PHP - exec awk or fread faster for reading a column on very large file -
i have file containing plot data. each line has 4 coordinates in total data file can exceed 1 gb. let's say, third column in data file, method should consider practice , faster?
using execute:
exec("awk '{ print $3 }' data", $output);
using php script:
$data = file("data"); $points = array(); foreach($data $line) $points[] = $line[2];
moreover, since server not allow read large file, have use fread read file in several parts. fread not smart enough , work must done combine last line in each part. suggestion or better method read column on file in php?
here /file
3.1 gb big file:
root# time awk '{ print $3 }' /file >/dev/null real 1m42.430s user 1m0.241s sys 0m2.198s
okay. ±1.7 minutes awk. let's test php (without field splitting, third char):
root# time php -r '$fp = fopen("/file", "r"); while (($buf = fgets($fp)) !== false) echo $buf[2]; fclose($fp);' >/dev/null real 4m17.322s user 3m16.571s sys 0m31.625s
±4.3 minutes php! don't want imagine how long take if i'd use @jack's code...
php far slower awk
. on big files, use awk (invoked exec()). see here, php spends lot of time in user space (three times more awk).
Comments
Post a Comment