Reputation: 4761
I have a big csv file ( about 30M ), that I want to read using my php program and convert it to another format and save it as different small files . When I am using the traditional fopen , fwrite methods I am getting an error that says Fatal error: Allowed memory size of 134217728 bytes exhausted
. I am aware that I can set the memory limit in php.ini but is there any way that I can read the file as stream so that it wont create much memory overhead ? May be something like StreamReader
classes in java ?
Upvotes: 3
Views: 6357
Reputation: 20873
You could just read the file one line at a time with fgets()
, provided you are reassigning your variable each time through (and not storing the lines in an array or something, where they would remain in memory).
One way, with a ~65 MB file:
// load the whole thing
$file = file_get_contents('hugefile.txt');
echo memory_get_peak_usage() / 1024 / 1024, ' MB';
// prints '66.153938293457 MB'
Second way:
// load only one line at a time
$fh = fopen('hugefile.txt', 'r');
while ($line = fgets($fh)) {}
echo memory_get_peak_usage() / 1024 / 1024, ' MB';
// prints '0.62477111816406 MB'
Also, if you want to rearrange the data in a different format, you could parse each line as CSV as you go using fgetcsv()
instead.
Upvotes: 9