Reputation: 2893
I have the following code
$file="postcodes.csv";
$csv= file_get_contents($file);
$array = array_map("str_getcsv", explode("\n", $csv));
$json = json_encode($array);
print_r($json);
postcodes.csv is 603MB in size, so a large file.
In php.ini, if I have
memory_limit=1024M
I get the error
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 256 bytes) in ...
If I increase the memory limit to 2056, I get the error
Fatal error: Out of memory (allocated 1919680512) (tried to allocate 36 bytes) in...
It is similar if I change it to -1.
So how can I load this csv file without having memory issues?
Thanks
Upvotes: 4
Views: 6468
Reputation: 8560
The answer is simple you need to increase memory_limit in php.ini, because files have 603MB, but using all this function in code creates some structures in memory from json data and this is more than 603MB. Alernativley you can optimize memory usage changing code, but your question is how to increase memory limit.
Upvotes: 2
Reputation: 7283
Instead of getting the full file into a variable, parsing it for newlines and then do str_getcsv
on each array element.
Depending on what you are after, one full json
containing all values from each line or multiple json
strings one for each csv line.
$h = fopen("postcodes.csv",);
if ($h !== FALSE) {
$str ='';
while (($data = fgetcsv($handle)) !== FALSE) {
$str .= json_encode($data); // add each json string to a string variable, save later
// or
$array[]=$data;
}
}
fclose($h);
$finalJsonString = json_encode($array);
I wouldn't recommend that you print_r
an entire array
or json
object of that size since it would be difficult to follow.
Upvotes: 2
Reputation: 451
If you are reading a large file I would recommend using file pointer and fgetcsv()
function and looping line by line rather than loading whole file.
Also, new line not necessarily mean end of the CSV row, explode("\n", $csv)
may give you some unwanted results... It would be safer to use fgetcsv()
Upvotes: 1
Reputation: 2005
You can read your file line by line.
For example,
$file="postcodes.csv";
$array = array();
if (($handle = fopen($file, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$array[]=$data;
}
fclose($handle);
}
$json = json_encode($array);
print_r($json);
But memory problem still can happen if you have really a lot of data and your array is too big
Upvotes: 2