Reputation: 1645
I read value of a file csv as:
//$mypath . '/' . $filename <=> ../abc.csv
$val = file_get_contents($mypath . '/' . $filename);
$escaped = pg_escape_bytea($val);
$model->addFileImport($tmp, $data['email'], $escaped);
My file ia about 100MB. In php.ini setting : memory_limit = 128M
But it still show errort:Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate 133120 bytes) in...
at row: $val = file_get_contents($mypath . '/' . $filename);
I had fixed by add ini_set('memory_limit', '-1');
:
//$mypath . '/' . $filename <=> ../abc.csv
ini_set('memory_limit', '-1');
$val = file_get_contents($mypath . '/' . $filename);
$escaped = pg_escape_bytea($val);
$model->addFileImport($tmp, $data['email'], $escaped);
But it show error :
Fatal error: Out of memory (allocated 230686720) (tried to allocate 657099991 bytes) in C:\wamp\www\joomlandk\components\com_servicemanager\views\i0701\view.html.php on line 112
at row $escaped = pg_escape_bytea($val);
Why? How fix that error?
Upvotes: 2
Views: 21776
Reputation: 28840
According to the doc
pg_escape_bytea() escapes string for bytea datatype. It returns escaped string.
When you SELECT a bytea type, PostgreSQL returns octal byte values prefixed with '' (e.g. \032). Users are supposed to convert back to binary format manually.
meaning a single input byte gives 4 bytes, i.e. 4 times the initial size
You need a lot of RAM to handle your file (maybe your system cannot allocate that much memory - even without the PHP limit). A solution is to process it from 40MB chunks, made, for instance with fread() and fwrite() functions.
$val = file_get_contents($mypath . '/' . $filename);
will take 100MB - thus the next line takes 400 MB, total 500MB. You need to read less from file_get_contents, like reading only 20 (or 40) MB at a time
Upvotes: 1