Reputation: 827
I have a php script that splits a large file and inserts into PostgreSQL. This import has worked before on PHP 5.3 and PostgreSQL 8.3 and Mac OS X 10.5.8. I now moved everything over to a new Mac Pro. This has plenty of RAM (16MB), Mac OS X 10.9.2, PHP 5.5.8, PostgreSQL 9.3.
The problem is when reading the large import file. It is a tab-separated file over 181 MB. I have tried to increase PHP memory up to 2GB (!) with no more success. So I guess the problem must be in the code reading the text file and splitting it. I get this error:
PHP Fatal error: Allowed memory size of 2097152000 bytes exhausted (tried to allocate 72 bytes) in /Library/FileMaker Server/Data/Scripts/getGBIFdata.php on line 20
Is there a better way to do this? I read the file and split the lines, then again split each line by \t (tab). The error I get on this line:
$arr = explode("\t", $line);
Here is my code:
<?php
## I have tried everything here, memory_limit in php.ini is 256M
ini_set("memory_limit","1000M");
$db= pg_connect('host=127.0.0.1 dbname=My_DB_Name user=Username password=Pass');
### SETT ERROR_STATE:
pg_set_error_verbosity($db, PGSQL_ERRORS_VERBOSE);
### Emtpy DB
$result = pg_query("TRUNCATE TABLE My_DB_Name");
$fcontents = file ('///Library/FileMaker\ Server/Data/Documents/EXPORT/export_file.tab');
for($i=0; $i<sizeof($fcontents); $i++) {
$line = trim($fcontents[$i]);
$arr = explode("\t", $line);
$query = "insert into My_DB_Name(
field1, field2 etc.... )
values (
'{$arr[0]}','{$arr[1]}','{$arr[2]}','{$arr[3]}', etc........
)";
$result = pg_query($query); echo "\n Lines:".$i;
pg_send_query($db, $query);
$res1 = pg_get_result($db);
}
## Update geometry column
$sql = pg_query("
update darwincore2 set punkt_geom=
ST_SetSRID(ST_MakePoint(My_DB_Name.longitude, darwincore2.latitude),4326);
");
?>
Upvotes: 0
Views: 1710
Reputation: 12117
this case can be occurred from code e.g infinite loop, process large amount data, or even database queries You should check code, there might have been infinite loop or such type case
Upvotes: 0
Reputation: 11347
I think the problem is that you're using the file() function which reads the whole file in memory at once. Try reading it line by line using fopen and fgets.
$fp = fopen(filename, "r");
while (($line = fgets($fp)) !== false) {
... insert $line into the db....
}
fclose($fp);
You can also import a file directly with the COPY command (http://www.postgresql.org/docs/9.2/static/sql-copy.html)
Upvotes: 3