Reputation: 6471
I insert data from a text file into mySQL database and there will be inserted around 6000 or more entries at once.
try {
$sql = "INSERT INTO data (id,name,date,place) values(?,?,?,?) ";
$q = $pdo->prepare($sql);
foreach($data as $row) {
$q->execute(array($row['id'], $row['name'], $row['date'], $row['place']));
}
}
catch (PDOException $pe) {
echo $pe->getMessage();
}
catch (Exception $e ) {
echo $e->getMessage();
}
I tried it with 3000 entries and everything works fine. But if I have more data to be inserted it happens that my page is blank and nothing is inserted into my database.
What could cause this problem and how can I solve it?
Upvotes: 2
Views: 4014
Reputation: 3200
Since you already have text file then use below query
LOAD DATA INFILE 'file/path/to/file.txt'
INTO TABLE table-name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Make sure your data in file is in sequence with your table rows.
Read more here https://dev.mysql.com/doc/refman/5.6/en/load-data.html
Upvotes: 5
Reputation: 24959
Use mysql LOAD DATA INFILE
refer to This Answer
If you use PDO or the like with looping, you are doing it wrong, and slow.
Upvotes: 1
Reputation: 19879
Instead of running multiple queries inside a loop, you should use the multi-insert syntax. Example:
INSERT INTO data (id,name,date,place) VALUES (:id1,:name1,:date1,:place1), (:id2,:name2,:date2,:place2), (:id3,:name3,:date3,:place3);
Check out this article on doing multi-inserts with prepared statements.
Upvotes: 1
Reputation: 1439
NSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas. Example:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
From How to do a batch insert in MySQL
Upvotes: 1