Reputation: 1
I've got a large (about 20MB) csv file I'm trying to import into a table in my database. The file has a little over 78,000 rows, and 26 columns. I only need the first few columns. I enter the following code into the sql terminal in phpmyadmin:
LOAD DATA INFILE 'path to my file/myfile.csv'
REPLACE INTO TABLE mytable
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
IGNORE 1 LINES
(id, name, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy, @dummy)
It gives a successful message, which says, "156306 rows inserted. (Query took 8.2870 seconds.)" This is exactly double the number of rows the file has.
I went in to browse the table and see what happened, and phpmyadmin's browse window is indicating there are only 13264 records there. The rows that successfully imported worked perfectly. Since I'm using a replace to eliminate duplicates, I expected far fewer than the 78k rows, but many of the cases are missing. Any idea what might cause this issue? I've never experienced this before....
Upvotes: 0
Views: 725
Reputation: 75
Edit /etc/php5/apache2/php.ini
, set upload_max_filesize
and post_max_size
to 150M and restart Apache. Visit https://www.digitalocean.com/community/questions/how-to-extend-limit-of-import-file-in-phpmyadmin
Upvotes: 2
Reputation: 77
Change the value on this file C:/xampp/mysql/bin/my.cnf
using notepad or any other text editor. Increase max_allowed_packet
as needed after u try to import the csv file. once again full document copy open new document paste and save it. don't save as page.
Upvotes: 0