Reputation: 1394
I have a large text file containing Arabic text data. When I try to load it into a MySQL table, I get error saying Error code 1300: invalid utf8 character string
. This is what I have tried so far:
LOAD DATA INFILE '/var/lib/mysql-files/text_file.txt'
IGNORE INTO TABLE tblTest
FIELDS TERMINATED BY '\t'
LINES TERMINATED BY '\n';
I tried to ignore this error, but it does not work. I have tried LOCAL INFILE
but it did not work, too. My database was created using DEFAULT CHAR SET UTF8
and DEFAULT COLLATE utf8_general_ci
. The text file is utf-8
encoded.
I do not want the records which contain invalid utf8 characters. So how I can load the data with ignoring the records containing such invalid chars?
Thank in advance!
Upvotes: 4
Views: 18165
Reputation: 142278
It would help to have the HEX of the naughty character.
A possible approach to reading all the text, then dealing with any bad characters:
Read into a column of type VARBINARY
or BLOB
.
Loop through the rows, trying to copy to a VARCHAR
or TEXT
column.
Another plan is to use utf8mb4 instead of utf8. It could be that the bad character is an Emoji or Chinese character that will work in utf8mb4, but not utf8.
Ignore errors
This may let you ignore errors:
SET @save := @@sql_mode;
LOAD DATA ...;
SET @@sql_mode := @save;
Upvotes: 3
Reputation: 134
I have this problem when try use MySQL 5.7.14, too.
I returned to MySQL 5.6 and this problem is disappeared
Upvotes: 3