Sonu R
Sonu R

Reputation: 11

MySQL import csv is failing with Data too long for column exception

When I'm trying to import csv into MySQL table, I'm getting an error

Data too long for column 'incident' at row 1

I'm sure the values are not higher than varchar(12). But, still I'm getting the error.

MariaDB [pagerduty]> 

    LOAD DATA INFILE '/var/lib/mysql/pagerduty/script_output.csv' 
    REPLACE INTO TABLE incidents 
    ignore 1 lines;

ERROR 1406 (22001): Data too long for column 'incident' at row 1

MariaDB [pagerduty]> 

    LOAD DATA INFILE '/var/lib/mysql/pagerduty/script_output.csv' 
    INTO TABLE incidents 
    ignore 1 lines;

ERROR 1406 (22001): Data too long for column 'incident' at row 1

While trying with REPLACE, the data is uploading only one column(which set on primary key)

MariaDB [pagerduty]> 

    LOAD DATA INFILE '/var/lib/mysql/pagerduty/script_output.csv' 
    IGNORE INTO TABLE incidents 
    ignore 1 lines;

Query OK, 246 rows affected, 1968 warnings (0.015 sec)
Records: 246  Deleted: 0  Skipped: 0  Warnings: 1968
**Columns:**
+----------------+--------------+------+-----+---------+-------+
| Field          | Type         | Null | Key | Default | Extra |
+----------------+--------------+------+-----+---------+-------+
| incident       | varchar(12)  | NO   | PRI | NULL    |       |
| description    | varchar(300) | YES  |     | NULL    |       |
| status         | varchar(12)  | YES  |     | NULL    |       |
| urgency        | varchar(7)   | YES  |     | NULL    |       |
| service        | varchar(27)  | YES  |     | NULL    |       |
| trigger        | varchar(25)  | YES  |     | NULL    |       |
| team           | varchar(20)  | YES  |     | NULL    |       |
| incident_start | datetime(6)  | YES  |     | NULL    |       |
| incident_end   | datetime(6)  | YES  |     | NULL    |       |
| resolved_by    | varchar(20)  | YES  |     | NULL    |       |
+----------------+--------------+------+-----+---------+-------+
10 rows in set (0.003 sec)

Upvotes: 0

Views: 2462

Answers (2)

By default, MySQL looks for a TAB character to separate values. Your file is using a comma, so MySQL reads the entire line and assumes it is the value for the first column only.

You need to tell MySQL that the column terminator is a comma, and while you're at it, tell it about the enclosing double quotes.

Try this:

LOAD DATA INFILE '/var/lib/mysql/pagerduty/script_output.csv' REPLACE INTO TABLE incidents
    columns terminated by ','
    optionally enclosed by '"'
    ignore 1 lines;

Reference

Upvotes: 1

DRapp
DRapp

Reputation: 48139

If you THINK your data appears ok, and its still nagging about the too long data, how about creating a new temporary table structure and set your first column incident to a varchar( 100 ) just for grins... maybe even a few others if they too might be causing a problem.

Import the data to THAT table to see if same error or not.

If no error, then check the maximum trimmed length of the data in respective columns and analyze the data itself... bad format, longer than expected, etc.

Once resolved, then you can pull into production once you have figured it out. You could also always PRE-LOAD the data into this larger table structure, truncate it before each load so no dups via primary key on a fresh load.

I have had to do that in the past, also was efficient for pre-qualifying lookup table IDs for new incoming data. Additionally could apply data cleansing in the temp table before pulling into production.

Upvotes: 0

Related Questions