dmreshet
dmreshet

Reputation: 1526

SQOOP export CSV to MySQL fails

I have CSV file in HDFS with lines like:

"2015-12-01","Augusta","46728.0","1"

I am trying to export this file to MySQL table.

CREATE TABLE test.events_top10(
   dt VARCHAR(255),
   name VARCHAR(255),
   summary VARCHAR(255),
   row_number VARCHAR(255)
  );

With the command:

sqoop export  --table events_top10 --export-dir /user/hive/warehouse/result --escaped-by \" --connect ...

This command fails with error:

Error: java.io.IOException: Can't export data, please check failed map task logs
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.RuntimeException: Can't parse input data: '2015-12-02,Ashburn,43040.0,9'
    at events_top10.__loadFromFields(events_top10.java:335)
    at events_top10.parse(events_top10.java:268)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
    ... 10 more
Caused by: java.util.NoSuchElementException
    at java.util.ArrayList$Itr.next(ArrayList.java:834)
    at events_top10.__loadFromFields(events_top10.java:320)
    ... 12 more

In case I do not use --escaped-by \" parameter than MySQL table contains rows like this

"2015-12-01" | "Augusta"       | "46728.0" | "1" 

Could you please explain how to export CSV file to MySQL table without double quotes?

Upvotes: 1

Views: 1013

Answers (1)

dmreshet
dmreshet

Reputation: 1526

I have to use both --escaped-by \ and --enclosed-by '\"' So the correct command is

sqoop export  --table events_top10 --export-dir /user/hive/warehouse/result  --escaped-by '\\' --enclosed-by '\"'  --connect ...

For more information please see official documentation

Upvotes: 1

Related Questions