pravin chavan
pravin chavan

Reputation: 11

sqoop export failed after successful map

1.sqoop export --connect jdbc:mysql://localhost:3306/hduser_db --username hduser 
    --password hduser --table export --export-dir /user/hive/warehouse/three --
fields-terminated-by ','

17/09/13 14:10:45 INFO mapreduce.Job:  map 0% reduce 0%
17/09/13 14:10:50 INFO mapreduce.Job:  map 100% reduce 0%
17/09/13 14:10:51 INFO mapreduce.Job: Job job_1505199140014_0033 failed with 
state FAILED due to: Task failed task_1505199140014_0033_m_000000
ob failed as tasks failed. failedMaps:1 failedReduces:0

2.17/09/13 14:10:51 INFO mapreduce.Job: Counters: 8

Job Counters Failed map tasks=1 Launched map tasks=1 Rack-local map tasks=1 Total time spent by all maps in occupied slots (ms)=2947 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=2947 Total vcore-milliseconds taken by all map tasks=2947 Total megabyte-milliseconds taken by all map tasks=3017728 17/09/13 14:10:51 WARN mapreduce.Counters: Group FileSystemCounters is deprecate 17/09/13 14:10:51 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 14.8875 s 17/09/13 14:10:51 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$C 17/09/13 14:10:51 INFO mapreduce.ExportJobBase: Exported 0 records. 17/09/13 14:10:51 ERROR tool.ExportTool: Error during export: Export job failed! at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:931) at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80) at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

Upvotes: 0

Views: 2578

Answers (1)

args
args

Reputation: 532

While running export command- Below steps have to be taken care.

  1. Datatypes, column names between source(HDFS data) and destination(table on rdbms) should match.

  2. we should specify all column names in the --columns parameter.

Eg:

sqoop export --connect jdbc:mysql://localhost:3306/hduser_db --username hduser --password hduser --table export --export-dir /user/hive/warehouse/three --fields-terminated-by ',' --columns "column1,column2,...." ;

Upvotes: 0

Related Questions