Reputation: 21
I'm trying to export a table from HDFS to Oracle database, using the command:
sqoop export --connect jdbc:oracle:thin:@ip:port/db --username user -P --table OPORTUNIDADESHIVE --export-dir /user/hadoop/OPORTUNIDADES/000000_0 --input-fields-terminated-by "\t"
where OPORTUNIDADESHIVE is the table from Oracle and the file "000000_0" is the table extracted from Hive to HDFS. Both tables have the same columns.
This table from Hive is ROW FORMAT DELIMITED and FIELDS TERMINATED BY '\t'.
But in the end, the export gives me this error message:
2021-11-04 14:58:04,633 INFO mapreduce.Job: map 0% reduce 0% 2021-11-04 14:58:13,711 INFO mapreduce.Job: map 100% reduce 0% 2021-11-04 14:58:13,723 INFO mapreduce.Job: Job job_1635324128846_0049 failed with state FAILED due to: Task failed task_1635324128846_0049_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0
2021-11-04 14:58:13,792 INFO mapreduce.Job: Counters: 9 Job Counters Failed map tasks=2 Killed map tasks=2 Launched map tasks=2 Data-local map tasks=2 Total time spent by all maps in occupied slots (ms)=28818 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=14409 Total vcore-milliseconds taken by all map tasks=14409 Total megabyte-milliseconds taken by all map tasks=29509632 2021-11-04 14:58:13,799 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 2021-11-04 14:58:13,800 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 19.1507 seconds (0 bytes/sec) 2021-11-04 14:58:13,803 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 2021-11-04 14:58:13,804 INFO mapreduce.ExportJobBase: Exported 0 records. 2021-11-04 14:58:13,804 ERROR mapreduce.ExportJobBase: Export job failed! 2021-11-04 14:58:13,804 ERROR tool.ExportTool: Error during export: Export job failed! at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:445) at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:465) at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80) at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Upvotes: 0
Views: 1948
Reputation: 7387
sqoop export is very sensitive to data type, length mismatch issue. so plese follow below steps to fix this issue -
export dir
argument, it should be path to the table which is /user/hadoop/OPORTUNIDADES
.
But i feel like its easy to use --hcatalog-table
and here is complete command.sqoop export --connect jdbc:oracle:thin:@mydb:5217/user --username user --password pass --hcatalog-database hive_db --hcatalog-table hive_tab --table orac_tab --input-null-string '\\\\N' --input-null-non-string '\\\\N' --m 6
Pls note we are using just input and output table and optional null handling.
Upvotes: 0