Chandru
Chandru

Reputation: 1

Wrong data types in hive with sqoop import from Oracle

I am trying to import Oracle tables into hive directly with sqoop.

Oracle tables use data types NUMBER, VARCHAR2, RAW

When I tried:

sqoop import ... --hive-import --hive-overwrite --hive-database default --fields-terminated-by '|' --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N' --warehouse-dir "/test"

All data types in hive tables are either double or string but I want int, date, etc for NUMBER(1), Date types.

I have tried like adding few tags like

--map-column-hive O_abc=INT,O_def=DATE,pqr=INT,O_uvw=INT,O_xyz=INT.

Is there any way I can automatic because I need to import 150 to 200 tables. It's tedious to mention all map-columns for every table.

Environment:

Thanks in advance!

Upvotes: 0

Views: 1145

Answers (1)

Sandish Kumar H N
Sandish Kumar H N

Reputation: 312

You could import all tables from Oracle to HDFS (sqoop import-all-tables {generic-args} {import-args}) and create an external and internal table based on your requirement.

Upvotes: 1

Related Questions