Mohamed Emad
Mohamed Emad

Reputation: 104

sqoop not import datatype varchar2

sqoop not import datatype varchar2 to hadoop I have a table in oracle Database and I want import the data to hdfs. I am trying to do it with sqoop, but varchar2 columns are not imported. I mean that these data isn't arriving to hdfs file. my sqoop command

sqoop import -D mapred.job.name='default oraoop'  --driver oracle.jdbc.driver.OracleDriver --connect "jdbc:oracle:thin:MyIp:MyServiceName" --username "XXXX" --password "XX" --target-dir "My_dir" --query 'select * from MyTable where $CONDITIONS' --split-by "coulmn"  --boundary-query "SELECT min(splitColumn),max(SplitCoulmn)  FROM DUAL" --num-mappers 30

Upvotes: 4

Views: 675

Answers (2)

melbadry
melbadry

Reputation: 26

you can try to downgrade the ojdbc instead of using higher ojdbc "ojdbc6 or ojdbc7" use "ojdbc14" this solved the problem for me but in order not to face an exception with some encoding classes not being found remove or rename the "ori18n.jar" while importing data from the orale9i.

you can find the paths to these jar files in "$HADOOP_CLASSPATH" and "$SQOOP_HOME"

Upvotes: 1

Sathiyan S
Sathiyan S

Reputation: 1023

May be sqoop couldn't identify the matching java type of VARCHAR2, so try with --map-column-java.

let's say column A is the VARCHAR2 type then your sqoop command would be,

sqoop import -D mapred.job.name='default oraoop' --driver oracle.jdbc.driver.OracleDriver --connect "jdbc:oracle:thin:MyIp:MyServiceName" --username "XXXX" --password "XX" --target-dir "My_dir" --query 'select * from MyTable where $CONDITIONS' --map-column-java a=String --split-by "coulmn" --boundary-query "SELECT min(splitColumn),max(SplitCoulmn) FROM DUAL" --num-mappers 30

let me know if this works.

Upvotes: 0

Related Questions