Jaime Paz
Jaime Paz

Reputation: 11

Sqoop create hive table ERROR - Encountered IOException running create table job

I am running sqoop on a Centos7 Machine that has hadoop/map reduce and hive already installed. I read from a tutorial that when importing data from a RDBMS (SQL Server in my case) to HDFS I need to run the next commands :

sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true  --connect 'jdbc:sqlserver://hostname;database=databasename' --username admin --password admin123  --table tableA

Everything works perfectly with this step. The next step is creating a hive table that has the same structure as the RDBMS (SQL Server in my case) and using a sqoop command :

sqoop create-hive-table --connect 'jdbc:sqlserver://hostname;database=databasename' --username admin --password admin123  --table tableA --hivetable hivetablename --fields-terminated-by ','

However, whenever I run the above command I get the next error :

FAILED: Execution Error, return code 1 from      
org.apache.hadoop.hive.ql.exec.DDLTask. 
com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang
/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
18/04/01 19:37:52 ERROR ql.Driver: FAILED: Execution Error, return code 1   
from org.apache.hadoop.hive.ql.exec.DDLTask. 
com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang
/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
18/04/01 19:37:52 INFO ql.Driver: Completed executing  
command(queryId=hadoop_20180401193745_1f3cf07d-ca16-40dd-
8f8d-1e426ecd5860); Time taken: 0.212 seconds
18/04/01 19:37:52 INFO conf.HiveConf: Using the default value passed in 
for log id: 0813b5c9-f374-4920-b8c6-b8541449a6eb
18/04/01 19:37:52 INFO session.SessionState: Resetting thread name to     
main
18/04/01 19:37:52 INFO conf.HiveConf: Using the default value passed in 
for log id: 0813b5c9-f374-4920-b8c6-b8541449a6eb
18/04/01 19:37:52 INFO session.SessionState: Deleted directory: /tmp/hive
/hadoop/0813b5c9-f374-4920-b8c6-b8541449a6eb on fs with scheme hdfs
18/04/01 19:37:52 INFO session.SessionState: Deleted directory: /tmp/hive  
/java/hadoop/0813b5c9-f374-4920-b8c6-b8541449a6eb on fs with scheme file
18/04/01 19:37:52 ERROR tool.CreateHiveTableTool: Encountered IOException 
running create table job: java.io.IOException: Hive CliDriver exited with 
status=1

I am not a java expert but I would like to know if you have any idea of this result?

Upvotes: 1

Views: 962

Answers (4)

Aniket Bhawkar
Aniket Bhawkar

Reputation: 1

For this please check the jackson-core, jackson-databind and jackson-annotation jar. The jar should be of the latest version. Usually it comes due to the older version. Place these jar inside the hive lib and sqoop lib. Along with please check the libthrift jar, both in hive and hbase it should be same and copy the same in sqoop lib

Upvotes: 0

P Déchorgnat
P Déchorgnat

Reputation: 19

I've faced the same issue. It seems that there are some compatibility issues between my versions of sqoop (1.4.7) and hive (2.3.4).
The problem raises from the version of the jackson-* jar files within $SQOOP_HOME/lib: some of them are too old for hive because we need versions older than 2.6.

The solution that I found was to replace the following files in $SQOOP_HOME/lib by their counterpart in $HIVE_HOME/lib:

  • jackson-core-*.jar
  • jackson-databind-*.jar
  • jackson-annotations-*.jar

They are all from versions 2.6+ and this seems to work. Not sure it's good practice though.

Upvotes: 2

goutham
goutham

Reputation: 31

I was facing the same issue and I have downgraded my hive to 1.2.2 and it works. That will solve the issue.

But not really sure if you want to use Sqoop with only hive2.

Upvotes: 1

Prabhat Ratnala
Prabhat Ratnala

Reputation: 705

Instead of writing two different statements, you can put the whole thing in one statement, which will fetch the data from sql server and then create a HIVE table too.

sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true --connect 'jdbc:sqlserver://hostname;database=databasename' --username admin --password admin123 --table tableA --hive-import --hive-overwrite --hive-table hivetablename --fields-terminated-by ',' --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N'

Upvotes: 0

Related Questions