user1521306
user1521306

Reputation: 185

How to import tables from sql server through sqoop to hdfs

I have hadoop, hive, sqoop installed. I imported the table from my database to hdfs but couldnt import it to hive. Do I need to configure any file in hive? Also when I browsed the web the configuration is shown for MySQL but I am using the driver jdbc:sqlserver. Anyone please help me as I am stuck with this since many days.

Upvotes: 2

Views: 17083

Answers (5)

Rishi
Rishi

Reputation: 1060

I also had the same problem, I could store my MYSQL table in the HDFS but couldn't store it in hive. I simple imported the table in hive using the following command without again storing it in the HDFS and it worked for me.

sqoop import --connect jdbc:mysql://ipAddress:portNo/mysqldatabase --table mysqltablename --username mysqlusername --password mysqlpassword --hive-import --hive-table hivedatabase.hivetablename

Upvotes: 0

sandeepkakkeri
sandeepkakkeri

Reputation: 31

jdbc:mysql is for mysql and it won't work for sqlserver, i have tried using it and it was giving out errors. I have tried the below command and it worked wonderfully.

Command – import

Copy data from Database Table to HDFS File System

In the example below, our database & hdfs configuration is:

  • server name :- labDB
  • database name :- demo
  • SQL user name :- sqoop
  • SQL password :- simp1e
  • Driver Class Name :- com.microsoft.sqlserver.jdbc.SQLServerDriver
  • Table :- dbo.customers
  • Target Directory : /tmp/dbo-customer (HDFS Folder name)

Syntax:

  sqoop import --connect jdbc:sqlserver://sqlserver-name \
      --username <username> \
      --password <password> \
      --driver   <driver-manager-class> \
      --table    <table-name> \
      --target-dir  <target-folder-name>

Sample:

  sqoop import --connect "jdbc:sqlserver://labDB;database=demo" \
       --username sqoop  \
       --password simp1e \
       --driver   com.microsoft.sqlserver.jdbc.SQLServerDriver \
       --table    "dbo.customer" \
       --target-dir  "/tmp/dbo-customer"

https://danieladeniji.wordpress.com/2013/05/06/hadoop-sqoop-importing-data-from-microsoft-sql-server/

Upvotes: 3

sathish reddy
sathish reddy

Reputation: 119

just create an external hive table on the path in hdfs. or use --hive-import

Any of the two should work for you.

Upvotes: 0

Dave Poole
Dave Poole

Reputation: 1068

Have you used the specific --hive-import switch in the sqoop command line?

Sqoop import --connect ‘jdbc:sqlserver://sqlservername.mycompany.com;username=hadoop;password=hadoop;database=hadoop’ --table dataforhive --hive-import

Upvotes: 0

You should be able to import a table and see it in Hive using the --hive-import flag Check if you have defined all the global variables, HADOOP_HOME, SQOOP_HOME and HIVE_HOME If it doesn't work for you, in the meantime you can always use CREATE EXTERNAL TABLE syntax to make use of your imported data in Hive.

Upvotes: 1

Related Questions