Nick
Nick

Reputation: 572

Moving Sqoop data from HDFS to Hive

When importing a bunch of large MySQL tables into HDFS using Sqoop, I forgot to include the --hive-import flag. So now I've got these tables sitting in HDFS, and am wondering if there's an easy way to load the data into Hive (without writing the LOAD DATA statements myself).

I tried using sqoop create-hive-table:

./bin/sqoop create-hive-table --connect jdbc:mysql://xxx:3306/dw --username xxx --password xxx --hive-import --table tweets

While this did create the correct hive table, it didn't import any data into it. I have a feeling I'm missing something simple here...

For the record, I am using Elastic MapReduce, with Sqoop 1.4.1.

Upvotes: 1

Views: 3942

Answers (2)

Chris Marotta
Chris Marotta

Reputation: 600

You did not specify "import" in your command. Syntax is sqoop tool-name [tool-arguments]

It should look like this:

$ sqoop import --create-hive-table --connect jdbc:mysql://xxx:3306/dw --username xxx --password xxx --hive-import --table tweets

Upvotes: 1

Matthew Rathbone
Matthew Rathbone

Reputation: 8269

Can't you create an external table in hive and point it to these files?

create external table something(a string, b string) location 'hdfs:///some/path'

Upvotes: 5

Related Questions