Yogzzz
Yogzzz

Reputation: 2785

Hadoop - Most efficient way to get data

I have a lot of data in hadoop, which I need to copy into a msql db.

Would it be more efficient to select the columns I need (which is almost all columns) in hive and write the results using INSERT OVERWRITE or would it be better to use the copyToLocal shell command to copy the file and manually transform the data?

Upvotes: 0

Views: 120

Answers (1)

Navneet Kumar
Navneet Kumar

Reputation: 3752

if your data is in hadoop , you can just use the Sqoop to move it MYSQL DB. Directly using SQOOP will be efficient then adding another extra layer of HIVE. Get the MYSQL connector and You can try some thing like below , let me know more help needed .

bin/sqoop export --connect jdbc:mysql:/// --table -username -P --export-dir -m1

Upvotes: 2

Related Questions