Reputation: 77
I have to copy a whole mysql table from there to hive. I cannot use sqoop.
I found a way getting the whole dataset from mysql and, row by row (looping into the dataset), copying into hive...
but it is a very slow method (50k rows need hours in my cluster)
is it a way like "*insert into mysql select * from hive*"?
thank you marco
Upvotes: 0
Views: 234
Reputation: 1642
why dont you use sqlplus to execute query and load data into file and and then put this file into HDFS
sqlplus -s user/password@dbname <<EOF
set feedback off trimspool on
spool file_name.txt;
select * from table_name;
spool off;
exit;
EOF
once you get your data into file file_name.txt you can directly put the data into hdfs
hadoop fs -put file_name.txt /myhdfsfolder/
Upvotes: 1