Reputation: 43
I have a MySQL table called dept
which I load to Hive EXTERNAL table dept
table as incremental append check-column last-value based on a time stamp column ts
.
This is working fine and now I want to add dynamic partition to my hive table dept
based on dept_name
and ts
columns.
I know that I can create table dynamic partition outside sqoop but need help in loading data to dynamic partitions. I saw one example using --hive-partition-key
& --hive-partition-value
but there the value is hard coded. Without hard coding the hive-partition-values, can I load data automatically to corresponding partitions using SQOOP say (dept1,timestamp1/dept1,timestamp2..etc) and (dept2,timestamp1/dept2,timestamp2...etc)?
Upvotes: 0
Views: 1903
Reputation: 2469
Yes, you can load the data from the external table you have created into the dynamic partitioned table without hardcoding the partitioned values.
Assuming that you already have your external table externalDept
with the values and also that you have created your partitioned table dept
, you could use something like this:
INSERT OVERWRITE TABLE dept
PARTITION (dept_name, ts)
SELECT ..., ed.dept_name, ed.ts
FROM externalDept ed
It's not part of your question, but you could also mix both static and dynamic partitions. For instance, if you wanted to specify a dynamic value for the timestamp
but a static value for the dept
, you could do something like this:
INSERT OVERWRITE TABLE dept
PARTITION (dept_name = 'Math', ts)
SELECT ..., ed.dept_name, ed.ts
FROM externalDept ed
WHERE ed.dept_name = 'Math'
Upvotes: 0