Reputation: 1
I have a table named "analytics" in hive which has nearly 5TB of data with more than 10000 partitions. Now i want to rename the table to analytics_backup. So i have used the command,
alter table analytics rename to analytics_backup.It was hanging in the terminal for 30-45 mins and then throws the out-of-memory error.
Is there anyone have noticed this kind of issue and any solution to overcome this. I am using the CDH3 Hadoop/Hive version. Thanks in advance.
Upvotes: 0
Views: 514
Reputation: 18434
You can modify hive-env.sh to increase the heap size of the hive client. export HADOOP_HEAPSIZE=___
is the setting you want.
I don't know specifically why; I've seen memory issues before when dealing with many partitions, though. Some step in the rename is probably trying to load all of the partition info into memory.
Also, if all you want is a backup, it may be easier to do it at the file level and just move the data yourself on hdfs.
Upvotes: 1