Reputation: 173
Below is the error message:
Unable to move source hdfs://sandbox-hdp.hortonworks.com:8020/user/maria_dev/DimDepartmentGroup/part-m-00000 to destination hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/dbodimemployee/delta_0000001_0000001_0000: Permission denied: user=hive, access=WRITE, inode="/user/maria_dev/DimDepartmentGroup":maria_dev:hdfs:drwxr-xr-x
I am totally confused. The error message itself shows that Maria_dev has write permission on the folder inode="/user/maria_dev/DimDepartmentGroup":maria_dev:hdfs:drwxr-xr-x
What did I miss?
Upvotes: 1
Views: 734
Reputation: 9584
When you run Sqoop , ** generally ** it first loads the Data from Your external database , then store that as a multi-part file at the given location (--target-dir /goldman/yahoo)
then from that location to hive table (--hive-table topclient.mpool)
Now you can have access denied at 2 level .
1) If you see access denied at file location /goldman/yahoo
, then set filelocation access to 777 running as hdfs user - sudo -u hdfs hadoop fs -chmod 777 /goldman/yahoo
2) If you see access denied while creating table , run sqoop command as user hive
, beacuse the user hive
has access to hive tables, i.e.
sudo -u hive sqoop import --connect 'jdbc:sqlserver://test.goldman-invest.data:1433;databaseName=Investment_Banking' --username user_***_cqe --password ****** --table cases --target-dir /goldman/yahoo --hive-import --create-hive-table --hive-table topclient.mpool
Upvotes: 1
Reputation: 173
Finally, I got it to work. I logged in as root and switch to hive user using su - hive
.
Then I was able to run the SQOOP command successfully. Previously I logged in as maria_dev and could not use su
command. I do not have the password to user hive because hive is not a regular user in HDP sandbox.
Still, it is strange to me that a user needs to have root access to load some data into HDP HIVE.
Upvotes: 0