Reputation: 117
I'm using Ubuntu
When i try to save a dataframe to HDFS (Spark Scala):
processed.write.format("json").save("hdfs://localhost:54310/mydata/enedis/POC/processed.json")
I got this error
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/mydata/enedis/POC":hadoop_amine:supergroup:drwxr-xr-x
Upvotes: 2
Views: 2556
Reputation: 31540
You are trying to write data as root
user but hdfs directory(/mydata/enedis/POC) having permissions to hadoop_amine
user to write to the directory.
Change the permissions on the HDFS directory to allow root
user to write to /mydata/enedis/POC
directory.
#login as hadoop_amine user then execute below command
hdfs dfs –chmod -R 777 /mydata/enedis/POC
(Or)
Intialize the spark shell with hadoop_amine
user then no need to change the permissions of the directory.
Upvotes: 2