Reputation: 529
I'm using Java-Spark.
I'm trying to write to external HDFS directory as follow:
df.write().mode(mode).save("hdfs://myservername:8020/user/path/to/hdfs");
And got an exception
host details: local host is: ... destination host is: ...
How can I write to "external" hdfs directory from Spark and not to local Hadoop/HDFS?
Thanks
Upvotes: 1
Views: 1814
Reputation: 1912
Check if the HDFS Namenode hostname is accessible from Spark cluster, you can either use ip address as well.
hdfs://<HDFS_NAMENODE_IP>:8020/user/path/to/hdfs
You can also update the spark configuration in the spark application using:
spark.conf.set("fs.defaultFS", "hdfs://<HDFS_NAMENODE_IP>:8020/")
.
Upvotes: 2