Reputation: 1124
I am trying to write in Hadoop HDFS, using this line of code:
Files.write(Paths.get("hdfs:////localhost:9000/user/cloudera/trial/"+ "cat.txt","miao miao!".getBytes());
The Spark Application gives me this exception:
java.nio.file.NoSuchFileException: hdfs:/quickstart.cloudera:9000/user/cloudera/trial/cat2
Which, I am interpreting, gives an error because there is only one slash after "hdfs:" .
I remember I had already used the java.nio.Files methods to write in HDFS, so I would exclude that is the problem.
What should I do to prevent that exception?
EDIT: The import section
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
Upvotes: 0
Views: 1169
Reputation: 1496
NO, you can not use java.nio.Files to write to HDFS. Java classes don´t know about the NameNode and DataNodes in hadoop cluster. You need to use hadoop libraries to communicate with HDFS.
Here I have an example to write to HDFS using Java:
Upvotes: 1