Reputation: 23
I am trying to to wordcount opeartion using hadoop. Hadoop is configured and I can see datanode, namenode, resourcemanager, and nodemanager running. I am using hadoop version 3.4.0 and Java version 8. However, when I paste this command:
C:\hadoop\sbin>hadoop jar C:/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.4.0.jar wordcount /input /output_dir
This is the error I get:
Exception in thread "main" java.lang.UnsupportedOperationException: 'posix:permissions' not supported as initial attribute
at sun.nio.fs.WindowsSecurityDescriptor.fromAttribute(WindowsSecurityDescriptor.java:358)
at sun.nio.fs.WindowsFileSystemProvider.createDirectory(WindowsFileSystemProvider.java:492)
at java.nio.file.Files.createDirectory(Files.java:674)
at java.nio.file.TempFileHelper.create(TempFileHelper.java:136)
at java.nio.file.TempFileHelper.createTempDirectory(TempFileHelper.java:173)
at java.nio.file.Files.createTempDirectory(Files.java:950)
at org.apache.hadoop.util.RunJar.run(RunJar.java:296)
at org.apache.hadoop.util.RunJar.main(RunJar.java:245)
Upvotes: 0
Views: 366
Reputation: 11
I was facing this same issue, solved it by downgrading hadoop to 3.3.6. I think 3.4.0 is not compatible with windows.
Upvotes: 1
Reputation: 191743
You're getting an internal Java error starting here, so not related to Hadoop...
at java.nio.file.Files.createTempDirectory(Files.java:950)
Note - the latest Hadoop version supports Java 11, not Java 8. And if that doesn't work, I'd suggest using a proper Linux environment like WSL2 or a VM, not trying to run Hadoop on Windows, since it isn't designed to be.
Additionally, not many people use plain MapReduce anymore when Flink or Spark are much better processing engines.
Upvotes: 1