Reputation: 25
I need your help about hadoop-minicluster
I'm working with scala (with sbt) and I try to Mock calls of HDFS. I sow hadoop-minicluster for deploying a little cluster and test on it.
However, when I add the sbt dependency :
libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "3.1.0" % Test
The sources are not added and I can't import the package org.apache.hadoop.hdfs.MiniDFSCluster
Do you know how I can solve the problem ?
Thank you for yours answers
Upvotes: 1
Views: 3377
Reputation: 25
Thank you very much for your answer.
So to get the tests files and the sources files (for example with DistributedFileSystem), I use this line in my sbt file :
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "3.1.0" % Test classifier "tests" libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "3.1.0" % Test classifier "tests"
Hadoop-common was needed to compile.
However, I have an other problem when I run my tests :
An exception or error caused a run to abort: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
I sow that it's about HADOOP_HOME in the path, but I did it, but nothing happens..
Upvotes: 1
Reputation: 78
Surprisingly, it's not in hadoop-minicluster. Try libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % HADOOP_VERSION classifier "tests"
You also may have to exclude some components, such as "org.apache.hadoop" % "hadoop-hdfs" % HADOOP_VERSION classifier "tests" exclude ("javax.servlet", "servlet-api")
Upvotes: 3