Reputation: 113
i'm attempting to perform unit testing with scalatest, making use of the hbase testing utility to locally test development code. the setup for hbase testing utility in sbt is the struggle right now. when i compile, i get the following error:
[warn] module not found: org.apache.hbase#${compat.module};1.2.1
[warn] ==== local: tried
[warn] /root/.ivy2/local/org.apache.hbase/${compat.module}/1.2.1/ivys/ivy.xml
[warn] ==== public: tried
[warn] https://repo1.maven.org/maven2/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== Akka Repository: tried
[warn] http://repo.akka.io/releases/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== scala-tools: tried
[warn] https://oss.sonatype.org/content/groups/scala-tools/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== cloudera-repos: tried
[warn] https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ==== Sonatype OSS Snapshots: tried
[warn] https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.hbase#${compat.module};1.2.1: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.apache.hbase:${compat.module}:1.2.1
[warn] +- org.apache.hbase:hbase-testing-util:1.2.1 (/workspace/spark/etl/built.sbt#L30-62)
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hbase#${compat.module};1.2.1: not found
[error] Total time: 32 s, completed Apr 29, 2016 9:25:27 AM
my build.sbt file is as follows:
val hbaseVersion = "1.2.1"
val sparkVersion = "1.6.1"
val hadoopVersion = "2.7.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"org.apache.spark" %% "spark-streaming-kafka" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-mllib" % sparkVersion ,
"org.apache.hbase" % "hbase" % hbaseVersion,
"org.apache.hbase" % "hbase-server" % hbaseVersion,
"org.apache.hbase" % "hbase-server" % hbaseVersion classifier "tests",
"org.apache.hbase" % "hbase-client" % hbaseVersion,
"org.apache.hbase" % "hbase-common" % hbaseVersion,
"org.apache.hbase" % "hbase-common" % hbaseVersion classifier "tests",
"org.apache.hbase" % "hbase-annotations" % hbaseVersion,
"org.apache.hbase" % "hbase-testing-util" % hbaseVersion % "test",
"org.apache.hadoop" % "hadoop-minicluster" % hadoopVersion,
"org.apache.hadoop" % "hadoop-mapreduce-client-jobclient" % hadoopVersion classifier "tests",
"org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion,
"org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion classifier "tests",
"org.apache.hbase" % "hbase-hadoop-compat" % hbaseVersion,
"org.apache.hbase" % "hbase-hadoop-compat" % hbaseVersion classifier "tests",
"org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion,
"org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion classifier "tests",
"org.apache.hadoop" % "hadoop-common" % hadoopVersion,
"org.apache.hadoop" % "hadoop-common" % hadoopVersion classifier "tests",
"org.apache.hadoop" % "hadoop-annotations" % hadoopVersion,
"org.scalatest" %% "scalatest" % "2.2.6" % "test" ,
//"org.scalacheck" %% "scalacheck" % "1.12.5" % "test",
"com.cloudera.sparkts" % "sparkts" % "0.3.0",
"com.ecwid.consul" % "consul-api" % "1.1.9",
"joda-time" % "joda-time" % "2.7"
)
resolvers ++= Seq(
"Akka Repository" at "http://repo.akka.io/releases/",
"scala-tools" at "https://oss.sonatype.org/content/groups/scala-tools",
"cloudera-repos" at "https://repository.cloudera.com/artifactory/cloudera-repos/",
"Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
)
Anyone understand why this failure is occurring?
Upvotes: 3
Views: 1287
Reputation: 113
sorry for the delayed response. I couldn't get it to work as is, so I changed versions like so:
val sparkVersion = "1.6.1"
val hbaseVersion = "1.2.0-cdh5.7.0"
val hadoopVersion = "2.6.0-cdh5.7.0"
This led to more headaches. I had to change the version of guava to an earlier version because of a reference to an older library method, so this is required:
"com.google.guava" % "guava" % "14.0" force()
(i think up to version 16.0 is fine)
had to comment out the following:
// "com.cloudera" % "spark-hbase" % "0.0.2-clabs",
(wasn't there in original q)
Finally, looks like the original problem is a bug that needs to be resolved, please see here, thanks to reference from David Portabella:
https://issues.apache.org/jira/browse/HBASE-15925
And Fixed with 1.2.2 version
Upvotes: 4