Reputation: 63062
I have inherited old code that depends on
org.apache.spark.LocalSparkContext
which is in the spark core tests. But the spark core jar (correctly) does not include test-only classes. I was unable to determine if/where spark test classes have their own maven artifacts. What is the correct approach here?
Upvotes: 5
Views: 2044
Reputation: 502
If you want to add test jars , might go ahead adding in SBT as mentioned below:
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.3.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % Provided,
"org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "tests",
"org.apache.spark" %% "spark-core" % sparkVersion % Test classifier "test-sources",
"org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
"org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "tests",
"org.apache.spark" %% "spark-sql" % sparkVersion % Test classifier "test-sources",
"org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "tests",
"org.apache.spark" %% "spark-catalyst" % sparkVersion % Test classifier "test-sources",
"com.typesafe.scala-logging" %% "scala-logging" % "3.9.0",
"org.scalatest" %% "scalatest" % "3.0.4" % "test")
The same if you want to add it over the MAVEN dependencies, you can do it by as mentioned below :
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
<classifier>tests</classifier>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
<classifier>test-sources</classifier>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependencies>
Upvotes: 0
Reputation: 26560
I came here hoping to find some inspiration for doing the same in SBT. As a reference for other SBT users: Applying the pattern of using test-jars in SBT for Spark 2.0 results in:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.0" classifier "tests"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.0.0" classifier "tests"
Upvotes: 1
Reputation: 137084
You can add a dependency to the test-jar
of Spark by adding <type>test-jar</type>
. For example, for Spark 1.5.1 based on Scala 2.11:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.5.1</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
This dependency provides all the test classes of Spark, including LocalSparkContext
.
Upvotes: 6