Reputation: 21
I am trying to write unit test for spark code. I know we can install Spark and then use SparkConf and SparkContext to write tests.
However, I wanted to check if there is any way we can write unit tests without installing Spark as my client don't want to install Spark on Jenkins server, where we intend to run our tests as part of automated process.
Upvotes: 2
Views: 2315
Reputation: 5782
You can setup Spark to run in a local cluster via code:
val conf = new SparkConf().setAppName(appName).setMaster("local")
val context = new SparkContext(conf)
Then, you can use the context
to create RDDs
of your data for testing:
context.makeRDD
Upvotes: 2