Endika Montejo
Endika Montejo

Reputation: 17

Usage of Junit in Spark shell

I am struggling to make junit work in my Spark shell.

When trying to import Assert from junit I geet the following error message:

scala> import org.junit.Assert._
<console>:23: error: object junit is not a member of package org
       import org.junit.Assert._

Any way of having this fixed? Any idea of how can I download org.junit from the scala shell?

EDIT: After following the recomendation from zsxwing, I have used spark-shell --packages junit:junit:4.12 with the following output:

C:\spark>spark-shell --packages junit:junit:4.12
Ivy Default Cache set to: C:\xxx\.ivy2\cache
The jars for the packages stored in: C:\xxxx\.ivy2\jars
:: loading settings :: url = jar:file:/C:/spark/jars/ivy-2.4.0.jar!/org/apache/i
vy/core/settings/ivysettings.xml
junit#junit added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found junit#junit;4.12 in central
        found org.hamcrest#hamcrest-core;1.3 in central
:: resolution report :: resolve 365ms :: artifacts dl 7ms
        :: modules in use:
        junit#junit;4.12 from central in [default]
        org.hamcrest#hamcrest-core;1.3 from central in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   2   |   0   |   0   |   0   ||   2   |   0   |
        ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 2 already retrieved (0kB/20ms)
Setting default log level to "WARN".
Spark context Web UI available at http://xxxxx
Spark context available as 'sc' (master = local[*], app id = local-xxx
).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0
      /_/

However still facing the same issue when trying to import org.junit.Assert._

Upvotes: 0

Views: 242

Answers (1)

zsxwing
zsxwing

Reputation: 20826

JUnit is a test dependency and it's not included in the Spark shell's classpath. You can use the --packages parameter add any dependency such as,

bin/spark-shell --packages junit:junit:4.12

Upvotes: 0

Related Questions