bsky
bsky

Reputation: 20242

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

I have the following class:

import scala.util.{Success, Failure, Try}


class MyClass {

  def openFile(fileName: String): Try[String]  = {
    Failure( new Exception("some message"))
  }

  def main(args: Array[String]): Unit = {
    openFile(args.head)
  }

}

Which has the following unit test:

class MyClassTest extends org.scalatest.FunSuite {

  test("pass inexistent file name") {
    val myClass = new MyClass()
    assert(myClass.openFile("./noFile").failed.get.getMessage == "Invalid file name")
  }

}

When I run sbt test I get the following error:

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.scalatest.tools.FriendlyParamsTranslator$.translateArguments(FriendlyParamsTranslator.scala:174)
        at org.scalatest.tools.Framework.runner(Framework.scala:918)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:533)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:527)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at sbt.Defaults$.createTestRunners(Defaults.scala:527)
        at sbt.Defaults$.allTestGroupsTask(Defaults.scala:543)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:35)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (test:executeTests) java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

Build definitions:

version := "1.0"

scalaVersion := "2.12.0"

// https://mvnrepository.com/artifact/org.scalatest/scalatest_2.11
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0"

I can't figure out what causes this. My class and unit test seem simple enough. Any ideas?

Upvotes: 63

Views: 122233

Answers (14)

Ankur Pandey
Ankur Pandey

Reputation: 51

For me, the issue is resolved after upgrading the Scala version from 2.11.12 to 2.12.10. Along with this I also updated the suffix of artifactId from 2.11 to 2.12

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>
    <version>2.4.0</version>
</dependency>

Upvotes: 0

J.Hammond
J.Hammond

Reputation: 270

For anyone encountering this issue with scala in AWS Glue 3

Glue 3 is using spark 3 and scala 2.12 and as the other answers indicate you can't use scala 2.11 jars on a cluster running scala 2.12

So if you were like me and you where trying to use an extra jar file compiled with scala 2.11 (and which worked in earlier versions of glue), you will now get this error and will need to rebuild/change your jar to one using scala 2.12 for use with Glue 3.

See aws migration recommendations: https://docs.aws.amazon.com/glue/latest/dg/migrating-version-30.htmlLscala/collection/mutable/ArrayOps

Upvotes: 0

Kiran Gali
Kiran Gali

Reputation: 111

This happens if you use scala 2.12 libraries with scala 2.11 or vice versa.

If cleaning the pom.xml with good packages didn't work. It is because the jars are already downloaded in your local.

Delete this folder ~/.m2/ and reload the packages

Upvotes: 0

Powers
Powers

Reputation: 19328

This error occurs when you use a Scala JAR file that was compiled with Scala 2.11 for a Scala 2.12 project.

Scala libraries are generally cross compiled with different versions of Scala, so different JAR files are published to Maven for different project versions. For example, Scalatest version 3.2.3 publishes separate JAR files to Maven to Scala 2.10, 2.11, 2.12, and 2.13, as you can see here.

Lots of Spark programmers will run into this error when they attach a JAR file that was compiled with Scala 2.11 to a cluster that's running Scala 2.12. See here for a detailed guide on how to migrate Spark projects from Scala 2.11 to Scala 2.12.

As the accepted answer mentioned, the SBT %% operator should be used when specifying Scala dependencies so you can automatically grab library dependencies that correspond with your project's Scala version (as mentioned in the accepted answer). The %% operator won't help you if the library dependency doesn't have a JAR file for the Scala version you're looking for. Look at the Spark releases for example:

spark releases

This build.sbt file will work because there is a Scala 2.12 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

This code will not work because there isn't a Scala 2.11 release for Spark 3.0.1:

scalaVersion := "2.12.12"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.0.1"

You can cross compile your project and build JAR files for different Scala versions if your library dependencies are also cross compiled. Spark 2.4.7 is cross compiled with Scala 2.11 and Scala 2.12, so you can cross compile your project with this code:

scalaVersion := "2.11.12"
crossScalaVersions := Seq("2.11.12", "2.12.10")
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.7"

The sbt +assembly code will build two JAR files for your project, one that's compiled with Scala 2.11 and another that's compiled with Scala 2.12. Libraries that release multiple JAR files follow a similar process cross compilation workflow.

Upvotes: 8

voxoid
voxoid

Reputation: 1299

In my case, I had a project jar dependency which was depending on a different version of scala. This was found under Project Structure -> Modules -> (selected project) -> Dependencies tab. Everything else in the project and its libraries lined up in scala version (2.12), but the other jar was hiding a transitive dependency on an older version (2.11).

Upvotes: 0

ibaralf
ibaralf

Reputation: 12528

This was happening to me in DataBricks. Problem was the same as noted by previous answers, the incompatibility with spark and scala version. For DataBricks, I had to change the cluster DataBricks Runtime Version. Default was Scala 2.11/Spark 2.4.5, bump this up to at least Scala 2.12/Spark 3.0.0

Click Clusters > Cluster_Name > Edit > DataBricks Runtime Version

enter image description here

Upvotes: 2

louis l
louis l

Reputation: 31

In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.

Upvotes: 3

MeirDayan
MeirDayan

Reputation: 650

When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.

Upvotes: 1

Yousef Irman
Yousef Irman

Reputation: 109

Try adding the following line to your build.sbt

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

your build.sbt should be like this:

libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"

With this, the error for me is solved.

Upvotes: 0

Andrew Norman
Andrew Norman

Reputation: 911

in eclipse ide the project tends to be preselected with the scala installation 'Latest 2.12 bundle (dynamic)' configuration. If you are not actually using 2.12 for your Scala project and you attempt to run your project through the IDE, then this issue will manifest itself.

I've also noticed if I rebuild my eclipse project with the sbt command: "eclipse with-source" that this has the side effect of resetting the eclipse project scala installation back to the 2.12 setting (even though my build.sbt file is configured for a 2.11 version of Scala). So be on the lookout for both of those scenarios.

Upvotes: 0

Anton Tkachov
Anton Tkachov

Reputation: 751

I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.

Upvotes: 71

HHH
HHH

Reputation: 310

I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.

Upvotes: 15

MyounghoonKim
MyounghoonKim

Reputation: 1090

In my experience, if you still get errors after matching scalatest version and scala version in build.sbt, you have to think about your actual scala version that is running on your machine. You can check it by $ scala, seeing

Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121). Type in expressions for evaluation. Or try :help. this type of messages. You need to match this Scala version(eg. 2.12.1 here) and build.sbt's one.

Upvotes: 3

Alexey Romanov
Alexey Romanov

Reputation: 170839

scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.

Upvotes: 42

Related Questions