clay
clay

Reputation: 20440

Flink 1.12.3 upgrade triggers `NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps`

When I upgrade my Flink Java app from 1.12.2 to 1.12.3, I get a new runtime error. I can strip down my Flink app to this two liner:

public class TableEnvOnly {
    public static void main(String[] args) throws Exception {
        final StreamExecutionEnvironment streamEnv = StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tableEnv = StreamTableEnvironment.create(streamEnv);
    }
}

This works and doesn't trigger any errors with Flink version 1.12.2. When I upgrade the Maven Flink dependencies to 1.12.3, the same simple app throws the error:

Exception in thread "main" java.lang.NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps(java.lang.Object[])'
        at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:118)
        at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:47)
        at org.apache.flink.table.planner.delegation.BlinkPlannerFactory.create(BlinkPlannerFactory.java:48)
        at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:143)
        at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:113)
        at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:85)
        at simple.TableEnvOnly.main(TableEnvOnly.java:12)

FYI, I'm not using Scala directly. My Gradle dependencies are:

    implementation("org.apache.flink:flink-table-planner-blink_2.12:1.12.3")
    implementation("org.apache.flink:flink-clients_2.12:1.12.3")
    implementation("org.apache.flink:flink-connector-kafka_2.12:1.12.3")
    implementation("org.apache.flink:flink-connector-jdbc_2.12:1.12.3")

Upvotes: 0

Views: 454

Answers (1)

tashoyan
tashoyan

Reputation: 428

TL;DR: After upgrade to Flink 1.12.4 the problem magically disappears.

Details

After upgrade from Flink 1.12.2 to Flink 1.12.3 the following code stopped to compile:

import scala.collection.JavaConverters._
val input = new DataStream[String](env.fromCollection(Seq("a", "b", "c").asJava))
val res = input.map(_.toUpperCase)

The Scala compiler reports the error:

could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[String]

The version of scala-compiler and scala-library is 2.12.7 - exactly as used by Flink.

To overcome the compilation problem, we provide an implicit instance of TypeInformation:

implicit val typeInfo = TypeInformation.of(classOf[String])

Then, the code compiles. Nevertheless we face the runtime failure described above:

  java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
  at org.apache.flink.api.scala.ClosureCleaner$.getSerializedLambda(ClosureCleaner.scala:184)
  at org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$clean(ClosureCleaner.scala:257)
  at org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:168)
  at org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:859)
  at org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:1189)
  at org.apache.flink.streaming.api.scala.DataStream.map(DataStream.scala:623)

As mentioned, the upgrade to Flink 1.12.4 helps - both the compilation and the runtime failures disappear.

My guess is that some Flink 1.12.3 jars have been accidentally compiled with a wrong Scala version. The subsequent release 1.12.4 has been compiled with the correct Scala version.

Upvotes: 1

Related Questions