Reputation: 3574
I keep seeing this error when trying to run Spark, have tried using Spark for 2.3.2, 2.3.3, and 2.4.3
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80) at org.apache.spark.network.util.NettyMemoryMetrics.(NettyMemoryMetrics.java:76) at org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109) at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
This is invoked on the last line of this block:
lazy val spark: SparkSession = {
SparkSession
.builder()
.appName("SparkProfiler")
.master("local[*]").config("spark.driver.host", "localhost")
.getOrCreate()
}
I have tried the suggestions implied by a similar thread, but to no avail.
For example, in my build.sbt I have these dependency overrides
dependencyOverrides += "io.netty" % "netty" % "3.9.9.Final" // have tried not including this
dependencyOverrides += "io.netty" % "netty-all" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-buffer" % "3.9.9.Final" // have tried keeping this version to 4.1.8Final
dependencyOverrides += "io.netty" % "netty-codec" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-codec-http" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-common" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-handler" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-resolver" % "4.1.8.Final"
dependencyOverrides += "io.netty" % "netty-transport" % "4.1.8.Final"
When I look in the external libraries, I do see:
sbt: io.netty:netty:3.9.9.Final.jar
sbt: io.netty:netty-all:4.1.8.Final.jar
But I have also tried including in my build.sbt :
excludeDependencies ++= ExclusionRule("io.netty", "netty")
so that sbt: io.netty:netty:3.9.9.Final.jar
is excluded from my external libraries.
When I explore the error through IntelliJ, and go into the imported NettyMemoryMetrics
class, I do see in the imports that import io.netty.buffer.PooledByteBufAllocatorMetric;
cannot be found. I would think that this would be resolved by just keeping the netty-all in the dependencies, but can't seem to find the right combo for Spark to find this class after the build. Any suggestions?
Upvotes: 0
Views: 559
Reputation: 3574
Finally found the answer in this comment:
https://stackoverflow.com/a/51565332/7082628
Added to the build.sbt: all of my versions of netty needed to be at 4.1.17, instead of 4.1.8, except for just the plain 'netty'
dependencyOverrides += "io.netty" % "netty" % "3.9.9.Final"
dependencyOverrides += "io.netty" % "netty-all" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-buffer" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-codec" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-codec-http" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-common" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-handler" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-resolver" % "4.1.17.Final"
dependencyOverrides += "io.netty" % "netty-transport" % "4.1.17.Final"
Upvotes: 2