Reputation: 8132
I created a Spark job which requires a newer version of commons-compress
to function. I added this to the pom.xml
:
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.20</version>
</dependency>
And I am using maven-shade-plugin
to bake the library into my resulting jar.
However, when I want to use things from this library, I still get:
java.lang.NoSuchMethodError: org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(Ljava/nio/channels/SeekableByteChannel;)V
It seems it uses the (older) commons-compress
which Spark uses, but ignores the one I baked into my jar.
How can I tell Spark to use my more modern version instead?
Upvotes: 0
Views: 238
Reputation: 2495
In this type of scenario, I've had good luck using the shade plugin to actually relocate the classes in the newer jar.
What effectively happens is that the newer version of the jar lives under a different hierarchical name, and the shade plugin will automatically re-write all references to that new jar within your bytecode. It's worked quite well for me in the past to avoid dependency version conflicts.
Modify your maven shade plugin's configuration with a relocation similar to the one below. You will also have to make sure that the newer version of the jar is embedded (it sounds like you've done this already).
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.commons</pattern>
<shadedPattern>org.shaded.apache.commons</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Upvotes: 2