Reputation: 176
I use the aspectj-maven-plugin in order to weave my aspect in the existing dependencies of my project (in my case, the org.apache.spark jars). Than I use the maven-assembly-plugin to generate a standalone jar containing all the dependencies.
The aspectj plugin seems to correctly weave the aspect in the external jar. However, when I call the jar with dependencies created by the maven assembly plugin, the aspects are not called for the external jar (but works perfectly for the pointcut in my own code).
I suspect that the maven assembly plugin does not use the weaved jar when creating the jar with dependencies. But I have no idea of where the aspectj plugin stores the weaved jar and how to use them instead of the original ones.
Here is my maven configuration :
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>be.example.aspectj</groupId>
<artifactId>minimal</artifactId>
<version>1.0-SNAPSHOT</version>
<name>minimal</name>
<!-- FIXME change it to the project's website -->
<url>http://www.example.com</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.aspectj/aspectjrt -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.9.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.9.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/mysql/mysql-connector-java -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.26</version>
</dependency>
</dependencies>
<build>
<!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) -->
<plugins>
<!-- Maven assembly plugin -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<!-- get all project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!-- MainClass in mainfest make a executable jar -->
<archive>
<manifest>
<mainClass>be.example.aspectj.App</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Deactivating the default maven compiler plugin -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>none</phase>
</execution>
</executions>
</plugin>
<!-- Aspectj configuration -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.14.0</version>
<configuration>
<complianceLevel>${maven.compiler.source}</complianceLevel>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.source}</target>
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<Xlint>ignore</Xlint>
<encoding>UTF-8 </encoding>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>be.example.aspectj.App</mainClass>
</manifest>
</archive>
<!-- Weaved dependencies -->
<weaveDependencies>
<weaveDependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
</weaveDependency>
<weaveDependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
</weaveDependency>
</weaveDependencies>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
And here is the output for the aspectj-maven-plugin. I seems to indicate that the weaving was correctly done but the warning seems to indicate that there is conflict between several versions of a same jar :
[INFO] --- aspectj-maven-plugin:1.14.0:compile (default) @ minimal ---
[INFO] Showing AJC message detail for messages of types: [error, warning, fail]
[INFO] Join point 'method-call(java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String, java.lang.String, java.lang.String))' in Type 'org.sparkproject.jetty.security.JDBCLoginService' (JDBCLoginService.java:183) advised by before advice from 'DriverManagerAspect' (DriverManagerAspect.aj:8)
[INFO] Join point 'method-call(java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String))' in Type 'org.sparkproject.jetty.server.session.DatabaseAdaptor' (DatabaseAdaptor.java:305) advised by before advice from 'DriverManagerAspect' (DriverManagerAspect.aj:8)
[INFO] Join point 'method-call(boolean be.example.aspectj.Account.withdraw(int))' in Type 'be.example.aspectj.App' (App.java:19) advised by before advice from 'AccountAspect' (AccountAspect.aj:9)
[INFO] Join point 'method-call(java.sql.Connection java.sql.DriverManager.getConnection(java.lang.String))' in Type 'be.example.aspectj.App' (App.java:36) advised by before advice from 'DriverManagerAspect' (DriverManagerAspect.aj:8)
[WARNING] duplicate resource: 'META-INF/NOTICE'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/LICENSE'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/DEPENDENCIES'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/services/org.apache.spark.deploy.history.EventFilterBuilder'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/maven/org.spark-project.spark/unused/pom.xml'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
[WARNING] duplicate resource: 'META-INF/maven/org.spark-project.spark/unused/pom.properties'
/home/jeromefink/.m2/repository/org/apache/spark/spark-core_2.12/3.1.2/spark-core_2.12-3.1.2.jar:0
My full minimal example is available on this Github repository. Do you have any idea of how to create a jar with dependencies containing the dependency jar weaved by aspectj ? Or do you know where the aspectj-maven-plugin stores the jar he weaved ?
Thank you for reading me.
Upvotes: 1
Views: 889
Reputation: 67437
Your MCVE on GitHub is very helpful, thanks for that. Other people should follow your example in this regard, especially if they have complex problems like you do.
Regarding AspectJ binary weaving, it works as follows: All weave dependencies (i.e. stuff on ajc's inpath) are written to the compiler's output directory. There is no option to filter or exclude anything. I.e., woven Java classes as well as unvoven ones and all resource files are being written.
In your particular case, you have two weave dependencies which happen to contain identically named resource files, hence the compiler warnings. The simplest way to avoid this in your case would be to only weave spark-core_2.12
, because according to the compiler output no classes from spark-sql_2.12
are weaving targets. But this only works in your specific case. Assuming that both weave dependencies contain target classes to be woven, you can
either ignore the warnings because most of the conflicting files are just meta data files unnecessary during runtime. The two EventFilterBuilder
service descriptor files however have conflicting contents, I checked by comparing them manually. In that case, one file is gets overwritten by the other one in your target/classes
directory. If ignoring this problem works for you or not, strongly depends on your use case.
or handle the situation by providing a more detailed Maven Assembly descriptor file, telling it exactly which files to include and which to ignore. Maybe you even need to merge the two service provider files.
Your second problem is that your woven files get overwritten by the original dependencies when creating the assembly. This is because you have duplicates in the classpath: the originals and the woven copies in your own target directory. This can also be fixed by modifying the assembly descriptor in order to ignore the original(s). That way you make sure that the woven ones are going to be preferred.
Here is how I minimally modified your project in order to
archive
section in AspectJ Maven,spark-sql_2.12
weave dependency in order to avoid the conflict warnings,spark-core_2.12
files.diff --git a/minimal/src/assembly/executable-jar.xml b/minimal/src/assembly/executable-jar.xml
new file mode 100644
--- /dev/null (revision Staged)
+++ b/minimal/src/assembly/executable-jar.xml (revision Staged)
@@ -0,0 +1,27 @@
+<assembly xmlns="http://maven.apache.org/ASSEMBLY/2.1.0"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/ASSEMBLY/2.1.0 http://maven.apache.org/xsd/assembly-2.1.0.xsd">
+ <id>jar-with-dependencies</id>
+ <formats>
+ <format>jar</format>
+ </formats>
+ <includeBaseDirectory>false</includeBaseDirectory>
+ <dependencySets>
+ <dependencySet>
+ <outputDirectory>/</outputDirectory>
+ <useProjectArtifact>true</useProjectArtifact>
+ <unpack>true</unpack>
+ <scope>runtime</scope>
+ <excludes>
+<!--
+ <exclude>
+ org.apache.spark:spark-sql_2.12:*
+ </exclude>
+-->
+ <exclude>
+ org.apache.spark:spark-core_2.12:*
+ </exclude>
+ </excludes>
+ </dependencySet>
+ </dependencySets>
+</assembly>
diff --git a/minimal/pom.xml b/minimal/pom.xml
--- a/minimal/pom.xml (revision HEAD)
+++ b/minimal/pom.xml (revision Staged)
@@ -73,10 +73,9 @@
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
- <!-- get all project dependencies -->
- <descriptorRefs>
- <descriptorRef>jar-with-dependencies</descriptorRef>
- </descriptorRefs>
+ <descriptors>
+ <descriptor>src/assembly/executable-jar.xml</descriptor>
+ </descriptors>
<!-- MainClass in mainfest make a executable jar -->
<archive>
@@ -121,22 +120,10 @@
<showWeaveInfo>true</showWeaveInfo>
<verbose>true</verbose>
<Xlint>ignore</Xlint>
- <encoding>UTF-8 </encoding>
- <archive>
- <manifest>
- <addClasspath>true</addClasspath>
- <classpathPrefix>lib/</classpathPrefix>
- <mainClass>be.example.aspectj.App</mainClass>
- </manifest>
- </archive>
+ <encoding>UTF-8</encoding>
<!-- Weaved dependencies -->
<weaveDependencies>
- <weaveDependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-sql_2.12</artifactId>
- </weaveDependency>
-
<weaveDependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
If you want a more generic solution, you can
Upvotes: 0