halafi
halafi

Reputation: 1254

Maven exec plugin passing arguments to external executable

I am trying to configure my app to work with mvn package exec:exec only.

What I want is to build .jar and submit it to an external binary, I can do that manually after mvn package, but I want it to be automatic. I think that maven exec plugin is the way to go.

My current configuration is:

<plugin>
  <groupId>org.codehaus.mojo</groupId>
  <artifactId>exec-maven-plugin</artifactId>
  <version>1.4.0</version>
  <executions>
    <execution>
      <goals>
        <goal>exec</goal>
      </goals>
    </execution>
  </executions>
  <configuration>
    <executable>
      /bin/spark-submit
    </executable>
    <arguments>
      <argument>
         --class "package.Class" --master "local[4]" ${project.build.directory}/${project.build.finalName}.${packaging}                 
      </argument>
    </arguments>
  </configuration>
</plugin>

What this does is it executes (from maven DEBUG info):

/bin/spark-submit, --class "package.Class" --master "local[4]" myApp.jar

I think the problem si that the arguments are meant to be passed to a java executable, so they are divided by commas, which I do not want when executing shell command.

Another thing I have tried was to split the arguments:

<arguments>
  <argument>--class "package.Class"</argument>
  <argument>--master "local[4]"</argument>
  <argument>${project.build.directory}/${project.build.finalName}.${packaging}</argument>
</arguments>

which executes:

/bin/spark-submit, --class "package.Class", --master "local[4]", myApp.jar

(more commas, not what I want either)

I want maven to execute:

/bin/spark-submit --class "package.Class" --master "local[4]" myApp.jar

I hope what I want can be done using maven exec plugin and my problem can be well understood. I would greatly appreciate any help you can give me.

Upvotes: 2

Views: 1903

Answers (2)

Mike George
Mike George

Reputation: 1

This also works:

<configuration>
<executable>/Users/mike/Applns/apache/spark/spark-3.0.0-bin-hadoop3.2/bin/spark-submit</executable>
<arguments>
    <argument>--master</argument>
    <argument>local</argument>
    <argument>${project.build.directory}/${project.artifactId}-${project.version}.jar</argument>
</arguments>

Upvotes: 0

halafi
halafi

Reputation: 1254

I found that there is an optional configuration parameter <commandlineArgs> which does exactly what I need.

So the correct configuration for me is:

...
<configuration>
  <executable>
    /bin/spark-submit
  </executable>
  <commandlineArgs>
    --class "package.Class" --master "local[4]" ${project.build.directory}/${project.build.finalName}.${packaging}
  </commandlineArgs>
</configuration>
...

Upvotes: 2

Related Questions