Reputation: 2235
I'm currently building apps for Apache Spark. Spark provides during runtime a lot of dependencies, which I normally need if I test/run the apps locally in the IDE (IntelliJ).
Is there any possibility to have different set of dependendencies related if I use the 'package' or the usual compile/run target in IntelliJ ?
For instance, this is a needed dependency to Hadoop
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
<scope>provided</scope>
</dependency>
But the scope 'provided' does not work when I run it locally in the IDE.
Upvotes: 0
Views: 788
Reputation: 4185
If you want IntelliJ to use its own build process rather than Maven's, it's probably better to tell add a (global) library to your project dependencies in the IDE.
It definitely won't be providing these Spark JARs by default, which is what you're telling Maven here.
Upvotes: 1