Reputation: 441
I would like to compile Java file to jar. I want it to be runnable by Spark. I did try to compile normally but it have an error like this.
java.lang.NoClassDefFoundError: JavaWordCount (wrong name: org/apache/spark/examples/JavaWordCount)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:229)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:700)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
This is how I compile Java.
javac -classpath spark-sql_2.11-2.1.1.jar:spark-core_2.11-2.1.1.jar:scala-compiler-2.11.8.jar:scala-library-2.11.8.jar JavaWordCount.java
And this is how I make jar file
jar cvf JavaWordCount.jar JavaWordCount*.class
However doing like this came up with an error above when I tried to spark-submit.
spark-submit --class JavaWordCount JavaWordCount.jar README.md
I also did try to change class into org.apache.spark.examples.JavaWordCount but it's still give me the same error.
Where did I go wrong? Any suggestion? PS I use an example JavaWordCount in Spark folder.
Upvotes: 1
Views: 1196
Reputation: 441
I solved the problem only need to javac
with all spark jar files(only the one that needed) and when use jar cvf
with spark jar files again which I didn't do it when I posted the question.
Upvotes: 1