Reputation: 158
I have a Scala application that successfully runs on the JVM using an uber jar via the command: java -jar myapp.jar
. I need to create a separate, but related Scala job that utilizes many of the same objects/functions/dependencies as the first, making it a great candidate to keep in the same code repository & uber jar. Please note that these Scala jobs do not utilize Spark, so spark2-submit
is out of the equation.
Question: How can I run 2 separate Scala jobs from the same uber jar on the JVM? (I am using Scala 2.11.8 and SBT for jar assembly)
Additional Context:
I've already looked into related StackOverflow discussions, namely this post about specifying Java classes using java -cp myapp.jar MyClass
and this post, which only presented the solution of running the Scala equivalent using scala -classpath myapp.jar MyClass
.
While the scala -classpath
solution may have worked for the OP of the second linked discussion, I'll be deploying my code to an environment that doesn't have executables for scala
or sbt
, only java
.
Let's say these are the 2 Scala jobs I want to run:
// MyClass.scala
package mypackage
object MyClass {
def main(args: Array[String]): Unit = {
println("Hello, World!")
}
}
// MyClass2.scala
package mypackage
object MyClass2 {
def main(args: Array[String]): Unit = {
println("Hello, World! This is the second job!")
}
}
Is there a way to run Scala code using java -cp myapp.jar MyClass
?
I've tried this and receive the following error:
Error: Could not find or load main class MyClass
The main alternative I can think of would be to create a Scala object that serves as a main entry point and takes a parameter to determine which job gets run. I'd like to avoid that solution if possible, but it would allow me to continue using java -jar myapp.jar
, which has been working fine.
Upvotes: 2
Views: 502
Reputation: 27356
You need to use a fully qualified name for the App
instance:
java -cp myapp.jar mypackage.MyClass
Upvotes: 4