Reputation: 1809
Not very clear about the whole picture of spark. Let's say I create a regular java jar, without involving any spark relevant things, no sparksession, no sparkcontext, no rdd, no dataset, then what would happen if I submit it to spark cluster via spark-submit with deply-mode=cluster?
I wrote a simple jar which only prints some lines, and seems it works fine on my toy spark, I once thought that would result in some error since it's not a spark application...
I wonder to know if I can expect same result when submitting to a real-world spark cluster which has many nodes?
Upvotes: 1
Views: 316
Reputation: 46
That can depend on a cluster manager and mode but in general nothing strange. Spark application is a plain JVM application with normal main
function, it doesn't implement particular interface and lack of active session is not an issue.
Upvotes: 3