Suresh
Suresh

Reputation: 39541

Spark-submit Executers are not getting the properties

I am trying to deploy the Spark application to 4 node DSE spark cluster, and I have created a fat jar with all dependent Jars and I have created a property file under src/main/resources which has properties like batch interval master URL etc.

I have copied this fat jar to master and I am submitting the application with "spark-submit" and below is my submit command.

dse spark-submit --class com.Processor.utils.jobLauncher --supervise application-1.0.0-develop-SNAPSHOT.jar qa

everything works properly when I run on single node cluster, but if run on DSE spark standalone cluster, the properties mentioned above like batch interval become unavailable to executors. I have googled and found that is the common issue many has solved it. so I have followed one of the solutions and created a fat jar and tried to run, but still, my properties are unavailable to executors.

can someone please give any pointers on how to solve the issue ?

I am using DSE 4.8.5 and Spark 1.4.2

and this is how I am loading the properties

 System.setProperty("env",args(0)) 

 val conf = com.typesafe.config.ConfigFactory.load(System.getProperty("env") + "_application")

Upvotes: 0

Views: 139

Answers (1)

Suresh
Suresh

Reputation: 39541

figured out the solution:

I am referring the property file name from system property(i am setting it main method with the command line parameter) and when the code gets shipped and executed on worker node the system property is not available (obviously..!!) , so instead of using typesafe ConfigFactory to load property file I am using simple Scala file reading.

Upvotes: 0

Related Questions