Reputation: 61
I couldn't connect to spark on master while deployed on Jboss server
I have an application designed using Java and Spark API for the data loads to oracle database. I have deployed this application on Jboss and that is working fine on my local master spark session but when I tried to change my spark.master to cluster mode and hit the url from my local, it's not connecting to master. I am always seeing - Error occured while loading the member file: java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address on client(local) and ERROR TransportRequestHandler: Error while invoking RpcHandler
java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; l ocal class incompatible: stream classdesc serialVersionUID = 6257082371135760434 - this error on master machine. I have all the same versions(spark 2.4.2 & Hadoop 2.7, Scala 2.12.8 and sparkcore.2.8.0 in my pom.xml) on both my local and master. When I tried to find about this error I came to know there is a version mismatch but I don't have any. Can someone please help on this?
Creating sparksession -
sparkSession = new SparkSession.Builder().master("spark://ip.addresss:7077").config("spark.submit.deployMode","cluster").appName("Java JDBC Spark").config("spark.driver.bindAddress","ip.addresss").getOrCreate();
Upvotes: 6
Views: 7157
Reputation: 502
this is because of the spark configuration file which you are using to create a spark session. In my case when I corrected the same, it worked.
Upvotes: 0