Henry
Henry

Reputation: 1686

SparkR re-initialisation not working

I am trying to follow the answer provided here:

Trying to find R equivalent for SetConf from Java

When I do that, I'm loading SparkR:

sparkR
....initialisation spam....
sparkR.stop()
sc<-sparkR.init(sparkEnvir=....)
sqlContext<-sparkRSQL.init(sc)

I get an error message:

Error in callJMethod(x,'getClass'):
  Invalid jobj 1. If SparkR was restarted, Spark operations need to be re-executed.

This same error message comes up when I do not use the sparkEnvir argument as well, so just standard stop and re-initialisation seems to be problematic.

Upvotes: 2

Views: 639

Answers (1)

Alexey P.
Alexey P.

Reputation: 11

I suppose you have a DataFrame loaded from stopped Spark session. You need to reload the DataFrame after you init the Spark again.

Upvotes: 1

Related Questions