Reputation: 542
I installed spark with sbt in project dependecies. Then I want to change variables of the spark env without doing it within my code with a .setMaster()
. The problem is that i cannot find any config file on my computer.
This is because I have an error : org.apache.spark.SparkException: Invalid Spark URL: spark://[email protected]_not_set.invalid:50487
even after trying to change my hostname. Thus, I would like to go deep into spark library and try some things.
I tried pretty much everything that is on this so post : Invalid Spark URL in local spark session.
Many thanks
Upvotes: 0
Views: 409
Reputation: 542
What worked for the issue:
export SPARK_LOCAL_HOSTNAME=localhost
in shell profil (e.g. ~/.bash_profil
)
SBT was not able to find the host even using the command just before running sbt. I had to put it in the profil to have a right context.
Upvotes: 1