takaomag
takaomag

Reputation: 1635

Multiple spark executors with same spark.local.dir

I am running a Spark application on Mesos Docker containerizer by setting the following properties.

--conf 'spark.local.dir=/var/lib/spark'
--conf 'spark.mesos.executor.docker.image=my_spark_docker_image'
--conf 'spark.mesos.executor.docker.volumes=/var/data/x-spark:/var/lib/spark,/opt/local/mesos:/opt/local/mesos:ro'
--conf 'spark.executorEnv.MESOS_NATIVE_JAVA_LIBRARY=/opt/local/mesos/lib/libmesos.so'

That is, all Spark executors on a host share same local directory (/var/data/x-spark).

It seems everything works ok though, I worry about file corruption. Is it safe ?

Upvotes: 3

Views: 1016

Answers (1)

Michael Gummelt
Michael Gummelt

Reputation: 723

It's safe. Each job will create its own subdirectory.

Upvotes: 4

Related Questions