AKC
AKC

Reputation: 1023

Spark Error: invalid log directory /app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/

enter image description here

My spark application is failing with the above error.

Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.

My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.

Can anyone give me clue on why i am getting this error?

Upvotes: 2

Views: 2576

Answers (1)

Dumitru
Dumitru

Reputation: 31

In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true

in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.

Just comment that line and see if it helps.

Upvotes: 3

Related Questions