Reputation: 11
I have a Single Node MemSql cluster:
I have Spark deployed on this Memsql for ETL purpose.
I am unable to configure spark on Memsql.
How do I set rotation policy for Spark Work directory: /var/lib/memsql-ops/data/spark/install/work/
How can I change the path?
How large should spark.executor.memory
be set to avoid OutOfMemoryExceptions?
How to set different configuration settings for Spark which has been deployed on Memsql cluster?
Upvotes: 1
Views: 321
Reputation: 1188
Hopefully the following will fix your issue:
spark.worker.cleanup.enabled
and related configuration options: https://spark.apache.org/docs/1.5.1/spark-standalone.html/var/lib/memsql-ops/data/spark/install/conf/spark_{master,worker}.conf
. once the configuration is changed, you must restart the spark cluster with memsql-ops spark-component-stop --all
and then memsql-ops spark-component-start --all
Upvotes: 1