Ravi Prakash
Ravi Prakash

Reputation: 11

How to set up Spark with a single-node MemSql cluster?

I have a Single Node MemSql cluster:

I have Spark deployed on this Memsql for ETL purpose.

I am unable to configure spark on Memsql.

  1. How do I set rotation policy for Spark Work directory: /var/lib/memsql-ops/data/spark/install/work/

  2. How can I change the path?

  3. How large should spark.executor.memory be set to avoid OutOfMemoryExceptions?

How to set different configuration settings for Spark which has been deployed on Memsql cluster?

Upvotes: 1

Views: 321

Answers (1)

Carl Sverre
Carl Sverre

Reputation: 1188

Hopefully the following will fix your issue:

  1. See spark.worker.cleanup.enabled and related configuration options: https://spark.apache.org/docs/1.5.1/spark-standalone.html
  2. The config can be changed in /var/lib/memsql-ops/data/spark/install/conf/spark_{master,worker}.conf. once the configuration is changed, you must restart the spark cluster with memsql-ops spark-component-stop --all and then memsql-ops spark-component-start --all

Upvotes: 1

Related Questions