Reputation: 870
I am running pyspark job
spark-submit --driver-memory 2g --executor-memory 2g --conf spark.driver.maxResultSize=2g job.py
I tried changing multiple options but every time i am getting below error:
I am new to spark can some one help me out what should be the solution with this.
Upvotes: 1
Views: 5414
Reputation: 358
Try reducing the driver memory - the node where you're submitting the job is running out of RAM.
Upvotes: 4