BAR
BAR

Reputation: 17061

Increase Spark memory when using local[*]

How do I increase Spark memory when using local[*]?

I tried setting the memory like this:

  val conf = new SparkConf()
    .set("spark.executor.memory", "1g")
    .set("spark.driver.memory", "4g")
    .setMaster("local[*]")
    .setAppName("MyApp")

But I still get:

MemoryStore: MemoryStore started with capacity 524.1 MB

Does this have something to do with:

.setMaster("local[*]")

Upvotes: 22

Views: 23522

Answers (9)

Ryan Miao
Ryan Miao

Reputation: 1

Version

spark-2.3.1

Source Code

org.apache.spark.launcher.SparkSubmitCommandBuilder:267

String memory = firstNonEmpty(tsMemory, config.get(SparkLauncher.DRIVER_MEMORY),
System.getenv("SPARK_DRIVER_MEMORY"), System.getenv("SPARK_MEM"), DEFAULT_MEM);
cmd.add("-Xmx" + memory);

  1. SparkLauncher.DRIVER_MEMORY

--driver-memory 2g

  1. SPARK_DRIVER_MEMORY

vim conf/spark-env.sh

SPARK_DRIVER_MEMORY="2g"

  1. SPARK_MEM

vim conf/spark-env.sh

SPARK_MEM="2g"

  1. DEFAULT_MEM

1g

Upvotes: 0

hamza tuna
hamza tuna

Reputation: 1497

You can't change driver memory after application start link.

Upvotes: 1

fansy1990
fansy1990

Reputation: 141

in spark 2.x ,you can use SparkSession,which looks like :

        val spark= new SparkSession()
        .config("spark.executor.memory", "1g")
        .config("spark.driver.memory", "4g")
        .setMaster("local[*]")
        .setAppName("MyApp")

Upvotes: 9

Rajiv Singh
Rajiv Singh

Reputation: 1078

/usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G --conf spark.driver.maxResultSize = 2G

Upvotes: -2

Rajiv Singh
Rajiv Singh

Reputation: 1078

To assign memory to Spark:

on Command shell: /usr/lib/spark/bin/spark-shell --driver-memory=16G --num-executors=100 --executor-cores=8 --executor-memory=16G

Upvotes: -1

bshyamkumar
bshyamkumar

Reputation: 41

Tried --driver-memory 4g, --executor-memory 4g, neither worked to increase working memory. However, I noticed that bin/spark-submit was picking up _JAVA_OPTIONS, setting that to -Xmx4g resolved it. I use jdk7

Upvotes: 4

BAR
BAR

Reputation: 17061

I was able to solve this by running SBT with:

sbt -mem 4096

However the MemoryStore is half the size. Still looking into where this fraction is.

Upvotes: 10

Glennie Helles Sindholt
Glennie Helles Sindholt

Reputation: 13154

The fraction of the heap used for Spark's memory cache is by default 0.6, so if you need more than 524,1MB, you should increase the spark.executor.memory setting :)

Technically you could also increase the fraction used for Spark's memory cache, but I believe this is discouraged or at least requires you to do some additional configuration. See https://spark.apache.org/docs/1.0.2/configuration.html for more details.

Upvotes: 2

Gillespie
Gillespie

Reputation: 2228

Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.

You can either launch your spark-shell using:

./bin/spark-shell --driver-memory 4g

or you can set it in spark-defaults.conf:

spark.driver.memory 4g

If you are launching an application using spark-submit, you must specify the driver memory as an argument:

./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar

Upvotes: 12

Related Questions