derek
derek

Reputation: 10227

Setting "spark.memory.storageFraction" in Spark does not work

I am trying to tune the memory parameter of Spark. I tried:

sparkSession.conf.set("spark.memory.storageFraction","0.1") //sparkSession has been created

After I submit the job and checked Spark UI. I found "Storage Memory" is still as before. So the above did not work.

What is the correct way to set "spark.memory.storageFraction"?

I am using Spark 2.0.

Upvotes: 7

Views: 4825

Answers (2)

mnicky
mnicky

Reputation: 1388

As per docs the spark.memory.storageFraction option configures just the "amount of storage memory immune to eviction", not the upper limit. In fact all the memory not used for execution can be used for storage and the upper limit of the storage memory, assuming no memory is used for execution, therefore is: (executor memory - reserved memory) * memoryFraction). See also Memory Management Overview.

In the column's tooltip in the UI they correctly state that it displays the "total available memory for storage...".

This means that you won't see the effect of the spark.memory.storageFraction in the UI column you are probably looking at.

Upvotes: 2

FelixHo
FelixHo

Reputation: 1304

I face same problem , after read some code from spark github I think the "Storage Memory" on spark ui is misleading, it's not indicate the size of storage region,actually it represent the maxMemory:

maxMemory =  (executorMemory - reservedMemory[default 384]) * memoryFraction[default 0.6]

check these for more detail ↓↓↓

spark ui executors-page source code

getMaxmemory source code

Upvotes: 3

Related Questions