user2404124
user2404124

Reputation: 25

Set Apache Spark's BlockManager memory

My spark cluster running info is

15/12/29 17:45:33 INFO BlockManagerMasterEndpoint: Registering block manager 10.108.98.123:51075 with 530.3 MB RAM, BlockManagerId(8, 10.108.98.123, 51075

How to modify 530.3MB to 20g or 10g?

I have make code that is

conf = SparkConf().set("spark.python.worker.memory", "10g").
set('spark.driver.memory', '10g')

Still it takes only 530.3MB RAM and my setting did not reflect.

Is there a way to set it?

Upvotes: 2

Views: 2305

Answers (1)

jopasserat
jopasserat

Reputation: 5930

The BlockManager is only a component running on each node (driver or worker).
See this source for a bit more details on what it's doing: https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-blockmanager.html

It's not directly impacted by the settings you're manipulating.

I couldn't find how to specifically set the memory it's using though.

Upvotes: 1

Related Questions