Reputation: 351
I launched with the command
IPYTHON=1 MASTER=local[4] pyspark
Spark greets me with
Welcome to spark, version 1.2.1
SparkContext availabel as sc.
But using sc, I am not able to find the memory it has. How to find this out, and if possible how to set it to another value as well.
Upvotes: 12
Views: 13223
Reputation: 3615
@karlson's answer is great, you could also use Spark UI.
Upvotes: 0
Reputation: 5433
You can query the configuration of the SparkContext like so:
sc._conf.get('spark.executor.memory')
or, if you're interested in the driver's memory:
sc._conf.get('spark.driver.memory')
The complete configuration can be viewed as a list of tuples by
sc._conf.getAll()
Upvotes: 28