user1521607
user1521607

Reputation: 351

How to find out the amount of memory pyspark has from iPython interface?

I launched with the command

IPYTHON=1 MASTER=local[4] pyspark

Spark greets me with

Welcome to spark, version 1.2.1
SparkContext availabel as sc. 

But using sc, I am not able to find the memory it has. How to find this out, and if possible how to set it to another value as well.

Upvotes: 12

Views: 13223

Answers (2)

samhita
samhita

Reputation: 3615

@karlson's answer is great, you could also use Spark UI.

  • Start the Spark UI: When you launch your Spark application, a web interface is typically started.
  • Navigate to the "Executors" tab: This tab provides information about the executors running your application, including their memory usage. You can see the total memory allocated to each executor, as well as the memory used by different components like the JVM heap, storage, and other overhead.

Upvotes: 0

karlson
karlson

Reputation: 5433

You can query the configuration of the SparkContext like so:

sc._conf.get('spark.executor.memory')

or, if you're interested in the driver's memory:

sc._conf.get('spark.driver.memory')

The complete configuration can be viewed as a list of tuples by

sc._conf.getAll()

Upvotes: 28

Related Questions