pythonic
pythonic

Reputation: 21625

How can I know programmatically if my spark program is running in local or cluster mode?

I want to know at runtime whether the program is being run in local mode (just one node) or a cluster (for example yarn-client or yarn-cluster)?

Upvotes: 1

Views: 1764

Answers (2)

wbo4958
wbo4958

Reputation: 31

You can directly call

spark.sparkContext.isLocal

Upvotes: 3

user8552640
user8552640

Reputation: 46

Just use master property:

spark.sparkContext.master: String

Upvotes: 3

Related Questions