Reputation: 131
Suppose I run a pyspark job using a dataproc workflow template and an ephemeral cluster... How can I get the name of the cluster created inside my pyspark job
Upvotes: 2
Views: 177
Reputation: 2158
One way would be to fork out and run this command:
/usr/share/google/get_metadata_value attributes/dataproc-cluster-name
The only output will be the cluster name, without any new line characters or anything else to cleanup. See Running shell command and capturing the output
Upvotes: 2