VS_FF
VS_FF

Reputation: 2363

Kill Dataproc job from Yarn UI no longer works -- only from Dataproc UI

I used to be able to kill Spark jobs running on Dataproc via the Yarn UI KILL command, rather than from the GCP Dataproc UI command (which is much slower). However I am no longer able to do that -- only the GCP UI works.

Did something change or am I doing something wrong now?

I am using the Dataproc version 1.2 (where this has worked in the past).

Upvotes: 2

Views: 798

Answers (1)

Dagang Wei
Dagang Wei

Reputation: 26478

To avoid YARN security vulnerabilities, non-get APIs are disabled by default now, but users can change it (with caution) when creating the cluster, or update the config then restart Hadoop services for running clusters. Also, as mentioned in the title of this question, users can kill jobs from Dataproc UI, which is recommended.

The yarn-site.xml yarn.resourcemanager.webapp.methods-allowed property now defaults to "GET,HEAD". This change restricts the HTTP methods that can be called on the YARN Resource Manager web UI (default port 8088) and REST APIs to GET and HEAD only, and disables job submission and modifications via the YARN REST API. You can override the default values and enable specific HTTP methods on port 8088 by setting the yarn.resourcemanager.webapp.methods-allowed property to one or more comma-separated HTTP method names when you create a cluster. An ALL value will allow all HTTP methods on the port. Example: gcloud dataproc clusters create --properties='yarn:yarn.resourcemanager.webapp.methods-allowed=GET,POST,DELETE' Recommendation: If you set this property to allow non-default HTTP methods, make sure to configure firewall rules and other security settings to restrict access to port 8088 (see Cluster Web Interfaces→Avoid Security Vulnerabilities).

See more details in this release notes.

Upvotes: 2

Related Questions