Kalin Borisov
Kalin Borisov

Reputation: 1120

How to disable logs for whole cluster in gce

Could it be possible for already created (Yarn/Hadoop) cluster to disable logging for all servers inside ?

I can't find anything like it. Is there anything in Dataproc or Compute Engine which can help me to disable the logs ?

Upvotes: 1

Views: 1429

Answers (3)

Kalin Borisov
Kalin Borisov

Reputation: 1120

With little help and suggestion from Google support, there is complete solution to skip logging for whole yarn/hadoop cluster. That can be possible only when create new cluster from dataproc either by google cloud page or console. Property which should to be set in cluster properties filed: dataproc:dataproc.logging.stackdriver.enable to be false

dataproc cluster properties

More info at: https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/cluster-properties

If you create cluster through console you can referral https://cloud.google.com/sdk/gcloud/reference/dataproc/clusters/create#--properties and to use command like:

 gcloud.cmd --properties 'dataproc:dataproc.logging.stackdriver.enable=false'

Upvotes: 2

Wojtek_B
Wojtek_B

Reputation: 4443

You can create a resuorce based exclusion in Stacdriver - select a DataProc cluster you want and it will stop collecting any logs - hence bill you for that.

Go to Logs Ingestion page, select Exclusions and click blue button "create exclusion".

As a resource type select "Cloud Dataproc Cluster" > your_cluster_name > All cluster_uuid as shown below. Also - select "no limit" for time frame. enter image description here

Fill the "Name" field on the right and again click blue button "Create Exlusion".

You can create up to 50 exclusion queries in StackDriver.

Upvotes: 2

Yuri Grinshteyn
Yuri Grinshteyn

Reputation: 727

One easy way would be to create an exclusion in Stackdriver Logging that would prevent logs from that cluster from being ingested into Stackdriver.

Upvotes: 2

Related Questions