gona
gona

Reputation: 33

Enable Databricks Cluster logs via global init script

I want to setup Cluster log delivery for all the clusters (new or old) in my workspace via global init script.

I tried to add the underlying spark properties via custom spark conf - /databricks/driver/conf/00-custom-spark.conf (see below). But this is not working. May be these properties are set for display purpose only.

cat <<EOF >/databricks/driver/conf/00-custom-ud-spark.conf
[driver] {
    "spark.databricks.clusterUsageTags.clusterLogDeliveryEnabled" = "true"
    "spark.databricks.clusterUsageTags.clusterLogDestination" = "dbfs:/cluster-logs"
}
EOF

Any pointers to get this working?

Upvotes: 2

Views: 680

Answers (1)

gona
gona

Reputation: 33

Reached out to Databricks team and they said this is not possible. Even the init scripts logs are available only if the cluster logging is enabled first. So, enabling at a later state has no impact.

Upvotes: 0

Related Questions