Reputation: 348
I recently started using stackdriver logging on my Kubernetes cluster. The service are logging json payloads. In stackdriver logging I see the json payload parsed correctly, but everything has severity "ERROR". This is not intended. Most of these logs aren't errors. They also do not contain error fields or similar fields. Is there a way to tell stackdriver how to determine the severity of a log entry received from the log agent in kubernetes? Or do I need to modify my structured log output in some way to make stackdriver understand it better?
Thanks in advance.
Upvotes: 15
Views: 5042
Reputation: 9562
You do not need to hard-code a key-value pair into the jsonPayload. You can just pass the parameter "severity" to the gcloud logging function log_struct()
, example in Python:
logger.log_struct(json_for_gcp_lbm, labels={"cloud_function_name": environ['LOG_FILE']}, severity=200))
200 stands for "INFO", and you can also write "INFO" instead. This will then be written to the logs like this:
And it will also be available in the Logs Explorer for filtering:
Same for log_text()
.
Upvotes: 0
Reputation: 7717
A simple example of payload: as of August 2018, if you log the string
{"msg" : "starting", "severity" : "INFO"}
Stackdriver will show
{
...
jsonPayload: { msg: "starting" },
severity: "INFO"
}
and the resulting severity will be INFO
(with the blue icon).
Upvotes: 7
Reputation: 727
If you put a severity
field into your JSON record, the Stackdriver logging agent should turn that into the entry severity. Otherwise it hard-codes ERROR for stderr
and INFO for stdout
(for Kubernetes logs).
Upvotes: 10