Reputation: 41
Per the google docs (https://cloud.google.com/logging/docs/export/using_exported_logs#log_entries_in_google_bigquery) I have set up my GCP app engine to auto-export to big query. However, I am running nodejs using bunyan. My logs are in json format. I'd like to take advantage of the cloud logging "structPayload" LogEntry, but the auto-export seems to automatically dump it into a "textPayload". Is there any way to configure this?
Upvotes: 1
Views: 577
Reputation: 96
I'm one of the engineers working on Cloud Logging. We haven't yet announced the structured logging feature, and documentation will be available when we do, but the functionality is present in the cloud logging plugin and can be used.
In your case, if you edit the configuration file that is capturing your logs (under /etc/google-fluentd/config.d/), configure 'format json', then 'service google-fluentd reload', you should see your logs ingested as structPayload - each json field will become a column in BigQuery.
See the tail input plugin documentation for more details on the configuration options: http://docs.fluentd.org/articles/in_tail
Upvotes: 1