Reputation: 969
In Google App Engine, first generation, logs are grouped automatically by request in Logs Viewer, and in the second generation it's easy enough to set up.
In background Cloud Functions I can't find any way of doing it (save manually filtering by executionId
in Logs Viewer). From various articles around the web I read that the key is to set the trace
argument to the Trace ID when calling the Stackdriver Logging API, and that in HTTP contexts this ID can be found in the X-Cloud-Trace-Context
header.
There are no headers in background contexts (for example, called from Pub/Sub or Storage triggers). I've tried setting this to an arbitrary value, such as the event_id
from the function context, but no grouping happens.
Here's a minified representation of how I've tried it:
from google.cloud.logging.resource import Resource
import google.cloud.logging
log_name = 'cloudfunctions.googleapis.com%2Fcloud-functions'
cloud_client = google.cloud.logging.Client()
cloud_logger = cloud_client.logger(log_name)
request_id = None
def log(message):
labels = {
'project_id': 'settle-leif',
'function_name': 'group-logs',
'region': 'europe-west1',
}
resource = Resource(type='cloud_function', labels=labels)
trace_id = f'projects/settle-leif/traces/{request_id}'
cloud_logger.log_text(message, trace=trace_id, resource=resource)
def main(_data, context):
global request_id
request_id = context.event_id
log('First message')
log('Second message')
Upvotes: 4
Views: 1493
Reputation: 21580
This is currently possible.
It's on our roadmap to provide this support: https://github.com/GoogleCloudPlatform/functions-framework-python/issues/79
Upvotes: 3