beQr
beQr

Reputation: 523

Google Cloud Functions Python Logging issue

I'm not sure how to say this but, I'm feeling like there is something under the hood that was changed by Google without me knowing about it. I used to get my logs from my python Cloud Functions in the Google Cloud Console within the logging dashboard. And now, it just stopped working.

So I went investigating for a long time, I just made a log hello world python Cloud Function:

import logging

def cf_endpoint(req):
    logging.debug('log debug')
    logging.info('log info')
    logging.warning('log warning')
    logging.error('log error')
    logging.critical('log critical')
    return 'ok'

So this is my main.py that I deploy as a Cloud Function with an http trigger.

Since I was having a log ingestion exclusion filter with all the "debug" level logs I wasn't seeing anything in the logging dashboard. But when I removed it I discovered this :

logging dashboard screenshot

So it seems like something that was parsing the python built-in log records into stackdriver stopped parsing the log severity parameter! I'm sorry if I look stupid but that's the only thing I can think about :/

Do you guys have any explanations or solutions for this ? am I doing it the wrong way ?

Thank you in advance for your help.

UPDATE 2022/01:

The output now looks for example like:

[INFO]: Connecting to DB ... 

And the drop-down menu for the severity looks like:

enter image description here

With "Default" as the filter that is needed to show the Python logging logs, which means to show just any log available, and all of the Python logs are under "Default", the severity is still dropped.

Upvotes: 50

Views: 29208

Answers (7)

Ferroao
Ferroao

Reputation: 3033

based on the official link https://cloud.google.com/run/docs/logging

import json
import logging
import google.cloud.logging
from flask import jsonify

log_client = google.cloud.logging.Client()
log_client.setup_logging()

def main_func(request):
    # Add trace information to the log entry
    entry = dict(
        severity="NOTICE",
        message="This is a log message with trace info visible in gcp!",
        component="main_func"
    )

    logging.info(json.dumps(entry))

    return jsonify({"message": "not in gcp logs only local!"})

deploy

gcloud functions deploy main_func --runtime=python312 --trigger-http --source=. --region=$LOCATION

After triggering, check logs

gcloud logging read 'resource.type="cloud_run_revision" AND resource.labels.service_name="main-func" AND logName="projects/[project-name]/logs/run.googleapis.com%2Fstdout"' --limit=1

Upvotes: 0

Sander van den Oord
Sander van den Oord

Reputation: 12808

To use the standard python logging module on GCP (tested on python 3.9), you can do the following:

import google.cloud.logging
logging_client = google.cloud.logging.Client()
logging_client.setup_logging()

import logging
logging.warning("A warning")

See also: https://cloud.google.com/logging/docs/setup/python

Upvotes: 2

Sander van den Oord
Sander van den Oord

Reputation: 12808

I use a very simple custom logging function to log to Cloud Logging:

import json

def cloud_logging(severity, message):
    print(json.dumps({"severity": severity, "message": message}))

cloud_logging(severity="INFO", message="Your logging message")

Upvotes: -1

dtheodor
dtheodor

Reputation: 5060

Not wanting to deal with cloud logging libraries, I created a custom Formatter that emits a structured log with the right fields, as cloud logging expects it.

class CloudLoggingFormatter(logging.Formatter):
    """Produces messages compatible with google cloud logging"""
    def format(self, record: logging.LogRecord) -> str:
        s = super().format(record)
        return json.dumps(
            {
                "message": s,
                "severity": record.levelname,
                "timestamp": {"seconds": int(record.created), "nanos": 0},
            }
        )

Attaching this handler to a logger results in logs being parsed and shown properly in the logging console. In cloud functions I would configure the root logger to send to stdout and attach the formatter to it.

# setup logging
root = logging.getLogger()
handler = logging.StreamHandler(sys.stdout)
formatter = CloudLoggingFormatter(fmt="[%(name)s] %(message)s")
handler.setFormatter(formatter)
root.addHandler(handler)
root.setLevel(logging.DEBUG)

Upvotes: 16

José Cols
José Cols

Reputation: 111

From Python 3.8 onwards you can simply print a JSON structure with severity and message properties. For example:

print(
    json.dumps(
        dict(
            severity="ERROR",
            message="This is an error message",
            custom_property="I will appear inside the log's jsonPayload field",
        )
    )
)

Official documentation: https://cloud.google.com/functions/docs/monitoring/logging#writing_structured_logs

Upvotes: 11

Hui Zheng
Hui Zheng

Reputation: 3077

I encountered the same issue.

In the link that @joan Grau shared, I also see there is a way to integrate cloud logger with Python logging module, so that you could use Python root logger as usually, and all logs will be sent to StackDriver Logging.

https://googleapis.github.io/google-cloud-python/latest/logging/usage.html#integration-with-python-logging-module

...

I tried it and it works. In short, you could do it two ways

One simple way that bind cloud logger to root logging

from google.cloud import logging as cloudlogging
import logging
lg_client = cloudlogging.Client()
lg_client.setup_logging(log_level=logging.INFO) # to attach the handler to the root Python logger, so that for example a plain logging.warn call would be sent to Stackdriver Logging, as well as any other loggers created.

Alternatively, you could set logger with more fine-grain control

from google.cloud import logging as cloudlogging
import logging
lg_client = cloudlogging.Client()

lg_handler = lg_client.get_default_handler()
cloud_logger = logging.getLogger("cloudLogger")
cloud_logger.setLevel(logging.INFO)
cloud_logger.addHandler(lg_handler)
cloud_logger.info("test out logger carrying normal news")
cloud_logger.error("test out logger carrying bad news")

Upvotes: 20

Joan Grau Noël
Joan Grau Noël

Reputation: 3192

Stackdriver Logging severity filters are no longer supported when using the Python native logging module.

However, you can still create logs with certain severity by using the Stackdriver Logging Client Libraries. Check this documentation in reference to the Python libraries, and this one for some usage-case examples.

Notice that in order to let the logs be under the correct resource, you will have to manually configure them, see this list for the supported resource types. As well, each resource type has some required labels that need to be present in the log structure.

As an example, the following code will write a log to the Cloud Function resource, in Stackdriver Logging, with an ERROR severity:

from google.cloud import logging
from google.cloud.logging.resource import Resource

log_client = logging.Client()

# This is the resource type of the log
log_name = 'cloudfunctions.googleapis.com%2Fcloud-functions' 

# Inside the resource, nest the required labels specific to the resource type
res = Resource(type="cloud_function", 
               labels={
                   "function_name": "YOUR-CLOUD-FUNCTION-NAME", 
                   "region": "YOUR-FUNCTION-LOCATION"
               },
              )
logger = log_client.logger(log_name.format("YOUR-PROJECT-ID"))
logger.log_struct(
 {"message": "message string to log"}, resource=res, severity='ERROR')

return 'Wrote logs to {}.'.format(logger.name) # Return cloud function response

Notice that the strings in YOUR-CLOUD-FUNCTION-NAME, YOUR-FUNCTION-LOCATION and YOUR-PROJECT-ID, need to be specific to your project/resource.

Upvotes: 32

Related Questions