Devesh Poojari
Devesh Poojari

Reputation: 93

Log to Cloud Logging with correct severity from Cloud Run Job and package used in the job

What we are trying:
We are trying to Run a Cloud Run Job that does some computation and also uses one our custom package to do the computation. The cloud run job is using google-cloud-logging and python's default logging package as described here. The custom python package also logs its data (only logger is defined as suggested here).

Simple illustration:

from google.cloud import logging as gcp_logging
import logging
import os
import google.auth
from our_package import do_something

def log_test_function():
    SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
    credentials, project_id = google.auth.default(scopes=SCOPES)
    try:
        function_logger_name = os.getenv("FUNCTION_LOGGER_NAME")

        logging_client = gcp_logging.Client(credentials=credentials, project=project_id)
        logging.basicConfig()
        logger = logging.getLogger(function_logger_name)
        logger.setLevel(logging.INFO)
        logging_client.setup_logging(log_level=logging.INFO)

        logger.critical("Critical Log TEST")
        logger.error("Error Log TEST")
        logger.info("Info Log TEST")
        logger.debug("Debug Log TEST")

        result = do_something()
        logger.info(result)
    except Exception as e:
        print(e)    # just to test how print works

    return "Returned"

if __name__ == "__main__":
    result = log_test_function()
    print(result)

Cloud Run Job Logs Cloud Run Job Logs

The Blue Box indicates logs from custom package
The Black Box indicates logs from Cloud Run Job
The Cloud Logging is not able to identify the severity of logs. It parses every log entry as default level.

But if I run same code in Cloud Function, it seems to work as expected (i.e. severity level of logs from cloud function and custom package is respected) as shown in image below.

Cloud Function Logs Cloud Function Logs


Both are serverless architecture than why does it works in Cloud Function but not in Cloud Run.

What we want to do:
We want to log every message from Cloud Run Job and custom package to Cloud Logging with correct severity.

We would appreciate your help guys!


Edit 1
Following Google Cloud Python library commiters solution. Almost solved the problem. Following is the modified code.

from google.cloud import logging as gcp_logging
import logging
import os
import google.auth
from our_package import do_something
from google.cloud.logging.handlers import CloudLoggingHandler
from google.cloud.logging_v2.handlers import setup_logging
from google.cloud.logging_v2.resource import Resource
from google.cloud.logging_v2.handlers._monitored_resources import retrieve_metadata_server, _REGION_ID, _PROJECT_NAME

def log_test_function():
    SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
    region = retrieve_metadata_server(_REGION_ID)
    project = retrieve_metadata_server(_PROJECT_NAME)
    try:
        function_logger_name = os.getenv("FUNCTION_LOGGER_NAME")

        # build a manual resource object
        cr_job_resource = Resource(
            type="cloud_run_job",
            labels={
                "job_name": os.environ.get('CLOUD_RUN_JOB', 'unknownJobId'),
                "location": region.split("/")[-1] if region else "",
                "project_id": project
            }
        )

        logging_client = gcp_logging.Client()
        gcloud_logging_handler = CloudLoggingHandler(logging_client, resource=cr_job_resource)
        setup_logging(gcloud_logging_handler, log_level=logging.INFO)

        logging.basicConfig()
        logger = logging.getLogger(function_logger_name)
        logger.setLevel(logging.INFO)

        logger.critical("Critical Log TEST")
        logger.error("Error Log TEST")
        logger.warning("Warning Log TEST")
        logger.info("Info Log TEST")
        logger.debug("Debug Log TEST")

        result = do_something()
        logger.info(result)
    except Exception as e:
        print(e)    # just to test how print works

    return "Returned"

if __name__ == "__main__":
    result = log_test_function()
    print(result)

Now the every log is logged twice one severity sensitive log other severity insensitive logs at "default" level as shown below. Logs

Upvotes: 0

Views: 3188

Answers (2)

Dokook Choe
Dokook Choe

Reputation: 304

you might want to get rid of loggers you don't need. take a look at https://stackoverflow.com/a/61602361/13161301

Upvotes: 0

guillaume blaquiere
guillaume blaquiere

Reputation: 75940

Cloud Functions get your code, wrap it in a webserver, build a container and deploy it.

With Cloud Run, you only build and deploy the container.

That means, Cloud Functions webserver wrapper do something more that you do: it initialize correctly the logger in Python.

Have a look to that doc page, you should solve your issue with it


EDIT 1

I took the exact example and I added it in my flask server like that

import os
from flask import Flask

app = Flask(__name__)

import google.cloud.logging

# Instantiates a client
client = google.cloud.logging.Client()

# Retrieves a Cloud Logging handler based on the environment
# you're running in and integrates the handler with the
# Python logging module. By default this captures all logs
# at INFO level and higher
client.setup_logging()
# [END logging_handler_setup]

# [START logging_handler_usage]
# Imports Python standard library logging
import logging

@app.route('/')
def call_function():
    # The data to log
    text = "Hello, world!"

    # Emits the data using the standard logging module
    logging.warning(text)
    # [END logging_handler_usage]

    print("Logged: {}".format(text))
    return text


# For local execution
if __name__ == "__main__":
    app.run(host='0.0.0.0',port=int(os.environ.get('PORT',8080)))

A rough copy of that sample.

And the result is correct. A Logged entry with the default level (the print), and my Hello world in warning status (the logging.warning)

enter image description here


EDIT 2

Thanks to the help of the Google Cloud Python library commiters, I got a solution to my issue. By waiting the native integration in the library.

Here my new code for Cloud Run Jobs this time

import google.cloud.logging
from google.cloud.logging.handlers import CloudLoggingHandler
from google.cloud.logging_v2.handlers import setup_logging
from google.cloud.logging_v2.resource import Resource
from google.cloud.logging_v2.handlers._monitored_resources import retrieve_metadata_server, _REGION_ID, _PROJECT_NAME
import os

# find metadata about the execution environment
region = retrieve_metadata_server(_REGION_ID)
project = retrieve_metadata_server(_PROJECT_NAME)

# build a manual resource object
cr_job_resource = Resource(
    type = "cloud_run_job",
    labels = {
        "job_name": os.environ.get('CLOUD_RUN_JOB', 'unknownJobId'),
        "location":  region.split("/")[-1] if region else "",
        "project_id": project
    }
)

# configure handling using CloudLoggingHandler with custom resource
client = google.cloud.logging.Client()
handler = CloudLoggingHandler(client, resource=cr_job_resource)
setup_logging(handler)

import logging

def call_function():
    # The data to log
    text = "Hello, world!"

    # Emits the data using the standard logging module
    logging.warning(text)
    # [END logging_handler_usage]

    print("Logged: {}".format(text))
    return text


# For local execution
if __name__ == "__main__":
    call_function()

And the result works great:

  • Logged: .... entry is the simple "Print" to the stdout. Standard level
  • Warning entry as expected enter image description here

Upvotes: 0

Related Questions