GZZ
GZZ

Reputation: 708

Python logging to Azure

I am using Python and I was wondering if there is any package/simple way for logging directly to Azure?

I found a package (azure-storage-logging) that would be really nice, however it is not being maintained and not compatible with the new Azure API.

Any help is welcome.

Upvotes: 6

Views: 14407

Answers (3)

Hesam Eskandari
Hesam Eskandari

Reputation: 96

You can just create your own handler. I show you how to log it to an azure table. Storing in blob can be similar. The biggest benefit is that you emit the log as you log it instead of sending logs at the end of a process.

Create a table in azure table storage and then run the following code.

from logging import Logger, getLogger

from azure.core.credentials import AzureSasCredential
from azure.data.tables import TableClient, TableEntity


class _AzureTableHandler(logging.Handler):

    def __init__(self, *args, **kwargs):
        super(_AzureTableHandler, self).__init__(*args, **kwargs)
        credentials: AzureSasCredential = AzureSasCredential(signature=<sas-token>)
        self._table_client: TableClient = TableClient(endpoint=<storage-account-url>, table_name=<table-name>, credential=credentials)

    def emit(self, record):
        level = record.levelname
        message = record.getMessage()
        self._table_client.create_entity(TableEntity({'Severity': level,
                                                      'Message': message,
                                                      'PartitionKey': f'{datetime.now().date()}',
                                                      'RowKey': f'{datetime.now().microsecond}'}))


if __name__ == "__main__":
        logger: Logger = getLogger(__name__)
        logger.addHandler(_AzureTableHandler())
        logger.warning('testing azure logging')

In this approach, you also have the benefit of creating custom columns for your table. For example, you can have separate columns for the project name which is logging, or the username of the dev who is running the script.

logger.addHandler(_AzureTableHandler(Username="Hesam", Project="Deployment-support-tools-client-portal"))

Make sure to add your custom column names to the table_entity dictionary. Or you can put the project name as partition key.

Logged in the azure table storage

Upvotes: 1

nl09
nl09

Reputation: 103

I had the same requirement to log error and debug messages for small application and store the logs to Azure data lake. We did not want to use Azure Insight as our was not a web application and we just needed logs to debug the code.

To solve this I created temp.log file.

logging.basicConfig(filename='temp.log', format='%(asctime)s  %(levelname)-8s [%(filename)s:%(lineno)d] %(message)s',
    datefmt='%Y-%m-%d:%H:%M:%S')    

At the end of program I uploaded the temp.log to azure using,

DataLakeFileClient.append_data 

local_file = open("temp.log",'r')

file_contents = local_file.read()

file_client.append_data(data=file_contents, offset=0, length=len(file_contents))

file_client.flush_data(len(file_contents))

https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-python

Upvotes: 1

Thiago Custodio
Thiago Custodio

Reputation: 18387

You should use Application Insights which will send the logs to Azure Monitor (previously Log Analytics).

https://learn.microsoft.com/en-us/azure/azure-monitor/app/opencensus-python

Upvotes: 5

Related Questions