TheRedSeth
TheRedSeth

Reputation: 77

Mass delete Cloudwatch log groups using Boto3 - delete_log_group

I have a pretty lengthy list of cloudwatch log groups i need to delete....like close to a hundred. Since you have to delete them one at a time I thought a little python script could help me out but now im stuck.

here's my script so far...

import boto3
from botocore.exceptions import ClientError
import json

#Connect to AWS using default AWS credentials in awscli config
cwlogs = boto3.client('logs')

loglist = cwlogs.describe_log_groups(
    logGroupNamePrefix='/aws/lambda/staging-east1-'
)

#writes json output to file...
with open('loglist.json', 'w') as outfile:
    json.dump(loglist, outfile, ensure_ascii=False, indent=4, 
sort_keys=True)

#Opens file and searches through to find given loggroup name
with open("loglist.json") as f:
    file_parsed = json.load(f)

for i in file_parsed['logGroups']:
    print i['logGroupName']


#   cwlogs.delete_log_group(
#       logGroupName='string'   <---here is where im stuck
#   )

How do I take the value of 'logGroupName' in i and convert it to a string that the delete_log_group command can use and iterate through to delete all of my log groups I need to be gone? I tried using json.loads and it errored out with the following...

Traceback (most recent call last): File "CWLogCleaner.py", line 18, in file_parsed = json.loads(f) File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/init.py", line 339, in loads return _default_decoder.decode(s) File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())

Or am I totally going about this the wrong way?

TIA

Upvotes: 3

Views: 5823

Answers (3)

Vignesh Rao
Vignesh Rao

Reputation: 179

None of the solution here worked the way I wanted (some due to pagination), so I built my own script. This deletes logs older than 7 days. You can change the timedelta as you choose or set it to 0 or remove the deletion date condition to remove all logs.

from datetime import datetime, timedelta

import boto3

app_name = 'your function name here'


def login():
    client = boto3.client('logs')
    paginator = client.get_paginator('describe_log_streams')
    response_iterator = paginator.paginate(
        logGroupName=f'/aws/lambda/{app_name}',
    )
    return client, response_iterator


def deletion_date():
    tod = datetime.today() - timedelta(days=7)
    epoch_date = str(int(tod.timestamp()))
    selected_date = int(epoch_date.ljust(13, '0'))
    return selected_date


def purger():
    n = 0
    print('Deleting log files..')
    for item in response:
        collection = item['logStreams']
        for collected_value in collection:
            if collected_value['creationTime'] < req_date:
                resp = client_.delete_log_stream(
                    logGroupName=f'/aws/lambda/{app_name}',
                    logStreamName=f"{collected_value['logStreamName']}"
                )
                n = n + 1
                if resp['ResponseMetadata']['HTTPStatusCode'] != 200:
                    print(f"Unable to purge logStream: {collected_value['logStreamName']}")
    return n


if __name__ == '__main__':
    client_, response = login()
    req_date = deletion_date()
    print(f'{purger()} log streams were purged for the function {app_name}')

Upvotes: 2

jarmod
jarmod

Reputation: 78583

Unless you specifically need to save the JSON responses to disk for some other purpose, perhaps you could simply use some variant of this code:

import boto3

def delete_log_streams(prefix=None):
    """Delete CloudWatch Logs log streams with given prefix or all."""
    next_token = None
    logs = boto3.client('logs')

    if prefix:
        log_groups = logs.describe_log_groups(logGroupNamePrefix=prefix)
    else:
        log_groups = logs.describe_log_groups()

    for log_group in log_groups['logGroups']:
        log_group_name = log_group['logGroupName']
        print("Delete log group:", log_group_name)

        while True:
            if next_token:
                log_streams = logs.describe_log_streams(logGroupName=log_group_name,
                                                        nextToken=next_token)
            else:
                log_streams = logs.describe_log_streams(logGroupName=log_group_name)

            next_token = log_streams.get('nextToken', None)

            for stream in log_streams['logStreams']:
                log_stream_name = stream['logStreamName']
                print("Delete log stream:", log_stream_name)
                logs.delete_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
                # delete_log_stream(log_group_name, log_stream_name, logs)

            if not next_token or len(log_streams['logStreams']) == 0:
                break

Upvotes: 6

TheRedSeth
TheRedSeth

Reputation: 77

Heres what I got working for me. Im sure this is hackey and im no developer but it worked for me...

cwlogs = boto3.client('logs')

loglist = cwlogs.describe_log_groups(
    logGroupNamePrefix='ENTER NAME OF YOUR LOG GROUP HERE'
)

#writes json output to file...
with open('loglist.json', 'w') as outfile:
    json.dump(loglist, outfile, ensure_ascii=False, indent=4, 
sort_keys=True)

#Opens file and searches through to find given loggroup name
with open("loglist.json") as f:
    file_parsed = json.load(f)

for i in file_parsed['logGroups']:
    print i['logGroupName']

for i in file_parsed['logGroups']:
    cwlogs.delete_log_group(
        logGroupName=(i['logGroupName'])
    )

Upvotes: 2

Related Questions