jimiclapton
jimiclapton

Reputation: 889

Using multiple Python functions within AWS Lambda script

SITUATION

I'm using a Lambda function that takes a CSV attachment from an incoming email and places it into what is, in effect, a sub-folder of an S3 bucket. This part of the Lambda works well, however there are other UDFs which I need to execute, within the same Lambda function, to perform susequent tasks.

CODE

    import boto3 
    
    import email
    import base64
    
    import math
    import pickle
    
    import numpy as np
    import pandas as pd
    
    import io 
    
    
    ###############################
    ###    GET THE ATTACHMENT   ###
    ###############################
    
    #s3 = boto3.client('s3')
    
    
    FILE_MIMETYPE = 'text/csv'
    #'application/octet-stream'
    
    # destination folder
    S3_OUTPUT_BUCKETNAME = 'my_bucket' 
    
    print('Loading function')
    
    s3 = boto3.client('s3')
    
    
    def lambda_handler(event, context):
    
        #source email bucket 
        inBucket = event['Records'][0]['s3']['bucket']['name']
        key = urllib.parse.quote(event['Records'][0]['s3']['object']['key'].encode('utf8'))
    
    
        try:
            response = s3.get_object(Bucket=inBucket, Key=key)
            msg = email.message_from_string(response['Body'].read().decode('utf-8'))   
    
        except Exception as e:
            print(e)
            print('Error retrieving object {} from source bucket {}. Verify existence and ensure bucket is in same region as function.'.format(key, inBucket))
            raise e
        
    
        attachment_list = []
       
    
        try:
            #scan each part of email 
            for message in msg.walk():
                
                # Check filename and email MIME type
                if  (message.get_content_type() == FILE_MIMETYPE and message.get_filename() != None):
                    attachment_list.append ({'original_msg_key':key, 'attachment_filename':message.get_filename(), 'body': base64.b64decode(message.get_payload()) })
        except Exception as e:
            print(e)
            print ('Error processing email for CSV attachments')
            raise e
        
        # if multiple attachments send all to bucket 
        for attachment in attachment_list:
    
            try:
                s3.put_object(Bucket=S3_OUTPUT_BUCKETNAME, Key='attachments/' + attachment['original_msg_key'] + '-' + attachment['attachment_filename'] , Body=attachment['body']
            )
            except Exception as e:
                print(e)
                print ('Error sending object {} to destination bucket {}. Verify existence and ensure bucket is in same region as function.'.format(attachment['attachment_filename'], S3_OUTPUT_BUCKETNAME))
                raise e

#################################
###    ADDITIONAL FUNCTIONS   ###
#################################
    
    def my_function():
      print("Hello, this is another function")

OUTCOME

The CSV attachment is successfully retrieved and placed in the destination as specified by s3.put_object, however there is no evidence in the Cloudwatch logs that my_function runs.

WHAT I HAVE TRIED

I've tried using def my_function(event, context): in an attempt to ascertain whether the function requires the same criteria to be executed as the first functon. I've also tried to include the my_function() as part of the first function but this does not appear to work either.

How can I ensure that both functions are executed within the Lambda?

Upvotes: 0

Views: 1896

Answers (1)

Marcin
Marcin

Reputation: 238975

Based on the comments.

The issue was caused because my_function function was not called inside the lambda handler.

The solution was to add my_function() into the handler lambda_handler so that the my_function is actually called.

Upvotes: 2

Related Questions