Salam
Salam

Reputation: 1

Python Lambda Function with MSK kafka Trigger to S3 bucket

I have created a python lambda function with MSK Kafka trigger to write to S3 bucket.It is failing with errorMessage": "'Records'", "errorType": "KeyError",

import json import boto3 import base64 import uuid

s3 = boto3.client('s3')

def lambda_handler(event, context): for record in event['Records']: # Decode the base64-encoded Kafka data kinesis_data = base64.b64decode(record['kinesis']['data']).decode('utf-8') payload = json.loads(kinesis_data)

    # Generate a unique key for each object
    unique_key = f"{payload['event_type']}/{str(uuid.uuid4())}.json"
    
    # Process the payload as needed and write it to S3
    s3.put_object(
        Bucket='dev-mos-xcorr-broadcast',
        Key=unique_key,
        Body=json.dumps(payload)
    )

return {'statusCode': 200, 'body': 'Data written to S3'}

====ERROR== { "errorMessage": "'Records'", "errorType": "KeyError", "requestId": "60be31e6-ce4a-4ee7-a3ab-f8aff3f49717", "stackTrace": [ " File "/var/task/lambda_function.py", line 9, in lambda_handler\n for record in event['Records']:\n" ] }

trying to find the solution

Upvotes: 0

Views: 152

Answers (1)

donald fossouo
donald fossouo

Reputation: 1

From your first line of code : "for record in event['Records']".

You need to make sure that in your message you have a key called 'Records' if not your code will fail.

Please let me know if my answer help or share the input event by running a simple : print(event) before your for loop.

Upvotes: 0

Related Questions