IoT user
IoT user

Reputation: 1300

Amazon SQS to files in Amazon S3

According to this if I want to create a lambda function to send the data from SQS to S3 each SQS message will be stored in an individual S3 object (I assume this is due the lambda function will be trigger each time the SQS recieve a message)

Is there any way to send, for example, all the messages that SQS received in the last 24 hours to the same S3 object?

EDIT

This could be the code to received the message from the queue and send it to S3

var receiveMessageRequest = new ReceiveMessageRequest { QueueUrl = myQueueUrl };
 var receiveMessageResponse = sqs.ReceiveMessageAsync(receiveMessageRequest).GetAwaiter().GetResult(); ;
 while (receiveMessageResponse.Messages.Count > 0)
 {
     if (receiveMessageResponse.Messages != null)
     {
         Console.WriteLine("Printing received message.\n");
         foreach (var message in receiveMessageResponse.Messages)
         {
             if (!string.IsNullOrEmpty(message.Body))
             {
                 <...> SEND TO S3
             }
         }
         var messageRecieptHandle = receiveMessageResponse.Messages[0].ReceiptHandle;
         //Deleting a message
         Console.WriteLine("Deleting the message.\n");
         var deleteRequest = new DeleteMessageRequest { QueueUrl = myQueueUrl, ReceiptHandle = messageRecieptHandle };
         sqs.DeleteMessageAsync(deleteRequest).GetAwaiter().GetResult();
     }
     receiveMessageRequest = new ReceiveMessageRequest { QueueUrl = myQueueUrl };
     receiveMessageResponse = sqs.ReceiveMessageAsync(receiveMessageRequest).GetAwaiter().GetResult();
}

But, which would be the best option to send to S3? I mean, in S3 I would pay to put request and if I do it element by element this could be quite inefficient.

I also imagine that storing items in memory would not be a good idea either, so im not sure what should I use for the best result

Other question: when I developed locally i used ReceiveMessage but in Lambda function I have to use ReceiveMessageAsync, why this?

Upvotes: 0

Views: 16054

Answers (2)

Arafat Nalkhande
Arafat Nalkhande

Reputation: 11718

If you want to send all the messages over last 24 hours into SQS then you can do that with Scheduled Lambda function with certain limitations as explained below.

First of all the flow would be that you have a scheduled Lambda that executes every 24 hours. Now whenever this Lambda gets invoke at the schedule you read the SQS for all the available events and then append them one after the other and then write this entire String to S3 object.

This was all the events that got accumulated over the 24 hour window get stored in the same S3 object.

Limitations with this approach

  1. Timeout Constraint : If there are too many events then you might exhaust the max timeout limit of 15 minutes for AWS Lambda
  2. Memory Constraint : If there are too many events then you might exhaust the max memory limit of 1.5 GB for AWS Lambda.

To overcome this limitation you can consider scheduling Lambda at a higher frequency than 24 hours (maybe 12 hours or 6 hours or 3 hours or whatever if feasible in your case)

Upvotes: 3

ene_salinas
ene_salinas

Reputation: 705

A possible solution is:

You can scheduling a cloudwatch event that handler a lambda.

This lambda will read SQS messages and it will store them in a bucket S3 (each message represent an object)

checkout this article: link

Upvotes: 0

Related Questions