Abhishek Singh
Abhishek Singh

Reputation: 426

SQS and AWS Lambda Integration

I am developing an Audit Trail System, that will act as a central location for all the critical events happening around the organization. I am planning to use Amazon SQS as a temporary queue to hold the messages that in turn will trigger the AWS lambda function to write the messages into AWS S3 store. I want to segregate the data at tenantId level (some identifiable id) and persist the messages as batches in S3, that will reduce the no of calls from lambda to S3. Moreover, I want to trigger the lambda every hour. But, I have 2 issues here, one the max batch size provided by SQS is 10, also the lambda trigger polls the SQS service on regular basis, that's gonna increase the no of calls to my S3. I want to create a manual batch of 1000 messages(say) before calling the S3 batch api. I am not very much sure how to architecture my system, so that above requirements can be met. Help or idea provided is very much appreciable!

Simplified Architecture:

enter image description here

Thanks!

Upvotes: 0

Views: 177

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 269081

I would recommend that you instead use Amazon Kinesis Data Firehose. It basically does what you're wanting to do:

  • Accepts incoming messages
  • Buffers them for a period of time
  • Writes output to S3 or Elasticsearch

This is all done as a managed service, and can also integrate with AWS Lambda to provide custom processing (eg filter out certain records).

However, you might have to do something special to segregate the data at tenantId. See: Can I customize partitioning in Kinesis Firehose before delivering to S3?

Upvotes: 3

Related Questions