Reputation: 1917
I'm working with AWS Lambda functions (in Python), that process new files that appear in the same Amazon S3 bucket and folders.
When new file appears in s3:/folder1/folderA, B, C
, an event s3:ObjectCreated:*
is generated and it goes into sqs1
, then processed by Lambda1
(and then deleted from sqs1
after successful processing).
I need the same event related to the same new file that appears in s3:/folder1/folderA
(but not folderB, or C) to go also into sqs2
, to be processed by Lambda2
. Lambda1
modifies that file and saves it somewhere, Lambda2
gets that file into DB, for example.
But AWS docs says that:
Notification configurations that use Filter cannot define filtering rules with overlapping prefixes, overlapping suffixes, or prefix and suffix overlapping.
So question is how to bypass this limitation? Are there any known recommended or standard solutions?
Upvotes: 2
Views: 2627
Reputation: 19
To send the same S3 events, Event FolderA to SQS1 and SQS2 - Send the event to SNS Topic and subscribe your SQS queues to the topic [SQS1 and SQS2]
Upvotes: 0
Reputation: 269081
It appears that your requirement is:
folderA
, you wish to send a message to sqs1
AND sqs2
(can be done in parallel)folderB
, you wish to send a message to sqs2
This can be done by configuring separate events for each folder:
Prefix = folderA
Prefix = folderB
Prefix = folderC
You can then use an Amazon SNS topic to fan-out to multiple queues:
eventA -> sns1 +-> sqs1 -> Lambda1
|
+-> sqs2 -> Lambda2
eventB -> sqs1 -> Lambda1
eventC -> sqs1 -> Lambda1
Upvotes: 5
Reputation: 1045
Instead of set up the S3 object notification of (S3 -> SQS), you should set up a notification of (S3 -> Lambda).
In your lambda function, you parse the S3 event and then you write your own logic to send whatever content about the S3 event to whatever SQS queues you like.
Upvotes: 1