J. O'Ryan
J. O'Ryan

Reputation: 155

AWS question - How can I get Cloudwatch event data in a Fargate task with Python

I'm new to Cloudwatch events and to Fargate. I want to trigger a Fargate task (Python) to run whenever a file is uploaded to a specific S3 bucket. I can get the task to run whenever I upload a file, and can see the name in the event log; however I can't figure out a simple way to read the event data in Fargate. I've been researching this the past couple of days and haven't found solution other than reading the event log or using a lambda to invoke the task and to put the event data in a message queue.

Is there a simple way to obtain the event data in Fargate with boto3? It's likely that I'm not looking in the right places or asking the right question.

Thanks

Upvotes: 6

Views: 953

Answers (2)

Adiii
Adiii

Reputation: 59946

One of the easiest options that you can configure is two targets for same s3 image upload event.

  • Push the Same Event to SQS
  • launch Fargate task at the same time

Read Message Event from SQS when Fargate is up (No Lambda in between), also same task definition that will work a normal use case, make sure you exit the process after reading the message from sqs.

So in this case whenever Fargate Task up, it will read messages from the SQS. enter image description here

Upvotes: 8

Chris Williams
Chris Williams

Reputation: 35188

To do this you would need to use a input transformer.

Each time a event rule is triggered a JSON object accessible to use for in the transformation.

As the event itself is not accessible within the container (like with Lambda functions), the idea is that you would actually forward key information as environment variables and manipulate in your container.

At this time it does not look like every service supports this in the console so you have the following options:

You can view a tutorial for this exact scenario from this link.

Upvotes: 2

Related Questions