Reputation: 197
Currently, I am attempting to create a series of Lambdas that will execute given specific payloads from the StepFunctions Input. I have everything working; however, it's not as I intended.
I finally have grasped the differences between InputPath, ResultPath, and OutputPath. The only problem I have now is allowing the ResultPath to 'append' the returned JSON rather than nesting it within the payload.
Here is the state machine:
{
"StartAt": "GetDailyEmails",
"States": {
"GetDailyEmails": {
"Type": "Task",
"Resource": "arn:aws:lambda:region:account:function:DailyEmailExtractor",
"InputPath": "$.GetDailyEmailsInputs",
"ResultPath": "$.TransformEmailsToCSVInputs.GetDailyEmailsResults",
"Next": "TransformEmailsToCSV"
},
"TransformEmailsToCSV": {
"Type": "Task",
"Resource": "arn:aws:lambda:region:account:function:EmailTransform",
"InputPath": "$.TransformEmailsToCSVInputs",
"End": true
}
}
}
Here is the input I'm providing:
{
"GetDailyEmailsInputs": {
"secret_name": "email_password",
"subject_contains": "stuff",
"json_output_file_name": "test_emails",
"bucket_name": "emails"
},
"TransformEmailsToCSVInputs": {
"csv_output_file_name": "email_errors"
}
}
Here was the output I received:
{
"GetDailyEmailsInputs": {
"secret_name": "email_password",
"subject_contains": "stuff",
"json_output_file_name": "test_emails",
"bucket_name": "emails"
},
"TransformEmailsToCSVInputs": {
"csv_output_file_name": "apex_errors",
"GetDailyEmailsResults": {
"object_key": "raw_emails/test_emails.json",
"bucket_name": "emails"
}
}
}
While this does work, I have to manually extract out the GetDailyEmailsResults
in the TransformEmailsToCSV lambda. I wanted to make the lambdas fully agnostic to whether or not they're being executed from a test payload or the stepfunctions.
Here is the output I'm trying to receive:
{
"GetDailyEmailsInputs": {
"secret_name": "email_password",
"subject_contains": "stuff",
"json_output_file_name": "test_emails",
"bucket_name": "emails"
},
"TransformEmailsToCSVInputs": {
"csv_output_file_name": "apex_errors",
"object_key": "raw_emails/test_emails.json",
"bucket_name": "emails"
}
}
This way, there is not a GetDailyEmailsResults
nested dictionary where I need to account for it.
I could write this in a generalized way where it's just a data
payload being passed to it; however, I'm trying to just have everything within a single payload without nesting if that's possible.
Upvotes: 7
Views: 6573
Reputation: 111
Have a look at the Intrinsic functions
of the amazon state language..
You can use JsonMerge
to achieve what you desire.
Beware that it does not work in the data flow simulator.
You can however use a dummy machine consisting of Pass
states to simulate and play around.
Unfortunately intrinsic functions are not allowed everywhere. At the bottom of the linked docs above is a table telling where you can use them.
Upvotes: 2
Reputation: 1076
I can see that you are trying to achieve a good design. Though there is no out of box solution to provide what you need, I have a different proposal that you might want to consider.
GetDailyEmailsResults
differently and not in the input that you provided.Here is the state machine that I am proposing(It may not be completely correct syntactically but you will get an idea).
{
"StartAt": "GetDailyEmails",
"States": {
"GetDailyEmails": {
"Type": "Task",
"Resource": "arn:aws:lambda:region:account:function:DailyEmailExtractor",
"InputPath": "$.GetDailyEmailsInputs",
"ResultPath": "$.GetDailyEmailsResults",
"Next": "TransformEmailsToCSV"
},
"TransformEmailsToCSV": {
"Type": "Task",
"Resource": "arn:aws:states:::lambda:invoke",
"Parameters":{
"FunctionName":"EmailTransform",
"Payload":{
"csv_output_file_name": "apex_errors",
"object_key.$": "$.GetDailyEmailsResults.object_key",
"bucket_name.$": "$.GetDailyEmailsResults.bucket_name"
}
},
"End": true
}
}
}
I hope this helps.
Cheers :)
Upvotes: 3