Reputation: 98
I am designing a system where I want to use the serverless functions of AWS and S3 buckets.
I have some Word files in an S3 bucket and want to do some transformation on the files (this code will be written by me, which I want to host using using serverless functions, let's say the name of this service is FileTransformation
service) and generate new files.
Suppose the user sends a request to transform the files, and the request includes the name of the files.
Now I have the following questions:
user1
has given the 10 file names in the request and another user2
has also given a request for another 15 files. I don't want something like "once user1
request is finished then user2
requests should start". How should I can handle this? user1
and user2
asked for the status of their request, how can I report this? Does this need something extra to be done into my FileTransformation
service?Is there any AWS service I should consider while developing the above service?
I think we can use message broker to send requests to service. CloudWatch for error reporting or something.
Upvotes: 0
Views: 99
Reputation: 3097
Your design could look like this:
I don't want the services to do sequential stuff
When you use AWS Lambda, which is serverless, your Lambda will run multiple concurrent instances (which you can cap and/or reserve using the reserved concurrent executions value).
Suppose the user1 and user2 asked for status of their request
To support this, you might want to implement a DynamoDB that holds the status of each request. When a Lambda gets triggered it can write to DynamoDB with a status saying "in progress", which it can then update to "completed" or "failed" accordingly. Then, you will have a separate API that reads that table whenever a user requests an update
How can i notify user if request fails in between?
You can upload the file with a prefix that contains the user's email, or any other notification medium, for example: files/userA/[email protected]/file.doc
. This way, when the Lambda gets triggered and starts processing, if it fails processing it knows what email to send the result to.
Edit: Based off the comment, if you want an approach where the file is already present in S3 and a user requests processing, then simply send the user's messages to an SQS queue, and then have a Lambda trigger for it (it will trigger concurrent executions of Lambdas to process the queue, not sequential). Or, you could have the API directly call the Lambda, and then have the user wait (if the processing is relatively quick, this should be fine as well)
Upvotes: 2