Reputation: 3036
I have a bunch of simple and short python scripts to run on a daily bases. Right now I am using Amazon Lightsail (of course I could as well use other cloud services) and run my scripts with crontab on a linux server.
Is there a more suitable solution to run jobs in aws? I have heard or read of aws batch, lambda etc. and elastic beanstalk. Is there a best practice?
In best case I would only need to write the code, deploy it and set a timer and am able to see the log.
Upvotes: 0
Views: 314
Reputation: 9402
Lambda will be a great option for this. Your solution could be built with these components:
Log files are automatically made available for Lambda functions via CloudWatch, so you can see the logs for each invocation of the lambda function. Also, with Lambda it is completely managed, so you don't have to bother with the overhead of maintaining the infrastructure on which it runs. Other components may be needed depending on what the scripts are doing. The Lambda functions run using a particular role, so you will need to ensure the role allows the lambda to access other resources if needed (such as databases or other servers, or even resources in other AWS accounts). And with Lambda it is easy to expose the functions as a service if you want (API Gateway) or to manually invoke them by running a test on the Lambda configuration screen.
You can deploy code either inline in the Lambda or provide a .zip that has the code files in it, including 3rd-party dependencies you may have installed.
Here are some useful links:
Upvotes: 1