Reputation: 343
Maybe the simplest solution is to invoke this public API from my API. The catch is that this public API has a limit of 1000 invocations daily, but we expect our customers to invoke my API way more than that.
I can run a cronjob to get the latest information on 10AM every day, but I don't know how to transfer this information to my API in AWS environment.
Database is clearly an overkill as this DB only has to store only one entry for the daily info.
Can anybody suggest a better solution for this use case?
Upvotes: 0
Views: 46
Reputation: 973
Storing to S3 should be easy.
let xr = 5.2838498;
await s3
.putObject({
Bucket: 'mybucket',
Key: `mydataobject`,
Body: xr.toString(),
ContentType: 'text/plain;charset=utf-8'
})
.promise();
xr = Number((await s3.getObject({
Bucket: 'mybucket',
Key: 'mydataobject',
}).promise()).Body?.toString('utf-8'));
Upvotes: 1
Reputation: 3124
There are tons of ways to implement this. Get the data via API call and use any of the following ways to store it:
Store the data in S3 in any format (txt, csv, json, yml, etc). Read the data from this S3 bucket via your API call
If you're planning to use API Gateway then you can cache the API call. Use this cache to serve the data and don't have to persist it anywhere else. Pretty sure you'll not hit 1k limit with this cache implemented. https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-caching.html
DynamoDB is also a good place to store such data. It will be cheap also if data is not huge AND super performant
Elastic Cache (Redis) is another place to store the data for a day
CloudFront in front of S3 is also a great way for not so dynamic data. Cache the data for a day and just read it from CloudFront
SSM param store is also an option but SSM is not meant to be persistent database
Upvotes: 1