Reputation: 71
I want to make a script (maybe lambda?) so every new json file uploaded to this s3 is also uploaded directly into a postgres table located in PostgreSQL RDS
.
The json in nested and contains lists of jsons inside, so it is not that simple to just parse it in Postgres. In addition, it has a changing number of columns, so a new file may add up a new column to the table. (If a file has a new column that didn't appear yet, I want to add it and put null objects for the rest of the table values).
How can I do it efficiently?
Upvotes: 1
Views: 1929
Reputation: 10740
As suggested, you can write lambda to listen to S3 events and trigger a function when a new file is uploaded.
https://n2ws.com/blog/aws-automation/lambda-function-s3-event-triggers
One event is trigged you need to read & parse the file.
Now connect to database & run sql queries after generating them from the object.
Upvotes: 2