Reputation: 1
I have a sqlLite db file that is reside in S3 bucket. I am trying to read the information (mainly read) from the tables to be transform to json. I saw the nodejs with sqlite3 but somehow I can see to make it work on lambda.
const fs = require('fs')
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
exports.handler = async (event, context) => {
const stream = fs.createWriteStream(`/tmp/test.db`, {flags:'a'});
const download = async (args, stream) => {
const params = {
Bucket: 'xyzBucket',
Key: 'test.db'
};
const readStream = s3.getObject(params).createReadStream();
const res = await readStream.pipe(stream)
}
};
Upvotes: 0
Views: 867
Reputation: 46
I got something similar working. I downloaded the sqlite file from my s3 bucket like this:
const command = new GetObjectCommand({
Bucket: "bucketname",
Key: "filename.db",
});
const response = await s3Conn.send(command);
const db = await response.Body;
await writeFile('/tmp/filename.db', db);
And then I was able to work with the file in the lambda.
To get sqlite working in lambda though I had to build my package file through a linux system which I didn't have so I used this: https://nodelayer.xyz/ very helpful tool. From there I created a lambda layer for my sqlite package.
Upvotes: 0