Reputation: 359
I am new to aws and I want to zip a directory in aws and get the zipped s3 URL in lambda. Currently I am using s3-zip module for that. But it downloads the zip directly. I actually dont want to download it. I want it to be zipped in the bucket itself. Is there a way for this? I have shared my code below
const params = {
Bucket: S3BucketConfig.bucket,
Prefix: folder
}
const filesArray = []
const files = s3.listObjects(params).createReadStream()
const xml = new XmlStream(files)
xml.collect('Key')
xml.on('endElement: Key', function(item) {
filesArray.push(item['$text'].substr(folder.length))
})
xml
.on('end', function () {
zip(filesArray)
})
function zip(files) {
const output = fs.createWriteStream(join(__dirname, fileName))
s3Zip
.archive({ region: "eu-west-1", bucket: S3BucketConfig.bucket, preserveFolderStructure: true }, folder, files)
.pipe(output)
}
Can anyone help me with this ? Thanks in advance
Upvotes: 0
Views: 1799
Reputation: 238189
I want it to be zipped in the bucket itself. Is there a way for this?
Sadly there is not. S3 is not a filesystem, it is object storage. This means that you can't perform modification on the objects "inside" a bucket.
Your zip operation will need to download all objects that you want to archive, zip them locally (e.g. on ec2 instance, lambda function, local workstation), and then upload the zip to the bucket.
The s3-zip downloads the objects and packs them into zip. There is no way for this, or other module, to perform zipping operation "in" a bucket, without downloading the objects first. The object's don't have to be physically stored locally before zipping, as they can be streamed from S3, but still its not "in" the bucket.
Upvotes: 5