Bigbohne
Bigbohne

Reputation: 1356

AWS S3 leaking connections

I'm trying to create an expressjs application that streams objects from S3 via HTTP.

My Problem is that under high load around a minute after the load starts requests begin to timeout. When checking open connection from the nodejs process using lsof -p <pid> I can see that the process has 50 open connection to AWS. (The documentation states that this is the default max connections from the AWS SDK.)

Question: How can I close or reuse the open connection to AWS to continue service objects via HTTP?

(Hopefully) minimal working example

var express = require('express');
var app = express();

app.get('/', function (req, res) {
     // [ some code to create the bucket key ]
    
    const params = {
        'Bucket' : BUCKET,
        'Key' : '<objectkey>'
    }

    stream = s3.getObject(params).on('httpHeaders', function (statusCode, headers) {
        res.set('Content-Length', headers['content-length']);
        res.set('Content-Type', headers['content-type']);
        this.response.httpResponse.createUnbufferedStream()
            .pipe(res);
    })
    .send();
}

app.listen(3000, function () {
    console.log('Example app listening on port 3000!');
});

Edit:

Upvotes: 0

Views: 565

Answers (1)

Bigbohne
Bigbohne

Reputation: 1356

For me I don't what I might be doing wrong with pipe() however the following code does not get stuck even after tens-of-thousands of requests:

var stream = s3.getObject(params).createReadStream()

stream.on('readable', () => {
    data = stream.read()
    if (data == null) {
        return;
    }

    res.write(data);
});

stream.on('end', () => {
    res.end();
});

Upvotes: 0

Related Questions