Reputation: 72868
I'm very new to streams in NodeJS - basically clueless about them - and I'm trying to get the KnoxJS client for Amazon S3 to stream a file from an HTTP GET.
The sample code on the Knox github page shows this:
http.get('http://google.com/doodle.png', function(res){
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
};
client.putStream(res, '/doodle.png', headers, function(err, res){
// check `err`, then do `res.pipe(..)` or `res.resume()` or whatever.
});
});
But this is very clearly incomplete... it really doesn't do much of anything other than open the http.get
and putStream
for S3.
So where do I go from here? Can someone help me complete this code so that I can stream a file from an HTTP GET to my bucket on S3?
Upvotes: 2
Views: 1515
Reputation: 146014
Once you're inside this callback:
//check `err`, then do `res.pipe(..)` or `res.resume()` or whatever
The response from google has been streamed to S3 already (that's what knox's putStream
does for you), and err, res
are S3's response, so you don't have to anything else here other than check error and something like console.log("Upload done")
if this is a command line snippet. Their docs here are saying if this entire snippet was within the context of an HTTP req/res interaction between say a browser and a node.js web app, then you could do decide you wanted to pipe the image back to the browser as well, you could do that (at least in theory). I think the docs are a bit confusing about that, but that's my interpretation.
Upvotes: 2