Reputation: 11790
Currently I am running this in a http handler:
err := mongoCollection.Find(bson.M{"name": vars["name"]}).One(&result)
data, err := json.Marshal(result)
w.Write(data)
How can I begin serving the result before the full BSON data is in?
Edit: the answer needs to go beyond the mgo
extension and go into bson
. mgo
as far as I can see will only serve full documents if I not mistaken. I have one - possibly large - document as my code example clearly shows.
Upvotes: 2
Views: 1238
Reputation: 943
Take a look at chanson, you can easily construct and stream json. There's an example that reads data from channels for adding elements to a list. You could probably do something similar
Upvotes: 0
Reputation: 1803
In order for this to be possible, you would need these things:
Reader
for the incoming bson streamReader
and produces document partsWriter
mgo
does not provide number 1. encoding/json
does not provide number 2 or 4. mgo/bson
does not provide number 3. A bit of googling doesn't turn up any help for any of those points in Go, though there are streaming json parsers in other languages (see answers for Is there a streaming API for JSON?).
Your desire to do this is reasonable, but the support just doesn't exist yet. Fortunately, json and bson are simple enough and all the components you're using are open source, so in theory you could write the tools you need.
Upvotes: 3
Reputation: 8526
I don't think there's anything you can do to avoid unmarshalling the whole BSON (and therefore not serving the result until the BSON has been fully delivered by mgo), short of hacking on mgo. Its API only deals in whole, unmarshalled documents, with no access to any BSON-encoded []byte
or Reader
that you could potentially elementwise bsondecode-then-jsonencode as data comes in.
Upvotes: 1
Reputation: 17443
Take a look at json.Encoder
. It writes JSON objects to an output stream. json.Marshal
produces a []byte
in one shot and does not provide a stream.
On the MongoDB side take a look at mgo.Iter
. In case you have a large number of documents in your result you can serialize them in batches and make your application more memory efficient.
Sample of using json.Encode:
data := map[string]int{"apple": 5, "lettuce": 7}
enc := json.NewEncoder(w)
enc.Encode(data)
Upvotes: -1