Reputation: 63
I have an array which stores huge data and I need to insert those data into MongoDB.
I am able to achieve this using below code. But it takes 1.5 min. I need to push within fraction of seconds. Is there any other way to push huge array data into MongoDB?
HeadDet is an array and has 3 million record.
session, err := mgo.Dial(“localhost”)
if err != nil {
panic(err)
}
defer session.Close()
// Optional. Switch the session to a monotonic behavior.
session.SetMode(mgo.Monotonic, true)
c := session.DB("Test").C("Indicators")
for i := 0; i < len(HeadDet); i++ {
err = c.Insert(HeadDet[i])
}
if err != nil {
log.Fatal(err)
}
I have referred this link
Upvotes: 2
Views: 2152
Reputation: 417512
First, drop labix.org/mgo
(aka gopkg.in/mgo.v2
), it's obsolte, unmaintained. Instead use the community supported fork: github.com/globalsign/mgo
.
Next, to perform inserts or updates in masses, use the Bulk API introduced in MongoDB 2.6. The mgo
driver has support for bulk operations using the mgo.Bulk
type.
You want to insert "30 lakhs records". For those who don't know, "lakh" is a unit in the Indian numbering system equal to one hundred thousand (100,000). So 30 lakhs is equal to 3 million.
Using the Bulk API, this is how you can insert all those efficiently:
c := session.DB("Test").C("Indicators")
// BULK, ORDERED
bulk := c.Bulk()
for i := 0; i < len(HeadDet); i++ {
bulk.Insert(HeadDet[i])
}
res, err := bulk.Run()
Note that if you don't care about the order of inserts, you may put the bulk operation in unordered mode which may speed things up:
// BULK, UNORDERED
bulk := c.Bulk()
bulk.Unordered()
for i := 0; i < len(HeadDet); i++ {
bulk.Insert(HeadDet[i])
}
res, err := bulk.Run()
For comparison, on my computer (client-server is the same machine so no network latency) a loop with 3 million individual inserts takes 5 minutes and 43 seconds.
The ordered Bulk operation to insert 3 million documents takes 18.6 seconds!
The unordered Bulk operation to insert 3 million documents takes 18.22 seconds!
Upvotes: 3