Reputation: 2673
I have a json file with an array of ~120K documents. I'm importing the json file to a mongo collection
mongoimport --db my-db --collection my_collection -j 4 file.json --jsonArray --batchSize 5 -v
It stopped randomly and never finished. I tried to run with -v but couldn't see any useful log.
Upvotes: 2
Views: 740
Reputation: 11
I was also facing the issue for myfile which was arounf 1.5GB. I was to solve it using both batchsize
and numInsertionWorkers
.
mongoimport --db cam --collection cost --type json
--file /data/db/cost1/cost500000.json --jsonArray
--numInsertionWorkers 500 --batchSize 1;
Upvotes: 1
Reputation: 2673
The problem was solved after upgrading mongo to the current latest version 3.2.11
Upvotes: 2
Reputation: 15036
Can you try reducing the batchSize to 1 --batchSize 1
mongoimport --db my-db --collection my_collection -j 4 --file file.json --jsonArray --batchSize 1
You can also try increasing the -j
parameter to 8
if you have that many logical cores.
Upvotes: 0