Reputation: 4135
I am newbie on Mongodb. Actually I have thousands of files in different folders. All of files include json data. There are more than 30 millions files. So I think the best way to store this data is document based db.
I am aware of Import more than 1 json file using mongoimport this SO post. However, accepted answer require a collection which has file names in it. I cannot put 30 millions file name in a collection...
How can I import multiple json files to Mongodb on Windows env?
Upvotes: 0
Views: 7564
Reputation: 31
I was looking for the solution for 2 days and here is the solution which works for me:
C:\MongoDB\Server\3.0\bin>
for %i in (C:\test\*) do
mongoimport --file %i --type json --db mydb --collection mycollection
You just copy and paste this code in the cmd and change the file directories C:\MongoDB\Server\3.0\bin
and C:\test\
.
Upvotes: 3
Reputation: 962
For anyone searching for a cross-platform solution to this, I created a little perl script that will do this. It takes a database and directory argument and will import any .json files it finds in the directory to mongodb. If you dont give it a directory, it just uses the one you're currently in. I need to refine the regex that checks for the .json files a bit and I'm sure this can be done with less code, (I'm a newbie Perl monk) but this works and I like Perl.. so, to anyone that finds this - enjoy.
#!/usr/bin/perl
use strict;
use warnings;
#this is a script for enumerating over every json file in a folder and importing it into mongodb
my ($database, $directoryPath) = @ARGV;
if(! $database) { #check for required database argument
die "A database argument must be provided to the script. Ex: perl mongorestore.pl wasp";
}
#if a directory path is not given in arguments, operate in the current directory.
if(!$directoryPath) {
$directoryPath = '.';
}
#open directory and import json files to mongo
opendir my $dir, $directoryPath or die "Cannot open directory at path $directoryPath.";
my @files = readdir $dir;
importJSONToMongo(@files);
closedir $dir;
#subroutine that takes an array of json files and imports them to the given mongodb database
sub importJSONToMongo {
foreach my $file (@_) {
if($file =~ /.json/) { #only import json files - need to make this regex better (it would match *.metadata.json and other extraneous files)
$file =~ /(^.+?)(?=\.)/; #capture the filename before the '.json' extension
system("mongoimport -d $database -c $1 --jsonArray --file $directoryPath/$1.json");
}
}
}
Upvotes: 1
Reputation: 139
You can create a batch script which fetches all json file in a given folder and then imports it to the db:
@echo off
for %%f in (*.json) do (
"mongoimport.exe" --jsonArray --db databasename --collection collectioname --file %%~nf.json )
Hope this helps
Upvotes: 1
Reputation: 36784
You will need to write a script in your favourite language that reads each file, JSON-decodes it and then inserts them one by one into MongoDB. In PHP, such a script would be akin to:
<?php
$f = glob("*.json");
$m = new MongoClient;
$c = $m->myDb->myCollection;
foreach ( $f as $fileName )
{
$contents = json_decode( file_get_contents( $fileName ) );
$c->insert( $contents );
}
?>
Upvotes: 1