Reputation: 95
I created a java application on openshift with the mongoDb cartridge. My application runs fine, both locally on jboss AS7 as on openshift. So far so good. Now I would like to import an csv into the mongoDb on the openshift cloud. The command is fairly simple:
mongoimport -d dbName -c collectionName --type csv data.csv --headerline
This works fine locally, and I know how to connect to the openshift-shell and remote mongo-db. But my question is: how can I use a locally stored file (data.csv) when executing this commando in a ssh-shell.
I found this on the openshift forum, but I don't realy know what this tmp directory is and how to use it. I work on windows, so I use Cygwin as a shell-substitute.
Thanks for any help
Upvotes: 5
Views: 3775
Reputation: 1454
Similar to Simon's answer, but this is how I imported .json
to the database:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST -u admin -p 123456 --db dbname --collection grades < grades.json
Upvotes: 0
Reputation: 8111
Users who wish to use mongorestore
the following worked for me:
First copy your dump using scp to the data dir on openshift:
scp yourfile.bson [email protected]:app-root/data
rhc ssh into your app and cd to the app-root/data
folder.
mongorestore --host $OPENSHIFT_MONGODB_DB_HOST
--port $OPENSHIFT_MONGODB_DB_PORT
--username $OPENSHIFT_MONGODB_DB_USERNAME
--password $OPENSHIFT_MONGODB_DB_PASSWORD
-d yourdb
-c yourcollection
yourfilename.bson --drop
Upvotes: 1
Reputation: 21005
This is what I needed in October 2014:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST --port $OPENSHIFT_MONGODB_DB_PORT -u admin -p 123456789 -d dbName -c users /tmp/db.json
Note that I used a json file instead of csv
Upvotes: 4
Reputation: 885
When using Openshift you must use the environment variables to ensure your values are always correct. Click here to read more about Openshift Envrionment variables
SSH into your openshift server then run (remember to change the bold bits in the command to match your values):
mongoimport --headerline --type csv \
--host $OPENSHIFT_NOSQL_DB_HOST \
--port $OPENSHIFT_NOSQL_DB_PORT \
--db **your db name** \
--collection **your collection name** \
--username $OPENSHIFT_NOSQL_DB_USERNAME \
--password $OPENSHIFT_NOSQL_DB_PASSWORD \
--file ~/**your app name**/data/**your csv file name**
NOTE When importing csv files using mongoimport the data is saved as strings and numbers only. It will not save arrays or objects. If you have arrays or object to be saved you must first convert your csv file into a proper json file and then mongoimport the json file.
Upvotes: 3
Reputation: 95
I installed RockMongo on my openshift instance to manage the mongodb. It's a nice userinterface, a bit like phpMyAdmin for mysql
Upvotes: 1
Reputation: 23592
The tmp directory is shorthand for /tmp
. On Linux, it's a directory that is cleaned out whenever you restart the computer, so it's a good place for temporary files.
So, you could do something like:
$ rsync data.csv openshiftUsername@openshiftHostname:/tmp
$ ssh openshiftUsername@openshiftHostname
$ mongoimport -d dbName -c collectionName --type csv /tmp/data.csv --headerline
Upvotes: 8