I've create a docker image in order to seed my dockerized mongo instance:
FROM mongo:2.6
MAINTAINER Living Digital Way
COPY ./clients.init.json .
COPY ./users.init.json .
CMD mongoimport --host mongo --db lvdb --collection clients --type json --file ./clients.init.json --jsonArray --upsert --upsertFields client_id
CMD mongoimport --host mongo --db lvdb --collection users --type json --file ./users.init.json --jsonArray --upsert --upsertFields username
I first kick off my mongo instance:
docker run --name mongo -p 27017:27017 --hostname mongo mongo:2.6
After that, I perform my image:
docker run --link mongo:mongo registry.private.com/mongo_seed:demo
This is the output:
# docker run --name mongo-seed --link mongo:mongo registry.private.com/mongo-seed:demo
Unable to find image 'registry.private.com/mongo-seed:demo' locally
v1: Pulling from mongo-seed
046d0f015c61: Already exists
ba95eb02831f: Already exists
53dc8636c4de: Already exists
a1ba40c46d70: Already exists
58b7d37cc7a7: Already exists
6fc4041cef29: Already exists
4cb494f83a39: Already exists
29839a673e80: Pull complete
cc731752cc1a: Pull complete
Digest: sha256:9a88d141b426fb7e96d2418f63c1587f6c055602d77c13ddd4ef744d66d6acc2
Status: Downloaded newer image for registry.private.com/mongo-seed:demo
connected to: mongo
2016-09-09T12:11:42.194+0000 imported 1 objects <<<<<<<<<<<<<<<<
As you can see only the last CMD is performed.
Am I doing something wrong?
There can only be one CMD instruction in a Dockerfile. If you list more than one CMD then only the last CMD will take effect. I suggest you put the two commands in to a separate import.sh, copy it to your container and run it using CMD.
COPY ./clients.init.json .
COPY ./users.init.json .
COPY ./import.sh .
RUN ["chmod", "+x", "import.sh"] # -> only required, if import.sh is not executable
CMD ["import.sh"]
Related
I have made a backup from a mongoDb collection with the following command
mongodump -h 127.0.0.1 --port 9001 -d meteor -c products
I have copied the dump folder recursively to my server with the following command
scp -r dump root#66.204.148.25:/root
I can not restore with the following command
docker exec -i mongodb mongorestore -d mew -c audioQuestions_Joker dump/meteor
the files are there but I get the following message
2020-01-12T11:38:10.863+0000 Failed: mongorestore target 'dump/meteor' invalid: stat dump/meteor: no such file or directory
What would be the correct command to restore the collection backup from the BSON files?
FYI docker abernix/meteord:node-8.4.0-base
Thanks
You need to have sudo privileges.
1 Modify docker-compose.* file for MongoDB container and add new volume
Suppose you have something like this:
mongo:
image: mongo:4.2
container_name: mongodb
ports:
- 27017:27017
volumes:
- "./local/path:/data/db"
- "./local/tmp/path:/home"
restart: always
command: --auth
2 You need to build
sudo docker-compose -f docker-compose.yml up -d
3 Copy inside ./local/tmp/path you dump folder and check if MongoDB container has access to it.
sudo docker ps
sudo docker exec -it mongodb_CONTAINER ID /bin/bash
# ls /home - If you see there dump folder, continue step 4
4 Now, execute MongoDB restore (you may do it inside MongoDB container)
mongorestore -d mew -c audioQuestions_Joker /home/dump/meteor
5 If you have restored successfully, update again docker-compose and remove - "./local/tmp/path:/home" and build.
I use Docker to develop.
docker exec -it <My-Name> mongo
I want to import the data to MongoDB from a JSON file, but it fails.
The command is
mongoimport -d <db-name> -c <c-name> --file xxx.json
What can I do?
With your description it seems that
you have a datafile in json format in your host machine and you want to import this data into mongodb which is running in a container.
You can follow following steps to do so.
#>docker exec -it <container-name> mongo
#>docker cp xxx.json <container-name-or-id>:/tmp/xxx.json
#>docker exec <container-name-or-id> mongoimport -d <db-name> -c <c-name> --file /tmp/xxx.json
In the last step you have to use file path that is available in the container.
To debug more if required you can login into the container and execute the way you do on the Linux machines.
#>docker exec -it <container-name-or-id> sh
sh $>cat /tmp/xxx.json
sh $>mongoimport -d <db-name> -c <c-name> --file /tmp/xxx.json
Run without copy file:
docker exec -i <container-name-or-id> sh -c 'mongoimport -c <c-name> -d <db-name> --drop' < xxx.json
Step 1: Navigate to the directory where the JSON file located from your host terminal.
Step 2: Use command "docker cp xxx.json mongo:/tmp/xxx.json” to copy the JSON file from current host directory to container "tmp" directory.
Step 3: Navigate to container command shell by using command “docker container exec -it mongo bash”.
Step 4: Import the collection from "tmp" folder to container database by using command : "mongoimport --uri="<mongodb connection uri>" —collection=<c-name> --file /tmp/xxx.json"
What we have:
mongodb running in docker container.
json file to import from local machine
mongoimport.exe on local machine
What to do to import this json file as a collection:
mongoimport --uri=<connection-string> --collection=<collection-name> --file=<path-to-file>
Example:
mongoimport --uri="mongodb://localhost:27017/test" --collection=books --file:"C:\Data\books.json"
More details regarding mongoimport here: https://www.mongodb.com/docs/database-tools/mongoimport/
I am trying to create a Docker container with MongoDB and import data into it. I have tried using the following dockerfile:
FROM mongo
# This will be created if it doesn't exist
WORKDIR /app/data/
# Copy dependency definitions
COPY mydata.csv .
ENTRYPOINT mongod
# Import data
RUN mongoimport --host=127.0.0.1 -d mydb -c reports --type csv --file mydata.csv --headerline
I get the following error:
Failed: error connecting to the db server: no reachable servers
Any suggestions? Thanks!
Try this :
mongoimport --host 127.0.0.1 --port <specifyPort> -d mydb -c reports --type csv --file mydata.csv --headerline
Setup for the problem:
Create a data volume container
$ docker create --name dbdata -v /dbdata mongo /bin/true
Start mongo in a container linked to the data volume container
$ docker run -d --name mongo --volumes-from dbdata mongo
Verify you can connect to mongo using the mongo client
$ docker run -it --link mongo:mongo --rm mongo sh -c 'exec mongo "$MONGO_PORT_27017_TCP_ADDR:$MONGO_PORT_27017_TCP_PORT/test"'
The problem:
The docker-machine ssh takes a host and a command argument to execute on the host. I'd like to execute the following mongodump command, which works once I ssh into the docker host:
$ docker-machine ssh demo
root#demo:~# docker run --rm --link mongo:mongo -v $HOME:/backup mongo bash -c 'mongodump --out /backup --host $MONGO_PORT_27017_TCP_ADDR'
2015-09-15T16:34:02.676+0000 writing test.samples to /backup/test/samples.bson
2015-09-15T16:34:02.678+0000 writing test.samples metadata to /backup/test/samples.metadata.json
2015-09-15T16:34:02.678+0000 done dumping test.samples (1 document)
2015-09-15T16:34:02.679+0000 writing test.system.indexes to /backup/test/system.indexes.bson
However, using the docker-machine ssh command to execute the above command in a single step doesn't work for me:
$ docker-machine ssh demo -- docker run --rm --link mongo:mongo -v $HOME:/backup mongo bash -c 'mongodump --out /backup --host $MONGO_PORT_27017_TCP_ADDR'
SSH cmd error!
command: docker run --rm --link mongo:mongo -v /Users/tony:/backup mongo bash -c mongodump --out /backup --host $MONGO_PORT_27017_TCP_ADDR
err : exit status 1
output : 2015-09-15T16:53:07.717+0000 Failed: error connecting to db server: no reachable servers
So if the container to run the mongodump command can't connect to the mongo container, I figure there's probably an issue with --host $MONGO_PORT_27017_TCP_ADDR (it should be passed as is into the container, so premature expansion causing an empty string?), but a bit stumped trying to get it right. Any ideas are appreciated.
Update: I'm one step closer. The following appears to execute the command correctly, although the data isn't written to the system and the session hangs:
$ docker-machine ssh demo -- $(docker run --rm --link mongo:mongo -v $HOME:/backup mongo bash -c 'mongodump --out /backup --host $MONGO_PORT_27017_TCP_ADDR')
2015-09-15T18:02:03.347+0000 writing test.samples to /backup/test/samples.bson
2015-09-15T18:02:03.349+0000 writing test.samples metadata to /backup/test/samples.metadata.json
2015-09-15T18:02:03.349+0000 done dumping test.samples (1 document)
2015-09-15T18:02:03.350+0000 writing test.system.indexes to /backup/test/system.indexes.bson
The question asked for a solution based on docker ssh, but since no one responded, I'll answer the question myself with what is a better solution anyway.
As suggested by Nathan LeClaire (#upthecyberpunks) to me over twitter, the better solution is to avoid the hassle altogether and simply run a container to execute the mongodump command.
$ docker run \
--rm \
--link mongo:mongo \
-v /root:/backup mongo bash \
-c 'mongodump --out /backup --host $MONGO_PORT_27017_TCP_ADDR'
Not technically required for the answer, but the resulting test db backup file can then be transferred from the docker host machine to your current directory via docker scp:
$ docker-machine scp -r dev:/root/test .
Since I cannot add a comment to the orginal nice answer, just add a little explanation here, $MONGO_PORT_27017_TCP_ADDR should be the ip of our machine, for example, the virtual machine's ip in my virtualbox is 100.100.100.10, so the last line shoulb be:
-c 'mongodump --out /backup --host 100.100.100.10' or
-c 'mongodump --out /backup --host 100.100.100.10:27017'.
If not add the host field, great chances are that we will encounter some error like:
*** Failed: error connecting to db server: no reachable servers.
And thanks again to the orginal answer ^_^.
I created a java application on openshift with the mongoDb cartridge.
My application runs fine, both locally on jboss AS7 as on openshift.
So far so good.
Now I would like to import an csv into the mongoDb on the openshift cloud.
The command is fairly simple:
mongoimport -d dbName -c collectionName --type csv data.csv --headerline
This works fine locally, and I know how to connect to the openshift-shell and remote mongo-db. But my question is: how can I use a locally stored file (data.csv) when executing this commando in a ssh-shell.
I found this on the openshift forum, but I don't realy know what this tmp directory is and how to use it.
I work on windows, so I use Cygwin as a shell-substitute.
Thanks for any help
The tmp directory is shorthand for /tmp. On Linux, it's a directory that is cleaned out whenever you restart the computer, so it's a good place for temporary files.
So, you could do something like:
$ rsync data.csv openshiftUsername#openshiftHostname:/tmp
$ ssh openshiftUsername#openshiftHostname
$ mongoimport -d dbName -c collectionName --type csv /tmp/data.csv --headerline
This is what I needed in October 2014:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST --port $OPENSHIFT_MONGODB_DB_PORT -u admin -p 123456789 -d dbName -c users /tmp/db.json
Note that I used a json file instead of csv
When using Openshift you must use the environment variables to ensure your values are always correct. Click here to read more about Openshift Envrionment variables
SSH into your openshift server then run (remember to change the bold bits in the command to match your values):
mongoimport --headerline --type csv \
--host $OPENSHIFT_NOSQL_DB_HOST \
--port $OPENSHIFT_NOSQL_DB_PORT \
--db **your db name** \
--collection **your collection name** \
--username $OPENSHIFT_NOSQL_DB_USERNAME \
--password $OPENSHIFT_NOSQL_DB_PASSWORD \
--file ~/**your app name**/data/**your csv file name**
NOTE
When importing csv files using mongoimport the data is saved as strings and numbers only. It will not save arrays or objects. If you have arrays or object to be saved you must first convert your csv file into a proper json file and then mongoimport the json file.
I installed RockMongo on my openshift instance to manage the mongodb.
It's a nice userinterface, a bit like phpMyAdmin for mysql
Users who wish to use mongorestore the following worked for me:
First copy your dump using scp to the data dir on openshift:
scp yourfile.bson yourhex#yourappname.rhcloud.com:app-root/data
rhc ssh into your app and cd to the app-root/data folder.
mongorestore --host $OPENSHIFT_MONGODB_DB_HOST
--port $OPENSHIFT_MONGODB_DB_PORT
--username $OPENSHIFT_MONGODB_DB_USERNAME
--password $OPENSHIFT_MONGODB_DB_PASSWORD
-d yourdb
-c yourcollection
yourfilename.bson --drop
Similar to Simon's answer, but this is how I imported .json to the database:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST -u admin -p 123456 --db dbname --collection grades < grades.json