mongodump is abruptly getting killed inside a container of mongo:latest image - mongodb

I am running a container with image mongo:latest. Starting the container:
sudo docker run -it -v /tmp/adhock-container/mongo_latest:/tmp/ mongo:latest /bin/bash
I want to take the backup of Mongo database so I run the following:
# mongodump -vvvv --host MY_HOSTNAME --port 27017 --username MY_USERNAME --password MY_PASSWORD --gzip --archive=ARCHIVE_PATH
In the console log I am getting only the following output:
2020-04-06T08:00:08.007+0000 done dumping ******** (1104 documents)
2020-04-06T08:00:08.007+0000 writing ************ to archive 'standalone.gzip'
2020-04-06T08:00:08.007+0000 MuxIn open ************_log
Killed
#
Server memory stats
$ free -m
total used free shared buff/cache available
Mem: 459 328 6 0 124 123
Swap: 0 0 0
Not sure why my mongodump process is getting killed. I think it may be because of memory issues but I am not sure how to fine tune those customizations.

I would start with consulting the MongoDB server log. See "Container shell access and viewing MongoDB logs" here if this is the image you are using.
To determine whether your container is running out of memory, get a shell on it and run top while dumping.

I faced the same issue, it happened when I was executing mongodump on the database host system. However, I did small workaround, for me the the infra looked like:
Source Mongo <---> My PC <---> Dest Mongo (K8S)
I did run mongorestore on my PC, in container, connected to remote machine and dumped to another remote instance. This was slower but didn't kill the process.
Run mongorestore on your PC and ensure you have access to remote DB or do a port-forwarding thingy.
$ docker run --network host --entrypoint bash -it mongo:4.2
Dump the archive
$ mongodump --archive=test.archive --db db --username username --host <address>
Restore the archive from inside the container
$ mongorestore --archive=test.archive --db db --username username --host <address>
This is the only way I managed to not crash the mongodump. If someone runs into this issue, I would consider having mongo container on your OS/some different machine, do a dump from this point and then restore as mentioned.

Related

How can I import data file from local, and to build an image with data use mongodb

Now I built an image(userdb) with the usersInfo.js data file by the Dockerfile:
FROM mongo
COPY usersInfo.js /data/db
COPY script.sh .
RUN chmod +x script.sh
CMD [ "./script.sh"]
and the script.sh file as:
$ mongoimport --host 127.0.0.1 --db users --collection usersInfo --drop
--file /data/db/usersInfo.js
when I run the container as:
docker run --name test -it userdb
it gives me the warn:
2017-11-10T23:14:39.781+0000 [........................]
users.usersInfo 0B/10.9KB (0.0%)
2017-11-10T23:14:40.286+0000 [........................]
users.usersInfo 0B/10.9KB (0.0%)
2017-11-10T23:14:40.286+0000 Failed: error connecting to db server:
no reachable servers
And I run the server on the local computer, it still not works.
What I am trying to do is to import the usersInfo.js file from the image (userdb)itself, why does it ask me to connect the server here? How to fix it?

How to restore remote MongoDB server with local mongodump data

We have a remote MongoDB server and we have mongodump data on a local developer's machine. What is the best way to restore the remote MongoDB server data with the local data? Is there a mongo command that we can use?
Alright so we did this in two steps. I think you can do it in one step, with just mongorestore.
First we moved the data from the local machine to the remote machine with the scp command:
scp <path-to-mongofile> <remote-host>:<absolute-file-path>
then we ssh'd into the remote mongod server, and used mongorestore to restore the db
mongorestore --host=$HOST --port=$PORT -u $ADMIN_USER -p $PSWD --db <your-db> <absolute-path-to-restore-db> --authenticationDatabase "admin"
but I think the first scp command is redundant. In fact, if you cannot ssh into the server running mongod, then you will have to use the mongorestore command directly from the local developer's machine.
Just use mongorestore but point it towards the remote server, such as:
$ mongorestore -h ds01234567.mlab.com:12345 -d heroku_fj33kf -u <user> -p <password> <input db directory>
From MongoLab's docs

Calling mongodump wrapped into docker

My setup is as follows:
there is a mongodb replica set (v. 2.4.8), which I like to backup via mongodump
there is a machine (NAS) outside the replica set, which should perform the the backup task, but does not have the mongodump binary installed. But it has docker support.
So there is my idea to use docker to perform the mongodump on the NAS. A shell script "mongodump.sh" should wrap the docker call to mongodump with all needed params and I would call it like:
mongodump.sh --host rs/url -u backup -p "password" --out ./dump/
Is this possible with docker? What would the shell script look like?
If found the solution. The command I use to perform the mongodump via docker ist:
docker run --rm --name some-mongo -v /volume1/Backups/mongodump:/dumps --entrypoint mongodump mongo --host rs1/myserver.net -u backup -p "password" --out /dumps

MongoDB Container Dockerfile no reachable servers

I'm trying to build an easy Dockerfile to Copy files from current directory into container then run a mongorestore command to seed the data. I've looked at many different websites and I'm still getting the following error.
2016-08-17T03:03:22.639+0000 Failed: error connecting to db server: no reachable servers
The command '/bin/sh -c mongorestore --drop /mongo-seed/mongo-seed-data/mongo-dump --host 127.0.0.1:27017' returned a non-zero code: 1
When I "bash" into the container and run the mongorestore command with the same parameters it populates database. I'm at a loss, please help.
Below is the Dockerfile
FROM mongo
COPY . /mongo-seed
EXPOSE 27017
CMD ["mongod"]
RUN mongorestore --drop /mongo-seed/mongo-seed-data/mongo-dump --host 127.0.0.1:27017
CMD is run when you start up the container.
So mongod is not running when docker executes the last RUN instruction of your dockerfile while building the image.
FROM mongo
COPY . /mongo-seed
# EXPOSE 27017 //not necessary, the mongo base image already has that instruction
ENTRYPOINT mongod
Build and run: docker build -t foo . && docker run -d --name bar foo
Execute the mongorestore command:
docker exec bar mongorestore --drop ...

How to allow remote connections from mongo docker container

I am using the official mongodb docker container.
I want to connect to the mongodb container from my host machine on port 27017.
I ran the container with these ports exposed
-p 27017:27017
I am not able to connect (connection refused) and I believe its because the mongo conf file is not configured to allow remote connections. How can I configure it to allow? The official container does not have vi/nano installed to modify the image.
I am able to connect to mongodb from another container by creating a link - however this is not my wish
Better solutions for furthering:
https://blog.madisonhub.org/setting-up-a-mongodb-server-with-auth-on-docker/
https://docs.mongodb.com/v2.6/tutorial/add-user-administrator/
My answer to another question. How to enable authentication on MongoDB through Docker?
Here's what I did for the same problem, and it worked.
Run the mongo docker instance on your server
docker run -d -p 27017:27017 -v ~/dataMongo:/data/db mongo
Open bash on the running docker instance.
docker ps
CONTAINER IDIMAGE COMMAND CREATED STATUS PORTS NAMES
b07599e429fb mongo "docker-entrypoint..." 35 minutes ago Up 35 minutes 0.0.0.0:27017->27017/tcp musing_stallman
docker exec -it b07599e429fb bash
root#b07599e429fb:/#
Reference- https://github.com/arunoda/meteor-up-legacy/wiki/Accessing-the-running-Mongodb-docker-container-from-command-line-on-EC2
Enter the mongo shell by typing mongo.
root#b07599e429fb:/# mongo
For this example, I will set up a user named ian and give that user read & write access to the cool_db database.
> use cool_db
> db.createUser({
user: 'ian',
pwd: 'secretPassword',
roles: [{ role: 'readWrite', db:'cool_db'}]
})
Reference: https://ianlondon.github.io/blog/mongodb-auth/ (First point only)
Exit from mongod shell and bash.
Now run the mongo docker with auth enabled.
docker run -d -p 27017:27017 -v ~/dataMongo:/data/db mongo mongod --auth
Reference: How to enable authentication on MongoDB through Docker? (Usman Ismail's answer to this question)
I was able to connect to the instance running on a Google Cloud server from my local windows laptop using the below command.
mongo <ip>:27017/cool_db -u ian -p secretPassword
Reference: how can I connect to a remote mongo server from Mac OS terminal