Can't connect to remote mongodb with macOS - mongodb

I am trying to connect to a mongodb service hosted on IBM Cloud following this instructions.
When I run the following command
mongo -u $USERNAME -p $PASSWORD --ssl --sslCAFile c5f07836-d94c-11e8-a2e9-62ec2ed68f84 --authenticationDatabase admin --host replset/bdb98a3ac10-0.b8a5e798d2d04f2e860d042c915.databases.appdomain.cloud:30484,bd576-96db98a3ac10-1.b8a5e4e5d042c915.databases.appdomain.cloud:30484
I get this error on macOs, while on Windows 10 the connection is correctly estiblished:
SSL peer certificate validation failed: Certificate trust failure:
Invalid Extended Key Usage for policy; connection rejected
If I connect via MongoDB Compass instead of using the terminal the connection works

I had to add --sslAllowInvalidCertificates flag
https://docs.mongodb.com/manual/reference/configuration-options/#net.ssl.allowConnectionsWithoutCertificates

Related

Windows Batch Script for Automated Mongodb Backup

I have setup a superuser in mongodb and I am able to do a backup manually in the command line or connect to the database with the username and password in the console or in Compass, but if I move that command to a batch script and run the script, I always get the error:
Failed: can't create session: could not connect to server: connection() : auth error: sasl conversation error: unable to authenticate using mechanism "SCRAM-SHA-256": (AuthenticationFailed) Authentication failed.
The command is:
mongodump --username myusername --password mypassword --out C:\backups --db mydb --authenticationDatabase admin
I have updated the bindIp to 0.0.0.0 and added security: authorization: "enabled".
Any thoughts on why I can't run this in the batch script. I am using Mongo 4.2
Thanks

Programmatically connect to remote MongoDB with SSH

I need to use terminal to connect to MongoDB. I have almost precisely same issue as this StackExchange question.
In my case I can correctly use Robo3T to connect. As well as use command
mongo --host 111.111.111.111 --port 111 --authenticationDatabase DB --username USER --password PASS locally. With same command executed remotely I receive following error:
No connection could be made because the target machine actively refused it.
I wanted to precisely recreate my Robo3T connection setup to see if SSH tunnel solves my issue

Why can't `mongoimport` connect to my free-tier MongoDB Atlas cluster?

I have a free-tier MongoDB Atlas cluster, and I want to upload some JSON to it from my Mac.
I tried to run this:
mongoimport -h cluster0-shard-00-00-cxacx.mongodb.net:27017 -d literature -c books -u xxxx -p xxxx --file ~/Desktop/books.json --type json --ssl
However, I got this error:
Failed: error connecting to db server: no reachable servers, openssl error: SSL errors: SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version
I tried without --ssl in the command, and I got this instead:
Failed: error connecting to db server: no reachable servers
Why can't it reach the server?
More info:
My mongoimport version is r3.2.22.
My OS is macOS 10.14.5.
My cluster version is 4.0.10.
The user has admin permission.
My IP address is white-listed.
I can connect to the cluster with the MongoDB Node Driver.
I get the same problem with mongofiles.
I also tried variations of the command with no success that included --authenticationDatabase admin and the name of the replica set and other server nodes.
Update: I was able to upload the JSON file by sending it to a Linux cloud server and running mongoimport from there.
Monogo atlas gives you service records (SRV) and not host.
"cluster0-shard-00-00-cxacx.mongodb.net" It is not a proper host.
Use it like:
mongoimport --uri
"mongodb://root:#atlas-host1:27017,atlas-host2:27017,atlas-host3:27017/?ssl=true&replicaSet=myAtlasRS&authSource=admin"
-d literature -c books -file ~/Desktop/books.json --type json --ssl
You will get complete URI from mongo atlas connect screen

How do I access a remote aws lightsail mongodb over ssh tunnel

I have a Lightsail AWS instance up and running with a MEAN stack. I have an existing MEAN stack running on a different network. At the moment the node server.js connects to localhost for the mongo bit (on same machine) and all I want to do is replace the localhost with a connection to my mongo running on my AWS remote server.
I understand, that for security reasons, it is best to ssh tunnel this connection, which I think I am familiar with.
What I have done so far is this:
In a console on the machine hosting the node server (remote to the db) I have run:
ssh -L 8181:127.0.0.1:80 -i ~/LightsailDefaultPrivateKey-eu-west-2.pem bitnami#31.16.56.125 -N
I can then browse to the RockMongo UI from the local machine using localhost:8181/rockmongo ...yay.
If I then run the following:
ssh -L 8181:127.0.0.1:27017 -i ~/LightsailDefaultPrivateKey-eu-west-2.pem bitnami#31.16.56.125 -N
(27017 being the mongo port)
Then try and access the db from my remote machine using:
mongo --username XXXXXX --password XXXXXX 31.16.56.125:8181/testdata
I get the following error:
~]
2017-12-28T22:11:09.791+0000 Error: couldn't connect to server 31.16.56.125:8181 (31.16.56.125), connection attempt failed at src/mongo/shell/mongo.js:148
exception: connect failed
Am I doing this wrong? i.e. is the tunnel only for http connections and not mongo command line use? Do I need to test the connection some other way?
I've Googled all over the place for this and not had much luck (a lot of the AWS docs suggest punching a hole in the firewall - which one can no longer do!)
OK I've (partially) solved this, there were a few things wrong.
1) The mongo client was 2.6 and mongo running on AWS was 3.4. Upgrading this solved some issues - in that I was getting a more meaningful error message.
One thing I did have trouble with is that apt-get seemed to perform an update, yet the version reported when issuing the mongo command was still 2.6.
To solve this I had to run sudo apt-get purge mongodb-org* (note the asterisk). Then perform the update.... If you need to do this then follow these instructions:
https://docs.mongodb.com/v3.2/tutorial/install-mongodb-on-ubuntu/
2) This command
mongo --username XXXXXX --password XXXXXX 31.16.56.125:8181/testdata
won't work as I've omitted 'admin' from it and not specified localhost!
but
mongo admin --username XXXXXX --password XXXXXX localhost:8181/testdata
doesn't work either and gives the following output.
2018-01-03T22:00:42.380+0000 W NETWORK [thread1] Failed to connect to 127.0.0.1:27017, in(checking socket for error after poll), reason: errno:111 Connection refused
2018-01-03T22:00:42.380+0000 E QUERY [thread1] Error: couldn't connect to server 127.0.0.1:27017, connection attempt failed :
connect#src/mongo/shell/mongo.js:229:14
#(connect):1:6
The only command I could get to work is:
mongo admin --username XXXXXX --password XXXXXX --port 8181
The default host is localhost, so in this case it uses the tunnel, this will also just connect to the test db, you can then admin from there.
What I haven't got to the bottom of is the specification of the host:port/db as an argument as per the output from running mongo --help
usage: mongo [options] [db address] [file names (ending in .js)]
db address can be:
foo foo database on local machine
192.169.0.5/foo foo database on 192.168.0.5 machine
192.169.0.5:9999/foo foo database on 192.168.0.5 machine on port 9999

mongodump and mongorestore with SSL

Getting mongodump and mongorestore work with security quite troublesome.
I have mongod v3.4.1 with requireSSL running at 192.168.99.100. It is IP address of VirtualBox docker machine running on my Windows. It is just for testing of-cause.
The instance already configured to use TLS/SSL both server and client signed with the same CA. I use the IP address for mongod Common Name to allow hostname validation. The authentication already enabled to accept my client certificate.
So everything is working. I can connect to it like this:
mongo --ssl --host 192.168.99.100 --sslCAFile rootCA.pem --sslPEMKeyFile me.pem
but now I can't get both mongodump and mongorestore working:
mongodump --ssl --host 192.168.99.100 --sslCAFile rootCA.pem --sslPEMKeyFile me.pem -d olddb
mongorestore --ssl --host 192.168.99.100 --sslCAFile rootCA.pem --sslPEMKeyFile me.pem -d newdb --dir=dump/olddb
Both return this error:
2017-01-13T04:28:03.881+0800 Failed: error connecting to db server: no reachable servers, openssl error: Host validation error
I have been trying to turn off client certificate, use username/password but still did not work. I need to remove the SSL in order to make it work.
That means I can only use preferSSL in production.
There is no way to bypass SSL in localhost if I stick with requireSSL.
Anyone getting the same error? Is it a known issue?
Add this option to the command-line:
--sslAllowInvalidHostnames
Full connection sample:
mongo --host 192.168.99.100 --username luke
--password skywalker --authenticationDatabase admin --ssl --sslCAFile rootCA.pem --sslPEMKeyFile me.pem --sslAllowInvalidHostnames
First check your logs at path /var/log/mongodb/mongod.log
Also there is default path of SSL Certificates. And for unix based systems given an SSL certificate located at /etc/ssl/mongodb.pem
As per my understanding for this problem, i would say certificate path needs to be checked. SSL certificate is not located in your windows environment. Try adding full path of certificate.
Also look into this https://docs.mongodb.com/manual/tutorial/configure-ssl-clients/
Happy coding