Not able to migrate the data from Parse to local machine - mongodb

as some of you might aware about the shutting down of parse service in about a year, i am following the migration process as per their tutorials. However, i am not able to migrate these data from parse to local database(i.e. mongodb).
I've started the mongodb instanse locally on 27017, and also created an admin user as part of migration based on these tutorials. Reference-1 & Reference-2.
But when i try to migrate the data from parse developer console, i get No Reachable Servers or Network Error & i don't understand why. I have doubt in the Connection string that i use for this but i am not sure, please find the following image.
I am new to mongodb so don't have much idea about this, your little help would be greatly appreciated.

Since the migration tool runs at parse.com, the tool needs to be able to access your MongoDB instance over the Internet.
Since you're using a local IP (192.168.1.101), parse.com cannot connect to your IP and the transfer will time out.
Either you need to make your MongoDB reachable from the Internet, or you can - as they do in their guide - use an existing MongoDB service.

Related

Transfer MongoDB dump on external hard drive to google cloud platform

As a part of my thesis project, I have been given a MongoDB dump of size 240GB which is on my external hard drive. I'll have to use this data to run my python scripts for a short duration. However, since my dataset is huge and I cannot mongoimport on my local mongodb server (since I don't have enough internal memory), my professor gave me a $100 google cloud platform coupon so I can use the google cloud computing resources.
So far I have researched that I can do it this way:
Create a compute engine in GCP and install mongodb on remote engine. Transfer the MongoDB dump to remote instance and run the scripts to get the output.
This method works well but I'm looking for a method to create a remote database server in GCP so I that I can run my scripts locally, which is something like one of the following.
Creating a remote mongodb server on GCP so that I can establish a remote mongo connection to run my scripts locally.
Transferring the mongodb dump to google's datastore so then I can use the datastore API to remotely connect and run my scripts locally.
I have given a thought of using MongoDB atlas but because of the size of the data, I will be billed hugely and I cannot use my GCP coupon.
Any help or suggestions on how of either of the two methods can be implemented is appreciated.
There is 2 parts to your question
First, you can create a compute engine VM with MongoDB installed and load your backup on it. Then, open the right firewall rules for allowing the connexion from your local environment to the Google Compute Engine VM. The connexion will be performed with a simple login/password.
You can use a static IP on your VM. By the way, in case of reboot on the VM you will keep the same IP (and it will be easier for your local connexion).
Second, BE CAREFUL to datastore. It's a good product, serverless NoSQL database, document oriented, but it's absolutely not the MongoDB equivalent. You can't perform aggregate, you are limited in search capabilities,... It's designed for specific use case (I don't know yours, but don't think that is the MongoDB equivalent!).
Anyway, if you use Datastore, you will have to use a service account or to install Google Cloud SDK on your local environment to be authenticated and to be able to request Datastore API. No login/password in this case.

How can we access Bluemix hosted "Compose for MongoDB" service from "outside"?

Situation:
Have created today a new Compose for MongoDB Service instance in Bluemix
Need:
I have to access this MongoDB DIRECTLY with tools (eg. Mongo Managemant Studio Pro, mongo.exe, etc.) for bulkloading, testing, ad-hoc data fix, etc.
Problem:
I have not found any docs, samples nor a CLEAR statement that
a) gives me some confirmation that THIS is possible
b) gives me COMPLETE information (not just some technical fragments that might have worked year ago) how to do it.
Maybe I am looking to the wrong places or do not know the right people. However I am stuck on this, and before quitting Bluemix MongoDB maybe somebody has a copy/past solution or handson step by step manual.
Any help welcome. Thanks!
Connecting to MongoDB service in Bluemix from an application is possible. For this answer I have used the application "Robo3T" and here are the steps:
Access your MongoDB Service on you Bluemix account. Usually under
"Cloud Foundry Services"
Open section "Manage", from "Connection Settings" copy from "HTTPS" the connection address and port. In this example "sl-eu-lon-2-portal.5.dblayer.com" and "20651"
In Robo3T create a new connection with the connection address from previous step
In tab Authentication configure database name, username and password
. The credentials are found as in step 1
From "Connection Settings" copy the SSL Certificate into a text file and save locally.
In Robo3T Add the certificate to the connection in the "SSL" tab
Test the connection and save the settings
Answer
YES, Bluemix hosted Compose for MongoDB instances can be connected from the mongo Shell and some updated DB Managment tools.
However, you have to make sure, that in case you are running the newest DB versions, that your tools (shell and DB management GUIs) comply with the newest DB features such as encryption etc.
Origin of the Problem
My problem was due to older and therefore incompatible versions of the mongo shell and DB-managment tools running against the newest MongoDB versions with their specialities on encription and multiple servers to be handled in the URI.
At least two DB managment tools are not compatible with the newest DB version and will take their time to get fixed. The problem is, that both will not tell you about this. They just do not not connect. No logs on either side. Period.
So my advise here: look for tool providers who express dedicated compliance with the specific version of your DB.
Advise to the Bluemix Team
It might not take much time to provide some sample connection strings for the most common tools like the mongo shell, MongoBooster, etc. to take the hassle and guesswork out of interpreting the Environment variables and figuring out what is needed for specific connection strings and what is not.
For instance MongoDB Atlas hosting provides for every cluster readymade connection strings for many tools you can just copy/past and done!
Connecting to Atlas took me 5 Minutes. For Bluemix I have lost hours! Not because it is complex, but because the documentation and the generated Info is somehow incomplete and messy - at least for the ones who do not connection strings for their living!

Is there any other way other than MLab/ObjectRocket to migrate the app from parse

I have created an EC2 AWS instance with rocksDb engine now I am trying to migrate the parse application to this Instance as instructed here
https://gyminutesapp.wordpress.com/2016/04/28/migrating-parse-mongodb-to-mongorocks-on-aws
Is it compulsory that I have to do it via MLab/ObjectRockect or is there any other way??
Can Anyone help me out with the further steps, How to connect to parseServer and migrate the data?
You can move to any mongodb database. You can setup any server and install mongodb, allow remote access, and push your data from parse.com to your own mongodb database. This is the first step in parse migration process.
Below are the other steps to take care :
1. Host the open source parse server, configure it to connect to your database.
2. Fix your cloud code, minor changes might be required.
3. Replace the cloud modules that you are using with npm module counterpart.
deploy !!!

mongodb - user connection string, secure password

I've been following a tutorial with express, node and mongo.
I have in a config file on the server side:
production:{
db:'mongodb://MYUSERNAME:MYPASSWORD#ds033307.mongolab.com:33307/dbname',
rootPath:rootPath,
port:process.env.PORT||80
}
so, i have my username and password in clear text in a server side javascript file. should i be worried about this? if yes, where else can I put it?
Thanks.
Edit: I went back and had a look at mongolab and heroku (where my site is hosted) docs.
Where I found: "The MongoLab add-on contributes one config variable to your Heroku environment: MONGOLAB_URI", and so I was able to put the MONGOLAB_URI env var into my config and move the password out of the source code.
With regards to the same datacenter, am I right to assume heroku would not be hosting my mongolab database in their datacenter, but would instead be calling out to a cloud service mongo database? Not much I can do then, is there, if I want to stick with mongolab and heroku?
I know this question is old but according to Heroku's docs they currently use 2 datacenters (https://devcenter.heroku.com/articles/regions#data-center-locations).
Their US server is 'amazon-web-services::us-east-1' and their EU alternative is 'amazon-web-services::eu-west-1'.
Both of these data centers are available when launching mongo instances on Mongolab so you can choose for both your app and your db to be on the same datacenter giving much improved security.
I think you should always be concerned about storing passwords in source code files. Generally you would be much better off keeping it in a configuration file that is managed separately. This gives you the flexibility to use the same code with a different configuration file to point to development or qa databases.
Of bigger concern perhaps - are you hosting your application in the same datacenter that MongoLab is hosting your database? If not, that user name and password, along with your data, will traverse the internet in the clear.
MongoLab does not currently support SSL (other than for their RestAPI) so even they recommend being in the same data center:
Do you support SSL?
Not yet but it is on our roadmap to be available in Summer 2014. In
the meantime, we highly recommend that you run your application and
database in the same datacenter. If you have a Dedicated plan, we also
highly recommend that you configure custom firewall rules for your
database(s).
Rest API:
Each MongoLab account comes with a REST API that can be used to access
the databases, collections and documents belonging to that account.
The API exposes most the operations you would find in the MongoDB
driver, but offers them as a RESTful interface over HTTPS.
I would definitely read MongoLab's security page fairly closely:
https://docs.mongodb.com/manual/security/

talend , mongoDB connection

I am facing a problem with mongo DB connection.
I have succefully imported tMongo components it to my Talend Open Studio 5.1.1 and by copying the mongo 1.3.jar file to lib/java folder, my Mongo DB jobs are running successfully, but the problem is even if I provide some fake server path(IP) and fake port for mongoDB, my job is running without an error and it is giving me 1 row with no data. and same goes with right IP and port.
How do I resolve it.
I think the connection is not working. As you must be knowing, mongoDB checks that the connection is actually working or not when you perform a query on it.
(Yeah, it doesn't check for a successful connection when you just connect to it ).
I would suggest to instead add the mongoDB components present in Talend for Big Data by following the steps below:
Components provided for MongoDB are :
tMongoDBInput, tMongoDBOutput, tMongoDBConnection etc.
Or you can Download the components from http://www.talendforge.org/exchange/ and search for Mongo instead of using Talend Big Data. But I would suggest use Talend for big Data for it.
The components will be zipped format , Unzip the same. In Talend Big data you will find the components in Component folder.
Copy these Unzipped Components to the installation Path of TOS.
C:TalendTOS_DI-Win32-r84309V5.1.1pluginsorg.talend.designer.components.localprovider_5.1.1.r84309components
Copy the mongo-1.3.jar file in the component folder into the C:TalendTOS_DI-Win32-r84309-V5.1.1libjava
In many systems you might not be able to see this file then go with ADMINISTRATOR priviliges.
optional for few systems——>>> Inside index.xml add
save index.xml
Restart TOS
Then you will be able to use them as normal components.
Cheers!
The reason for the Job running without any error could be due to the connection / meta-data you have used for the Mongo Connector. It doesn't is not possible for the job to run without any error even after giving fakepath.
I guess you might configured (re-modified) the repository connection but using a built-in meta data for component.