Running pymongo to save data in local mongoDB database - mongodb

I was trying to save data from jupyter notebook in Visual Studio Code into my local mongoDB database. Previously, I had multiple databases in my local mongoDB server. However, after running the pymongo code, all the existing databases got deleted and only the new database that was created in the python code exists.
Is there a reason because of which this has occurred? How can I resolve it?
Here's the code I had used for the same:
# Making a Connection with MongoClient
client = MongoClient("mongodb://localhost:27017/")
# database
db = client["stocks_database"]
# collection
company= db["Company"]
db.list_collection_names()
company.insert_one({"index":"Sensex","data":'abc'})

Related

Delete or drop Mongodb database on server ecs2 through another server shell file

I am on Linux Server. My MongoDB Database is stored there. Now if I want to delete the database from another machine lets say my development machine, then for that I created the steps but not working for me.
create a delete.sh file
#!usr/bin/env mongo
#instance Ip assume:
mongo 12.122.12.12:27017/tempDatabase --eval "db.dropDatabase"
But I am unable to delete the Database. Any idea that where I have done mistakes. Any interaction is really appreciated

Byte is decreased after mongo copy to another server

I am a newbie to MongoDB.
I use mongodb db.copyDatabase command for copying database to another server.
After I copy(I got the Ok sign from mongo shell), I found something strange.
In my source Server db
> show dbs
newstrust 13.947GB
In my Target Server db
> show dbs
ntrust1 2.188GB
I checked my db and compare source server db and target server db. The number of collections and rows are same.
I am not able to understand the problem

Mongodb connecting to a database with old data when connecting through mongo --host

I have a instance running mongodb. I have used a config file while starting the database using mongod -f mongod.conf. I can connect to this mongodb instance from my application server instance. Recently, due to a software upgrade I had to restart the system. After that, I have been facing the following issue:
When I connect to the mongodb instance using mongo from the database instance(locally), it connects me to the proper database with the latest data. But, when I try to connect it from application server instance using mongo --host "ip_address", it is connecting to the same database but it showing the data which is some days old. I needed to know what the issue is and how will I be able to fetch the latest data which is residing in the database from the application server instance.

Orient db import data

I am trying to do export/import in orient db 1.7.4 community edition between to different orient db servers running same version.
I did export database dbone which created dbone.json.gz file.
connected to other server created a new database and ran import database dbone.json.gz
it terminated with below error.
Error on importing database 'dbthree' from file: dbthree.json.gz
Error while removing cluster '10'
E:\Installs\orientdb-community-1.7.4\orientdb-community-1.7.4\databases\dbthree\e.0.ocl: The process cannot access the file because it is being used by another process.
This exception:
The process cannot access the file because it is being used by another process.
Means you've a console or server open that locks the database.

How to perform one-time DB sync to another DB in MongoDB?

I have separate development and production MongoDB servers and I want to keep actual data in development server for sometime. What I should use for it: mongodump, mongoimport or something else?
Clarification: I want to copy data from production to development.
If it's a one time-thing
and you want fine control over parameters such as which collections to sync, you should use:
mongodump to dump bson files of your Production DB to your local machine
mongorestore to then, retrieve the dumped BSON files in your Local DB
Otherwise you should check out mongo-sync
It's a script I wrote for my self when I had to constantly copy my Local MongoDB database to and from my Production DB for a Project (I know it's stupid).
Once you put your DB details in config.yml, you can start syncing using two simple commands:
./mongo-sync push # Push DB to Remote
./mongo-sync pull # Pull DB to Local
If you use it inside some project, it's a good idea to add config.yml to .gitignore
You can use the db.copyDatabase(...) or db.cloneDatabase(...) commands:
http://www.mongodb.org/display/DOCS/Copy+Database+Commands
This is faster than mongodump / mongorestore because it skips creating the bson representation on disk.
When you want the dev database to look exactly like the production database, you can just copy the files. I am currently running a setup where I synchronize my MongoDB database between my desktop and my notebook with dropbox - even that works flawless.