How to import csv in meteor mongo - mongodb

I'm facing a problem in importing a CSV file to my db. I tried this command:
mongoimport --db meteor -h localhost:3001 -c TollPlaza -d meteor --headerline --type csv --file TollPlaza.csv
I referred to this question but am still having a problem.

Try this query :
mongoimport --db meteor --host localhost --port 3001 -c TollPlaza -d
meteor --headerline --type csv --file TollPlaza.csv
Maybe seperate host and port

The query is perfectly fine.
Follow the instructions
run the meteor project
Open a one more command prompt in same location
make sure that csv file present in the same directory.
run the query.
check the DB (show collections).

Related

Upload Data into MongoLab database from terminal

I'm having trouble figuring out how to upload csv data to my MongoLab database. From my terminal I have used
sudo mongoimport --db heroku_hkr86p3z -u <dbusername> -p <dbpassword> --collection contributors --type csv --headerline --file /Users/tonywinglau/Desktop/independent-expenditure.csv
and
sudo mongoimport --host mongodb://<username>:<password>#ds035310.mlab.com:35310/heroku_hkr86p3z --db heroku_hkr86p3z -u <username> -p <password> --collection contributors --type csv --headerline --file /Users/tonywinglau/Desktop/independent-expenditure.csv
both of which respond with
Failed: error connecting to db server: no reachable servers
imported 0 documents
From what I have read it might be something to do with my 'mongo config' file (I can't find it if it does exist) being set to only connect with localhost? How do I import data directly into my mongolab hosted database?
Your command line should look like this:
mongoimport -d <databasename> -c <collectionname> --type csv --file <filelocation/file.csv> --host <hostdir example:ds011291.mlab.com> --port <portnumber example:11111> -u <username> -p <password> --headerline
The host direction and the port number it gived by mlab when you create the database.
Example:
ds000000.mlab.com:000000/databaseName

Restoring single collection in an existing mongodb

I'm failing miserably to be able to restore a single collection into an existing database.
I'm running Ubuntu 14.04 with mongo version 2.6.7
There is a dump/mydbname/contents.bson based off my home directory.
If I run
mongorestore --collection contents --db mydbname
Then I get:
connected to: 127.0.0.1
don't know what to do with file [dump]
If I add in the path
mongorestore --collection contents --db mydbname --dbpath dump/mydbname
Then I get
If you are running a mongod on the same path you should connect to that instead of direct data file access
I've tried various other combinations, options, etc. and just can't puzzle it out, so I'm coming to the community for help!
If you want to restore a single collection then you have to specifiy the dump file of the collection. The dump file of the collection is found in the 'dump/dbname/' folder. So assuming your dump folder is in your current working directory, the command would go something like -
mongorestore --db mydbname --collection mycollection dump/mydbname/mycollection.bson
I think this is now done with the --nsInclude option:
mongorestore --nsInclude test.purchaseorders dump/
dump/ is the folder with your mongodump data, test is the db, and purchaseorders is the collection.
https://docs.mongodb.com/manual/reference/program/mongorestore/
Steps to restore specific collection in the mongodb.
1) Go to the directory where your dump folder exists.
2) Execute following command by modifying according to your db name and your collection name.
mongorestore --db mydbname --collection mycollection dump/mydbname/mycollection.bson
If you get Failed: yourdbname.collection.name: error creating indexes for collection.name: createIndex error: The field 'safe' is not valid for an index specification error, then you can use following command:
mongorestore --db mydbname --collection mycollection dump/mydbname/mycollection.bson --noIndexRestore
If you are restoring multiple collections, you can use a loop:
for file in "$HOME/mongodump/dev/<your-db>/"* ; do
if [[ "$file" != "*metadata*" && "$file" != "system.*" && "$file" != "locks.*" ]]; then
file="$(basename "$file”)"
mongorestore \
--db cdt_dev \
--collection "${file%.*}" \ # filename w/o extension
--host "<your-host>" \
--authenticationDatabase "<your-auth-db>" \
-u "user" \
-p "pwd" \
"$HOME/mongodump/dev/<your-db>/$file"
fi;
done

How to copy a collection from one mongodb to another?

I have a collection named tracks in a db named socialmedia in my mongo. How can i copy this collection to another mongodb in my network ?
Update:
there is only one mongodb instance
Export your collection to file, copy the file to the other machine and import it on your other machine.
Export from commandline to file:
mongoexport -d socialmedia -c tracs -o filename.json
Import a file(in the same folder) from commandline :
mongoimport -d socialmedia -c tracs --file filename.json
Use cloneCollection
http://docs.mongodb.org/manual/reference/command/cloneCollection/
On the target server, run
{ cloneCollection: "databaseName.socialmedia", from: "mongodb.example.net:27017" }
If you wanted to do this on the same server:
db.socialmedia.copyTo(newNameOfSocialmedia)
http://docs.mongodb.org/manual/reference/method/db.collection.copyTo/
Use mongo import and export. Explanation you can find here
mongoimport --db project_test_db --collection users --out export/users.json
mongoexport --db project_test_db --collection users --sort '{fieldName: 1}' --limit 100 --skip 10 --out export/users.json

import data into openshift mongoDb

I created a java application on openshift with the mongoDb cartridge.
My application runs fine, both locally on jboss AS7 as on openshift.
So far so good.
Now I would like to import an csv into the mongoDb on the openshift cloud.
The command is fairly simple:
mongoimport -d dbName -c collectionName --type csv data.csv --headerline
This works fine locally, and I know how to connect to the openshift-shell and remote mongo-db. But my question is: how can I use a locally stored file (data.csv) when executing this commando in a ssh-shell.
I found this on the openshift forum, but I don't realy know what this tmp directory is and how to use it.
I work on windows, so I use Cygwin as a shell-substitute.
Thanks for any help
The tmp directory is shorthand for /tmp. On Linux, it's a directory that is cleaned out whenever you restart the computer, so it's a good place for temporary files.
So, you could do something like:
$ rsync data.csv openshiftUsername#openshiftHostname:/tmp
$ ssh openshiftUsername#openshiftHostname
$ mongoimport -d dbName -c collectionName --type csv /tmp/data.csv --headerline
This is what I needed in October 2014:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST --port $OPENSHIFT_MONGODB_DB_PORT -u admin -p 123456789 -d dbName -c users /tmp/db.json
Note that I used a json file instead of csv
When using Openshift you must use the environment variables to ensure your values are always correct. Click here to read more about Openshift Envrionment variables
SSH into your openshift server then run (remember to change the bold bits in the command to match your values):
mongoimport --headerline --type csv \
--host $OPENSHIFT_NOSQL_DB_HOST \
--port $OPENSHIFT_NOSQL_DB_PORT \
--db **your db name** \
--collection **your collection name** \
--username $OPENSHIFT_NOSQL_DB_USERNAME \
--password $OPENSHIFT_NOSQL_DB_PASSWORD \
--file ~/**your app name**/data/**your csv file name**
NOTE
When importing csv files using mongoimport the data is saved as strings and numbers only. It will not save arrays or objects. If you have arrays or object to be saved you must first convert your csv file into a proper json file and then mongoimport the json file.
I installed RockMongo on my openshift instance to manage the mongodb.
It's a nice userinterface, a bit like phpMyAdmin for mysql
Users who wish to use mongorestore the following worked for me:
First copy your dump using scp to the data dir on openshift:
scp yourfile.bson yourhex#yourappname.rhcloud.com:app-root/data
rhc ssh into your app and cd to the app-root/data folder.
mongorestore --host $OPENSHIFT_MONGODB_DB_HOST
--port $OPENSHIFT_MONGODB_DB_PORT
--username $OPENSHIFT_MONGODB_DB_USERNAME
--password $OPENSHIFT_MONGODB_DB_PASSWORD
-d yourdb
-c yourcollection
yourfilename.bson --drop
Similar to Simon's answer, but this is how I imported .json to the database:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST -u admin -p 123456 --db dbname --collection grades < grades.json

How to use mongoimport to import CSV files?

CSV file with contact information:
Name,Address,City,State,ZIP
Jane Doe,123 Main St,Whereverville,CA,90210
John Doe,555 Broadway Ave,New York,NY,10010
Running this doesn't add documents to the database:
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
Trace says imported 1 objects, but in the MongoDB shell running db.things.find() doesn't show any new documents.
What am I missing?
Your example worked for me with MongoDB 1.6.3 and 1.7.3. Example below was for 1.7.3. Are you using an older version of MongoDB?
$ cat > locations.csv
Name,Address,City,State,ZIP
Jane Doe,123 Main St,Whereverville,CA,90210
John Doe,555 Broadway Ave,New York,NY,10010
ctrl-d
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
connected to: 127.0.0.1
imported 3 objects
$ mongo
MongoDB shell version: 1.7.3
connecting to: test
> use mydb
switched to db mydb
> db.things.find()
{ "_id" : ObjectId("4d32a36ed63d057130c08fca"), "Name" : "Jane Doe", "Address" : "123 Main St", "City" : "Whereverville", "State" : "CA", "ZIP" : 90210 }
{ "_id" : ObjectId("4d32a36ed63d057130c08fcb"), "Name" : "John Doe", "Address" : "555 Broadway Ave", "City" : "New York", "State" : "NY", "ZIP" : 10010 }
I was perplexed with a similar problem where mongoimport did not give me an error but would report importing 0 records. I had saved my file that didn't work using the OSX Excel for Mac 2011 version using the default "Save as.." "xls as csv" without specifying "Windows Comma Separated(.csv)" format specifically. After researching this site and trying the "Save As again using "Windows Comma Separated (.csv)" format, mongoimport worked fine. I think mongoimport expects a newline character on each line and the default Mac Excel 2011 csv export didn't provide that character at the end of each line.
We need to execute the following command:
mongoimport --host=127.0.0.1 -d database_name -c collection_name --type csv --file csv_location --headerline
-d is database name
-c is collection name
--headerline If using --type csv or --type tsv, uses the first line as field names. Otherwise, mongoimport will import the first line as a distinct document.
For more information: mongoimport
you will most likely need to authenticate if you're working in production sort of environments. You can use something like this to authenticate against the correct database with appropriate credentials.
mongoimport -d db_name -c collection_name --type csv --file filename.csv --headerline --host hostname:portnumber --authenticationDatabase admin --username 'iamauser' --password 'pwd123'
I use this on mongoimport shell
mongoimport --db db_name --collection collection_name --type csv --file C:\\Your_file_path\target_file.csv --headerline
type can choose csv/tsv/json
But only csv/tsv can use --headerline
You can read more on the offical doc.
Check that you have a blank line at the end of the file, otherwise the last line will be ignored on some versions of mongoimport
When I was trying to import the CSV file, I was getting an error. What I have done.
First I changed the header line's column names in Capital letter and removed "-" and added "_" if needed. Then Typed below command for importing CSV into mongo
$ mongoimport --db=database_name --collection=collection_name --type=csv --file=file_name.csv --headerline
Robert Stewart have already answered for how to import with mongoimport.
I am suggesting easy way to import CSV elegantly with 3T MongoChef Tool (3.2+ version). Might help someone in future.
You just need to select collection
Select file to import
You can also unselect data which is going to import. Also many options are there.
Collection imported
See how to import video
First you should come out of the mongo shell and then execute the mongoimport command like this:
Manojs-MacBook-Air:bin Aditya$ mongoimport -d marketdata -c minibars
--type csv
--headerline
--file '/Users/Aditya/Downloads/mstf.csv'
2017-05-13T20:00:41.989+0800 connected to: localhost
2017-05-13T20:00:44.123+0800 imported 97609 documents
Manojs-MacBook-Air:bin Aditya$
Robert Stewart's answers is great.
I'd like to add that you also can type your fields with --columHaveTypes and --fields like this :
mongoimport -d myDb -c myCollection --type csv --file myCsv.csv
--columnsHaveTypes --fields "label.string(),code.string(),aBoolean.boolean()"
(Careful to not have any space after the comma between your fields)
For other types, see doc here : https://docs.mongodb.com/manual/reference/program/mongoimport/#cmdoption-mongoimport-columnshavetypes
For the 3.4 version, please use the following syntax:
mongoimport -u "username" -p "password" -d "test" -c "collections" --type csv --file myCsv.csv --headerline
After 3 days, I finally made it on my own. Thanks to all the users who supported me.
My requirement was to import the .csv (with no headline) to remote MongoDB instance. For mongoimport v3.0.7below command worked for me:
mongoimport -h <host>:<port> -u <db-user> -p <db-password> -d <database-name> -c <collection-name> --file <csv file location> --fields <name of the columns(comma seperated) in csv> --type csv
For example:
mongoimport -h 1234.mlab.com:61486 -u arpitaggarwal -p password -d my-database -c employees --file employees.csv --fields name,email --type csv
Below is the screenshot of how it looks like after import:
where name and email are the columns in the .csv file.
Given .csv file I have which has only one column with no Header, below command worked for me:
mongoimport -h <mongodb-host>:<mongodb-port> -u <username> -p <password> -d <mongodb-database-name> -c <collection-name> --file file.csv --fields <field-name> --type csv
where field-name refers to the Header name of the column in .csv file.
C:\wamp\mongodb\bin>mongoexport --db proj_mmm --collection offerings --csv --fieldFile offerings_fields.txt --out offerings.csv
Just use this after executing mongoimport
It will return number of objects imported
use db
db.collectionname.find().count()
will return the number of objects.
use :
mongoimport -d 'database_name' -c 'collection_name' --type csv --headerline --file filepath/file_name.csv
mongoimport -d test -c test --type csv --file SampleCSVFile_119kb.csv --headerline
check collection data:-
var collections = db.getCollectionNames();
for(var i = 0; i< collections.length; i++)
{
print('Collection: ' + collections[i]);
// print the name of each collection
db.getCollection(collections[i]).find().forEach(printjson);
//and then print the json of each of its elements
}
1]We can save xsl as .csv file
2] Got to MongoDB bin pathon cmd - > cd D:\Arkay\soft\MongoDB\bin
3] Run below command
> mongoimport.exe -d dbname -c collectionname --type csv --file "D:\Arkay\test.csv" --headerline
4] Verify on Mongo side using below coomand.
>db.collectioname.find().pretty().limit(1)
Strangely no one mentioned --uri flag:
mongoimport --uri connectionString -c questions --type csv --file questions.csv --headerline
Sharing for future readers:
In our case, we needed to add the host parameter to make it work
mongoimport -h mongodb://someMongoDBhostUrl:somePORTrunningMongoDB/someDB -d someDB -c someCollection -u someUserName -p somePassword --file someCSVFile.csv --type csv --headerline --host=127.0.0.1
Make sure to copy the .csv file to /usr/local/bin or whatever folder your mondodb is in
All these answers above are great. And the way to go on a full featured application.
But if you want to prototype fast, want flexibility as the collection still changes as well as to minimize your early code base, there is a much simpler way that is not much discussed.
You can basically forego mongoimport by now. I could have saved 3 hours if it was mentioned here on this question. So let me share for others:
Mongodb has a GUI called Mongo Compass has both csv and json import features out of the box in a matter of clicks. It is an official part of the Mongo ecosytem. At the time of writing it is free and it works very well for my use case.
https://www.mongodb.com/products/compass
You simply get MongoDB compass running on your machine by following the simple installation. A couple of fields for DB connection and authentication directly in the GUI.
Import the csv/json file. It took less than a second on a 30KB file to be parsed before user (me) validates.
Validate the "type" of each property. Great feature, I could directly mention the property types such as booleans, integers, etc. In my experience, they seem all default to string. You can update before importing. Dates were more finicky and needed special attention on the coding side.
One click further the csv is a collection in your mongo db local or on the cloud. Voila!
If you have multiple files and you want to import all of them using python, you can do the following.
import os
import subprocess
# directory of files
dir_files = 'C:\data'
# create list of all files
_, _, fns = next(os.walk(dir_files))
files = [os.path.join(dir_files, fn) for fn in fns]
# mongotool address
mongotool = r'C:\Program Files\MongoDB\Server\4.4\bin\mongoimport.exe'
# name of mongodb database
mydatabase = 'mydatabase'
# name of mongodb collection
mycollection = 'mycollection'
# import all files to mongodb
for fl in files:
commands =[mongotool, '--db', mydatabase,
'--collection', mycollection,
'--file', fl,
'--type', 'tsv',
'--headerline']
subprocess.Popen(commands, shell=True)