I have an Excel file that I converted to a csv and imported into my running MongoDB storage, but there was trouble with one column of the data from the csv file. One column, called Room, occasionally, but not always, contains values separated by a comma (ex. "101, 103").
Running:
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
gave no errors, but for documents that are supposed to have 2 values for Room, there was just one. For example "101, 102" became "101," in the db.
Is there an option for mongoimport that allows me to specify an array for a certain column?
First you need to Import the data from CSV as
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
After that , you just have to use
db.things.find().snapshot().forEach(function (el) { el.Room = el.Room.split(','); db.things.save(el); });
So, It will solve your problem.
Related
I'm trying to import a csv into mongodb using the following command:
mongoimport --db users --collection contacts --file data.csv
--headerline
The database exists but not the collection, I want to create it and use the first row of the csv as the field names. Why am I getting error:
error validating settings: must specify --fields, --fieldFile or
--headerline to import this file type
I also would like to know:
how to copy/import data from one collection into another (basically
the syntax)
how datatypes from csv are handled in mongodb when
imported; do I need to specify datatypes for headers or will mongodb
read it from csv types?
To solve this:
Either make sure the first line of your data.csv file has field names of the data to be parsed and then execute:
mongoimport --db users --collection contacts --type csv --headerline --file data.csv
Or
Define the list of field names that the values of csv would be parsed in using --fields
mongoimport --db users --collection contacts --type csv --file data.csv --fields["name","surname","etc"]
You should write command like this:
mongoimport --db users --collection contacts --type csv --file data.csv --fields "name","surname","etc"
I seem to keep having this error when i try and import anything?
In terminal I input:
name:~ computer$ mongoimport --db users --collection contacts --type csv --file /Users/computer/Desktop/ftse100.csv
connected to: 127.0.0.1
assertion: 9998 you need to specify fields
I wouldn't know what to ask. I tried adding --field after this command line but just get help information.
ER
As per mongodb docs
--fields <field1[,field2]>, -f
Specify a comma separated list of field names when importing csv or tsv files that do not have field names in the first (i.e. header) line of the file.
mongoimport --db users --collection contacts --type csv --file /Users/computer/Desktop/ftse100.csv --fields field1, field2,field3
As per your question, there is a typo it's not --field instead --fields
In 2.4.6, mongoimport does not find the header in csv files that I make, with or without double quote boundaries.
If I chop off the header line and supply that same text to the -f or --fields option, it my files import fine.
If you want to add all columns, use --headerline option instead of -fields.
In your case it would be:
mongoimport --db users --collection contacts --type csv --headerline --file /Users/computer/Desktop/ftse100.csv
I have a data collection, which is separated by | character. I am going to add the data collection to mongodb. So I need to separate data through | character. how my mongoimport command looks like?
Previously, I'm successfully import csv file through the following command.
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
mongoimport supports either JSON, CSV (comma separated values) or TSV (tab separated values). The | character is not a valid delimiter for either CSV or TSV, so you will need to change your input files' | to , or a tab, and specify --type accordingly.
mongodb could actually treat a | seperated record in a .unl, .txt, .csv
Just make sure you do this in the format below. For the specified mentioned extensions use the --type csv:
mongoimport -c <table_name> -d <database_name> --mode upsert --file <filename> --type csv --headerline
I'm trying to import and merge multiple CSVs into mongo, however documents are getting replaced rather than merged.
For example, if I have one.csv:
key1, first column, second column
and two.csv:
key1, third column
I would like to end up with:
key1, first column, second column, third column
But instead I'm getting:
key1,third column
Currently I'm using:
mongoimport.exe --ftype csv --file first.csv --fields key,firstColumn,secondColumn
mongoimport.exe --ftype csv --file second.csv --fields key,thirdColumn --upsert --upsertFields key1
That's the way mongoimport works. There's an existing new feature request for merge imports, but for now, you'll have to write your own import to provide merge behavior.
cross-collection workaround: forEach method can be run on a dummy collection and the resulting doc objects used to search/update your desired collection:
mongoimport.exe --collection mycoll --ftype csv --file first.csv --fields key,firstColumn,secondColumn
mongoimport.exe --collection dummy --ftype csv --file second.csv --fields key,third
db.dummy.find().forEach(function(doc) {db.mycoll.update({key:doc.key},{$set:{thirdcol:doc.third}})})
That's correct, mongoimport --upsert updates full documents.
You may achieve your goal by importing to a temporary collection and using the following Gist.
Load the script to Mongo Shell and run:
mergeCollections("srcCollectionName", "destCollectionName", {}, ["thirdColl"]);
I just had a very similar problem. There is a node module for mongo and jline is my command line node tool for stream processing JSON lines. So:
echo '{"page":"index.html","hour":"2015-09-18T21:00:00Z","visitors":1001}' |\
jline-foreach \
'beg::dp=require("bluebird").promisifyAll(require("mongodb").MongoClient).connectAsync("mongodb://localhost:27017/nginx")' \
'dp.then(function(db){
updates = {}
updates["visitors.hour."+record.hour] = record.visitors;
db.collection("pagestats").update({_id:record.page},{$set:updates},{upsert:true});});' \
'end::dp.then(function(db){db.close()})'
In your case you'd have to convert from csv to JSON lines first by piping it through jline-csv2jl. That converts each CSV line into a dictionary with names taken from the header.
I have added this example to the manual: https://github.com/bitdivine/jline/blob/master/bin/foreach.md
I haven't used jline with promises much but so far it's OK.
Disclaimer: I am the author of jline.
I have a csv file containing following data and want to import it in mongodb
ID;"AdmissionID";"SeatNo";"RegistrationNo";"ResultDate";"ResultStatusId"
1;12;"2323";"23";07-05-2013;1
2;23;"35";"32";10-05-2013;5
this data is to be imported to mongodb 2.2. I'm using following command:
mongoimport -d test -c exam --type csv --headerline <f:\exam.csv
when used i get following error
SyntaxError: missing ; before statement (shell):1
please help me to find out the error
This should do the trick easily. More HERE.
mongoimport -d mydb -c collectionName --type csv --file myfile.csv --headerline
Your problem is the <f:\exam.csv bit, which is not properly escaped by the way it looks
> --headerline
> If using “--type csv” or “--type tsv,” use the first line as field names. Otherwise, mongoimport will import the first line as a distinct
> document.
Please try this line of code
C:\Program Files\MongoDB\Server\3.2\bin>mongoimport -d pravin -c FOC --type csv
--file D:\Script\ImportData\FOC.csv --headerline
2016-08-10T15:42:38.685+0530 connected to: localhost`enter code here`
2016-08-10T15:42:38.758+0530 imported 13 documents
I solved by opening the .csv file in Excel (File -> Options -> advanced) then unchecking the box "uses system separation" and then removing the comma from the box below and then saving again in .csv.
So there will not be any commas in the .csv file and the formatting of JSON in MongoDB will be right.
Run mongoimport from the system command line, not the mongo shell.
Ref - https://docs.mongodb.com/manual/reference/program/mongoimport/