The problem: I need to import a csv file to a mongodb, but i need to specify the field types and still use the --headline cause in my csv the first line is where the fieldnames are.
Here is what i'm tryin:
mongoimport --type csv -d solution2 -c data --headerline --drop dados.csv direction.int32\(\),latitude.double\(\),longitude.double\(\),metrictimestamp.date\(\),odometer.int32\(\),routecode.int32\(\),speed.int32\(\),device_deviceid.int32\(\),vehicle_vehicleid.int32\(\) --columnsHaveTypes
"solution2" is the DB name, "data" is the collection and "dados.csv" is my archive.
I'm getting this error message: Error validating settings: only one positional argument is allowed.
Related
I'm trying to import a CSV file into MongoDB using the following command;
mongoimport -db results --collection socialogy --type csv --file "F:\Projects\pandas\results.csv" --headerline
MongoDB gives the following error message;
error validating settings: incompatible options: --file and positional argument(s)
What am I doing wrong?
You seem to have typo in your query. See -db option part, it should be --db. This way your really mix arguments as error message says.
I'm trying to insert the TWDS1E1.json file into mongodb through the command prompt:
db.collections.insert( TWDS1E1.json )
But getting the error:
TWDS1E1.json is not defined.
Mongo is not my thing, what am I doing wrong here?
In command prompt whose directory path is the path where mongoimport.exe is available type the commands
For normal JSON
mongoimport -d test -c docs --file example2.json
For array type JSON
mongoimport --jsonArray -d test -c docs --file example2.json
Please see docs for more information
You cannot use the collection.insert() command to insert a file.
insert() is used to insert actual objects, e.g.
db.myCollection.insert({"name":"buzz"});
To bulk load a JSON file, use mongoimport
I have a comma separated csv file with data for following fields: Train_ID ,Train_Number, Train_name
I wrote the following query to import the data from csv to mongodb:
mongoimport --db test --collection csvimporting --type csv --file "C:/Darshil Babel/Desktop/sample_data.csv" --fields Train_ID,Train_Number,Train_Name
It is giving following error:
Error parsing command line: too many positional options
What am I doing wrong?
When importing data from file (csv in my case) mongoimport automatically choose data type for each field.
Is it possible to choose data type manually for specific field?
I encountered situation, when in my file there are phone numbers, which I want and which I should treat as strings, but mongoimport (quite properly) treat those phone numbers as a numbers (NumberLong).
When importing CSV/TSV to mongodb, the option --columnsHaveTypes can help to define the columnstypes. But the document seems very unclear. I tried several times until finally did succeed.
You should add option --columnsHaveTypes and change every column after --fields and remember using "\" before "(" and ")".
for example, change:
mongoimport -h foohost -d bardb -c fooc --type tsv --fields col1,col2,col3 --file path/to/file.txt
into
mongoimport -h foohost -d bardb -c fooc --type tsv --fields col1.int32\(\),col2.double\(\),col3.string\(\) --columnsHaveTypes --file path/to/file.txt
What you can do is import these data using CSV and then run the update statement on the existing data in mongo db to convert it into the format that you want.
Now version 3.4 onward mongoimport supports specifying the field types explicitly while importing the data. See below link:
https://docs.mongodb.com/manual/reference/program/mongoimport/#cmdoption--columnsHaveTypes
See the Type Fidelity section in the documentation:
mongoimport and mongoexport do not reliably preserve all rich BSON
data types because JSON can only represent a subset of the types
supported by BSON. As a result, data exported or imported with these
tools may lose some measure of fidelity. See MongoDB Extended JSON for
more information.
Use mongodump and mongorestore to preserve types.
When I tried to import CSV into Mongo Atlas, I ran into a similar issue. Here's how I deal with it.
To avoid shell error you can enclose fields in double-quotes.
In the below example, I used two-column "Name, Barcode".You Can use whatever column you need also don't forget to update <connecttionString>,<collectionName>, <CSVpath> with your own values.
for more mongo types refer to mongoimport documentation.
mongoimport --uri <connecttionString> --collection <collectionName> --type csv --file <CSVpath> -f "Name.string(),Barcode.string()" --columnsHaveTypes
You can also choose to put the column types in a field file to make it easier. Just make sure you have specified all columns in your field file.
In my case, I named it "field.txt".
In the field file, you write the columns with their types this way: <column>.<type>. To get the list of all types used in the mongoimport syntax, please visit https://www.mongodb.com/docs/database-tools/mongoimport/
field.txt
name.string()
usercode.int64()
city.string()
town.string()
address.string()
price.decimal()
date_created.date_go(2021-08-10 15:04:05)
You can choose to name it anything you want as long as you point the fieldFile to it. eg. fieldFile=myfieldname.txt
mongoimport --uri <connectionString> --collection <collectionName> --type csv --file <csv path> --columnsHaveTypes --fieldFile=field.txt --mode=insert
I'm trying to do a bulk update with the following
mongoimport -d my_db -c db_collection -upsertFields email ~/Desktop/update_list.csv
the csv that i'm trying to import looks like this.
email, full_name
stack#overflow.com,stackoverflow
mongo#db.com,mongodb
It should check the email column as a query arg and update the full name accordingly. However, none were imported, it encountered errors.
exception:Failure parsing JSON string near: abc#sa
abc#sasa.com,abc
imported 0 objects
encountered 99398 errors
Where is the problem? How should i be doing it?
Your mongoimport command is missing the --upsert option, which is needed in combination with --upsertFields. Try:
mongoimport -d my_db -c db_collection --upsert --upsertFields email ~/Desktop/update_list.csv
Add --type csv
Otherwise it assumes your input is json.
Also, looks like you should pass --headerline to make it use the first line of the file as a header.
I assume that the data inside your CSV file must be double-quoted.