I'm using mongodb 2.6 and trying to create a dump using the query option gives "positional arguments not allowed".
I am trying to get all the products who parameter's timestamp is between specified range and whose id is of any of the specified format.
mongodump --host 10.xx.xxx.xx:xxxx --db test --collection products --username abc --password uvw --query '{"parameterList":{$elemMatch:{ "paramName":"TimeStamp","paramValue":{$gte:"20160620000000",$lt:"20160724000000"}}},"parameterList.paramValue": {$in:[/SPC126/,/CSC234/]}}' --authenticationDatabase test --out "c:\New folder\dump"
document structure
{
"_id": ObjectId("590074c362f41f15144996fa"),
"product": "device1",
"parameterList":[{"paramName":"TimeStamp",
"paramValue":"20160731000700"},
{"paramName":"Id",
"paramValue": "SPC126332"}]
}
Unlike UNIX bash, Windows cmd.exe doesn't recognize single quotes as a delimiter.
Running your example command as-is in cmd.exe gives the error:
Error parsing command line: too many positional options
Try changing your quotes around, replacing the single quotes with double quotes and vice versa. For example, using the example command you posted:
mongodump --host 10.xx.xxx.xx:xxxx --db test --collection products --username abc --password uvw --query "{'parameterList':{$elemMatch:{ 'paramName':'TimeStamp','paramValue':{$gte:'20160620000000',$lt:'20160724000000'}}},'parameterList.paramValue': {$in:[/SPC126/,/CSC234/]}}" --authenticationDatabase test --out "c:\New folder\dump"
Note the --query "..." instead of --query '...' in the example above.
It should be able to complete the dump successfully.
Related
That's the query that i'm executing:
mongoexport --db solutions --collection solution3 --query "{ 'metrictimestamp': { '$gte': { '$date': '2016-03-01T00:00:00.001Z' },'$lte': { '$date': '2016-03-29T23:59:59.000Z' }}}"--out a.json
but i keep on getting this error:
Failed: error parsing query as Extended JSON: invalid JSON input
I've tried to invert the quotes, and all the solutions that i saw on Use mongoexport with a --query for ISODate
After a lot of trial and run, I found it works perfect by passing query through a file instead of command line.
create a file query.json and put your query in normal json format, with double quotes.
{"metrictimestamp":{"$gte":{"$date":"2016-03-01T00:00:00.001Z"},"$lte":{"$date":"2016-03-29T23:59:59.000Z"}}}
and then run the command passing the file to --queryfile instead of --query
mongoexport --db solutions --collection solution3 --queryfile query.json --out a.json
When I try to take backup with mongoexport using the --query option to get the documents whose status is equal to A, facing the below error:
mongoexport --port 27017 --db ex --collection A --type=csv --fields _id,status --query '{"status":"A"}' -o eg.csv
error validating settings: query ''{status:A}'' is not valid JSON
Please let me know how to use --query option.
Assuming you run this from the DOS command prompt, you need to swap the single and double quotes. You need to wrap the entire query in double quotes and use single quotes inside the JSON document like this:
--query "{'status':'A'}"
I have tested this with mongoexport version 3.0.0 and 3.2.0 and it works for both versions.
Im trying to do a mongoexport to CSV but only selecting certain records with a query. Here's my command (windows 7 cmd):
mongoexport --host foo.com --port 27017 --username bar -p --db foo --csv --fields col1,col2,col3 --collection bar --out dump_q.csv --query '{"recent":"yes"}'
However after entering the password, I get an error:
assertion: 16619 code FailedToParse: FailedToParse: Expecting '{': offset:0
The command works fine without the query argument but I cant figure out whats wrong with the query:
--query '{"recent":"yes"}'
Any help much appreciated
Summary of answer:
Make sure you use double quotes on enclose the query and single quotes to enclose strings e.g.
--query "{'recent':'yes'}"
Also make sure you don't have a space in your query otherwise the command prompt will parse it as another argument. So don't have:
--query "{'recent': 'yes'}"
(notice the space in-between)
Queries which include nested fields don't work such as:
--query "{'folder.recent':'yes'}"
You'll need to use double quotes to contain the query string (and either single quotes or two quotes to escape inside of the string)
--query "{'recent':'yes'}"
Complete:
mongoexport --host foo.com --port 27017 --username bar -p
--db foo --csv --fields col1,col2,col3
--collection bar --out dump_q.csv --query "{'recent':'yes'}"
From mongoexport documentation:
--query , -q
Provides a JSON document as a query that optionally limits the documents returned in the export.
Your query string seems to be correctly formated. You can even ommit the double quotes around recent.
Single or double quotes don't seem to matter, as long as you are persistent in using different types on the outside and the inside.
Are you sure this is a valid query though? What is the output if you run the following in the database? What about a find()?
db.bar.count({"recent":"yes"})
mongoexport -h db.mysite.com -u myUser -p myPass -c myCollection
But the response I get is:
ERROR: too many positional options
What's that about?
I had this same problem. In my case, I was using mongoexport with the --query option, which expects a JSON document, such as:
mongoexport ... --query {field: 'value'} ...
I needed to surround the document with quotes:
mongoexport ... --query "{field: 'value'}" ...
I had the same problem. Found a group post somewhere which said to remove the space between the '-p' and the password, which worked for me.
Your sample command should be:
mongoexport -h db.mysite.com -u myUser -pmyPass -c myCollection
The same error I have encountered while importing a csv file.
But its just, the fact that the field list which you pass for that csv file import may have blank spaces.
Just clear the blank spaces in field list.
Its the parsing error.
I had the same issue with mongodump. After searching a bit, I found out that using the --out parameter to specify the output directory would solve this issue. The syntax for using the out parameter is
mongoexport --collection collection --out collection.json
Also in case your Mongodb instance isn't running, then you could use the --dbpath to specify the exact path to the files of your instance.
Source: http://docs.mongodb.org/manual/core/import-export/
I had the same issue with the mongoexport utility (using 2.0.2). My fix was to use the FULL parameter name (i.e. not -d, instead use --db).
Sometimes editor will screw it up (such as evernote). I fixed the issue by retyping the command in terminal.
I was also stuck in same situation and found what was causing it.
Make sure you are exporting in CSV format by adding parameter --type csv
Make sure there are no spaces in fields name,
Example: --fields _id, desc is wrong but --fields id,desc,price is good
This also works if you place the -c option first. For me, this order does work:
mongoexport -c collection -h ds111111.mlab.com:11111 -u user -p pass -d mydb
You can also leave the pass out and the server will ask you to enter the pass. This only works if the server supports SASL authentication (mlab does not for example).
for the (Error: Too many arguments)
Dont Use Space Between the Fields
try:
mongoexport --host localhost --db local --collection epfo_input --type=csv --out epfo_input.csv --fields cin,name,search_string,EstablishmentID,EstablishmentName,Address,officeName
Dont_Try:
mongoexport --host localhost --db local --collection epfo_input --type=csv --out epfo_input.csv --fields cin,name,search_string,Establishment ID,Establishment Name,Address,office Name
Had a similar issue
$too many positional arguments
$try 'mongorestore --help' for more information
Simply fix for me was to wrap the path location in quotes " "
This Failed:
mongorestore -h MY.mlab.com:MYPORT -d MYDBNAME -u ADMIN -p PASSWORD C:\Here\There\And\Back\Again
This Worked:
mongorestore -h MY.mlab.com:MYPORT -d MYDBNAME -u ADMIN -p PASSWORD "C:\Here\There\And\Back\Again"
I had the same issue with starting mongod. I used the following command:
./mongod --port 27001 --replSet abc -- dbpath /Users/seanfoley/Downloads/mongodb-osx-x86_64-3.4.3/bin/1 --logpath /Users/seanfoley/Downloads/mongodb-osx-x86_64-3.4.3/log.1 --logappend --oplogSize 5 --smallfiles --fork
The following error message appeared:
Error parsing command line: too many positional options have been specified on the command line
What fixed this issue was removing the single space between the '--' and 'dbpath'
I had the same issue while using the "mongod --dbpath" command. What I was doing looked somewhat like this:
mongod --dbpath c:/Users/HP/Desktop/Mongo_Data
where as the command syntax was supposed to be:
mongod --dbpath=c:/Users/HP/Desktop/Mongo_Data
This worked for me. Apart from this one may take a note of the command function and syntaxes using the mongod --help command.
In my case, I had to write the port separately from the server connection. This worked for me:
mongoexport --host=HOST --port=PORT --db=DB --collection=COLLECTION
--out=OUTPUT.json -u USER -p PASS
Create a json file in the same folder where you have your mongod.exe.
eg: coll.json
and open a command prompt in this folder.
type this below in CMD.
mongoexport --db databasename --collection collectionname --out
coll.json
and you will see like a progress bar very cool exporting all data.
How do you export all the records in a MongoDB collection to a .csv file?
mongoexport --host localhost --db dbname --collection name --type=csv > test.csv
This asks me to specify name of the fields I need to export. Can I just export all the fields without specifying the names of fields?
#karoly-horvath has it right. Fields are required for csv.
According to this bug in the MongoDB issue tracker https://jira.mongodb.org/browse/SERVER-4224 you MUST provide the fields when exporting to a csv. The docs are not clear on it. That is the reason for the error.
Try this:
mongoexport --host localhost --db dbname --collection name --csv --out text.csv --fields firstName,middleName,lastName
UPDATE:
This commit: https://github.com/mongodb/mongo-tools/commit/586c00ef09c32c77907bd20d722049ed23065398 fixes the docs for 3.0.0-rc10 and later. It changes
Fields string `long:"fields" short:"f" description:"comma separated list of field names, e.g. -f name,age"`
to
Fields string `long:"fields" short:"f" description:"comma separated list of field names (required for exporting CSV) e.g. -f \"name,age\" "`
VERSION 3.0 AND ABOVE:
You should use --type=csv instead of --csv since it has been deprecated.
More details: https://docs.mongodb.com/manual/reference/program/mongoexport/#export-in-csv-format
Full command:
mongoexport --host localhost --db dbname --collection name --type=csv --out text.csv --fields firstName,middleName,lastName
Also, you are not allowed spaces between comma separated field names.
BAD:
-f firstname, lastname
GOOD:
-f firstname,lastname
mongoexport --help
....
-f [ --fields ] arg comma separated list of field names e.g. -f name,age
--fieldFile arg file with fields names - 1 per line
You have to manually specify it and if you think about it, it makes perfect sense. MongoDB is schemaless; CSV, on the other hand, has a fixed layout for columns. Without knowing what fields are used in different documents it's impossible to output the CSV dump.
If you have a fixed schema perhaps you could retrieve one document, harvest the field names from it with a script and pass it to mongoexport.
If you want, you can export all collections to csv without specifying --fields (will export all fields).
From http://drzon.net/export-mongodb-collections-to-csv-without-specifying-fields/ run this bash script
OIFS=$IFS;
IFS=",";
# fill in your details here
dbname=DBNAME
user=USERNAME
pass=PASSWORD
host=HOSTNAME:PORT
# first get all collections in the database
collections=`mongo "$host/$dbname" -u $user -p $pass --eval "rs.slaveOk();db.getCollectionNames();"`;
collections=`mongo $dbname --eval "rs.slaveOk();db.getCollectionNames();"`;
collectionArray=($collections);
# for each collection
for ((i=0; i<${#collectionArray[#]}; ++i));
do
echo 'exporting collection' ${collectionArray[$i]}
# get comma separated list of keys. do this by peeking into the first document in the collection and get his set of keys
keys=`mongo "$host/$dbname" -u $user -p $pass --eval "rs.slaveOk();var keys = []; for(var key in db.${collectionArray[$i]}.find().sort({_id: -1}).limit(1)[0]) { keys.push(key); }; keys;" --quiet`;
# now use mongoexport with the set of keys to export the collection to csv
mongoexport --host $host -u $user -p $pass -d $dbname -c ${collectionArray[$i]} --fields "$keys" --csv --out $dbname.${collectionArray[$i]}.csv;
done
IFS=$OIFS;
works for me remoting to a docker container with mongo:4.2.6
mongoexport -h mongodb:27017 --authenticationDatabase=admin -u username -p password -d database -c collection -q {"created_date": { "$gte": { "$date": "2020-08-03T00:00:00.000Z" }, "$lt": { "$date": "2020-08-09T23:59:59.999Z" } } } --fields=somefield1,somefield2 --type=csv --out=/archive.csv
Easy export csv or json file With Mongo Compass tool
Mongo Compass As the GUI for MongoDB, MongoDB Compass allows you to make smarter decisions about document structure, querying, indexing, document validation, and more. Commercial subscriptions include technical support for MongoDB Compass.
https://www.mongodb.com/try/download/compass
I could not get mongoexport to do this for me. I found that,to get an exhaustive list of all the fields, you need to loop through the entire collection once. Use this to generate the headers. Then loop through the collection again to populate these headers for each document.
I've written a script to do just this. Converting MongoDB docs to csv irrespective of schema differences between individual documents.
https://github.com/surya-shodan/mongoexportcsv
Also if you want to export inner json fields use dot (. operator).
JSON record:
{
"_id" : "00118685076F2C77",
"value" : {
"userIds" : [
"u1"
],
"deviceId" : "dev"
}
mongoexport command with dot operator (using mongo version 3.4.7):
./mongoexport --host localhost --db myDB --collection myColl
--type=csv --out out.csv --fields value.deviceId,value.userIds
Output csv:
value.deviceId,value.userIds
d1,"[""u1""]"
d2,"[""u2""]"
Note: Make sure you do not export an array. It would corrupt the CSV format like field userIds shown above
Solution for MongoDB Atlas users!
Add the --fields parameter as comma separated field names enclosed in double inverted quotes:
--fields "<FIELD 1>,<FIELD 2>..."
This is complete example:
mongoexport --host Cluster0-shard-0/shard1URL.mongodb.net:27017,shard2URL.mongodb.net:27017,shard3URL.mongodb.net:27017 --ssl --username <USERNAME> --password <PASSWORD> --authenticationDatabase admin --db <DB NAME> --collection <COLLECTION NAME> --type <OUTPUT FILE TYPE> --out <OUTPUT FILE NAME> --fields "<FIELD 1>,<FIELD 2>..."
This working for me Try it
mongoexport --host cluster0-shard-dummy-link.mongodb.net:27017 --db yourdbname --forceTableScan --collection users --type json --out /var/www/html/user.json --authenticationDatabase admin --ssl --username Yourusername --password Yourpassword
Above cmd return whole data of the users collection
if you want filter field then add --fields=email,name
For all those who are stuck with an error.
Let me give you guys a solution with a brief explanation of the same:-
command to connect:-
mongoexport --host your_host --port your_port -u your_username -p your_password --db your_db --collection your_collection --type=csv --out file_name.csv --fields all_the_fields --authenticationDatabase admin
--host --> host of Mongo server
--port --> port of Mongo server
-u --> username
-p --> password
--db --> db from which you want to export
--collection --> collection you want to export
--type --> type of export in my case CSV
--out --> file name where you want to export
--fields --> all the fields you want to export (don't give spaces in between two field name in between commas in case of CSV)
--authenticationDatabase --> database where all your user information is stored
Below command used to export collection to CSV format.
Note: naag is database, employee1_json is a collection.
mongoexport --db naag--collection employee1_json --type csv --out /home/orienit/work/mongodb/employee1_csv_op1