Mongo export query - mongodb

I am doing regex command
mongoexport --host localhost --db testml75spectredb_UK_IRE --collection peoplecollec_UK_IRE --type=csv --out /home/ubuntu/production2.csv -f full_name,headline,linkedin_id,given_name,family_name,url,num_connections,industry,locality,experience.0.organization.0.name -q '{ "headline" : { $regex: / php| javascript| js | ios | android| ruby| rubyonrail| java| laravel| scala| python| django| nodejs| node.js| angularjs| angular.js| devops| fullstack| fullstack| full-stack| C++| C#| WPF| J2EE| golang| objective-c| swift| .NET| reactjs| react.js| backbonejs| backbone.js| .js |symfony2| zend| pyramid| ASP.NET| python| java/i}}'
cause iM looking with people having one of this word in their headline and I have a tons of non relevant document

Related

How to export mongo data into csv using pymongo?

My code :
data = db.get_collection('activity_tracker').find({"companyId" : "527d4b23-347a-4ad2-81d8-dfd66af5631a", 'userEmail':{'$ne':'abc#xyz.in'}})
with open('asdxk.csv', 'w') as outfile:
fields = ['companyId', 'userEmail']
writer = csv.writer(outfile, fields)
for post in data:
writer.writerow([post])
Problem statement :
Using above code I am exporting data to CSV file as a junk but what i want to do is companyID and userEmail details should be export to csv as row and column format. Header name should be companyID and userEmail.
Use mongoexport utility to export the data to csv:
mongoexport -h localhost -d test -c activity_tracker --type=csv
--fields companyId,userEmail
--q '{"companyId":"527d4b23-347a-4ad2-81d8-dfd66af5631a","userEmail":{"$ne":"abc#xyz.in"}}'
--out asdxk.csv

Import CSV with semicolon separator into MongoDB database

I've just try to import a CSV file (with semicolon ; separator) into a MongoDB database. I managed import with mongoimport -d mydb -c things --type csv --file files.csv --headerline but the result is not what I'm expecting.
The files have that form:
And I get the following result:
But I want to have something like:
{
"_id": ObjectId("57b6e2654bf4a357b679305"),
"geom_x_y" : "48.87792844925 , 2.3664591564",
"circonfere" : "25.0",
"adresse" : "PARIS 10E ARRDT - QUAI DE JEMMAPES",
"hauteurnm" : "5.0",
"espece" : "Acer platanoides",
"varieteouc" : "'Schwedleri'",
"dateplanta" : "31/12/2014"
}
mongoimport unfortunately doesn't allow you to specify your separator character. But it does work automatically with tabs as well as commas. If you know you won't ever have tabs in your input, you could replace all semicolons with tabs and that should then import correctly.
tr ";" "\t" < file.csv | mongoimport --type tsv ...

MongoDB: FailedToParse: Bad characters in value

mongodump command:
mongodump --host myhost.com --port 12345 --username myUsername --password PSWRD --out /opt/somepath --db myDb --collection my_collection --query "{ content_type_id: { \$not: { \$eq: db.my_type.findOne({slug: 'form_submissions'} )._id } } }" --verbose
Results in:
assertion: 16619 code FailedToParse: FailedToParse: Bad characters in value:
offset:33 of:{ content_type_id: { $not: { $eq: db.my_type.findOne({slug: 'form_submissions'} )._id } } }
That's not a valid query. --query must be a JSON document. Your error is in thinking that mongodump is something programmatic like the mongo shell that can evaluate the findOne and substitute the value into the query. This is not the case. You can find the _id from the result of the findOne and put it in the mongodump --query manually. Use extended JSON format for an ObjectId type, if that is the type of _id.

How to use mongoexport with query script file

I'm trying to follow this tutorial: http://www.ultrabug.fr/tag/mongoexport/
and use a sh file for the query line.
this is my file:
#!/bin/bash
d=`date --date="-3 month"`
echo "{ timeCreated: { "\$lte": $d} }"
this is my mongoexport line:
mongoexport --db game_server --collection GameHistory -query /home/dev/test2.sh --out /home/dev/file.json
I keep getting:
assertion: 16619 code FailedToParse: FailedToParse: Expecting '{': offset:0 of:/home/dev/test2.sh
why? How can I make this work?
I found several errors in your approach, let's examine them one by one.
Date format
MongoDB expects date to be a number or ISO 8601 string.
Unfortunately, unix date utility have no build-in support for this format, so you should use:
d=`date --date="-3 month" -u +"%Y-%m-%dT%H:%M:%SZ"`
Using extended JSON
JSON specification have no support for dates, so you should use MongoDB extended JSON. So, your final query should look like this:
{ "timeCreated": { "$lte": { "$date": "2014-05-12T08:53:29Z" } } }
test.sh output
You messed up with quotation marks. Here is a script example, outputting correct JSON:
#!/bin/bash
d=`date --date="-3 month" -u +"%Y-%m-%dT%H:%M:%SZ"`
echo '{ "timeCreated": { "$lte": { "$date": "'$d'" } } }'
Passing query to mongoexport
mongoexport expects --query to be a JSON string, not .sh script. So, when you're passing file path to --query, mongoexport expects it to be a JSON file.
To fix it you should execute test2.sh yourself and pass resulting string to mongoexport:
mongoexport --db game_server --collection GameHistory \
--query "`./test2.sh`" --out ./test2.json
N.B. Notice " quotation marks around ./test2.sh call. They're telling bash to treat ./test2.sh output as a single parameter, ignoring all inner quotation marks and whitespaces.
You need to add back ticks around a script or command to have it evaluated:
mongoexport --db game_server --collection GameHistory \
-query `/home/dev/test2.sh` --out /home/dev/file.json

mongoexport too many options error while creating changelog

trying to use mongoexport to export a csv of the oplog... tried all quote combinations I have read so far...
../mongodb/bin/mongoexport --csv -d local -c oplog.rs -o export.csv -f {op,ns,o._id} -q "{ts: { \"$gte\": Timestamp(1355100998000,1)} , op :{ \"$nin\" : [\"c\",\"n\"]}"
but I keep getting
ERROR: too many positional options
.....
what could be wrong?
After a lot of screwing around I have tried this
q="{op: { \$nin: [\"c\",\"n\"]}}"
mongoexport --csv -d local -c oplog.rs -o export.csv -f {op,ns,o._id} -q "$q"
and this works like a charm.
but still this
q="{ts: { \$gte: Timestamp(1355100998000,1)}, op: { \$nin: [\"c\",\"n\"]}}"
../mongodb/bin/mongoexport --csv --db local --collection oplog.rs -o changelog.csv --fields op,ns -q "$q"
does not work. Output
Assertion: 10340:Failure parsing JSON string near: ts: { $gte
Feel something is wrong with Timestamp()?
So finally this is how it should be done... or how I did it. It is pretty fast tried it on 30000 records takes max 2 seconds.
All thats happening is that I am storing the results in a new collection by using mongo with --eval option
q="db.oplog.rs.find({ ts : { \$gte : Timestamp( $timestamp, 1)}, op : { \$nin : [\"c\",\"n\"] } }, { op : 1 , ns : 1 , \"o._id\" : 1 , h : 1 } ).forEach(function(x){db.changelog.save(x);})"
../mongodb/bin/mongo localhost:27017/local --eval "$q"
and then export it as .csv using mongoexport
../mongodb/bin/mongoexport --csv --db local --collection changelog -o changelog.csv --fields "o._id","op","ns","h"
and removinf the temporary database to support future changelogs
../mongodb/bin/mongo localhost:27017/local --eval 'db.changelog.remove()'