Spring Data with Mongo: Subdocument name as query parameter - spring-data

I have the following document structure:
{
"_id": "project1",
"customer": "someDefaultCustomer",
"users": {
"user1": {
"projectRoles": ["CUSTOMER"]
}
}
}
Now i'm going to query all projects with user contain 'user1' with Spring Data Mongo:
#Query("{'users.?1': {$exists : true} }")
Project findUserProject(String login);
The problem is that Spring Data escapes replacement in queries, so i've got the following 'real' query:
o.s.d.m.r.q.StringBasedMongoQuery - Created query { "users.\"user1\"" : { "$exists" : true}}
Is it possible to avoid escaping? Of course, i can create custom query using spring criteria, but i'd like to keep '#Query' approach.

Pass in the full users.user1 as the variable, i.e.:
#Query("{?1: {$exists : true} }")
If you really want to avoid people having to do that then have a helper function that adds the "users." at the start of the string then calls this method.

Related

Mongodb with queries in JSON format

I was trying to solve a query given in an assessment to fetch details from a collection. The assessment was in online editor so I had to follow the base query structure.
MONGODB version 4.4
The base query was looking like below.
{
find : "MY_COLLECTION_1"
}
The collection name is "MY_COLLECTION_1", and there is field "FirstName" column in the collections like below.
{
{
"FirstName": "ABC"
}
{
"FirstName": "XYZ"
}
}
I have never seen the below format to fetch the data. Need help with this.
{
find : "MY_COLLECTION_1"
}
I tried to fetch something like below.
{
"find" : "MY_COLLECTION_1",
"FirstName" : "ABC"
}
but it's saying invalid BSON format
The "online editor" appears to be using the find database command, which has the basic structure:
{
find: <string>,
filter: <document>,
sort: <document>,
projection: <document>,
hint: <document or string>
}
You will likely need to use the filter field to pass in a query.

pymongo db query with multiple conditions- $and $exists

An example document looks like this
{
"_id":ObjectId("562e7c594c12942f08fe4192"),
"Type": "f",
"runTime": ISODate("2016-12-21T13:34:00.000+0000"),
"data" : {
"PRICES SPOT" : [
{
"value" : 29.64,
"timeStamp" : ISODate("2016-12-21T23:00:00.000+0000")
},
{
"value" : 29.24,
"timeStamp" : ISODate("2016-12-22T00:00:00.000+0000")
},
{
"value" : 29.81,
"timeStamp" : ISODate("2016-12-22T01:00:00.000+0000")
},
{
"value" : 30.2,
"timeStamp" : ISODate("2016-12-22T02:00:00.000+0000")
},
{
"value" : 29.55,
"timeStamp" : ISODate("2016-12-22T03:00:00.000+0000")
}
]
}
}
My MongoDb has different Type of documents, I'd like to get a cursor for all of the documents that are from a time range that are of type: "f" but that actually exist. There are some documents in the database that broke the code I had previously(which did not check if PRICES SPOT existed).
I saw that I can use $and and $exists from the documentation. However, I am having trouble setting it up because of the range, and the nesting. I am using pyMongo as my python driver and also noticed here that I have to wrap the $and and $exists in quotes.
My code
def grab_forecast_cursor(self, model_dt_from, model_dt_till):
# create cursor with all items that actually exist
cursor = self._collection.find(
{
"$and":[
{'Type': 'f', 'runTime': {"$gte": model_dt_from, "$lte": model_dt_till}
['data']['PRICES SPOT': "$exists": true]}
]})
return cursor
This results in a Key Error it cannot find data. A sample document that has no PRICE SPOT looks exactly like the one I posted in the beginning, just without that respectively.
In short.. Can someone help me set up a query in which I can grab a cursor with all the documents of a certain type but that actually have respected contents nested in.
Update
I added a comma after the model_dt_till and have now a syntax error.
def grab_forecast_cursor(self, model_dt_from, model_dt_till):
# create cursor with all items that actually exist
cursor = self._collection.find(
{
"$and":[
{'Type': 'f', 'runTime': {"$gte": model_dt_from, "$lte": model_dt_till},
['data']['PRICES SPOT': "$exists": true]}
]})
return cursor
You're trying to use Python syntax to denote the path to a data structure, but the "database" want's it's syntax for the "key" using "dot notation":
cursor = self._collection.find({
"Type": "f",
"runTime": { "$gte": model_dt_from, "$lte": model_dt_till },
"data.PRICES SPOT.0": { "$exists": True }
})
You also don't need to write $and like that as ALL MongoDB query conditions are already AND expressions, and part of your statement was actually doing that anyway, so make it consistent.
Also the check for a "non-empty" array is 'data.PRICES SPOT.0' with the added bonus that not only do you know it "exists", but also that it has at least one item to process within it
Python and JavaScript are almost identical in terms of object/dict construction, so you really should be able to just follow the general documentation and the many samples here that are predominantly JavaScript.
I personally even try to notate answers here with valid JSON, so it could be picked up and "parsed" by users of any language. But here, python is just identical to what you could enter into the mongo shell. Except for True of course.
See "Dot Notation" for an overview of the syntax with more information at Query on Embedded / Nested Documents

How to update a field value globally in mongodb

Is there a easier way to update field text in mongodb globally?
I don't have a mongodb knowledge but some thing like this,
Update "all collections that has field1" set field1=some where field1=1234
Please take a look at mysql solution using db dumps:
Similar to Find and replace in entire mysql database
If the above is not possible what is the best way to go about writing a one time script to migrate data in mongodb? This is in linux environment.
To update multiple documents in the same collection, use the multi:true option when you do an update:
db.collection.update(
{ "field1": 1234 },
{ "$set": { "field1" : somevalue } },
{ multi: true }
);
MongoDB has no commands which affect more than one collection at a time, so you will have to execute this for every collection separately. When you want to do this in the shell, you can perform the command on every collection separately with a script like this:
db.getCollectionNames().forEach( function(name) {
db[name].update(
{ "field1": 1234 },
{ "$set": { "field1" : somevalue } },
{ multi: true }
);
});

Mongo : Trouble updating a Mongo DB row

I am trying to update a row in Mongo DB .
I have a collection named users
db.users.find()
{ "_id" : ObjectId("50e2efe968caee13be412413"), "username" : "sss", "age" : 32 }
I am trying to update the row with the username as Erinamodobo and modify the age to 55
I have tried the below way , but its not working .
db.users.update({ "_id": "50e2efe968caee13be412413" }, { $set: { "username": "Erinamodobo" } });
Please let me know where i am making the mistake ??
Pass in the _id as an ObjectId if you're using the mongo shell, otherwise, it won't find the existing user.
db.users.update({"_id": ObjectId("50e2efe968caee13be412413")},
{ "$set" :
{ "username": "Erinamodobo", "age" : "55" }})
With this query you are updating the name itself.
Notice that the syntax of an update is the following:
db.COLLECTION.update( {query}, {update}, {options} )
where query selects the record to update and update specify the value of the field to update.
So, in your case the correct command is:
db.users.update({ "name": "Erinamodobo" }, { $set: { "age": 55 } });
However, I suggets you to read the mongodb documentation, is very well written (http://docs.mongodb.org/manual/applications/update/)
As an extension to #WiredPrairie, even though he states the right answer he doesn't really explain.
The OjbectId is not a string, it is actually a Object so to search by that Object you must supply an Ojbect of the same type, i.e. here an ObjectId:
db.users.update({ "_id": ObjectId("50e2efe968caee13be412413") }, { $set: { "username": "Erinamodobo" } });
The same goes for any specific BSON type you use from Date to NumberLong you will need to wrap the parameters in Objects of the type.

Using stored JavaScript functions in the Aggregation pipeline, MapReduce or runCommand

Is there a way to use a user-defined function saved as db.system.js.save(...) in pipeline or mapreduce?
Any function you save to system.js is available for usage by "JavaScript" processing statements such as the $where operator and mapReduce and can be referenced by the _id value is was asssigned.
db.system.js.save({
"_id": "squareThis",
"value": function(a) { return a*a }
})
And some data inserted to "sample" collection:
{ "_id" : ObjectId("55aafd2bacbed38e06f9eccf"), "a" : 1 }
{ "_id" : ObjectId("55aafea6acbed38e06f9ecd0"), "a" : 2 }
{ "_id" : ObjectId("55aafeabacbed38e06f9ecd1"), "a" : 3 }
Then:
db.sample.mapReduce(
function() {
emit(null, squareThis(this.a));
},
function(key,values) {
return Array.sum(values);
},
{ "out": { "inline": 1 } }
);
Gives:
"results" : [
{
"_id" : null,
"value" : 14
}
],
Or with $where:
db.sample.find(function() { return squareThis(this.a) == 9 })
{ "_id" : ObjectId("55aafeabacbed38e06f9ecd1"), "a" : 3 }
But in "neither" case can you use globals such as the database db reference or other functions. Both $where and mapReduce documentation contain information of the limits of what you can do here. So if you thought you were going to do something like "look up data in another collection", then you can forget it because it is "Not Allowed".
Every MongoDB command action is actually a call to a "runCommand" action "under the hood" anyway. But unless what that command is actually doing is "calling a JavaScript processing engine" then the usage becomes irrelevant. There are only a few commands anyway that do this, being mapReduce, group or eval, and of course the find operations with $where.
The aggregation framework does not use JavaScript in any way at all. You might be mistaking just as others have done a statement like this, which does not do what you think it does:
db.sample.aggregate([
{ "$match": {
"a": { "$in": db.sample.distinct("a") }
}}
])
So that is "not running inside" the aggregation pipeline, but rather the "result" of that .distinct() call is "evaluated" before the pipeline is sent to the server. Much as with an external variable is done anyway:
var items = [1,2,3];
db.sample.aggregate([
{ "$match": {
"a": { "$in": items }
}}
])
Both essentially send to the server in the same way:
db.sample.aggregate([
{ "$match": {
"a": { "$in": [1,2,3] }
}}
])
So it is "not possible" to "call" any JavaScript function in the aggregation pipeline, nor is there really any point is "passing in" results in general from something saved in system.js. The "code" needs to be "loaded to the client" and only a JavaScript engine can actually do anything with it.
With the aggregation framework, all of the "operators" available are actually natively coded functions as opposed to the "free form" JavaScript interpretation provided for mapReduce. So instead of writing "JavaScript", you use the operators themselves:
db.sample.aggregate([
{ "$group": {
"_id": null,
"sqared": { "$sum": {
"$multiply": [ "$a", "$a" ]
}}
}}
])
{ "_id" : null, "sqared" : 14 }
So there are limitations on what you can do with functions saved in system.js, and the chances are that what you want to do is either:
Not allowed, such as accessing data from another collection
Not really required as the logic is generally self contained anyway
Or probably better implemented in client logic or other different form anyway
Just about the only practical use I can really think of is that you have a number of "mapReduce" operations that cannot be done any other way and you have various "shared" functions that you would rather just store on the server than maintain within every mapReduce function call.
But then again, the 90% reason for mapReduce over the aggregation framework is usually that the "document structure" of the collections has been poorly chosen and the JavaScript functionality is "required" to traverse the document for search and analysis.
So you can use it under the allowed constraints, but in most cases you probably should not be using this at all, but fixing the other issues that caused you to believe you needed this feature in the first place.