Python check MongoDB field exists or not - mongodb

I have two collections one is websites which stores information like:
{
"_id" : ObjectId("5ac5efd6a37efa4c0e28f5aa"),
"main_id" : 3,
"status" : "",
"website" : "http://test.com",
"last_access_time" : "2018-04-16 17:49:03",
"links" : [
{
"link_id" : "test-1",
"link" : "test1.html"
},
{
"link_id" : "test-2",
"link" : "test.html"
}
]
}
And another is website_info in which I want store info like:
{
"_id" : ObjectId("5ad72ddecf45b60dffcbf9f2"),
"main_id" : 3,
"last_access_time" : "2018-04-18 15:37:02",
"test-1" : {
"no_of_links" : 55,
"links_2" : [
{
"link" : "/home",
"link_id" : "secnd-1",
},
{
"link" : "/login",
"link_id" : "secnd-2",
},
{
"link" : "/services",
"link_id" : "secnd-3",
}
]
},
"test-2" : {
"no_of_links" : 55,
"links_2" : [
{
"link" : "/home",
"link_id" : "secnd-1",
},
{
"link" : "/login",
"link_id" : "secnd-2",
},
{
"link" : "/services",
"link_id" : "secnd-3",
}
]
}
}
I am using Python3 and mongoDB.
Here I want to check the field like "link_id" which is "test-1" in the website_info for main_id = 3 exists or not. If it is exists I will update for same, if does not exists I want to insert new record set.
The thing is how to check whether field "test-1" (which is the value from websites collection) in website_info collection exists or not.
Help is appreciated.

Here in my case, link_id will be unique in website_info collection. So no need to check for main_id, only checking for link_id solved my issue, like:
#classmethod
def find_link(self, link_id):
cursor = self.db.collection.find({link_id: {'$exists': True}} )
results = list(cursor)
return results
And check for exists like:
if(len(is_exists)>0):
#do if exists

Related

Mongoose updateMany :: wont find any on given condition

I have updateMany function as follows
Article.updateMany({author: userId}, {author: anonym}, function(err, updated) {
if (err) {
res.send(err);
} else {
res.send(updated);
}
});
userId is = 6068b57dbe4eef0b579120c7
anonym is = 6069870676d6320f39e7e5a2
for testing purposes I have a single article in MongoDB as follows
db.articles.find()
{ "_id" : ObjectId("6068b591be4eef0b579120c8"), "favoritesCount" : 1, "comments" : [ ], "tagList" : [ ], "title" : "Martin", "description" : "Testib", "body" : "Asju", "author" : ObjectId("6068b57dbe4eef0b579120c7"), "slug" : "martin-2hzx78", "createdAt" : ISODate("2021-04-03T18:36:01.977Z"), "updatedAt" : ISODate("2021-04-03T18:53:29.809Z"), "__v" : 0 }
You can see that article has "author" : id field in it which currently shows userId as author.
I want to update that field and transfer authorship to anonym user.
When I send this request to postman I get following response
{
"n": 0,
"nModified": 0,
"ok": 1
}
And database remains unchanged. What am I doing wrong here ?

Add field to every document with existing data (move fields data to new field)

I have almost no experience in SQL or noSQL.
I need to update every document so that my fields "Log*" are under the new field "Log"
I found some help from this StackOverflow, but I am still wondering how to move the data.
Thank you very much
Original document
// collection: Services
{
"_id" : ObjectId("5ccb4f99f4953d4894acbe79"),
"Name" : "WebAPI",
"LogPath" : "Product\\APIService\\",
"LogTypeList" : [
{
"Name" : "ApiComCounter",
"FileName" : "ApiComCounter.log"
},
{
"Name" : "ApiService",
"FileName" : "ApiService.log"
}
]
}
Final Document
// collection: Services
{
"_id" : ObjectId("5ccb6fa2ae8f8a5d7037a5dd"),
"Name" : "InvoicingService",
"Log" : {
"LogPath" : "Product\\APIService\\",
"LogTypeList" : [
{
"Name" : "ApiComCounter",
"FileName" : "ApiComCounter.log"
},
{
"Name" : "ApiService",
"FileName" : "ApiService.log"
}
]
}
}
This requires MongoDB 4.2 or higher:
db.<collection>.updateMany({}, [
{$set: {"Log.LogPath": "$LogPath", "Log.LogTypeList": "$LogTypeList"}},
{$unset: ["LogPath", "LogTypeList"]}
])

Mongodb update and delete operations in a single query

I have documents in which I would like to update the hostUser with one of the members of the document,also have to delete the record from the member document and add the chips of the deleted member in the club chips.
Here is the sample document.
{
"_id" : "1002",
"hostUser" : "1111111111",
"clubChips" : 10000,
"requests" : {},
"profile" : {
"clubname" : "AAAAA",
"image" : "0"
},
"tables" : [
"SJCA3S0Wm"
],
"isDeleted" : false,
"members" : {
"1111111111" : {
"chips" : 0,
"id" : "1111111111"
},
"2222222222" : {
"chips" : 0,
"id" : "2222222222"
}
}
}
This is what I have tried.
db.getCollection('test').updateMany({"hostUser":"1111111111"},
{"$set":{"hostUser":"2222222222"},"$unset":{"members.1111111111":""}})
This is how you would handle unset and set in a single call to updateMany. Can you please clarify what you meant by "check if the values exist in the member field"?
db.getCollection('test').updateMany(
{"hostUser":"1111111111"},
{
'$set': {"hostUser":"2222222222"} ,
'$unset': {"members.1111111111":""}
}
)

meteor client find is not working due to $eq

I subscribed to my servers's publication as follows:
Template.observedQuestions.onCreated(function(){
var self = this;
self.autorun(function(){
self.subscribe('observedQuestionsFeed');
});
});
Now I need to fetch my data using helper function:
Template.observedQuestions.helpers({
observedQuestionsList : function(){
questions = Questions.find({
observedByUsers : {$exists: true,$elemMatch:{$eq:Meteor.userId()}}});
return questions;
}
});
but it does not work due to $eq being not recognised in minimongo.
How to solve it?
doc sample:
{
"_id" : "rP4JP8jkprwwi3ZCp",
"qUserId" : "NLLW3RBXqnbSGuZ3n",
"type" : "question",
"date" : ISODate("2016-02-13T11:23:10.845Z"),
"subject" : "test",
"question" : "test",
"replies" : [
{
"rID" : "LphcqKnkTHf25SCwq",
"rUserID" : "NLLW3RBXqnbSGuZ3n",
"date" : ISODate("2016-02-13T11:23:10.847Z"),
"answer" : "reply1."
},
{
"rID" : "HxaohnEgxwNJLtf2z",
"rUserID" : "NLLW3RBXqnbSGuZ22",
"date" : ISODate("2016-02-13T11:23:10.848Z"),
"answer" : "reply2"
}
],
"observedByUsers" : [ "Bi24LGozvtihxFrNe" ]
}
Judging from your sample Questions document, the field observedByUsers is a simple array which contains user IDs.
As a result, you could simply use the following query:
Questions.find({observedByUsers: Meteor.userId()});

MongoDB MapReduce producing different results for each document

This is a follow-up from this question, where I tried to solve this problem with the aggregation framework. Unfortunately, I have to wait before being able to update this particular mongodb installation to a version that includes the aggregation framework, so have had to use MapReduce for this fairly simple pivot operation.
I have input data in the format below, with multiple daily dumps:
"_id" : "daily_dump_2013-05-23",
"authors_who_sold_books" : [
{
"id" : "Charles Dickens",
"original_stock" : 253,
"customers" : [
{
"time_bought" : 1368627290,
"customer_id" : 9715923
}
]
},
{
"id" : "JRR Tolkien",
"original_stock" : 24,
"customers" : [
{
"date_bought" : 1368540890,
"customer_id" : 9872345
},
{
"date_bought" : 1368537290,
"customer_id" : 9163893
}
]
}
]
}
I'm after output in the following format, that aggregates across all instances of each (unique) author across all daily dumps:
{
"_id" : "Charles Dickens",
"original_stock" : 253,
"customers" : [
{
"date_bought" : 1368627290,
"customer_id" : 9715923
},
{
"date_bought" : 1368622358,
"customer_id" : 9876234
},
etc...
]
}
I have written this map function...
function map() {
for (var i in this.authors_who_sold_books)
{
author = this.authors_who_sold_books[i];
emit(author.id, {customers: author.customers, original_stock: author.original_stock, num_sold: 1});
}
}
...and this reduce function.
function reduce(key, values) {
sum = 0
for (i in values)
{
sum += values[i].customers.length
}
return {num_sold : sum};
}
However, this gives me the following output:
{
"_id" : "Charles Dickens",
"value" : {
"customers" : [
{
"date_bought" : 1368627290,
"customer_id" : 9715923
},
{
"date_bought" : 1368622358,
"customer_id" : 9876234
},
],
"original_stock" : 253,
"num_sold" : 1
}
}
{ "_id" : "JRR Tolkien", "value" : { "num_sold" : 3 } }
{
"_id" : "JK Rowling",
"value" : {
"customers" : [
{
"date_bought" : 1368627290,
"customer_id" : 9715923
},
{
"date_bought" : 1368622358,
"customer_id" : 9876234
},
],
"original_stock" : 183,
"num_sold" : 1
}
}
{ "_id" : "John Grisham", "value" : { "num_sold" : 2 } }
The even indexed documents have the customers and original_stock listed, but an incorrect sum of num_sold.
The odd indexed documents only have the num_sold listed, but it is the correct number.
Could anyone tell me what it is I'm missing, please?
Your problem is due to the fact that the format of the output of the reduce function should be identical to the format of the map function (see requirements for the reduce function for an explanation).
You need to change the code to something like the following to fix the problem, :
function map() {
for (var i in this.authors_who_sold_books)
{
author = this.authors_who_sold_books[i];
emit(author.id, {customers: author.customers, original_stock: author.original_stock, num_sold: author.customers.length});
}
}
function reduce(key, values) {
var result = {customers:[] , num_sold:0, original_stock: (values.length ? values[0].original_stock : 0)};
for (i in values)
{
result.num_sold += values[i].num_sold;
result.customers = result.customers.concat(values[i].customers);
}
return result;
}
I hope that helps.
Note : the change num_sold: author.customers.length in the map function. I think that's what you want