Adding to a non-existent array in a collection with $addToSet - mongodb

This should be very simple. I have two collections, one of which holds two types of data (name, age), and the other should simply add the age values to an array (with no duplicates).
I "start" my collections like usual:
People = new Mongo.Collection('people')
Ages = new Mongo.Collection('ages')
Right now I'm working with seed data, but the question could easily extend to when I actually want to dynamically add data to the array. I seed it like so:
Meteor.startup(function() {
if (People.find().count() === 0) {
[
{
name: 'John',
age: '24' //Yes, I want to store it as strings.
},
{ ... } //more data
]
.forEach(function(person) {
People.insert(person)
Ages.update({ $addToSet: {age: person.age}}) //Not working
})
}
})
That last part there is what's not working. I guess I figured $addToSet would fix things for me, since the docs say:
If the field is absent in the document to update, $addToSet creates
the array field with the specified value as its element.
Now I suppose I have to create the field first, but I'm not sure where or how. I have a strong, strong feeling that I'm overlooking something ridiculously simple here...

If I got it right, your db should look like that when filled
Persons (_id, name, age)
1, John, 24
2, Pete, 21
3, Michele, 27
4, Sandy, 21
Ages (_id, ageset)
?, [ 24, 21, 27 ]
Solution1: Just insert one record on a fix key and then only update this one.
Have a look at this MeteorPad
Solution2: Using a local Meteor.Collection which is synced by server an gets DISTINCT field values from package mrt:mongodb-aggregation.
Have a look at this MeteorPad
Solution3: Using a server side synced Mongo.Collection to hold the distinct ages list.
Have a look at this MeteorPad
Remark: Checkout log infos on server process. There are timeouts to add, change and remove a record for test and updates (5 sec, 10 sec, 15 sec)

Now right now, I see that you're defining your People collection, but I don't see you actually defining "person" or "Age" anywhere. Maybe thats just due to how you've formatted your answer.
Either way though, I'm not entirely sure you'd be getting anything to happen. As far as I know, you'll need to select the documents each time through the loop, as you want to update them.
This is how I'm doing something similar in an app I'm working on:
Meteor.users.update({ _id: Meteor.userId() }, { $addToSet: { 'profile.viewedRequests' : this._id }});
The key there being that I'm selecting an individual document, before attempting to update it.
Its either that, or you need to switch to People.update.

Related

Increment or create object in sub-array

I want to store/update statistics for each year. I have the following model
{
entityid: ObjectId,
stats: [
{ year: 2018, value: 25 }
]
}
(This model is a bit simplified, in reality the year has also an array with months -> days -> hours. But the problem stays the same for the simplified model)
For updating I can simply use $inc like
db.statistics.updateOne(
{entityid, 'stats.year': 2018},
{$inc: { 'stats.$.year': 1}}
)
But now a problem arises when a new year begins because there will be no { year: 2019, value: 0 } inside the stats array. Upsert can not really be used because of the positional operator $.
The current solution is to check the result of the update query above if we actually modified a document. If no changes were applied we execute a push to insert the array element for the new year and execute the update again.
The solution feels like a hack and produces some problem with race conditions where multiple objects are pushed for the same year, although this can be fixed easily.
Can the update/push operation be performed in one go? Or is there a better database model to store this information?
You can either follow your hack or make database like this and use upsert on the year key while using $inc on value
{
entityid: ObjectId,
year: 2018,
value: 25
}
and use $group on entityid while fetching data if you want to group data.

duplicate a set of objects instances that match the specific field

I am a newbie in mongodb, my second day of playing with this, and I would like to get some help for my question below!
I would like to duplicate or copy a set of objects instances that match the specific field, and then copy them and save them with new ObjectIds.
Example
db.getCollection('food').find( { food_type: "fruit" } )
The query above with show me 8 documents with 8 fields each: food_type, name, description, color, origin, import_price, export_price, margin.
I have 8 types of fruits, and the query
totalFoodTypes = db.getCollection('food').count( { food_type: "fruit" } )
will return 8. I would like to copy all these 8 documents into another brand new 8 documents with new ObjectIds and food_type field value changed to Vegetable. Is there an easy way for doing that?
My current best thinking is to store the result from
db.getCollection('food').find( { food_type: "fruit" } )
and then loop totalFoodTypes documents and insert them one by one with Java.
Would be really appreciated if mongodb has some shortcut for doing this.

Unique Values in NoSQL

Consider mongodb or couchbase. What if I need a certain value to be unique (maybe incremental) within the range of UINT32?
Well, I guess I could add a field like another_id and use something like this to increment it (mongo).
function getNextSequence(name) {
var ret = db.counters.findAndModify(
{
query: { _id: name },
update: { $inc: { seq: 1 } },
new: true
}
);
return ret.seq;
}
db.users.insert(
{
another_id : getNextSequence("userid"),
name : "Stack O. Flow"
}
)
But really the question is,
Is this approach safe?
Should I even use NoSQL for this? (consider I only have around 50M rows of data but I really need fast read and writes because this 50M rows of data gets updated almost a few times in second.)
If I should stick with SQL which one should I use. I've used MySQL and it was too slow. (though non-optimization might be at fault) (joining quite a few tables)
Thank you for any suggestions.
There is a specific counter object in Couchbase that should do what you want. Here is an example of it with Node.js.
You could relate it to the main object you are using by doing an objectID such as:
original_objectID::counter.
Then when you go to get the original object, you just do another get for the counter object by ID and done. You can iterate it easily as well. So if you needed to get the object and the original objectID was
user::kirk
then that user's counter object would be:
user::kirk::counter
And you can get and set it by that ID. It works very well in Couchbase.

MongoDB order by "number" ascending

I'm trying to create a registration form with mongoose and MongoDB. I have a unique key, UserId, and every time I create a new entry I would like to take the greatest UserId in the database and increase it by one.
I tried with db.user.find({}).sort({userId: 1}); but it seems not to work.
Thanks
Masiar
What you want to do sounds more like a Schema for Relational Databases with an Auto Increment. I would recommend another solution.
At first you already have a unique id. It get automatically created and are in "_id" field. For me it seems you want to have a UserID for building relation, but you already ca use the value in _id.
The other thing why you want incremented ids could be that you create a webapplication and propably want "nicer" urls? For example. /user/1 instead of /user/abc48df...?
If that is the case i would prefer to create a unique constraint on a username. And instead of an id you use you username in the url "/user/john".
With this your urls are much nicer. And for building relation you can use _id. And you don't run into problems with fethcing the highest number first.
To create a unique index:
db.collection.ensureIndex({username: 1}, {unique: true})
You can do this to get the user with the current highest UserId:
db.user.insert( { UserId: 1 } )
db.user.insert( { UserId: 2 } )
db.user.insert( { UserId: 3 } )
db.user.find().sort( { UserId: -1 } ).limit(1)
It's worth noting that there isn't a way in MongoDB to fetch this value and insert a new user in a single atomic transaction, it only supports atomic operations on single documents. You'd need to take care that another operation didn't insert another user at the same time, you could end up with two users with the same UserId.
To iterate over the cursor and get put the returned doc in an array:
var myArray = [];
User.find().sort('UserId','descending').limit(1).each(function(err, doc) {
myArray.push(doc);
});

mongoDB: unique index on a repeated value

So i'm pretty new to mongoDb so i figure this could be a misunderstanding on general usage. so bear with me.
I have a document schema I'm working with as such
{
name: "bob",
email: "bob#gmail.com",
logins: [
{ u: 'a', p: 'b', public_id: '123' },
{ u: 'x', p: 'y', public_id: 'abc' }
]
}
My Problem is that i need to ensure that the public ids are unique within a document and collection,
Furthermore there are some existing records being migrated from a mySQL DB that dont have records, and will therefore all be replaced by null values in mongo.
I figure its either an index
db.users.ensureIndex({logins.public_id: 1}, {unique: true});
which isn't working because of the missing keys and is throwing a E11000 duplicate key error index:
or this is a more fundamental schema problem in that I shouldn't be nesting objects in an array structure like that. In which case, what? a seperate collection for the user_logins??? which seems to go against the idea of an embedded document.
If you expect u and p to have always the same values on each insert (as in your example snippet), you might want to use the $addToSet operator on inserts to ensure the uniqueness of your public_id field. Otherwise I think it's quite difficult to make them unique across a whole collection not working with external maintenance or js functions.
If not, I would possibly store them in their own collection and use the public_id as _id field to ensure their cross-document uniqueness inside a collection. Maybe that would contradict the idea of embedded docs in a doc database, but according to different requirements I think that's negligible.
Furthermore there are some existing records being migrated from a mySQL DB that dont have records, and will therefore all be replaced by null values in mongo.
So you want to apply a unique index on a data set that's not truly unique. I think this is just a modeling problem.
If logins.public_id is null that's going to violate your uniqueness constraint, then just don't write it at all:
{
logins: [
{ u: 'a', p: 'b' },
{ u: 'x', p: 'y' }
]
}
Thanks all.
In the end I opted to seperate this into 2 collections, one for users and one for logins.
users this looked a little like..
userDocument = {
...
logins: [
DBRef('loginsCollection', loginDocument._id),
DBRef('loginsCollection', loginDocument2._id),
]
}
loginDocument = {
...
user: new DBRef('userCollection', userDocument ._id)
}
Although not what i was originally after (a single collection) It is working niocely and by utilising the MongoId uniquness there is a constraint now built in at a database level and not implemented at the application level.