Indexing array/subobject in mongoDB causes duplicate key error - mongodb

I have a collection where I will have a _children attribute like this:
{
_children: {
videoTags: [ { id: '1', name: 'one'}, { id: '2', name: 'two'} ],
},
a: 10
}
Since I WILL search in videoTags, I create an index as such:
> db.test4.createIndex({ "_children.videosTags.id" : 1 }, { "unique" : true } );
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
Trouble is, I can no longer add anything to that table since I get a duplicate index error. Here is how to reproduce it:
Step 1: insert to a collection
db.test4.insert({a:20})
WriteResult({ "nInserted" : 1 })
Step 2: make the index
db.test4.createIndex({ "_children.videosTags.id" : 1 }, { "unique" : true } );
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
Step 3: try to insert again
db.test4.insert({a:30})
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 11000,
"errmsg" : "insertDocument :: caused by :: 11000 E11000 duplicate key error index: wonder_1.test4.$_children.videosTags.id_1 dup key: { : null }"
}
})
I think the issue here is that there is already a record where _children.videoTags.id is not defined.
However, what I expected was a behaviour where if videoTags.id was specified, it needed to be unique. Instead, an empty one is considered a "taken" key.
What am I doing that is stupidly wrong?
This will work if you don't set unique as true, but I have the feeling I need to fix it for real...

There could be two reasons.
There could be other documents exists in collection with same _children.videosTags.id
It's quite possible that more than one document may have missing _children.videosTags.id" or having null value.
As you are creating unique key, null or empty values are give you tough time. Solution is either create sparse index and if your MongoDB version is 3.2+, create partial index. See documentation for partial indexes.

Related

mongdb ensure uniqueness on two fields both ways

Say I have the fields a and b. I want to have a compound uniqueness where if a: 1, b: 2, I would not be able to do a: 2, b: 1.
The reason I want this is because I'm making a "friends list" kind of collection, where if a is connected to b, then it's automatically the reverse as well.
is this possible on a schema level or do I need to do queries to check.
If you don't need to differentiate between requester and requestee, you could sort the values before saving or querying so that your two fields a and b have a predictable order for any pair of friend IDs (and you can take advantage of the unique index constraint).
For example, using the mongo shell:
Create a helper function to return friend pairs in predictable order:
function friendpair (friend1, friend2) {
if ( friend1 < friend2) {
return ({a: friend1, b: friend2})
} else {
return ({a: friend2, b: friend1})
}
}
Add a compound unique index:
> db.friends.createIndex({a:1, b:1}, {unique: true});
{
"createdCollectionAutomatically" : true,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
Insert unique pairs (should work)
> db.friends.insert(friendpair(1,2))
WriteResult({ "nInserted" : 1 })
> db.friends.insert(friendpair(1,3))
WriteResult({ "nInserted" : 1 })
Insert non-unique pair (should return duplicate key error):
> db.friends.insert(friendpair(2,1))
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 11000,
"errmsg" : "E11000 duplicate key error collection: test.friends index: a_1_b_1 dup key: { : 1.0, : 2.0 }"
}
})
Search should work in either order:
db.friends.find(friendpair(3,1)).pretty()
{ "_id" : ObjectId("5bc80ed11466009f3b56fa52"), "a" : 1, "b" : 3 }
db.friends.find(friendpair(1,3)).pretty()
{ "_id" : ObjectId("5bc80ed11466009f3b56fa52"), "a" : 1, "b" : 3 }
Instead of handling duplicate key errors or insert versus update, you could also use findAndModify with an upsert since this is expected to be a unique pair:
> var pair = friendpair(2,1)
> db.friends.findAndModify({
query: pair,
update: {
$set: {
a : pair.a,
b : pair.b
},
$setOnInsert: { status: 'pending' },
},
upsert: true
})
{
"_id" : ObjectId("5bc81722ce51da0e4118c92f"),
"a" : 1,
"b" : 2,
"status" : "pending"
}
Doesn't seem like you can do a unique on the entire array's values so I'm doing a kind of work around. I'm using the $jsonSchema as follows:
{
$jsonSchema:
{
bsonType:
"object",
required:
[
"status",
"users"
],
properties:
{
status:
{
enum:
[
"pending",
"accepted"
],
bsonType:
"string"
},
users:
{
bsonType:
"array",
description:
"references two user_id",
items:
{
bsonType:
"objectId"
},
maxItems:
2,
minItems:
2,
},
}
}
}
then I will use $all to find the connected users, e.g.
db.collection.find( { users: { $all: [ ObjectId1, ObjectId2 ] } } )

What's causing this indexing error in mongo db?

I've created 3 indexes based on several json files of Yelp that I imported into my mongodb.
> db.review.createIndex({"text":"text"})
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
> db.business.createIndex({"categories":"text"})
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
> db.business.createIndex({"attributes":"text"})
{
"ok" : 0,
"errmsg" : "Index with pattern: { _fts: \"text\", _ftsx: 1 } already exists with different options",
"code" : 85
Basically I'm trying to create 3 indexes to make count function faster in my mongodb.
What does "errmsg" : "Index with pattern: { _fts: \"text\", _ftsx: 1 } already exists with different options" means?
Should I pick a differente thing as attribute, or should I drop it?
MongoDB (as of v3.4) only allows one text index per collection
In your business collection, you already built a text index on categories. So, the second text index, on attributes, will fail.

mongodb combination of fields unique

I need to ensure combination of some fields to be unique , so i found out we can use indexes to achieve it , like :
db.user.ensureIndex({'fname':1,'lname':1},{unique:true})
However in my scenario i allow the user to add email or phone or even both . So how do i achieve the uniqueness in this case . I tried like setting 2 indexed like
db.user.ensureIndex({'fname':1,'lname':1,'email':1},{unique:true})
And
db.user.ensureIndex({'fname':1,'lname':1,'mobile':1},{unique:true})
this dint work when i try to add a document with email unique , like below
> db.user.insert({fname:'tony',lname:'stark',email:'aa#gmail.com'})
WriteResult({ "nInserted" : 1 })
> db.user.insert({fname:'tony',lname:'stark',email:'xyz#gmail.com'})
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 11000,
"errmsg" : "E11000 duplicate key error index: temp.user.$fname_1_lname_1_mobile_1 dup key: { : \"tony\", : \"stark\", : null }"
}
})
I guess the error is because of the second index ,since both the inserts have null value for the mobile field . I then tried this
db.user.ensureIndex({'fname':1,'lname':1,'email':1,'mobile':1},{unique:true})
And that dint help ,is there any way to do this with mongodb or check the uniqueness with code otherwise .
Update
I need to ensure uniqueness in this case too :
db.user.insert({fname:'tony',lname:'stark',email:'abc#gmail.com'})
<<insert>>
db.user.insert({fname:'tony',lname:'stark',email:'abc#gmail.com',mobile:554545435})
<<dont insert>>
Update 2
Even Unique partial Index didn't help
> db.users.createIndex({ "fname": 1, "lname": 1, "email":1 },
... { unique: true, partialFilterExpression:{
... email: { $exists: true, $gt : { $type : 10 } } } })
{
"createdCollectionAutomatically" : true,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
> db.users.createIndex({ "fname": 1, "lname": 1, "mobile":1 },
... { unique: true, partialFilterExpression:{
... mobile: { $exists: true, $gt : { $type : 10 } } } })
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 2,
"numIndexesAfter" : 3,
"ok" : 1
}
Here is the query :
>db.users.insert({fname:"tony",lname:"stark",mobile:"123",email:"123#gmail.com"})
WriteResult({ "nInserted" : 1 })
>db.users.insert({fname:"tony",lname:"stark",mobile:"123",email:"123#gmail.com"})
WriteResult({ "nInserted" : 1 })
Mongo version - v3.2.0
I have no experience with mongodb much neither have used indexes in other db's , i guess it's kind of noob question .Thanks in advance

dropDups true not working mongodb

I am using mongoDB shell with version 3.0.2
I am trying to ensure uniqueness constraint over a username field in collection users.
This is what I gave:
db.users.ensureIndex({"username": 1},{unique: true})
It gave me following error:
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"errmsg" : "exception: E11000 duplicate key error index: mybackend.users.$username_1 dup key: { : \"rahat\" }",
"code" : 11000,
"ok" : 0
}
Then i used, dropDups: true in the command:
db.users.ensureIndex({"username": 1},{unique: true, dropDups: true})
Then too I get the same error:
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"errmsg" : "exception: E11000 duplicate key error index: mybackend.users.$username_1 dup key: { : \"rahat\" }",
"code" : 11000,
"ok" : 0
}
I also saw this SO link but here it already had one index. Mine does not have one.
db.users.getIndexes() ->
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "mybackend.users"
}
]
I looked for this issue on github and restarted the mongo but to no use. What am I doing wrong? I think I am doing a silly mistake. Please help.
The drop duplicates functionality on index creation is no longer supported since Mongo version 3.0. See the compatibility changes page for the 3.0 release.

mongo _id field duplicate key error

I have a collection with the _id field as a IP with type String.
I'm using mongoose, but here's the error on the console:
$ db.servers.remove()
$ db.servers.insert({"_id":"1.2.3.4"})
$ db.servers.insert({"_id":"1.2.3.5"}) <-- Throws dup key: { : null }
Likely, it's because you have an index that requires a unique value for one of the fields as shown below:
> db.servers.remove()
> db.servers.ensureIndex({"name": 1}, { unique: 1})
> db.servers.insert({"_id": "1.2.3"})
> db.servers.insert({"_id": "1.2.4"})
E11000 duplicate key error index: test.servers.$name_1 dup key: { : null }
You can see your indexes using getIndexes() on the collection:
> db.servers.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"ns" : "test.servers",
"name" : "_id_"
},
{
"v" : 1,
"key" : {
"name" : 1
},
"unique" : true,
"ns" : "test.servers",
"name" : "name_1"
}
]
I was confused by exactly the same error today, and later figured it out. It was because I removed a indexed property from a mongoose schema, but did not drop that property from the mongodb index. The error message is infact that the new document has an indexed property whose value is null (not in the json).