Create index and delete the duplicate document in mongodb - mongodb

I want to create Index by ensureIndex at name with parameter:dropDups,cause my collection books has two documents' name is "0book".
db.books.ensureIndex({name:1},{unique:true,dropDups:true})
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"errmsg" : "exception: E11000 duplicate key error index:
foobar.books.$name_1 dup key: { : \"0book\" }",
"code" : 11000,
"ok" : 0
}
Otherwise,it alerts me to the errmsg as above.
I don't know how to use pramater "dropDups",when create index in mongodb

Related

Unable to shard a mongodb collection with existing index. Getting "couldn't find valid index for shard key"

I have a collection with the following index...
{
"feedId" : 1,
"timekey" : 1,
"entity.samplingRate" : 1,
"endTimekey" : 1,
"geo" : "2dsphere"
}
and I'm trying to shard the collection using the following shard key...
{
"feedId": 1,
"timekey": 1,
"entity.samplingRate": 1
}
which is giving me the following error...
{
"ok" : 0,
"errmsg" : "couldn't find valid index for shard key",
"code" : 96,
...
}
Why doesn't this work? Is it because "geo" is a "2dsphere" index? Or is it because of the "." in "entity.samplingRate"? It is not an index on an array (i.e "entity" is a subobject, not an array). Is there something else going on here?

Indexing array/subobject in mongoDB causes duplicate key error

I have a collection where I will have a _children attribute like this:
{
_children: {
videoTags: [ { id: '1', name: 'one'}, { id: '2', name: 'two'} ],
},
a: 10
}
Since I WILL search in videoTags, I create an index as such:
> db.test4.createIndex({ "_children.videosTags.id" : 1 }, { "unique" : true } );
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
Trouble is, I can no longer add anything to that table since I get a duplicate index error. Here is how to reproduce it:
Step 1: insert to a collection
db.test4.insert({a:20})
WriteResult({ "nInserted" : 1 })
Step 2: make the index
db.test4.createIndex({ "_children.videosTags.id" : 1 }, { "unique" : true } );
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
Step 3: try to insert again
db.test4.insert({a:30})
WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 11000,
"errmsg" : "insertDocument :: caused by :: 11000 E11000 duplicate key error index: wonder_1.test4.$_children.videosTags.id_1 dup key: { : null }"
}
})
I think the issue here is that there is already a record where _children.videoTags.id is not defined.
However, what I expected was a behaviour where if videoTags.id was specified, it needed to be unique. Instead, an empty one is considered a "taken" key.
What am I doing that is stupidly wrong?
This will work if you don't set unique as true, but I have the feeling I need to fix it for real...
There could be two reasons.
There could be other documents exists in collection with same _children.videosTags.id
It's quite possible that more than one document may have missing _children.videosTags.id" or having null value.
As you are creating unique key, null or empty values are give you tough time. Solution is either create sparse index and if your MongoDB version is 3.2+, create partial index. See documentation for partial indexes.

dropDups true not working mongodb

I am using mongoDB shell with version 3.0.2
I am trying to ensure uniqueness constraint over a username field in collection users.
This is what I gave:
db.users.ensureIndex({"username": 1},{unique: true})
It gave me following error:
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"errmsg" : "exception: E11000 duplicate key error index: mybackend.users.$username_1 dup key: { : \"rahat\" }",
"code" : 11000,
"ok" : 0
}
Then i used, dropDups: true in the command:
db.users.ensureIndex({"username": 1},{unique: true, dropDups: true})
Then too I get the same error:
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"errmsg" : "exception: E11000 duplicate key error index: mybackend.users.$username_1 dup key: { : \"rahat\" }",
"code" : 11000,
"ok" : 0
}
I also saw this SO link but here it already had one index. Mine does not have one.
db.users.getIndexes() ->
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "mybackend.users"
}
]
I looked for this issue on github and restarted the mongo but to no use. What am I doing wrong? I think I am doing a silly mistake. Please help.
The drop duplicates functionality on index creation is no longer supported since Mongo version 3.0. See the compatibility changes page for the 3.0 release.

Add field that is unique index to collection in MongoDB

I'm trying to add a username field to documents in a 'users' collection, and I'd like it to be a unique index. (So far, we've been using email addresses for login but we'd like to add a username field as well.) However, running db.users.ensureIndex({username:1},{unique:true}) fails because mongo considers all the unset usernames to be duplicates and therefore not unique. Anybody know how to get around this?
Show the current users and username if they have one:
> db.users.find({},{_id:0,display_name:1,username:1})
{ "display_name" : "james" }
{ "display_name" : "sammy", "username" : "sammy" }
{ "display_name" : "patrick" }
Attempt to make the 'username' field a unique index:
> db.users.ensureIndex({username:1},{unique:true})
{
"err" : "E11000 duplicate key error index: blend-db1.users.$username_1 dup key: { : null }",
"code" : 11000,
"n" : 0,
"connectionId" : 272,
"ok" : 1
}
It doesn't work because both james and sammy have username:null.
Let's set patrick's username to 'patrick' to eliminate the duplicate null value.
> db.users.update({display_name: 'patrick'}, { $set: {username: 'patrick'}});
> db.users.ensureIndex({username:1},{unique:true})
> db.users.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"ns" : "blend-db1.users",
"name" : "_id_"
},
{
"v" : 1,
"key" : {
"username" : 1
},
"unique" : true,
"ns" : "blend-db1.users",
"name" : "username_1"
}
]
Now it works!
To clarify the question, what I'd like is to be able to make username a unique index without having to worry about all the documents that have username still set to null.
Try creating a unique sparse index:
db.users.ensureIndex({username:1},{unique:true,sparse:true})
As per the docs:
You can combine the sparse index option with the unique indexes option
so that mongod will reject documents that have duplicate values for a
field, but that ignore documents that do not have the key.
Although this only works for documents which don't have the field, as opposed to documents that do have the field, but where the field has a null value.

mongo _id field duplicate key error

I have a collection with the _id field as a IP with type String.
I'm using mongoose, but here's the error on the console:
$ db.servers.remove()
$ db.servers.insert({"_id":"1.2.3.4"})
$ db.servers.insert({"_id":"1.2.3.5"}) <-- Throws dup key: { : null }
Likely, it's because you have an index that requires a unique value for one of the fields as shown below:
> db.servers.remove()
> db.servers.ensureIndex({"name": 1}, { unique: 1})
> db.servers.insert({"_id": "1.2.3"})
> db.servers.insert({"_id": "1.2.4"})
E11000 duplicate key error index: test.servers.$name_1 dup key: { : null }
You can see your indexes using getIndexes() on the collection:
> db.servers.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"ns" : "test.servers",
"name" : "_id_"
},
{
"v" : 1,
"key" : {
"name" : 1
},
"unique" : true,
"ns" : "test.servers",
"name" : "name_1"
}
]
I was confused by exactly the same error today, and later figured it out. It was because I removed a indexed property from a mongoose schema, but did not drop that property from the mongodb index. The error message is infact that the new document has an indexed property whose value is null (not in the json).