MongoDB - Duplicate Key Error After Changing Model To Allow Duplicates - mongodb

I am using the mean stack. In Mongoose I defined a model with these properties:
var personSchema = new mongoose.Schema({
personName:{ type: String, unique: true, required: true, index:true },
start: { type: Date},
end: { type: Date }
});
However, when testing I realised I had made a mistake and that personName should not be unique. I removed the unique: true property and restarted MongoDB and the app.
However, I still get the duplicate key error when submitting.
Can anyone tell me what I'm doing wrong?

You might have created an index for the personName field.
Remove the index associated with field personName and try, it will work.
Reason:
when the field personName in the state "unique: true" index would be fine and now after removal of the state "unique: true". If we are trying to enter a record which is having a personName which is already there in the DB, then DB will throw Duplicate key error.

Related

Uncaught Error: After filtering out keys not in the schema, your modifier is now empty

everyone. I am working on an attendance system using meteor.js. So I have two collections subject , student. In this system, a form will be submitted with selected studentID and subjectCode for subject enrollment. However, I got this error when I submitted. Uncaught Error: After filtering out keys not in the schema, your modifier is now empty.
The subject schema is like this
const subjectSchema = new SimpleSchema({
subjectCode:{
type: String,
label: “Subject Code”,
index:true,
unique:true
},
subjectName:{
type: String,
label: “Subject Name”,
index:true,
unique:true
},
enrollment:{
type: Array,
optional: true
},
‘enrollment.$’:String,
});
For actions
Template.enroll.events({
'submit form':function(e){
e.preventDefault();
var name = $(e.target).find('[name=studentID]').val();
var subjectCode = $(e.target).find('[name=subjectCode]').val();
subject.update(
{subjectCode:subjectCode},
{$push:{"enrollment.$":name}});
}
});
If anyone could help me out here I would really appreciate it.
Thanks a lot.
The solution is here Meteor Forum.
Code should be like this:
subject.update(
{subjectCode:subjectCode},
{$addToSet:{"enrollment":name}});
}

Duplicate Key Error on MongoDB Model, Even When Model is Not Enforcing Uniqueness for that Value

In my app I have an endpoint that allows a user to create a new document by simply passing something like this:
{ name: { long: "some name" } }
Now, the relevant portion of the model for this document looks like this:
name: {
long: {
type: String,
trim: true
},
short: {
type: String,
trim: true
}
}
As you can see, I don't have "short" set to "unique: true". However, the user is getting this error:
"errmsg": "E11000 duplicate key error collection: hr.agencies index: name.short_1 dup key: { : null }"
So, clearly the problem here is that once you have more than one "name.short" with a value of null, its producing a dupe error. However, since I don't have unique set to true in the model, I'm not sure why it's enforcing this as a rule.
What could the issue be here, and how can I resolve it? Is there a way I can explicitly say, don't enforce uniqueness on this field?
Try to remove the index from the short key using
db.collection.dropIndex({ "name.short": 1 })

Preventing duplicate mongoDB entries

I have a mongoDB with a mongoose schema:
const newsSchema = new Schema({
serverid: Number,
resetid: Number,
newsid: Number,
timestamp: Number,
type: Number,
win: Number,
attacker_num: Number,
attacker_name: String,
defender_num: Number,
defender_name: String,
result1: Number,
result2: Number,
a_tag: String,
d_tag: String,
killhit: Number
});
Below is what the CSV api looks like that I insert into my mongoDB:
9,672,22697434,1408587629,5,1,351,LaFing at SoF,9,ReDflag,10,0,SoL,LaF,0
9,672,22697435,1408587629,5,1,377,Commorragh,9,ReDflag,10,0,PDM,LaF,0
9,672,22697436,1408587629,5,1,589,The IX Kiss,9,ReDflag,10,0,SoL,LaF,0
Field #3 is unique from the api, it is never duplicated. In my schema it is called newsid. If my script were to update the database from the feed and it tries to insert another row that contains a newsid that is already in the database, is there a way to prevent that from happening?
A unique key constraint would do exactly what you want.
The unique key can be set in mongoose with either the schema field options:
const s = new Schema({newsid: {type: Number, unique: true}});
or by the index method:
Schema.path('newsid').index({unique: true});
If an attempt is made to create a document that already has an entry for that key then an error will be thrown:
NOTE: violating the constraint returns an E11000 error from MongoDB when saving, not a Mongoose validation error.

Got duplicate key error dup key: { : undefined }

I have an array field called udids in Meteor.users schema, which should contains unique elements. This is how I defined the index using SimpleSchema and Collection2:
new SimpleSchema({
...
udids: {
type: Array,
index: true,
unique: true,
optional: true,
sparse: true,
},
'udids.$': {
type: String,
},
...
})
However, when I start the app, I got this error: E11000 duplicate key error collection: meteor.users index: c2_udids dup key: { : undefined }.
I tried searching for the documents with udids = undefined in the database: db.users.find({ udids: { $type: 6 } }) ($type: 6 for undefined value) but it returns nothing.
The error message is a bit unclear so I had to guess the reason why. I found out that the current database already has some users with udids = []. I'm writing a migration script to unset this field from those users. Hopefully this will help others who have the same problem as me.
I've not tested this, but it should ideally work.
Used Meteor.users as a collection name. You may want to replace it with whichever collection you want to run the validation against.
Made use of custom function to find at least one doc which contains the field's value in udids.
If you don't have access to the collection on the client side, then you can edit the custom function and have it handled asynchronously.
new SimpleSchema({
...
'udids': {
optional: true,
type: [String], // an array of string eg. ['A','B']
custom: function() {
// this.value = current field's value
// Below mongo query will check to see if there is at least one
// mongo doc whose udids contains the currently entered field's value.
// if found, then return the error message.
if (Meteor.users.findOne({
udids: {
$in: [this.value]
}
})) {
return "duplicateMsg";
}
},
...
});
SimpleSchema.messages({ duplicateMsg:"Udid already exists"});

Grails 3 and GORM 6 for MongoDB - duplicated key error

Environment:
Grails 3.2.9
GORM 6.1.2 for MongoDB 3.4.2
This is my (simplified) domain class
class Cluster {
String name
String slug
static constraints = {
name blank: false, unique: true
slug blank: false, unique: true, validator: { return it == it.toLowerCase().replaceAll(/[^\w-]/, '') }
}
static mapping = {
collection 'Cluster'
id name: 'slug'
}
}
As you can see I mapped the slug property to be the document _id.
I can successfully add a document with
Cluster cluster = new Cluster(name: 'Dallas', slug: 'dal05')
cluster.insert(failOnError: true)
and everything works fine. But if I execute the same insert command again I get a duplicated key exception:
com.mongodb.MongoBulkWriteException: Bulk write operation error on server localhost:27017. Write errors: [BulkWriteError{index=0, code=11000, message='E11000 duplicate key error index: db.Cluster.$_id_ dup key: { : "dal05" }', details={ }}]
while I would have expected a simple validation error, stating duplicated key.
However, although the unique constraint fails, validation is correctly triggered for the other two (empty value or e.g. 'Dal05' -capital letters not allowed-).
Without mapping the id on slug property, so leaving the default assigned logic, everything works as expected.
Am I missing something? Thanks in advance!
It seems this is actually a bug, scheduled to be fixed in upcoming GORM release 6.1.5.
Ref. issue: https://github.com/grails/grails-data-mapping/issues/951