Mongo bulk insert and avoid duplicate values for multiple keys - mongodb

I have two collections items with 120,000 entries and itemHistories with more than 20 million entries. I periodically update all items and itemHistories by fetching an API that lists all history data for an item.
What I need to do is batch insert the history data to the collection while avoiding duplicates. Also the history API returns only date, info, item_id values.
Is it possible to batch insert in Mongo so that it doesn't add duplicates for 2 values combined (date, item_id). So if there already is an entry with the same date and item_id don't add it. Basically the date is an unique index for the item_id. It's allowed to have duplicate date values in the collection but only if the item_id is different for all the duplicates.
One item can have close to a million entries so I don't think fetching the history from the collection and comparing it to the API response is going to be optimal.
My current idea was to add another key to the collection called hash that is an md5(date,info,item_id) and make it an unique index. Suggestions?

A little bit of digging in the documentation of Mongoose and MongoDB I found out that there is a thing called Unique Compound Index that solves my problem and answers this question. Since I've never used indexes before I didn't know such a thing was possible.
You can also enforce a unique constraint on compound indexes. If you
use the unique constraint on a compound index, then MongoDB will
enforce uniqueness on the combination of the index key values.
For example, to create a unique index on groupNumber, lastname, and
firstname fields of the members collection, use the following
operation in the mongo shell:
db.members.createIndex( { groupNumber: 1, lastname: 1, firstname: 1 }, { unique: true } )
Source: https://docs.mongodb.org/manual/core/index-unique/
In my case I can use this code below to avoid duplicates:
db.itemHistories.createIndex( { date: 1, item_id: 1 }, { unique: true } )

Related

How to index and sorting with Pagination using custom field in MongoDB ex: name instead of id

https://scalegrid.io/blog/fast-paging-with-mongodb/
Example : {
_id,
name,
company,
state
}
I've gone through the 2 scenarios explained in the above link and it says sorting by object id makes good performance while retrieve and sort the results. Instead of default sorting using object id , I want to index for my own custom field "name" and "company" want to sort and pagination on this two fields (Both fields holds the string value).
I am not sure how we can use gt or lt for a name, currently blocked on how to resolve this to provide pagination when a user sort by name.
How to index and do pagination for two fields?
Answer to your question is
db.Example.createIndex( { name: 1, company: 1 } )
And for pagination explanation the link you have shared on your question is good enough. Ex
db.Example.find({name = "John", country = "Ireland"}). limit(10);
For Sorting
db.Example.find().sort({"name" = 1, "country" = 1}).limit(userPassedLowerLimit).skip(userPassedUpperLimit);
If the user request to fetch 21-30 first documents after sorting on Name then country both in ascending order
db.Example.find().sort({"name" = 1, "country" = 1}).limit(30).skip(20);
For basic understand of Indexing in MonogDB
Indexes support the efficient execution of queries in MongoDB. Without indexes, MongoDB must perform a collection scan, i.e. scan every document in a collection, to select those documents that match the query statement. If an appropriate index exists for a query, MongoDB can use the index to limit the number of documents it must inspect.
Indexes are special data structures, that store a small portion of the collection’s data set in an easy to traverse form. The index stores the value of a specific field or set of fields, ordered by the value of the field.
Default _id Index
MongoDB creates a unique index on the _id field during the creation of a collection. The _id index prevents clients from inserting two documents with the same value for the _id field. You cannot drop this index on the _id field.
Create an Index
Syntax to execute on Mongo Shell
db.collection.createIndex( <key and index type specification>, <options> )
Ex:
db.collection.createIndex( { name: -1 } )
for ascending use 1,for descending use -1
The above rich query only creates an index if an index of the same specification does not already exist.
Index Types
MongoDB provides different index types to support specific types of data and queries. But i would like to mention 2 important types
1. Single Field
In addition to the MongoDB-defined _id index, MongoDB supports the creation of user-defined ascending/descending indexes on a single field of a document.
2. Compound Index
MongoDB also supports user-defined indexes on multiple fields, i.e. compound indexes.
The order of fields listed in a compound index has significance. For instance, if a compound index consists of { name: 1, company: 1 }, the index sorts first by name and then, within each name value, sorts by company.
Source for my understanding and answer and to know more about MongoDB indexing MongoDB Indexing

Validate uniqueness of a relationship model

I have an application where users can follow each other. Once this relationship is made a document is added into the collection. That document has two fields follower and followee. I want to prevent insertions of duplicate relationships. I do not want to query the db, wait for a promise, then insert as this seems like an inefficient approach. I'd rather stop it from saving a new document if the new document's follower and followee matches an existing document.
Look into creating a Unique Compound Index index:
db.members.createIndex( { follower: 1, followee: 1 }, { unique: true } )
The created index enforces uniqueness for the combination of follower and followee values.
A unique index ensures that the indexed fields do not store duplicate
values; i.e. enforces uniqueness for the indexed fields. By default,
MongoDB creates a unique index on the _id field during the creation of
a collection

Purpose of Index in Mongoose Schema

I am trying to add unique documents in the collection. However the problem is I want to decide uniqueness based on 2 fields. So I found a solution for it online. What confuses me is, What is the purpose ofINDEX here? In RDBMS index is usually the word used for row id, what does this mean here and how does it affect the uniqueness?
var patientSchema = mongoose.Schema({
name : String,
fatherOrHusbandName : String,
address : String,
});
patientSchema .***index***({ email: 1, sweepstakes_id: 1 }, { unique: true });
Indexes support the efficient execution of queries in MongoDB. Without
indexes, MongoDB must perform a collection scan, i.e. scan every
document in a collection, to select those documents that match the
query statement. If an appropriate index exists for a query, MongoDB
can use the index to limit the number of documents it must inspect.
for more details see this documentation .
Don't be confused after reading this document because this document uses createIndex and you're code uses index. createIndex is for MongoDB and index is for mongoose that internally executes MongoDB operations.
If you have no data in your database then it will work fine
patientSchema .index({ email: 1, sweepstakes_id: 1 }, { unique: true });
but if you have some data in the database with duplicate items then you should use dropDups keyword to make it a unique index.
But one thing you should know before use dropDups. If you use dropDups: true and you have some data with duplicate value then keep one document in your database and all the other data will be deleted (duplicate data).
Like:
patientSchema .index({ email: 1, sweepstakes_id: 1 }, { unique: true, dropDups: true });
It actually has the same purpose like indexes in RDBMS. You just have to think in some mongo-terminology. In mongodb instead of Tables, you have collections, instead of rows, you have documents.
With mongoose you defined a schema for a collection 'Patient', within this collection(index belongs to Patient collection) you defined a unique index on the document properties email and sweepstakes_id.
Every time you save a document in the Patient collection, mongodb makes sure that a document which has the email and sweepstakes_id properties set, those 2 properties will be unique among all other documents.
So, instead of 'rows', think in 'documents'

How can I set multiple fields as primary key in MongoDB?

I am trying to create a collection with 50+ fields. I understand that the purpose of the primary key is to uniquely identify a record. Since the primary key is the _id in MongoDB that gets created automatically, isn't it obvious that all my records including duplicate would go into my DB with unique _id for evert record? Tell me where I'm going wrong.Other articles and discussions are more confusing.
How to set any one/more of the other fields as a primary key? But I don't want the default _id as primary key.
In what way, compound indexes are different from compound/primary key?
There is no such notion as a primary key in MongoDB. Terminology matters. Not knowing the terminology is a sure sign someone hasn't read the docs or at least not carefully.
A document in a collection must have an _id field which may be and by default is an ObjectId. This field has an index on it which enforces a unique constraint, so there can not be any two documents with the same value or combination of values in the _id field. Which, by what you describe, presumably is what you want.
My suggestion is to reuse the default _id as often as you can. Additional indices are expensive (RAM-wise). You have two options here: either use a different single value as _id or use multiple values, if the cardinality of the single field isn't enough.
Let us assume you want a clickstream per user recorded. Obviously, you need to have the unique user. But that would not be enough, since a user only could only have one entry. But since you need a timestamp fo each click anyway, you move it to the _id field:
{
_id:{
user: "some user",
ts: new ISODate()
},
...
}
Unless your Mongo installation is sharded, you can you create a unique compound index on multiple fields and use this as a surrogate composite primary key.
db.collection.createIndex( { a: 1, b: 1 }, { unique: true } )
Alternatively you could create your own _id values. However, as the default ObjectId is also a timestamp, personally I find it useful for auditing purposes.
Regarding the difference between compound index and composite primary key, by definition primary keys cannot be defined on a missing (null) fields and there can only be one primary key per document. In MongoDB only the _id field can be used as a primary key, as it is added by default when missing. In contrast, a compound index can be applied on missing fields by defining it as parse and you can define multiple compound indices on the same document.

MongoDB: Unique Key in Embedded Document

Is it possible to set a unique key for a key in an embedded document?
I have a Users collection with the following sample documents:
{
Name: "Bob",
Items: [
{
Name: "Milk"
},
{
Name: "Bread"
}
]
},
{
Name: "Jim"
},
Is there a way to create an index on the property Items.Name?
I got the following error when I tried to create an index:
> db.Users.ensureIndex({"Items.Name": 1}, {unique:true});
E11000 duplicate key error index: GroceryGuruApp.Users.$Items.Name_1 dup key: {
: null }
Any suggestions? Thank you!
Unique indexes exist only across collection. To enforce uniqueness and other constraints across document you must do it in client code. (Probably virtual collections would allow that, you could vote for it.)
What are you trying to do in your case is to create index on key Items.Name which doesn't exist in any of the documents (it doesn't refer to embedded documents inside array Items), thus it's null and violates unique constraint across collection.
You can create a unique compound sparse index to accomplish something like what you are hoping for. It may not be the best option (client side still might be better), but it can do what you're asking depending on specific requirements.
To do it, you'll need to create another field on the same level as Name: Bob that is unique to each top-level record (could do FirstName + LastName + Address, we'll call this key Identifier).
Then create an index like this:
ensureIndex({'Identifier':1, 'Items.name':1},{'unique':1, 'sparse':1})
A sparse index will ignore items that don't have the field, so that should get around your NULL key issue. Combining your unique Identifier and Items.name as a compound unique index should ensure that you can't have the same item name twice per person.
Although I should add that I've only been working with Mongo for a couple of months and my science could be off. This is not based on empirical evidence but rather observed behavior.
More on MongoDB Indexes
Compound Keys Indexes
Sparse Indexes
An alternative would be to model the items as a hash with the item name as the key.
Items: { "Milk": 1, "Bread": 1 }
I'm not sure about whether you're trying to use the index for performance or purely for the constraint. The right way to approach depends on your use cases, and determining whether the atomic operations are enough to keep your data consistent.
The index will be across all Users and since you asked it for 'unique', no user will be able to have two of the same named item AND no two users will be able to have the same named Item.
Is that what you want?
Furthermore, it appears that it's objecting to two Users having a 'null' value for Items.Name, clearly Jim does, is there another record like that?
It would be unusual to require uniqueness on an indexed collection like this.
MongoDB does allow unique indexes where it indexes only the first of each value, see
http://www.mongodb.org/display/DOCS/Indexes#Indexes-DuplicateValues, but I suspect the real solution is to not require uniqueness in this case.
If you want to ensure uniqueness only within the Items for a single user you might want to try the $addToSet option. See http://www.mongodb.org/display/DOCS/Updating#Updating-%24addToSet
You can use use findAndModify to create a sequence/counter function.
function getNextSequence(name) {
var ret = db.counters.findAndModify({
query: { _id: name },
update: { $inc: { seq: 1 } },
new: true,
upsert: true
});
return ret.seq;
}
Then use it whenever a new id is needed...
db.users.insert({
_id: getNextSequence("userid"),
name: "Sarah C."
})
This is from http://docs.mongodb.org/manual/tutorial/create-an-auto-incrementing-field/. Check it out.