How to do bulkcreate with updateOnDuplicate when there are composite keys? - postgresql

I'm using Postgres/sequelize .
I need to do bulkupdate and the table is partitioned , so I can't use "Insert into ~ on conflict" .
It looks like I can use bulkCreate with 'updateOnDuplicate' option but I don't know how to define multiple keys. I mean there is no pk in the table but I know two columns together will make unique records.
In this case, how to do bulkupdate ?
Model.bulkCreate(dataToUpdate, { updateOnDuplicate: ["user_id", "token", "created_at"] })

In my case I was using json object array in my model indexes fields like name value pair which was causing issue, after updating it to array of strings fixed my issue in seqialize version 6.
indexes: [
{
name: "payment_clearance_pkey",
unique: true,
fields: ["payment_module_id", "account_id"]
},
]

Related

Cosmos DB unique key

It is possible to create unique constraint on property in subcollection in json?
I have json like that:
{
"id":111,
"DataStructs":[
"Name": "aaaaa",
"DataStructCells": [
{
"RowName": "Default",
"ColumnName": "Perfect / HT",
"CellValue": "0.1"
},
{
"RowName": "Default",
"ColumnName": "100% / HT",
"CellValue": "0.2"
}
]
]
}
And I wanted to add unique key to prevent to add two the same RowName: Default
When I create collection I added unique key: /DataStructs/DataStructCells/RowName but it doesn't work.
No, that is impossible. Unique keys can only work with different documents within a logic partition. To achieve your requirement, you can split your array to different documents in a same logic partition. You can refer to this How do I define unique keys involving properties in embedded arrays in Azure Cosmos DB?

programatically sort multiple columns at a time

I want to sort multiple columns at a time programatically. I am NOT using the default sorting method which is to click on header name to sort and ctrl/shift key + header name to sort multiple columns. I have a different option in the column options menu which is used to sort that specific column. For single column sort, I am using the following api.
params.api.setSortModel([
{
colId: params.column.colId,
sort: "asc",
}
Is there any api or anyway to sort multiple columns?
You need to construct a sortModel object which looks like this -
var sortModel = [
{
"colId": "athlete",
"sort": "desc"
},
{
"colId": "country",
"sort": "asc"
}
]
Then you can use the sorting api, the way you have used it and pass this model which is just an array containing more than one column instead of single column the way you have it in question
params.api.setSortModel(sortModel);
Example on sorting api.

Unique index in mongoDB 3.2 ignoring null values

I want to add the unique index to a field ignoring null values in the unique indexed field and ignoring the documents that are filtered based on partialFilterExpression.
The problem is Sparse indexes can't be used with the Partial index.
Also, adding unique indexes, adds the null value to the index key field and hence the documents can't be ignored based on $exist criteria in the PartialFilterExpression.
Is it possible in MongoDB 3.2 to get around this situation?
I am adding this answer as I was looking for a solution and didn't find one. This may not answer exactly this question or may be, but will help lot of others out there like me.
Example. If the field with null is houseName and it is of type string, the solution can be like this
db.collectionName.createIndex(
{name: 1, houseName: 1},
{unique: true, partialFilterExpression: {houseName: {$type: "string"}}}
);
This will ignore the null values in the field houseName and still be unique.
Yes, you can create partial index in MongoDB 3.2
Please see https://docs.mongodb.org/manual/core/index-partial/#index-type-partial
MongoDB recommend usage of partial index over sparse index. I'll suggest you to drop your sparse index in favor of partial index.
You can create partial index in mongo:3.2.
Example, if ipaddress can be "", but "127.0.0.1" should be unique. The solution can be like this:
db.collectionName.createIndex(
{"ipaddress":1},
{"unique":true, "partialIndexExpression":{"ipaddress":{"$gt":""}}})
This will ignore "" values in ipaddress filed and still be unique
{
"YourField" : {
"$exists" : true,
"$gt" : "0",
"$type" : "string"
}
}
To create at mongodbCompass you must write it as JSON:
for find other types wich supports see this link.
Yes, that can be a kind of a problem that the partial filter expression cannot contain any 'not' filters.
For those who can be interested in a C# solution for an index like this, here is an example.
We have a 'User' entity, which has one-to-one 'relation' to a 'Doctor' entity.
This relation is represented by the not required, nullable field 'DoctorId' in the 'User' entity. In other words, there is a requirement that a given 'Doctor' can be linked to only single 'User' at a time.
So we need an unique index which can fire an exception when something attempts to set DoctorId to the same Guid which already set for any other 'User' entity. At the same time multiple 'null' entries must be allowed for the 'DoctorId' field, since many users do not have any doctor attached to them.
The solution to build this kind of an index looks like:
var uniqueDoctorIdIndexDefinition = new IndexKeysDefinitionBuilder<User>()
.Ascending(o => o.DoctorId);
var existsFilter = Builders<User>.Filter.Exists(o => o.DoctorId);
var notNullFilter = Builders<User>.Filter.Type(o => o.DoctorId, BsonType.String);
var andFilter = Builders<User>.Filter.And(existsFilter, notNullFilter);
var createIndexOptions = new CreateIndexOptions<User>
{
Unique = true,
Name = UniqueDoctorIdIndexName,
PartialFilterExpression = andFilter,
};
var uniqueDoctorIdIndex = new CreateIndexModel<User>(
uniqueDoctorIdIndexDefinition,
createIndexOptions);
users.Indexes.CreateOne(uniqueDoctorIdIndex);
Probably in your description of a 'User' entity you must directly specify the BsonType of the 'DoctorId' field, by using an attribute, for example in our case it was:
[BsonRepresentation(BsonType.String)]
public Guid? DoctorId { get; set; }
I am more than sure that there is a more proficient and compact solution for this problem, so would be happy if somebody suggests it here.
Here is an example that I modified from the mongoDB partial index documentation:
db.contacts.createIndex(
{ email: 1 },
{ unique: true, partialFilterExpression: { email: { $exists: true } } }
)
IMPORTANT
To use the partial index, a query must contain the filter expression (or a modified filter expression that specifies a subset of the filter expression) as part of its query condition.
You can see that queries such as:
db.contacts.find({'email':'name#email.com'}).explain()
will indicate that they doing an index scan, even if you don't specify {$exists: true} because you're implicitly specifying a subset of the partialFilterExpression by specifying an email in your filter.
On the other hand, the following query will do a collection scan:
db.contacts.find({email: {$exists: false}})
WARNING
mythicalcoder's answer (currently the highest voted answer) is very misleading because it successfully creates a unique index, but the query planner will not generally be able to use the index you've created unless you add houseName: {$type: "string"} into your filter expression. This can have performance costs which you might not be aware of and can cause problems down the road.

Keeping default mongo _id and unique index of MondoDB

Is it good or bad practice to keep the standard "_id" generated my mongo in a document as well as my own unique identifier such as "name", or should I just replace _id generated with the actual name so my documents will look like this:
{
_id: 782yb238b2327b3,
name: "my_name"
}
or just like this:
{
_id: "my_name"
}
This depends on the scenario, there is nothing wrong with having your own unique ID, it may be string or a number, completely depends on your situation as long as its unique, the important thing is you are in charge of it. You would want to add an index to it of course.
for example i have an additional ID field which is a number called 'ID', because i required a sequential number as an identifier, another usecase may be that your migrating an application so you have to conform to a particular sequence pattern.
The sequences for the unique identifies could easily be stored in a separate document/collections.
There is no issue with using the built in _id if you have no requirement not to have a custom one, an interesting fact is that you can get the created date out of the _id. Always useful.
db.col.insert( { name: "test" } );
var doc = db.col.findOne( { name: "test" } );
var timestamp = doc._id.getTimestamp();

mongoDB: unique index on a repeated value

So i'm pretty new to mongoDb so i figure this could be a misunderstanding on general usage. so bear with me.
I have a document schema I'm working with as such
{
name: "bob",
email: "bob#gmail.com",
logins: [
{ u: 'a', p: 'b', public_id: '123' },
{ u: 'x', p: 'y', public_id: 'abc' }
]
}
My Problem is that i need to ensure that the public ids are unique within a document and collection,
Furthermore there are some existing records being migrated from a mySQL DB that dont have records, and will therefore all be replaced by null values in mongo.
I figure its either an index
db.users.ensureIndex({logins.public_id: 1}, {unique: true});
which isn't working because of the missing keys and is throwing a E11000 duplicate key error index:
or this is a more fundamental schema problem in that I shouldn't be nesting objects in an array structure like that. In which case, what? a seperate collection for the user_logins??? which seems to go against the idea of an embedded document.
If you expect u and p to have always the same values on each insert (as in your example snippet), you might want to use the $addToSet operator on inserts to ensure the uniqueness of your public_id field. Otherwise I think it's quite difficult to make them unique across a whole collection not working with external maintenance or js functions.
If not, I would possibly store them in their own collection and use the public_id as _id field to ensure their cross-document uniqueness inside a collection. Maybe that would contradict the idea of embedded docs in a doc database, but according to different requirements I think that's negligible.
Furthermore there are some existing records being migrated from a mySQL DB that dont have records, and will therefore all be replaced by null values in mongo.
So you want to apply a unique index on a data set that's not truly unique. I think this is just a modeling problem.
If logins.public_id is null that's going to violate your uniqueness constraint, then just don't write it at all:
{
logins: [
{ u: 'a', p: 'b' },
{ u: 'x', p: 'y' }
]
}
Thanks all.
In the end I opted to seperate this into 2 collections, one for users and one for logins.
users this looked a little like..
userDocument = {
...
logins: [
DBRef('loginsCollection', loginDocument._id),
DBRef('loginsCollection', loginDocument2._id),
]
}
loginDocument = {
...
user: new DBRef('userCollection', userDocument ._id)
}
Although not what i was originally after (a single collection) It is working niocely and by utilising the MongoId uniquness there is a constraint now built in at a database level and not implemented at the application level.