how to check if email and username is unique in dynamodb table design - nosql

I am creating a leads table where each lead has a id ( uuid) , which is the patrtision key and createdAt the rangekey . i need to implement duplicate check each time i insert new item to leads table
on running the following command
C:\Users\user>aws dynamodb describe-table --table-name serverless-isd-app-leads-dev --endpoint-url http://localhost:8008
this is the result shown
{
"Table": {
"AttributeDefinitions": [
{
"AttributeName": "id",
"AttributeType": "S"
},
{
"AttributeName": "email",
"AttributeType": "S"
},
{
"AttributeName": "phone",
"AttributeType": "S"
},
{
"AttributeName": "createdAt",
"AttributeType": "N"
}
],
"TableName": "serverless-isd-app-leads-dev",
"KeySchema": [
{
"AttributeName": "id",
"KeyType": "HASH"
},
{
"AttributeName": "createdAt",
"KeyType": "RANGE"
}
],
"TableStatus": "ACTIVE",
"CreationDateTime": "2021-12-18T20:54:11.940000+05:30",
"ProvisionedThroughput": {
"LastIncreaseDateTime": "1970-01-01T05:30:00+05:30",
"LastDecreaseDateTime": "1970-01-01T05:30:00+05:30",
"NumberOfDecreasesToday": 0,
"ReadCapacityUnits": 1,
"WriteCapacityUnits": 1
},
"TableSizeBytes": 0,
"ItemCount": 0,
"TableArn": "arn:aws:dynamodb:ddblocal:000000000000:table/serverless-isd-app-leads-dev",
"GlobalSecondaryIndexes": [
{
"IndexName": "emai_phone_index",
"KeySchema": [
{
"AttributeName": "email",
"KeyType": "HASH"
},
{
"AttributeName": "phone",
"KeyType": "RANGE"
}
],
"Projection": {
"ProjectionType": "ALL"
},
"IndexStatus": "ACTIVE",
"ProvisionedThroughput": {
"ReadCapacityUnits": 1,
"WriteCapacityUnits": 1
},
"IndexSizeBytes": 0,
"ItemCount": 0,
"IndexArn": "arn:aws:dynamodb:ddblocal:000000000000:table/serverless-isd-app-leads-dev/index/emai_phone_index"
}
]
}
}
and this is my query params
const params = {
TableName: process.env.LEADS_TABLE,
IndexName: "emai_phone_index",
Item: {
id: data.id,
email: data.email,
phone: data.phone,
firstName: data.firstName,
lastName: data.lastName,
createdAt: data.createdAt,
updatedAt: data.updatedAt,
},
ConditionExpression:
"attribute_not_exists(#email) AND attribute_not_exists(#phone)",
ExpressionAttributeNames: {
"#email": "email",
"#phone": "phone",
},
};
but since the email and phone is on global secondary index i am not able to find if its duplicate or not .
How can i solve this ? or is there any other way of designing a better table

Instead of creating global secondary index, you can achieve the uniqueness by adding two extra items of "email" and "phone" at the time of writing new record. For these two records, your primary key will be set to those attribute values (email and phone) separated by hash sign. So writing will involve three records in total and you also have to delete these three records at the time of deletion.
Something similar to this:
'[
{
"Put": {
"TableName" : "serverless-isd-app-leads-dev",
"ConditionExpression": "attribute_not_exists(id)",
"Item" : {
"id":{"S":"b201c1f2-238e-461f-88e6-0e606fbc3c51"},
"email":{"S":"bobby.tables#gmail.com"},
"firstName":{"S":"Bobby"},
"lastName":{"S":"Tables"},
"phone":{"S":"+1-202-555-0124"}
}
}
},
{
"Put": {
"TableName" : "serverless-isd-app-leads-dev",
"ConditionExpression": "attribute_not_exists(id)",
"Item" : {
"id":{"S":"phone#+1-202-555-0124"}
}
}
},
{
"Put": {
"TableName" : "serverless-isd-app-leads-dev",
"ConditionExpression": "attribute_not_exists(id)",
"Item" : {
"id":{"S":"email#bobby.tables#gmail.com"}
}
}
}
]'
Please read below link for detailed understanding which has similar use case:
Simulating Amazon DynamoDB unique constraints using transactions

Related

Is there a way to include a Int32 field in a search index in MongoDB (with Atlas Search)?

I have a collection in a Mongo Atlas DB on which I have a search index including some specific string fields. What I want to do is include a Int32 field in this search index to be able to do a search on this number, along with the other fields. I tried to add the field (Number) as a new field in the search index, with the type number, but it doesn't work. I guess it's because it compares the query, a string, with an Int32, but is there a way to make it work ? Or do I have to copy the "Number" in another field "NumberString" to include in the search index ?
Here is an example of one of these documents :
{
“_id” : ObjectId(“010000000000000000000003”),
“Description” : {
“fr-CA” : “Un lot de test”,
“en-CA” : “A test item”
},
“Name” : {
“fr-CA” : “Lot de test”,
“en-CA” : “Test item”
},
“Number” : 345,
“Partners” : [],
[...]
}
The index :
{
“mappings”: {
“dynamic”: false,
“fields”: {
“Description”: {
“fields”: {
“en-CA”: {
“analyzer”: “lucene.english”,
“searchAnalyzer”: “lucene.english”,
“type”: “string”
},
“fr-CA”: {
“analyzer”: “lucene.french”,
“searchAnalyzer”: “lucene.french”,
“type”: “string”
}
},
“type”: “document”
},
“Name”: {
“fields”: {
“en-CA”: {
“analyzer”: “lucene.english”,
“searchAnalyzer”: “lucene.english”,
“type”: “string”
},
“fr-CA”: {
“analyzer”: “lucene.french”,
“searchAnalyzer”: “lucene.french”,
“type”: “string”
}
},
“type”: “document”
},
“Number”:
{
“representation”: “int64”,
“type”: “number”
},
“Partners”: {
“fields”: {
“Name”: {
“type”: “string”
}
},
“type”: “document”
}}}}
And finally the query I try to do.
db.[myDB].aggregate([{ $search: { "index": "default", "text": { "query": "345", "path": ["Number", "Name.fr-CA", "Description.fr-CA", "Partners.Name"]}}}])
For this example, I want the query to be applied on Number, Name, Description and Partners and to return everything that matches. I would expect to have the item #345, but also any items with 345 in the name or description. Is it possible ?
Thanks !
With your current datatype you, should be able to search for #345 in text. However, I would structure the query like so, to support the numeric field as well:
db.[myDB].aggregate([
{
$search: {
"index": "default",
"compound": {
"should":[
{
"text": {
"query": "345",
"path": ["Name.fr-CA", "Description.fr-CA", "Partners.Name"]
}
},
{
"near": {
"origin": 345,
"path": "Number",
"pivot": 2
}
}
]
}
}
}
])

mongoDB find and update or insert

I am using mongoDB and mongoose.
I have the following scheme:
{ "group" :
"items" : [
{
"name": "aaa",
"value": "aaa_value",
"description": "some_text"
},
{
"name": "bbb",
"value": "bbb_value"
"description": "some_text2"
},
{
"name": "ccc",
"value": "ccc_value"
"description": "some_text3"
},
]
}
My function receives a name and a value.
If an item with this name is presented, I want to update the value according to the value parameter (and do not change the description).
If not, I want to add a new item to the array, with the parameter name and value, and a default description.
How can it be done?
Thank you
upsert operation is not possible on embedded array. It has be to 2 step process.
Either you first (try to) remove record and then push it. Or update the record first, and if that fails, then insert it.
First approach: (first delete it)
db.collection.update(
{ _id : ObjectId("xyz")},
{ $pull: {"items.name": "aaa"}}
)
then, insert it:
db.collection.update(
{ _id : ObjectId("xyz")},
{ $push: {"items": {
name : "ddd",
value: "new value",
description: "new description"
}}
)
Second Approach: (first update it)
var result = db.collection.update(
{
_id : ObjectId("xyz"),
"items.name": "aaa")
},
{
$set: {"items.$.name": {name: "new name"}}
}
);
And then, if nothing updates, then insert it.
if(!result.nMatched)
{
db.collection.update(
{
_id: ObjectId("xyz"),
"items.name": {$ne: ObjectId("xyz"}}
},
{
$push: {
items: {
name: "new name",
value: "new value",
description: "new description"
}
}
}
);
}

How to limit the no of columns in output while doing aggregate operation in Mongo DB

My function looks like below.
function (x)
{
var SO2Min = db.AirPollution.aggregate(
[
{
$match : {"SO2":{$ne:'NA'}, "State":{$eq: x} }
},
{
$group:
{
_id: x,
SO2MinQuantity: { $min: "$SO2" }
}
},
{
$project:
{SO2MinQuantity: '$SO2MinQuantity'
}
}
]
)
db.AirPollution.update
(
{
"State": "West Bengal"},
{
$set: {
"MaxSO2": SO2Max
}
},
{
"multi": true
}
);
}
Here, AirPolltuion is my Collection. If I run this function, the collection gets updated with new column MaxSO2 as below.
{
"_id" : ObjectId("5860a2237796484df5656e0c"),
"Stn Code" : 11,
"Sampling Date" : "02/01/15",
"State" : "West Bengal",
"City/Town/Village/Area" : "Howrah",
"Location of Monitoring Station" : "Bator, Howrah",
"Agency" : "West Bengal State Pollution Control Board",
"Type of Location" : "Residential, Rural and other Areas",
"SO2" : 10,
"NO2" : 40,
"RSPM/PM10" : 138,
"PM 2.5" : 83,
"MaxSO2" : {
"_batch" : [
{
"_id" : "West Bengal",
"SO2MaxQuantity" : 153
}
],
"_cursor" : {}
}
}
Where we can see, that MaxSO2 has been added as a sub document. But I want that column to be added inside same document as a field, not as a part of sub document. Precisely, I dont want batch and cursor fields to come up. Please help.
Since the aggregate function returns a cursor, you can use the toArray() method which returns an array that contains all the documents from a cursor and then access the aggregated field. Because you are returning a single value from the aggregate, there's no need to iterate the results array, just access the first and only single document in the result to get the value.
Once you get this value you can then update your collection using updateMany() method. So you can refactor your code to:
function updateMinAndMax(x) {
var results = db.AirPollution.aggregate([
{
"$match" : {
"SO2": { "$ne": 'NA' },
"State": { "$eq": x }
}
},
{
"$group": {
"_id": x,
"SO2MinQuantity": { "$min": "$SO2" },
"SO2MaxQuantity": { "$max": "$SO2" }
}
},
]).toArray();
var SO2Min = results[0]["SO2MinQuantity"];
var SO2Max = results[0]["SO2MaxQuantity"];
db.AirPollution.updateMany(
{ "State": x },
{ "$set": { "SO2MinQuantity": SO2Min, "SO2MaxQuantity": SO2Max } },
);
}
updateMinAndMax("West Bengal");

MongoDB - Updating all matching Nested Documents

I have this MongoDB Document Collection. I would like to update a PropertyContractors Phone number for all documents where "ContractorTelephone": "0865215486".
{
"_id": "4",
"PropertyType": "House",
"PropertyNameNumber": "49",
"PropertyStreet": "Paul Street",
"PropertyTown": "Farmleigh",
"PropertyCity": "Waterford City",
"PropertyCounty": "Co Waterford",
"PropertyBedrooms": "3",
"PropertyDescription": "Central, Cheap",
"PropertyFacilities": [
{
"FacilitiesSmoking": "Yes",
"FacilitiesPets": "Yes",
"FacilitiesBroadBand": "Yes",
"FacilitiesTV": "Yes"
}
],
"PropertyAvailable": "1",
"PropertyContractor": [
{
"ContractorName": "John Murphy",
"ContractorTelephone": "0865215486",
"ContractorType": "Plumber"
}
]
}
Ive Tried
db.Property.update(
{ PropertyContractor.ContractorTelephone: "0865215486" },
{
$set: { PropertyContractor.ContractorTelephone: "0854854215" }
},
{ multi: true }
)
But it just says that the . is wrong Syntax
Thanks in advance.
Use the $ positional operator, it identifies an element in an array to update without explicitly specifying the position of the element in the array:
db.Property.update(
{
"PropertyContractor.ContractorTelephone": "0865215486"
},
{
"$set": {
"PropertyContractor.$.ContractorTelephone": "0854854215"
}
},
{ "multi": true }
);

How do I replace an entire array of subdocuments in MongoDB?

Here is an example document from my collection:
Books
[
id: 1,
links:
[
{text: "ABC", "url": "www.abc.com"},
{text: "XYZ", "url": "www.xyz.com"}
]
]
I want to replace the links array in one update operation. Here is an example of how the above document should be modified:
Books
[
id: 1,
links:
[
{text: "XYZ", "url": "www.xyz.com"},
{text: "efg", "url": "www.efg.com"}, <== NEW COPY OF THE ARRAY
{text: "ijk", "url": "www.ijk.com"}
]
]
As you can see, the links array has been replaced (old data removed, and new data added).
I am having very hard time with the Update.Set() because it says it MyLinks<> cannot be mapped to a BsonValue
I've tried many different ways of achieving this, and all of them fail, including .PushAllWrapped<WebLinkRoot>("links", myDoc.WebLinks).
Everything I've tried results in the new values being appended to the array, rather than the array being replaced.
As it seems MongoDB doesn't provide a simple method to replace an array of subdocument OR a method like .ClearArray(), what is the best way for me to ensure the array is cleared before adding new elements in a single query?
I am here because I saw 5k views on this post, I'm adding some stuff may be it help other who looking for answer of above
db.collectionName.insertOne({
'links': [
{
"text" : "XYZ",
"url" : "www.xyz.com"
}
]
});
now run this query which help to replace older data
db.collectionName.update(
{
_id: ObjectId("your object Id")
},
{
$set:
{
'links':[ {
"text" : "XYZ1",
"url" : "www.xyz.com1"
} ]
}
});
I think you have to do something like this:
var newArray = new BSONArray {
new BSONDocument { { "text", "XYZ" }, { "url", "www.xyz.com" } },
new BSONDocument { { "text", "efg" }, { "url", "www.efg.com" } },
new BSONDocument { { "text", "ijk" }, { "url", "www.ijk.com" } }
};
var update = Update.Set( "links", newArray );
collection.Update( query, update );
Or whatever method you can to cast as a valid BSONValue.
So equivalent to shell:
{ "links" : [ { "text" : "abc" } ] }
> db.collection.update(
{},
{ $set:
{ links: [
{ text: "xyz", url: "something" },
{ text: "zzz", url: "else" }
]}
})
>db.collection.find({},{ _id: 0, links:1 }).pretty()
{ "links" : [
{
"text" : "xyz",
"url" : "something"
},
{
"text" : "zzz",
"url" : "else"
}
]
}
So that works.
You clearly need something else other than embedded code. But hopefully that puts you on the right track.