I am working with the following data in table my_table:
[
{
"item": {
"id": 1,
"data": {
"name": "ABC",
"status": "Active"
}
}
},
{
"item": {
"id": 2,
"data": {
"name": "DEF",
"status": "Active"
}
}
}
]
I would like to update name property of data, keeping the rest of data intact. A PostgreSQL query for that purpose would look like this:
UPDATE my_table SET data = data || '{"name":"GHI"}' WHERE id = 1;
However, I am struggling to achieve this with knex, as I've tried:
knex('my_table')
.update({ data: knex.raw('data || ?', [{ name: 'GHI' }]) })
.where('id', 1);
and many other similar queries, but in vain. If you might have any ideas about this, please share them below. Thanks in advance!
Using knex you indeed have to use raw expressions to update single field inside JSONB column.
However in objection.js which is an ORM, built on top of knex, there are some additional support for JSONB operations.
With objection.js your update would look like
await MyTableModel.query(knex).update({'data:name', 'GHI'}).where('id', 1);
Which outputs SQL like this (with bindings ["GHI", 1]):
update "my_table" set "data" = jsonb_set("data", '{name}', ?, true) where "id" = ?
Runkit example https://runkit.com/embed/0dai0bybplxv
Related
I have a model with a JSONB field (Postgres).
from sqlalchemy.dialects.postgresql import JSONB
class Foo(Base):
__tablename__ = 'foo'
data = Column(JSONB, nullable=False)
...
where the data field looks like this:
[
{
"name": "a",
"value": 0.0143
},
{
"name": "b",
"value": 0.0039
},
{
"name": "c",
"value": 0.1537
},
...
]
and, given a search_name I want to return the rows where name is in the top x ranked names.
I know I can access the fields with something like:
res = session.query(Foo).filter(Foo.data['name'] == search_name)
but how do I order the JSON and extract the top names from that?
A SQLAlchemy solution would be preferred, but also a plain SQL one I can use as raw is fine.
I am attempting to use PartiQL to query a set of data that looks like this:
{
"userId": {
"S": "someuserID"
},
"mapWithData": {
"M": {
"1": {
"M": {
"neededVal": {
"S": "A Needed Value1"
},
"name": {
"S": "A Name1"
}
}
},
"2": {
"M": {
"neededVal": {
"S": ""A Needed Value12"
},
"name": {
"S": "A Name2"
}
}
}
}
},
"userName": {
"S": "someuserName"
}
}
I am developing this query using the NoSQL Workbench. I want to use a Request Parameter in the query to get a specific object from the mapWithData based on its key value. For instance, if I wanted to get the n-th value from the map, I could use this query:
SELECT "mapWithData"."N"."neededVal"
FROM "some-table"
WHERE "userId" = 'someuserID'
But I would like to be able to make the "N" and the 'someruserID' into Request Parameters to prevent any PartiQL injections (assuming the Request Parameters are actually cleansed). So what I'm trying to do is this:
SELECT "mapWithData".?."neededVal"
FROM "some-table"
WHERE "userId" = ?
This does not work unfortunately and I get this error:
Execute PartiQL statement failed: Validation Error: Statement wasn't well formed, can't be processed: Invalid path dot component
So then I tried to use a different format like this:
SELECT "mapWithData"[ ? ]["neededVal"]
FROM "some-table"
WHERE "userId" = ?
But when I do that, I get this error:
Execute PartiQL statement failed: Validation Error: Unexpected path component at 1:23:1
Is it possible to include a ? for inserting request parameters in the SELECT part of this statement? Is there another way to do this that I'm missing?
#Francis, sure you can do that: if you use the PythonSDK to implement it you can write:
table_name = 'mytable'
someruserID =
stmt = f"SELECT * FROM {table_name} WHERE userId=?"
pmt =[{
"S": userId
}]
resp = dynamodb.execute_statement(
Statement=stmt , Parameters= pmt
)
Hope that helps. A
I have a collection in MongoDB containing search history of a user where each document is stored like:
"_id": "user1"
searchHistory: {
"product1": [
{
"timestamp": 1623482432,
"query": {
"query": "chocolate",
"qty": 2
}
},
{
"timestamp": 1623481234,
"query": {
"query": "lindor",
"qty": 4
}
},
],
"product2": [
{
"timestamp": 1623473622,
"query": {
"query": "table",
"qty": 1
}
},
{
"timestamp": 1623438232,
"query": {
"query": "ike",
"qty": 1
}
},
]
}
Here _id of document acts like a foreign key to the user document in another collection.
I have backend running on nodejs and this function is used to store a new search history in the record.
exports.updateUserSearchCount = function (userId, productId, searchDetails) {
let addToSetData = {}
let key = `searchHistory.${productId}`
addToSetData[key] = { "timestamp": new Date().getTime(), "query": searchDetails }
return client.db("mydb").collection("userSearchHistory").updateOne({ "_id": userId }, { "$addToSet": addToSetData }, { upsert: true }, async (err, res) => {
})
}
Now, I want to get search history of a user based on query only using the db.find().
I want something like this:
db.find({"_id": "user1", "searchHistory.somewildcard.query": "some query"})
I need a wildcard which will replace ".somewildcard." to search in all products searched.
I saw a suggestion that we should store document like:
"_id": "user1"
searchHistory: [
{
"key": "product1",
"value": [
{
"timestamp": 1623482432,
"query": {
"query": "chocolate",
"qty": 2
}
}
]
}
]
However if I store document like this, then adding search history to existing document becomes a tideous and confusing task.
What should I do?
It's always a bad idea to save values are keys, for this exact reason you're facing. It heavily limits querying that field, obviously the trade off is that it makes updates much easier.
I personally recommend you do not save these searches in nested form at all, this will cause you scaling issues quite quickly, assuming these fields are indexed you will start seeing performance issues when the arrays get's too large ( few hundred searches ).
So my personal recommendation is for you to save it in a new collection like so:
{
"user_id": "1",
"key": "product1",
"timestamp": 1623482432,
"query": {
"query": "chocolate",
"qty": 2
}
}
Now querying a specific user or a specific product or even a query substring is all very easily supported by creating some basic indexes. an "update" in this case would just be to insert a new document which is also much faster.
If you still prefer to keep the nested structure, then I recommend you do switch to the recommended structure you posted, as you mentioned updates will become slightly more tedious, but you can still do it quite easily using arrayFilters for updating a specific element or just using $push for adding a new search
I am new to NoSQL and morphia. I am using Morphia to query MongoDB.
I have a sample collection as below:
[
{
"serviceId": "id1",
"serviceName": "ding",
"serviceVersion": "1.0",
"files": [
{
"fileName": "b.html",
"fileContents": "contentsA"
},
{
"fileName": "b.html",
"fileContents": "contentsB"
}
]
},
{
"serviceId": "id2",
"serviceName": "ding",
"serviceVersion": "2.0",
"files": [
{
"fileName": "b.html",
"fileContents": "contentsA"
},
{
"fileName": "b.html",
"fileContents": "contentsB"
}
]
}
]
I would like to fetch an element in "files" List , given service name, service version and filename., using Morphia.
I was able to get what I want using the query below:
db.ApiDoc.find({ serviceName: "ding", serviceVersion: "2.0"}, { files: { $elemMatch: { fileName: "b.html" } } }).sort({ "_id": 1}).skip(0).limit(30);
What I tried so far :
I tried using "elemmatch" api that morphia has, but no luck.
query = ...createQuery(
Result.class);
query.and(query.criteria("serviceName").equal("ding"),
query.criteria("serviceVersion").equal(
"2.0"));
query.filter("files elem",BasicDBObjectBuilder.start("fileName", "a.html").get());
I seem to get the entire Result collection with all the files. I would like to get only the matched files(by filename).
can some one help me how I can get this to work?
Thanks
rajesh
I don't believe it's possible to get just the matching sub element. You can request just to have the 'files' array returned but all elements will be included in the result set and you will have to refilter in your code.
The other option is to make Files a collection of its own with a serviceId field and then you'll have more power to load only certain files.
It's possible to do that.
the filter doesn't really work like projection.
try this :
datastore.createQuery(Result.class)
.field("serviceName").equal("dong")
.field("serviceVersion").equal("2.0")
.field("files.filename").equal("a.html")
.project("files.$.filename", true);
i want to query mongodb in sailsjs.
this is structure of my db
{
"users": [
"52ed09e1d015533c124015d5",
"52ed4bc75ece1fb013fed7f5"
],
"user_msgs": [
{
"sender": "52ed09e1d015533c124015d5",
"sendTo": "52ed4bc75ece1fb013fed7f5",
"msg": "ss"
}
],
"createdAt": ISODate("2014-02-06T16:12:17.751Z"),
"updatedAt": ISODate("2014-02-06T16:12:17.751Z"),
"_id": ObjectID("52f3b461f46da23c111582f6")
}
I want to search those "users" who who match array [
"52ed09e1d015533c124015d5",
"52ed4bc75ece1fb013fed7f5"
]
Message.find({user: ["52ed09e1d015533c124015d5","52ed4bc75ece1fb013fed7f5"]})
this query returns all objects which contains 1 OR 2 ..but i need only those which exacly match 1 AND 2,
i have also tried $all, and etc.. but did not worked
please tell me how to write query with sailsjs supported syntex to get those user
You'll need to use the native Mongo adapter for this:
Message.native(function(err, collection) {
collection.find({users:{'$all':["52ed09e1d015533c124015d5","52ed4bc75ece1fb013fed7f5"]}}).toArray(function(err, results) {
// Do something with results
});
});
Message.find()
.where({users: "52ed09e1d015533c124015d5", "52ed4bc75ece1fb013fed7f5"})
.exec(function(){
// do something
});
While the above code may work to pull in just those users. I think a better solution would be to define your user ID's in your message model.
I would add the following attributes to your messages model:
senderID: {
type: 'string'
},
receiverID: {
type: 'string'
}
Now you can make that query more efficient by using the following query:
Message.find()
.where({senderID: "52ed09e1d015533c124015d5"})
.where({receiverID: "52ed4bc75ece1fb013fed7f5"})
.exec(function(){
// do something
});
This is the route I would take.