I have a table with a jsonb column to which I want to append a new record.
This is the content of my table's data column.
data = [
{ "id" : "1",
"content" : "cat"},
{ "id" : "2",
"content" : "rat"},
{ "id" : "3",
"content" : "dog"},
...
...
{ "id" : "n",
"content" : "hamster"}
]
I want to add
{ "id" : "n+1",
"content" : "goat"}
to data. I know I can use the concatenate operator || to do this. However, I want to understand if I can use the jsonb_insert to do this as well?
can I also use jsonb_insert to
insert the new records any where arbitrarily
insert the new record at the start of the jsonb array
append the new record to the jsonb
array at the end
extra question;
if I want to turn the content of id 3, from dog to cow, should I do;
SELECT jsonb_set(data, '{id, 4}', '{"content":"cow"}'::jsonb) FROM tablename;
Related
can anybody tell me query to remove entire json object from array in jsonb type column by given id stored in json object.
example : I have data column in t1 table. Type of data column is jsonb.
Record in data column is as follows
"type" : "users",
"info" : [
{
"id":1,
"name" : "user1"
},
{
"id":2,
"name" : "user2"
}
]
}
Now I want to delete entire json object which has id = 1 ( I want to identify the object by json object id)
The expected result is
{
"type" : "users",
"info" : [
{
"id":2,
"name" : "user2"
}
]
}
Please help me out with queries.
Thanks in advance 🙂
You will need to use a subquery for each row on jsonb_array_elements that are then aggregated back to an array:
UPDATE t1
SET data = jsonb_set(data, '{info}', (
SELECT COALESCE(jsonb_agg(element), '[]'::jsonb)
FROM jsonb_array_elements(data -> 'info') element
WHERE element ->> 'id' <> '1'
))
##Here is a simple query to solve above problem ##
update t1 set data = jsonb_set(data,'{info}',jsonb_path_query_array(data,'$.info[*]?(#."id" != 1)'));
I am trying to upsert array elements for a single document.
I want to provide list of objects to mongo query, and for each object if exists (based on some field) in an mongo document array, replace the object in the array, if not exists, add it to the array.
I know this can be done using application side, but can this be done using 1 transactional query?
example:
{
"_id" : ObjectId("50b429ba0e27b508d854483e"),
"array" : [
{
"id" : "1",
"letter" : "a"
},
{
"id" : "2",
"letter" : "b"
},
{
"id" : "3",
"letter" : "c"
}
],
"tester" : "tom"
}
Now, I want to pass to pass to my query the following objects:
{
"id" : "3",
"letter" : "c-new"
"newField": "some string"
},
{
"id" : "4",
"letter" : "D"
}
Based on id field I want mongo to see that:
id:3 already exists, so it will replace the current element with the new object provided
id:4 doesn't exists, so it will add it to the array elements.
(that's basically called upsert).
How can I merge the documents together where the user is the same?
What I find difficult to do is to automatically add the values from the "field" column as columns and the data from the "data" field as values for the newly created columns.
Like merging these two because they have the same user id and have the columns "Date of birth":"1989-01-12" and "Job":"Teacher".
I know it's a lot to ask, but could someone guide me to how to achieve this?
{
"_id" : ObjectId("5d6b00b960016c4c441d9a16"),
"user" : 1000,
"field" : "Date of birth",
"data" : "1989-01-12",
"timestamp" : "2017-08-27 11:00:59"
}
{
"_id" : ObjectId("5d6b00b960016c4c441d9a17"),
"user" : 1000,
"field" : "Job",
"data" : "Teacher",
"timestamp" : "2017-08-27 10:59:19"
}
Into
{
"_id" : ObjectId("5d6b00b960016c4c441d9a16"),
"user" : 1000,
"Date of birth" : "1989-01-12",
"Job" : "Teacher",
"timestamp" : "2017-08-27 11:00:59"
}
To merge documents, you can iterate them to create a new document.
If you want to remove the old references to the given user, you have to delete them before inserting your new document.
You can go like this using the javascript interface of MongoDB:
// Get the docs
docs = db.find({user:1000})
// Init common values of the new doc
new_doc = db.findOne({user:1000})
// Remove the unused field in the new object
delete new_doc.field
delete new_doc.data
for (i=0; i<docs.size(); i++){
doc = docs[i]
new_doc[doc["field"]] = new_doc["data"]
if (Date(doc["timestamp"]) > Date(new_doc["timestamp"])) {
new_doc["timestamp"] = doc["timestamp"]
}
}
// remove all references to the user if you need to
db.remove({user:1000}, {multi:true})
// insert the merged document
db.insert(new_doc)
On my pymongo code, inserting twice the same doc raises an error :
document = {"auteur" : "romain",
"text" : "premier post",
"tag" : "test2",
"date" : datetime.datetime.utcnow()}
collection.insert_one(document)
collection.insert_one(document)
raises :
DuplicateKeyError: E11000 duplicate key error collection: test.myCollection index: _id_ dup key: { : ObjectId('5aa282eff1dba231beada9e3') }
inserting two documents with different content works fine.
Seems like according to https://docs.mongodb.com/manual/reference/method/db.collection.createIndex/#options I should do something aobut option of indexes:
unique boolean
Optional. Creates a unique index so that the collection will not accept insertion or update of documents where the index key value matches an existing value in the index.
Specify true to create a unique index. The default value is false.
The option is unavailable for hashed indexes.
Adding to Peba's answer, you can use the .copy() method of python dictionary to avoid the mutation of the document itself.
document = {"auteur" : "romain",
"text" : "premier post",
"tag" : "test2",
"date" : datetime.datetime.utcnow()}
collection.insert_one(document.copy())
collection.insert_one(document.copy())
This way, each insert_one call get's a shallow copy of the document and at the same time keeps your code more pythonic.
Inserting a document implicitly generates an _id.
So after inserting the document it will mutate to
document = {"_id" : ObjectId('random_id_here'),
"auteur" : "romain",
"text" : "premier post",
"tag" : "test2",
"date" : datetime.datetime.utcnow()}
Trying to insert said document again will result in an error due to the duplicated _id.
You can create a new document with the same values and insert it.
document = {"auteur" : "romain",
"text" : "premier post",
"tag" : "test2",
"date" : datetime.datetime.utcnow()}
collection.insert_one(document)
document = {"auteur" : "romain",
"text" : "premier post",
"tag" : "test2",
"date" : datetime.datetime.utcnow()}
collection.insert_one(document)
rowsI am attempting to perform a mongodb update on each field in an array of records.
An example schema is below:
{
"_id" : ObjectId("508710f16dc636ec07000022"),
"summary" : "",
"uid" : "ABCDEF",
"username" : "bigcheese",
"name" : "Name of this document",
"status_id" : 0,
"rows" : [
{
"score" : 12,
"status_id" : 0,
"uid" : 1
},
{
"score" : 51,
"status_id" : 0,
"uid" : 2
}
]
}
So far I have been able to perform single updates like this:
db.mycollection.update({"uid":"ABCDEF","rows.uid":1}, {$set:{"rows.$.status_id":1}},false,false)
However, I am struggling as to how to perform an update that will update all array records to a status_id of 1 (for instance).
Below is how I imagine it should work:
db.mycollection.update({"uid":"ABCDEF"}, {$set:{"rows.$.status_id":1}},false,true)
However I get the error:
can't append to array using string field name [$]
I have tried for quite a while with no luck. Any pointers?
You can't do the sort of 'wildcard' update of array elements that you're looking for. I think the best you can do is simultaneously set each element's status_id value like this:
db.mycollection.update(
{"uid":"ABCDEF"},
{$set:{
"rows.0.status_id":1,
"rows.1.status_id":1
}}, false, true);