Using SQLAlchey query top x items in a Postgres JSONB field - postgresql

I have a model with a JSONB field (Postgres).
from sqlalchemy.dialects.postgresql import JSONB
class Foo(Base):
__tablename__ = 'foo'
data = Column(JSONB, nullable=False)
...
where the data field looks like this:
[
{
"name": "a",
"value": 0.0143
},
{
"name": "b",
"value": 0.0039
},
{
"name": "c",
"value": 0.1537
},
...
]
and, given a search_name I want to return the rows where name is in the top x ranked names.
I know I can access the fields with something like:
res = session.query(Foo).filter(Foo.data['name'] == search_name)
but how do I order the JSON and extract the top names from that?
A SQLAlchemy solution would be preferred, but also a plain SQL one I can use as raw is fine.

Related

How to add new property in an object nested in 2 arrays (JSONB postgresql)

I am looking to you for help in adding a property to a json object nested in 2 arrays.
Table Example :
CREATE TABLE events (
seq_id BIGINT PRIMARY KEY,
data JSONB NOT NULL,
evt_type TEXT NOT NULL
);
example of my JSONB data column in my table:
{
"Id":"1",
"Calendar":{
"Entries":[
{
"Id": 1,
"SubEntries":[
{
"ExtId":{
"Id":"10",
"System": "SampleSys"
},
"Country":"FR",
"Details":[
{
"ItemId":"1",
"Quantity":10,
},
{
"ItemId":"2",
"Quantity":3,
}
],
"RequestId":"222",
"TypeId":1,
}
],
"OrderResult":null
}
],
"OtherThingsArray":[
]
}
}
So I need to add new properties into a SubEntries object based on the Id value of the ExtId object (The where clause)
How can I do that please ?
Thanks a lot
You can use jsonb_set() for this, which takes jsonpath assignments as a text[] (array of text values) as
SELECT jsonb_set(
input_jsonb,
the starting jsonb document
path_array '{i,j,k[, ...]}'::text[],
the path array, where the series {i, j, k} progresses at each level with either the (string) key or (integer) index (starting at zero)denoting the new key (or index) to populate
new_jsonb_value,
if adding a key-value pair, you can use something like to_jsonb('new_value_string'::text) here to force things to format correctly
create_if_not_exists_boolean
if adding new keys/indexes, give this as true so they'll be appended; otherwise you'll be limited to overwriting existing keys
)
Example
json
{
"array1": [
{
"id": 1,
"detail": "test"
}
]
}
SQL
SELECT
jsonb_set('{"array1": [{"id": 1, "detail": "test"}]}'::jsonb,
'{array1,0,update}'::TEXT[],
to_jsonb('new'::text),
true
)
Output
{
"array1": [
{
"id": 1,
"upd": "new",
"detail": "test"
}
]
}
Note that you can only add 1 nominal level of depth at a time (i.e. either a new key or a new index entry), but you can circumvent this by providing the full depth in the assignment value, or by using jsonb_set() iteratively:
select
jsonb_set(
jsonb_set('{"array1": [{"id": 1, "detail": "test"}]}'::jsonb, '{array1,0,upd}'::TEXT[], '[{"new": "val"}]'::jsonb, true),
'{array1,0,upd,0,check}'::TEXT[],
'"test_val"',
true)
would be required to produce
{
"array1": [
{
"id": 1,
"upd": [
{
"new": "val",
"check": "test_val"
}
],
"detail": "test"
}
]
}
If you need other, more complex logic to evaluate which values need to be added to which objects, you can try:
dynamically creating a set of jsonb_set() statements for execution
using the outputs from queries of jsonb_each() and jsonb_array_elements() to evaluate the row logic down at the SubEntities level, and then using jsonb_object_agg() and jsonb_agg() as appropriate to build the document back up to the root level from the resultant object-arrays and key-value collections

jsonb searching for keys in array and returning position

I have the following json object stored into a jsonb column
{
"msrp": 6000,
"data": [
{
"supplier": "a",
"price": 5775
},
{
"supplier": "b",
"price": 6129
},
{
"supplier": "c",
"price": 5224
},
{
"supplier": "d",
"price": 5775
}
]
}
There's a few things I'm trying to do but completely stuck on :(
Check if a supplier exists inside this array. So if I'm looking up if "supplier": "e" is in here. Here's what I tried but didn't work. "where data #> '{"supplier": "e"}'"
(optional but really nice to have) Before returning results if I do a select *, inject into each array a "price_diff" so that I can see the difference between msrp and the supplier price as such.
{
"supplier": "d",
"price": 5775,
"price_diff": 225
}
where data #> '{"supplier": "e"}'
Do you have a column named data? You can't just treat a JSONB key name as if it were a column name.
Containment starts from the root.
colname #> '{"data":[{"supplier": "e"}]}'
You can redefine the 'root' dynamically though:
colname->'data' #> '[{"supplier": "e"}]'

On mongoimport, can I easily map a field to _id on json import?

I need to import a dataset that looks like the similar (abbreviated) dataset.
[
{
"itemId": 1,
"name": "Item",
"qty": "10"
},
...
]
The tricky part is that doing inserts indefinitely will not raise any exception, but the itemId field would represent a valid identifier equivalent to _id field if it could be accepted as such.
Does any option akin to --idField itemId exists?

How to do PostgreSQL '||' operator with knex.update

I am working with the following data in table my_table:
[
{
"item": {
"id": 1,
"data": {
"name": "ABC",
"status": "Active"
}
}
},
{
"item": {
"id": 2,
"data": {
"name": "DEF",
"status": "Active"
}
}
}
]
I would like to update name property of data, keeping the rest of data intact. A PostgreSQL query for that purpose would look like this:
UPDATE my_table SET data = data || '{"name":"GHI"}' WHERE id = 1;
However, I am struggling to achieve this with knex, as I've tried:
knex('my_table')
.update({ data: knex.raw('data || ?', [{ name: 'GHI' }]) })
.where('id', 1);
and many other similar queries, but in vain. If you might have any ideas about this, please share them below. Thanks in advance!
Using knex you indeed have to use raw expressions to update single field inside JSONB column.
However in objection.js which is an ORM, built on top of knex, there are some additional support for JSONB operations.
With objection.js your update would look like
await MyTableModel.query(knex).update({'data:name', 'GHI'}).where('id', 1);
Which outputs SQL like this (with bindings ["GHI", 1]):
update "my_table" set "data" = jsonb_set("data", '{name}', ?, true) where "id" = ?
Runkit example https://runkit.com/embed/0dai0bybplxv

Returning set of records

I have a table that stores records broken into fields, ie. if I have this record as follows
{
"name": "John Doe"
"gender": "male"
}
Then this record is stored in the table in 2 rows, as follows
[
{ "id": "1", "col": "name", "value": "John Doe" },
{ "id": "1", "col": "gender", "value": "male" }
]
If I want to write a postgresql function, that returns the record back to the original form (all attributes in a row form, instead of the one attribute one row form), how can I do it?
(the table design was done as an experiment for data warehousing purpose)
If I understand you right, you could write something like that
select a.value, b.value
from table1 a
inner join table1 b on a.id = b.id and a.orderNum < b.orderNum
To make it simple you could introduce field orderNum
p.s. I do not have Postgress installed, so you possibly would fix my query.