Good day,
I've found a couple solutions on S.O pertaining to finding rows with its JSON column having a specified value.
The issue I'm currently facing, is that my specific JSON column (session_data) contains a multidimensional array, with one or several values:
{
"lastMessages": [
{
"eventId": "1",
"replyDate": "2022-11-23T05:47:18.577Z",
"replyPreview": "response-text-a"
},
{
"eventId": "2",
"replyDate": "2022-11-23T05:48:14.550Z",
"replyPreview": "response-text-b"
},
{
"eventId": "3",
"replyDate": "2022-11-23T06:23:53.234Z",
"replyPreview": "response-text-c"
},
{
"eventId": "4",
"replyDate": "2022-11-23T06:24:13.555Z",
"replyPreview": "response-text-d"
},
{
"eventId": "5",
"replyDate": "2022-11-23T06:24:30.919Z",
"replyPreview": "response-text-z"
}
],
"workflows": {},
"slots": {}
}
How would I go about retrieving all rows from a table where the JSON column array's replyPreview property contains the value response-text-z?
I've tried the following:
SELECT * FROM dialog_sessions WHERE (session_data->'lastMessages')::jsonb ? 'response-text-z' LIMIT 100
however to no avail.
You can use a JSON path expression:
select *
from dialog_sessions
where session_data #? '$.lastMessages[*].replyReview == "response-text-z"'
If you are on an older Postgres version you can try the #> operator:
select *
from dialog_sessions
where session_data -> 'lastMessages' #> '[{"replyPreview": "response-text-z"}]'
DbFiddle using Postgres 11
Related
Let's say I have the following JSON
{
"id": 1,
"sets": [
{
"values": [
{
"value": 1
},
{
"value": 2
}
]
},
{
"values": [
{
"value": 5
},
{
"value": 6
}
]
}
]
}
If the table name is X I expect the query
SELECT x.id, v.value
FROM X as x,
x.sets as sets,
sets.values as v
to give me
id, value
1, 1
1, 2
2, 5
2, 6
and it does work if both sets and values has one object each. When there's more the query fails with column 'id' had 0 remaining values but expected 2. Seems to me I'm not iterating over "sets" properly?
So my question is: what's the proper way to query data structured like my example above in Redshift (using PartiQL)?
I have the following json Response and I'm trying to parse it, in DB2 v11.
The problem is that I'm not able to so.
The values 3e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad6bbb and 4e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad62cd
are IDs and can have up to 60 in the response.
For me to do this, I would need like a kind of TAG for those IDs to be able to select them.
Then is there any ways to do this please ? Thank You -
{
"3e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad62cd": {
"found": true,
"signature": "FhNzQ3N2FjNzIxMjM1NDE3MmFkNzc4MGU1Mjk2OD111111111",
"sectors": [
"1",
"2"
]
},
"4e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad62cd": {
"found": false,
"signature": "FhNzQ3N2FjNzIxMjM1NDE3MmFkNzc4MGU1Mjk2O2222222",
"sectors": []
}
}
I tries this ...
INSERT INTO tablejson
SELECT '22222',
SYSTOOLS.JSON2BSON('
{
"3e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad62cd": {
"found": true,
"signature": "MTY0NDI0MDA5NTUxOTozZTAwYTIwMWQ5ZDFiODk3MzJiZjhjN2EwMGFhNzQ3N2FjNzIxMjM1NDE3MmFkNzc4MGU1Mjk2ODAzYWQ2MmNkOjEsMiwzLDQsNSw2LDcsODpmOGU4YmNlN2I3OGZhZWY3NzVlNDNjM2ZhYzZjNWMzZGRkYzgyMzAzNjI5ZDhjYTc2MDFiODIzYTc0MDRjZWNl",
"sectors": [
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8"
]
},
"4e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad62cd": {
"found": false,
"signature": "MTY0NDI0MDA5NTUxOTo0ZTAwYTIwMWQ5ZDFiODk3MzJiZjhjN2EwMGFhNzQ3N2FjNzIxMjM1NDE3MmFkNzc4MGU1Mjk2ODAzYWQ2MmNkOjphNTk2OTk1YjQ0ZTVmNGM4YTdiMGMxN2MzMzgyMmQyMzZkNDc2YTcyODA4ZTMyM2YxODI2Y2E5NWZjNjU2MWE0",
"sectors": []
}
}
')
FROM SYSIBM.SYSDUMMY1
SELECT ID,SYSTOOLS.BSON2JSON(jsonfield) AS JSON_INFO FROM tablejson
SELECT JSON_VAL(jsonfield,'*.signature','s:1000') FROM tablejson
But what yes is working is if I write directly the ID, but this is not helping a lot as the query should be generic.
SELECT JSON_VAL(jsonfield, '4e00a201d9d1b89732bf8c7a00aa7477ac7212354172ad7780e5296803ad62cd.found' ,'s:1000')
FROM tablejson
As the OUTPUT , I would like something like 3 fields
IDs found signature
3e00a201d9d1b89732bf8... true FhNzQ3N2FjNzIxMjM1ND...
4e00a201d9d1b89732bf8... flase FhNzQ3N2FjNzIxMjM1ND...
Thank You -
json
{
"availability": [{
"qty": 25,
"price": 28990,
"is_available": true
},
{
"qty": 72,
"price": 28990,
"is_available": true
}
]
}
Full text search to find value or node price = 28990 in first array item I use this:
select *
from product
where to_tsvector(product.data #>> '{availability, 0, price}') ## to_tsquery('28990')
Nice.
But I need to find not only it in the first array's item. I need to find it in all items in array.
Smt like this (pseudo code):
select *
from product
where to_tsvector(product.data #>> '{availability, ... , price}') ## to_tsquery('28990')
Is it possible?
What about the containment operator #>:
SELECT *
FROM product
WHERE product.data #> '{"availability": [ { "price": 28990 } ] }';
I have the following json object stored into a jsonb column
{
"msrp": 6000,
"data": [
{
"supplier": "a",
"price": 5775
},
{
"supplier": "b",
"price": 6129
},
{
"supplier": "c",
"price": 5224
},
{
"supplier": "d",
"price": 5775
}
]
}
There's a few things I'm trying to do but completely stuck on :(
Check if a supplier exists inside this array. So if I'm looking up if "supplier": "e" is in here. Here's what I tried but didn't work. "where data #> '{"supplier": "e"}'"
(optional but really nice to have) Before returning results if I do a select *, inject into each array a "price_diff" so that I can see the difference between msrp and the supplier price as such.
{
"supplier": "d",
"price": 5775,
"price_diff": 225
}
where data #> '{"supplier": "e"}'
Do you have a column named data? You can't just treat a JSONB key name as if it were a column name.
Containment starts from the root.
colname #> '{"data":[{"supplier": "e"}]}'
You can redefine the 'root' dynamically though:
colname->'data' #> '[{"supplier": "e"}]'
Given a jsonb and set of keys how can I get a new jsonb with required keys.
I've tried extracting key-values and assigned to text[] and then using jsonb_object(text[]). It works well, but the problem comes when a key has a array of jsons.
create table my_jsonb_table
(
data_col jsonb
);
insert into my_jsonb_table (data_col) Values ('{
"schemaVersion": "1",
"Id": "20180601550002",
"Domains": [
{
"UID": "29aa2923",
"quantity": 1,
"item": "book",
"DepartmentDomain": {
"type": "paper",
"departId": "10"
},
"PriceDomain": {
"Price": 79.00,
"taxA": 6.500,
"discount": 0
}
},
{
"UID": "bbaa2923",
"quantity": 2,
"item": "pencil",
"DepartmentDomain": {
"type": "wood",
"departId": "11"
},
"PriceDomain": {
"Price": 7.00,
"taxA": 1.5175,
"discount": 1
}
}
],
"finalPrice": {
"totalTax": 13.50,
"total": 85.0
},
"MetaData": {
"shopId": "1405596346",
"locId": "95014",
"countryId": "USA",
"regId": "255",
"Date": "20180601"
}
}
')
This is what I am trying to achieve :
SELECT some_magic_fun(data_col,'Id,Domains.UID,Domains.DepartmentDomain.departId,finalPrice.total')::jsonb FROM my_jsonb_table;
I am trying to create that magic function which extracts the given keys in a jsonb format, as of now I am able to extract scalar items and put them in text[] and use jsonb_object. but don't know how can I extract all elements of array
expected output :
{
"Id": "20180601550002",
"Domains": [
{
"UID": "29aa2923",
"DepartmentDomain": {
"departId": "10"
}
},
{
"UID": "bbaa2923",
"DepartmentDomain": {
"departId": "11"
}
}
],
"finalPrice": {
"total": 85.0
}
}
I don't know of any magic. You have to rebuild it yourself.
select jsonb_build_object(
-- Straight forward
'Id', data_col->'Id',
'Domains', (
-- Aggregate all the "rows" back together into an array.
select jsonb_agg(
-- Turn each array element into a new object
jsonb_build_object(
'UID', domain->'UID',
'DepartmentDomain', jsonb_build_object(
'departId', domain#>'{DepartmentDomain,departId}'
)
)
)
-- Turn each element of the Domains array into a row
from jsonb_array_elements( data_col->'Domains' ) d(domain)
),
-- Also pretty straightforward
'finalPrice', jsonb_build_object(
'total', data_col#>'{finalPrice,total}'
)
) from my_jsonb_table;
This probably is not a good use of a JSON column. Your data is relational and would better fit traditional relational tables.