Hanging query with JSON type - db2

I'm doing a complex JSON select query in DB2, and sometimes, the query stays hang-up.
It is always the same query and it doesn't matter of the number of record processed.
Just would like to know if someone had already this kind of problem ? maybe some solutions ?
either it seems to me to be a DB2 Bug with JSON.
Not sure if I have to change the DDL of this table to make this working ? or change some DB Parameters ?
The structure of the Query is like this
select json_object (
'pppp' value json_object (
) format json,
'uuuu' value json_object ( 'jjjj' value
json_array
( json_object('type' value 'jjjj', 'value' value jjjjjj ) format json ) format json
) format json,
'yyy' value json_object (
'xxx' value json_object (
format json absent on null
) format json
) format json
absent on null) as JsonID
from TABLE
Thank You -

Related

How can I compare json field with a string value in postgresql?

I have a field payload saved in a postgresql table which is json type. This type has a nested field subcategory which is string. Below is the output of this value:
=> select payload->'subcategory' from "Merchant";
?column?
-------------------------------
"Food"
null
"AUTOMOTIVE"
null
"MEDICAL"
null
null
"CLUB"
"Petrol Stations"
However, I can't put this field in the where clause. Below query returns 0 rows. But from above output it shows there are rows whose value is CLUB. What is the right way to use json field in where clause?
=> select count(*) from "Merchant" where ("payload"->'subcategory')::text = 'CLUB';
count
-------
0
Figured out what's wrong, I need to use ->> in the where like "payload"->'subcategory'.
that because ->> converts it to text while -> makes it as JSONB
An alternative solution is to use the JSON contains operator #>:
select count(*)
from "Merchant"
where payload #> '{"subcategory": "CLUB")';

How to check JSONB has a field in root?

I try this:
select * from "User" where "partnerData" -> 'name' != NULL
partnerData is a JSONB. I would see those rows, does not have the name field in JSON.
You can't use <> (or != or any other operator) to check for NULL values, you need to use IS NULL. Using -> also returns a jsonb value which might be the literal null not the SQL NULL value. So you should use ->> to return a text value (which would then be a SQL NULL)
select *
from "User"
where "partnerData" ->> 'name' IS NULL
Note that this doesn't distinguish between a JSON value that contains the key name but with a value of NULL and a JSON value that does not contain the key at all.
If you only want to check if the key exists (regardless of the value - even if it's a JSON null), use the ? operator.
where "partnerData" ? 'name'

Postgres - Remove NULL values from a json array

I'm building a JSON ARRAY from a table which has a JSON column and non-JSON columns.
Here is my sample data and query:
create table check_details(SHORT_NAME VARCHAR, UNIQUE_NO JSON,STATUS VARCHAR);
insert into check_details values('Anu','{"ID":"1e699-76af2"}','REJECTED');
select json_agg(json_strip_nulls(
json_build_object('Name',SHORT_NAME,
'IDS',
jsonb_build_array(case SIGN(position('ACCEPTED' in STATUS) ) when 1 then UNIQUE_NO::jsonb->>'ID' else json_typeof(NULL::json) end)
)))
from check_details;
I am getting this result:
[{"Name":"Anu","IDS":[null]}]
But I do not want to get "IDS":[null] part in my result when the value of the key IDS is NULL.
How can I achieve this result:
[{"Name":"Anu"}]
When IDS has a valid value, it has to be an array. Hence using jsonb_build_array.
This is because you are placing the result of your CASE statement in a JSON array, so it's a non-empty array containing a JSON null rather than null JSON value.
So you would need to stop that being an array if you want the NULL to be stripped:
SELECT
json_strip_nulls(
json_build_object(
'Name',
short_name,
'IDS',
CASE SIGN(position('ACCEPTED' IN status) )
WHEN 1 THEN (unique_no::jsonb->>'ID')::text
ELSE NULL
END
)
)
FROM
check_details;
json_strip_nulls
------------------
{"Name":"Anu"}
(1 row)
Note that json_strip_nulls() doesn't remove null values from JSON arrays.
Edit:
But as you require non-null values to show as an array, move the jsonb_build_array() function into the case statement
SELECT
json_strip_nulls(
json_build_object(
'Name',
short_name,
'IDS',
CASE SIGN(position('ACCEPTED' IN status) )
WHEN 1 THEN jsonb_build_array((unique_no::jsonb->>'ID')::text)
ELSE NULL
END
)
)
FROM
check_details;

How to lower-case all the elements of a JSONB array of strings of each row in a table

I have a table with a field called "data" which is of JSONB type. The content of "data" is an object with one of the fields called "associated_emails", which is an array of strings.
I need to update the existing table so that the content of "associated_emails" is all lower-case. How to achieve that? This is my attempt so far (it triggers error: ERROR: cannot extract elements from a scalar)
update mytable my
set
"data" = safe_jsonb_set(
my."data",
'{associated_emails}',
to_jsonb(
lower(
(
SELECT array_agg(x) FROM jsonb_array_elements_text(
coalesce(
my."data"->'associated_emails',
'{}'::jsonb
)
) t(x)
)::text[]::text
)::text[]
)
)
where
my.mytype = 'something';
You would like to use JSONB_SET and UPDATE the column with something like given below below:
UPDATE jsonb_test
SET data = JSONB_SET(data, '{0,associated_emails}',
JSONB(LOWER(data ->> 'associated_emails'::TEXT)));

How to select values in Postgresql that don't match json type?

I have column "ЕИУ" that's text column. Some rows contains json values, but some can be NULL on not json(text for example.) I would like to extract all values that NOT json values.
None json values example
Looking at your sample data, you can simply get this using NOT LIKE
select * from your_table where column_name not like '{%}'
Otherwise, you can create a function that will try casting to JSON and return true or false based on that output.
select * from your_table where your_json_function(your_column);