update table based on JSONB columns joins - postgresql

I'm trying to update my table and the code that isn't working is
UPDATE A
SET person_id = B.id
FROM B
WHERE A.info.EmpNumber = B.info.EmpNumber
both the info columns are of type jsonb
A.info looks like this
{
"position": "Data Engineer",
"EmpNumber": "382159"
}
B.info looks like this
{
"salary": 80000,
"EmpNumber": "382159"
}
Could you point me in the right direction?
The error is
ERROR: missing FROM-clause entry for table A
LINE 4: WHERE

You can just use ->> operator in order to extract the value from the related key such as
UPDATE A
SET person_id = B.id
FROM B
WHERE A.info->> 'EmpNumber' = B.info->> 'EmpNumber'
Demo

Related

Postgresql Null value populates left table when doing a left join. Why is this happening?

When I make the query:
"select * from asset a where (a.login_id = $1) and (a.status = 'pub')";
I get back the expected result:
{
id: 23f8jfj8gh2,
name: 'Asset1',
status: 'pub',
create_date: 2020-11-12T07:00:00.000Z,
...
}
But when I add a left join onto the query like this:
"select * from asset a left join asset_image i on (a.id = i.asset_id) where (a.login_id = $1) and (a.status = 'pub')";
I get back:
{
id: null, <---- THIS NULL VALUE for the asset id
name: 'Asset1',
status: 'pub',
create_date: 2020-11-12T07:00:00.000Z,
...
asset_id: null,
orig_name: null,
mimetype: null,
created: null
}
I am new to SQL, and I have looked at the docs for left joins, but I can't seem to figure out exactly where this is going wrong. Any help would be much appreciated! Thanks.
If the asset_image table contains its own column with the name of id when your query gets no match between (a.id = i.asset_id) it will overwrite the id field with null.
You may need to define each tables id column with a more unique name eg.
asset_id to replace id in the asset table
image_id to replace id in the asset_image table
Your problem is probably the * in SELECT *, which is something that you should always avoid in code.
If the table asset_image also has a column named id, your result set will contain two columns named id. You are probably looking at the wrong one.
Using SELECT * gives you no control over the columns you get in the result set, their name and their order. You should use an explicit column list and give columns with the same name in both tables an alias that allows you to disambiguate them.

Get nested value in JSONB column POSTGRESQL

I have a table guest_group with a jsonb column. I want to query one ID where the ID_Context is equal to protelIO.
Here the column of the table:
[
{
"protelSurname":"Smith",
"servicio_tags":[
"protel-info"
],
"protelUniqueID":"[{\"ID\":\"294623726\",\"Type\":\"21\",\"ID_Context\":\"GHA\"},{\"ID\":\"4842148\",\"Type\":\"1\",\"ID_Context\":\"protelIO\"}]",
"protelGivenName":"Seth"
},
{
"value":"test",
"display_name":"Traces",
"servicio_tags":[
"trace"
]
}
]
My try:
SELECT field ->>'protelUniqueID'
FROM guest_group gg
 cross join lateral
jsonb_array_elements(custom_fields) AS field
WHERE value #> '{"servicio_tags": ["protel-info"]}'::jsonb
This gave me:
[{"ID":"294623726","Type":"21","ID_Context":"GHA"},{"ID":"4842148","Type":"1","ID_Context":"protelIO"}]
How can I go the last mile an only get the value of the ID key with the value key pair "ID_Context":"protelIO"?
I appreciate your help!
This will get you the desired result I think. Not very pretty perhaps :-)
select * from (select jsonb_array_elements(f) from (
select (field ->>'protelUniqueID')::jsonb f
FROM guest_group gg,
lateral jsonb_array_elements(custom_fields) AS field
WHERE value #> '{"servicio_tags": ["protel-info"]}'::jsonb
) d(f)) dd(x)
where x->>'ID_Context'='protelIO';

Upserting a postgres jsonb based on multiple properties in jsonb field

I am trying to upsert into a table with jsonb field based on multiple json properties in the jsonb field using below query
insert into testtable(data) values('{
"key": "Key",
"id": "350B79AD",
"value": "Custom"
}')
On conflict(data ->>'key',data ->>'id')
do update set data =data || '{"value":"Custom"}'
WHERE data ->> 'key' ='Key' and data ->> 'appid'='350B79AD'
Above query throws error as below
ERROR: syntax error at or near "->>"
LINE 8: On conflict(data ->>'key',data ->>'id')
am I missing something obvious here?
I suppose you want to insert unique id and key combination value into the table. Then you need a unique constraint for them :
create unique index on testtable ( (data->>'key'), (data->>'id') );
and also use extra parentheses for the on conflict clause as tuple :
on conflict( (data->>'key'), (data->>'id') )
and qualify the jsonb column name ( data ) by table name (testtable) whenever you meet after do update set or after where clauses as testtable.data. So, convert your statement to :
insert into testtable(data) values('{
"key": "Key",
"id": "350B79AD",
"value": "Custom1"
}')
on conflict( (data->>'key'), (data->>'id') )
do update set data = testtable.data || '{"value":"Custom2"}'
where testtable.data ->> 'key' ='Key' and testtable.data ->> 'id'='350B79AD';
btw, data ->> 'appid'='350B79AD' converted to data ->> 'id'='350B79AD' ( appid -> id )
Demo

Json_query check which rows have a special value in their json list

I have a table that each row contains a json column. Inside the json column I have an object containing an array of tags. What I want to do is to see which rows in my table have the tag that I am searching for.
Here is an example of my data:
Row 1:
Id :xxx
Jsom Column:
{
"tags":[
{"name":"blue dragon", weight:0.80},
{"name":"Game", weight:0.90}
]
}
Row 2:
Id : yyy
Jsom Column:
{
"tags":[
{"name":"Green dragon", weight:0.70},
{"name":"fantasy", weight:0.80}
]
}
So I want to write a code that if I search for Green, it returns only row 2 and if I search for dragon it returns both row 1 and 2. How can I do that?
I know I can write this to access my array, but more than that I am clueless :\
I am looking for something like this
Select * from myTable
where JSON_query([JsonColumn], '$.tags[*].name') like '%dragon%'
update
My final query looking like this
select DISTINCT t.id, dv.[key], dv.value
from #t t
cross apply openjson(doc,'$.tags') as d
where json_Value( d.value,'$.name') like '%dragon%'
Something like this:
declare #t table(id int, doc nvarchar(max))
insert into #t(id,doc) values
(1,'
{
"tags":[
{"name":"blue dragon", "weight":"0.80"},
{"name":"Game", "weight":"0.90"}
]
}'),(2,'
{
"tags":[
{"name":"Green dragon", "weight":"0.70"},
{"name":"fantasy", "weight":"0.80"}
]
}')
select t.id, dv.[key], dv.value
from #t t
cross apply openjson(doc,'$.tags') as d
cross apply openjson(d.value) dv
where dv.value like '%dragon%'

PostgreSQL - jsonb_each

I have just started to play around with jsonb on postgres and finding examples hard to find online as it is a relatively new concept.I am trying to use jsonb_each_text to printout a table of keys and values but get a csv's in a single column.
I have the below json saved as as jsonb and using it to test my queries.
{
"lookup_id": "730fca0c-2984-4d5c-8fab-2a9aa2144534",
"service_type": "XXX",
"metadata": "sampledata2",
"matrix": [
{
"payment_selection": "type",
"offer_currencies": [
{
"currency_code": "EUR",
"value": 1220.42
}
]
}
]
}
I can gain access to offer_currencies array with
SELECT element -> 'offer_currencies' -> 0
FROM test t, jsonb_array_elements(t.json -> 'matrix') AS element
WHERE element ->> 'payment_selection' = 'type'
which gives a result of "{"value": 1220.42, "currency_code": "EUR"}", so if i run the below query I get (I have to change " for ')
select * from jsonb_each_text('{"value": 1220.42, "currency_code": "EUR"}')
Key | Value
---------------|----------
"value" | "1220.42"
"currency_code"| "EUR"
So using the above theory I created this query
SELECT jsonb_each_text(data)
FROM (SELECT element -> 'offer_currencies' -> 0 AS data
FROM test t, jsonb_array_elements(t.json -> 'matrix') AS element
WHERE element ->> 'payment_selection' = 'type') AS dummy;
But this prints csv's in one column
record
---------------------
"(value,1220.42)"
"(currency_code,EUR)"
The primary problem here, is that you select the whole row as a column (PostgreSQL allows that). You can fix that with SELECT (jsonb_each_text(data)).* ....
But: don't SELECT set-returning functions, that can often lead to errors (or unexpected results). Instead, use f.ex. LATERAL joins/sub-queries:
select first_currency.*
from test t
, jsonb_array_elements(t.json -> 'matrix') element
, jsonb_each_text(element -> 'offer_currencies' -> 0) first_currency
where element ->> 'payment_selection' = 'type'
Note: function calls in the FROM clause are implicit LATERAL joins (here: CROSS JOINs).
WITH testa AS(
select jsonb_array_elements
(t.json -> 'matrix') -> 'offer_currencies' -> 0 as jsonbcolumn from test t)
SELECT d.key, d.value FROM testa
join jsonb_each_text(testa.jsonbcolumn) d ON true
ORDER BY 1, 2;
tetsa get the temporal jsonb data. Then using lateral join to transform the jsonb data to table format.