I'm having some difficulty using the array_length function in psql.
I have a json object that looks like this when a call a function named test_function:
{
"outer": [
{
"keys": {
"id": 5
},
"name": "Joe Bloggs",
"age": "16",
"new_rels": [
"a6h922ao-621y-230p-52bk-t6i84rr3vo6g"
],
"old_rels": [
"9c8b67bf-871e-4004-88be-9a68dae3a86f",
"e6a15929-4aab-4af6-903a-8f8c09bef572"
],
"s_id": 1
}
],
"total": 0,
}
I am trying to get the length of new_rels and old_rels but having some difficulty, possibly due to it being an array of strings.
I have tried this:
select array_length(r->'updates'->0->>'new_rels',1)::bigint from test_function(1) r
But I am getting the following error:
No function matches the given name and argument types. You might need to add explicit type casts
I've even tried simplifying it and doing something like this but it doesn't work with the double quotes - if I change manually to single quotes it does word:
select array_length('["90faa4b9-23fe-4bde-81e7-4326e7356cde", "d642157c-8a55-44de-ac88-ddaa3ab02bb0"]',1);
You want to use the jsonb_array_length() function for jsonb data. The array_length() function is for native arrays such as text[].
with invars as (
select '{
"outer": [
{
"keys": {
"id": 5
},
"name": "Joe Bloggs",
"age": "16",
"new_rels": [
"a6h922ao-621y-230p-52bk-t6i84rr3vo6g"
],
"old_rels": [
"9c8b67bf-871e-4004-88be-9a68dae3a86f",
"e6a15929-4aab-4af6-903a-8f8c09bef572"
],
"s_id": 1
}
],
"total": 0
}'::jsonb as r
)
select jsonb_array_length(r->'outer'->0->'new_rels'),
jsonb_array_length(r->'outer'->0->'old_rels')
from invars;
jsonb_array_length | jsonb_array_length
--------------------+--------------------
1 | 2
(1 row)
Also, you had an extra comma after the total key.
Related
In Azure Data Factory, I need to be able to process a JSON response. I don't want to hardcode the array position in case they change, so something like this is out of the question:
#activity('Place Details').output.result.components[2].name
How can I get the name 123 where types = number given a JSON array like below:
"result": {
"components": [
{
"name": "ABC",
"types": [
"alphabet"
]
},
{
"name": "123",
"types": [
"number"
]
}
]
}
One example using the OPENJSON method:
DECLARE #json NVARCHAR(MAX) = '{
"result": {
"components": [
{
"name": "ABC",
"types": [
"alphabet"
]
},
{
"name": "123",
"types": [
"number"
]
}
]
}
}'
;WITH cte AS (
SELECT
JSON_VALUE( o.[value], '$.name' ) [name],
JSON_VALUE( o.[value], '$.types[0]' ) [types]
FROM OPENJSON( #json, '$.result.components' ) o
)
SELECT [name]
FROM cte
WHERE types = 'number'
I will have a look at other methods.
I have a jsonb object like this:
{
"applicants": [
{
"last_name": "ss",
"first_name": "ss",
"age": 31
},
{
"last_name": "kk",
"first_name": "kk",
"age": 32
}
]
}
I want to convert it to.
{
"applicants": [
{
"last_name": "ss",
"data": {
"first_name": "ss",
"age": 31
}
},
{
"last_name": "kk",
"data": {
"first_name": "kk",
"age": 32
}
}
]
}
I have done a similar thing using jsonb_array_elements and jsonb_build_object before, but I can't figure out how I would create a new data object inside each object, and transpose the fields into it.
Is it possible to write this in plain psql query?
Thanks.
I have to point out that Postgres is not the best tool for modifying JSON data structures, and if you feel the need to do so, it probably means that your solution is in general not optimal. While Postgres has the necessary features to do this, I wouldn't want to maintain code that contains queries like the following.
update my_table set
json_col = (
select
jsonb_build_object(
'applicants',
jsonb_agg(
elem- 'first_name'- 'age' || jsonb_build_object(
'data',
jsonb_build_object(
'first_name',
elem->'first_name',
'age',
elem->'age'
)
)
)
)
from jsonb_array_elements(json_col->'applicants') as arr(elem)
)
Test it in db<>fiddle.
Starting out with JSONB data type and I'm hoping someone can help me out.
I have a table (properties) with two columns (id as primary key and data as jsonb).
The data structure is:
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Shells"
},
...
]
}
I would like to update the value of a specific attributes element by name for a row with a given id. For example, for the element with "name"="Case" change the value to "Glass". So it ends up like
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Glass"
},
...
]
}
Is this possible with this structure using SQL?
I have created table structure if any of you would like to give it a shot.
dbfiddle
Use the jsonb concatenation operator, ||, to replace keys on the fly:
WITH properties (id, data) AS (
values
(1, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Silver"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb),
(2, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Red"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb)
)
SELECT id,
data||
jsonb_build_object(
'attributes',
jsonb_agg(
case
when attribs->>'name' = 'Case' then attribs||'{"value": "Glass"}'::jsonb
else attribs
end
)
) as data
FROM properties m
CROSS JOIN LATERAL JSONB_ARRAY_ELEMENTS(data->'attributes') as a(attribs)
GROUP BY id, data
Updated fiddle
I have a JSONB column in my PostgreSQL database. The data looks like this:
{
"cars":
[
{
"id": 1,
"brand": "BMW"
"parts":
[
{
"partId": 5,
"type": "battery"
}
]
},
{
"id": 2,
"brand": "Mercedes"
"parts":
[
{
"partId": 5,
"type": "battery"
},
{
"partId": 6,
"type": "engine"
}
]
}
]
}
Is there any way that I can search for all cars that have a part with type "battery"? How can I search inside of cars array and then inside of the parts array of each car element?
As it's not clear in your question that what output you want. So I am assuming that you want id and brand name in output:
so you try this:
select distinct x.y->>'id', x.y->>'brand'
from test
cross join lateral jsonb_array_elements(data->'cars') x(y)
cross join lateral jsonb_array_elements(x.y->'parts') a(b)
where a.b->>'type'='battery'
DEMO
I have a with a jsonb column called jsonb that contains data in the following format.
{
"stuff": [
{
"name": "foo",
"percent": "90.0000"
},
{
"name": "bar",
"percent": "10.0000"
}
],
"countries": [
{
"name": "USA",
"value": "30"
},
{
"name": "Canada",
"value": "25"
},
{
"name": "Mexico",
"value": "20"
},
{
"name": "Ecuador",
"value": "10"
}
]
}
I am having a lot of trouble working with this data. Specifically what I want to do is find all the different values "name" can have in "stuff" as well as in "countries", kind of like a SELECT distinct.
But my problem is that I can't seem to extract anything useful from this jsonb. My approach so far was to do
SELECT jsonb->>'stuff' FROM table, but this only gave me a column of type text which contained [{"name": "foo","percent": "90.0000"},{"name": "bar","percent": "10.0000"}].
But since this is text I can't really do anything with it. I also tried SELECT jsonb_array_elements_text(jsonb) FROM table but that returned the following Error:
ERROR: cannot extract elements from an object
SQL state: 22023
Any help with working with this format of data is greatly appreciated!
You need to unnest each array separately and then create a union on the result of those two steps:
select c.x ->> 'name'
from the_table
cross join lateral jsonb_array_elements(json_column -> 'countries') as c(x)
union
select s.x ->> 'name'
from the_table
cross join lateral jsonb_array_elements(json_column -> 'stuff') as s(x);
Online example: https://rextester.com/ZEOIXF91294