array_agg for Array Types - postgresql

I'm trying to get array_agg to work with an array type in Postgresql and I'm having trouble figuring out if this is possible and if so how to do it. The pertinent part of my query looks like this:
array_agg(ARRAY[e.alert_type::text, e.id::text, cast(extract(epoch from e.date_happened) as text)] order by e.date_happened asc, e.id asc)
The error that I'm getting in response is ERROR: could not find array type for data type text[]
Is this possible or should I try to find another approach?
Thanks!

You could write custom aggregate to handle your specific array of arrays, e.g.:
DROP TABLE IF EXISTS e;
CREATE TABLE e
(
id serial PRIMARY KEY,
alert_type text,
date_happened timestamp with time zone
);
INSERT INTO e(alert_type, date_happened) VALUES
('red', '2011-05-10 10:15:06'),
('yellow', '2011-06-22 20:01:19');
CREATE OR REPLACE FUNCTION array_agg_custom_cut(anyarray)
RETURNS anyarray
AS 'SELECT $1[2:array_length($1, 1)]'
LANGUAGE SQL IMMUTABLE;
DROP AGGREGATE IF EXISTS array_agg_custom(anyarray);
CREATE AGGREGATE array_agg_custom(anyarray)
(
SFUNC = array_cat,
STYPE = anyarray,
FINALFUNC = array_agg_custom_cut,
INITCOND = $${{'', '', ''}}$$
);
Query:
SELECT
array_agg_custom(
ARRAY[
alert_type::text,
id::text,
CAST(extract(epoch FROM date_happened) AS text)
])
FROM e;
Result:
array_agg_custom
--------------------------------------------
{{red,1,1305036906},{yellow,2,1308787279}}
(1 row)
EDIT:
Here is second, shorter way (that is, you don't need array_agg_custom_cut function, but as you see additional ARRAY level is necessary in query):
CREATE AGGREGATE array_agg_custom(anyarray)
(
SFUNC = array_cat,
STYPE = anyarray
);
SELECT
array_agg_custom(
ARRAY[
ARRAY[
alert_type::text,
id::text,
CAST(extract(epoch FROM date_happened) AS text)
]
])
FROM e;
Result:
array_agg_custom
--------------------------------------------
{{red,1,1305036906},{yellow,2,1308787279}}
(1 row)

or cast the array to text like array_agg(array[xxx, yyy]::text)
array_agg(ARRAY[e.alert_type::text, e.id::text,
cast(extract(epoch from e.date_happened) as text)]::text
order by e.date_happened asc, e.id asc)

Related

How to return an array of table records in PostgreSQL

I am getting a type mismatch error when trying to return an array of elements of type table1, the inherent type of the table1 I have declared.
Error occurred during SQL query execution
Razón:
SQL Error [42P13]: ERROR: return type mismatch in function declared to return table1[]
Detail: Actual return type is record[].
Where: SQL function "arrayof_records"
This is an oversimplified code that reproduces my problem.
drop table if exists table1 cascade;
create table table1 (
id serial primary key,
title text,
create_dt timestamp default now()
);
insert into table1 (title) values
('one'),
('two'),
('three');
create or replace function arrayof_records ()
returns table1[]
stable language sql as $$
select array_agg (t.*)
from (
select * from table1
order by create_dt desc
) as t;
$$;
It is clear that the parser is expecting some other expression in the array_agg function. I have tried t, t.* and *. All of them fail.
I expect there is a syntax, as PostgreSQL 12 documentations states "array_agg(expression)|any non-array type".
Any idea?
You can use a slightly different way of creating the array:
create or replace function arrayof_records ()
returns table1[]
stable language sql
as
$$
select array(
select table1
from table1
order by create_dt desc
);
$$;
That's typically faster than array_agg() as well.

Filtering out of required JSON object from a JSON Array column in postgresql with the input as single parameter

I have a select query return and it shows the result like below:
select * from table gives the result like below
I have parameter called Apple If I pass the parameter somewhere in query I should get the result like below
How to get this in postgresql. If anyone knows please share the answer below.
I would do this with a helper function for clarity. And it might be reusable.
create or replace function filter_jsonb_array(arr jsonb, fruit text)
returns jsonb language sql immutable as
$$
select coalesce
(
(select jsonb_agg(j) from jsonb_array_elements(arr) j where j ->> 'fruit' = fruit),
'[]'::jsonb
);
$$;
and then
select "Column_A", "Column_B", filter_jsonb_array("Column_JSONARRAY", 'Apple') from table_;
If you do not want a function then the function body can be placed directly into the select query.
select
"Column_A",
"Column_B",
coalesce
(
(select jsonb_agg(j) from jsonb_array_elements("Column_JSONARRAY") j where j ->> 'fruit' = 'Apple'),
'[]'::jsonb
) "Column_JSONARRAY"
from table_;
Considering your datatype of column Column_JSONARRAY is JSONB, try This:
with cte as (
SELECT column_a, column_b, (column_jsonarray ->> ( index_-1 )::int)::jsonb AS "column_jsonarray"
FROM table_
CROSS JOIN jsonb_array_elements(column_jsonarray)
WITH ORDINALITY arr(array_,index_)
WHERE array_->>'fruit' in ('Apple')
)
select t1.column_a, t1.column_b, jsonb_agg(t2.column_jsonarray)
from table_ t1
left join cte t2 on t1.column_a =t2.column_a and t1.column_b =t2.column_b
group by t1.column_a, t1.column_b

Make multiple JSONArray rows into one single row by grouping with some other column in postgresql

I have a view from a query select * from table which returns the below data
I want to group by the name column which have same name and merge the JSONArray column like mentioned below
One way to do this, is to unnest the arrays and then aggregate them back:
select t.id, t.name, jsonb_agg(a.e)
from the_table t
cross join lateral jsonb_array_elements(t.json_array) as a(e)
group by t.id, t.name;
If you do that a lot, a custom aggregate makes this a bit easier to user (but probably not faster)
create function jsonb_array_combine(p_one jsonb, p_two jsonb)
returns jsonb
as
$$
select jsonb_agg(e)
from (
select e
from jsonb_array_elements(p_one) as o(e)
union all
select e
from jsonb_array_elements(p_two) as t(e)
) t
$$
language sql
immutable;
create aggregate jsonb_array_agg(jsonb)
(
SFUNC = jsonb_array_combine(jsonb, jsonb),
STYPE = jsonb
);
Then you can use it like this:
select t.id, t.name, jsonb_array_agg(t.json_array)
from the_table t
group by t.id, t.name;

Saving a query in a table

I would like to store a table variable as to be accessed by another query within a function. Here is what I have so far.
CREATE OR REPLACE FUNCTION suggest(p_id INTEGER)
RETURNS TABLE (
r_id INT,
i_id INT
) AS $$
DECLARE
p_record INT[];
BEGIN
SELECT ingredient_id INTO p_record FROM shop_ingredients
WHERE item_id IN
(SELECT item_id FROM basket
WHERE user_id = p_id);
...
I am not sure whether the type of p-record should be an array of INTEGER or a RECORD. Within the function I would like to access such list of values, for example:
HAVING SUM(ingredient_id = ANY(p_record)) >= (COUNT(*)*0.6)
How can I achieve this? I have searched endlessly to understand how to manage this but to no avail.
The problem with creating a table, like so:
CREATE TEMP TABLE IF NOT EXISTS p_record
AS SELECT ingredient_id
FROM shop_ingredients
WHERE item_id IN
(SELECT item_id FROM basket
WHERE user_id = p_id);
is that if p_id parameter changes, the p_record variables does not.
Maybe the following SQL function can help:
create or replace function suggest(p_id int)
returns table (i_id int)
language sql
as
$$
select ingredient_id
from shop_ingredient
where item_id in
(select item_id from basket
where user_id = p_id);
$$;
Note that the SELECT statement column list must exactly match the TABLE clause after RETURNS keyword and this function is only SQL so no need of BEGIN/END block or intermediate record variable.

Merging Concatenating JSON(B) columns in query

Using Postgres 9.4, I am looking for a way to merge two (or more) json or jsonb columns in a query. Consider the following table as an example:
id | json1 | json2
----------------------------------------
1 | {'a':'b'} | {'c':'d'}
2 | {'a1':'b2'} | {'f':{'g' : 'h'}}
Is it possible to have the query return the following:
id | json
----------------------------------------
1 | {'a':'b', 'c':'d'}
2 | {'a1':'b2', 'f':{'g' : 'h'}}
Unfortunately, I can't define a function as described here. Is this possible with a "traditional" query?
In Postgres 9.5+ you can merge JSONB like this:
select json1 || json2;
Or, if it's JSON, coerce to JSONB if necessary:
select json1::jsonb || json2::jsonb;
Or:
select COALESCE(json1::jsonb||json2::jsonb, json1::jsonb, json2::jsonb);
(Otherwise, any null value in json1 or json2 returns an empty row)
For example:
select data || '{"foo":"bar"}'::jsonb from photos limit 1;
?column?
----------------------------------------------------------------------
{"foo": "bar", "preview_url": "https://unsplash.it/500/720/123"}
Kudos to #MattZukowski for pointing this out in a comment.
Here is the complete list of build-in functions that can be used to create json objects in PostgreSQL. http://www.postgresql.org/docs/9.4/static/functions-json.html
row_to_json and json_object doest not allow you to define your own keys, so it can't be used here
json_build_object expect you to know by advance how many keys and values our object will have, that's the case in your example, but should not be the case in the real world
json_object looks like a good tool to tackle this problem but it forces us to cast our values to text so we can't use this one either
Well... ok, wo we can't use any classic functions.
Let's take a look at some aggregate functions and hope for the best... http://www.postgresql.org/docs/9.4/static/functions-aggregate.html
json_object_agg Is the only aggregate function that build objects, that's our only chance to tackle this problem. The trick here is to find the correct way to feed the json_object_agg function.
Here is my test table and data
CREATE TABLE test (
id SERIAL PRIMARY KEY,
json1 JSONB,
json2 JSONB
);
INSERT INTO test (json1, json2) VALUES
('{"a":"b", "c":"d"}', '{"e":"f"}'),
('{"a1":"b2"}', '{"f":{"g" : "h"}}');
And after some trials and errors with json_object here is a query you can use to merge json1 and json2 in PostgreSQL 9.4
WITH all_json_key_value AS (
SELECT id, t1.key, t1.value FROM test, jsonb_each(json1) as t1
UNION
SELECT id, t1.key, t1.value FROM test, jsonb_each(json2) as t1
)
SELECT id, json_object_agg(key, value)
FROM all_json_key_value
GROUP BY id
For PostgreSQL 9.5+, look at Zubin's answer.
This function would merge nested json objects
create or replace function jsonb_merge(CurrentData jsonb,newData jsonb)
returns jsonb
language sql
immutable
as $jsonb_merge_func$
select case jsonb_typeof(CurrentData)
when 'object' then case jsonb_typeof(newData)
when 'object' then (
select jsonb_object_agg(k, case
when e2.v is null then e1.v
when e1.v is null then e2.v
when e1.v = e2.v then e1.v
else jsonb_merge(e1.v, e2.v)
end)
from jsonb_each(CurrentData) e1(k, v)
full join jsonb_each(newData) e2(k, v) using (k)
)
else newData
end
when 'array' then CurrentData || newData
else newData
end
$jsonb_merge_func$;
Looks like nobody proposed this kind of solution yet, so here's my take, using custom aggregate functions:
create or replace aggregate jsonb_merge_agg(jsonb)
(
sfunc = jsonb_concat,
stype = jsonb,
initcond = '{}'
);
create or replace function jsonb_concat(a jsonb, b jsonb) returns jsonb
as 'select $1 || $2'
language sql
immutable
parallel safe
;
Note: this is using || which replaces existing values at same path instead of deeply merging them.
Now jsonb_merge_agg is accessible like so:
select jsonb_merge_agg(some_col) from some_table group by something;
Also you can tranform json into text, concatenate, replace and convert back to json. Using the same data from Clément you can do:
SELECT replace(
(json1::text || json2::text),
'}{',
', ')::json
FROM test
You could also concatenate all json1 into single json with:
SELECT regexp_replace(
array_agg((json1))::text,
'}"(,)"{|\\| |^{"|"}$',
'\1',
'g'
)::json
FROM test
This is a very old solution, since 9.4 you should use json_object_agg and simple || concatenate operator. Keeping here just for reference.
However this question is answered already some time ago; the fact that when json1 and json2 contain the same key; the key appears twice in the document, does not seem to be best practice.
Therefore u can use this jsonb_merge function with PostgreSQL 9.5:
CREATE OR REPLACE FUNCTION jsonb_merge(jsonb1 JSONB, jsonb2 JSONB)
RETURNS JSONB AS $$
DECLARE
result JSONB;
v RECORD;
BEGIN
result = (
SELECT json_object_agg(KEY,value)
FROM
(SELECT jsonb_object_keys(jsonb1) AS KEY,
1::int AS jsb,
jsonb1 -> jsonb_object_keys(jsonb1) AS value
UNION SELECT jsonb_object_keys(jsonb2) AS KEY,
2::int AS jsb,
jsonb2 -> jsonb_object_keys(jsonb2) AS value ) AS t1
);
RETURN result;
END;
$$ LANGUAGE plpgsql;
The following query returns the concatenated jsonb columns, where the keys in json2 are dominant over the keys in json1:
select id, jsonb_merge(json1, json2) from test
FYI, if someone's using jsonb in >= 9.5 and they only care about top-level elements being merged without duplicate keys, then it's as easy as using the || operator:
select '{"a1": "b2"}'::jsonb || '{"f":{"g" : "h"}}'::jsonb;
?column?
-----------------------------
{"a1": "b2", "f": {"g": "h"}}
(1 row)
Try this, if anyone having an issue for merging two JSON object
select table.attributes::jsonb || json_build_object('foo',1,'bar',2)::jsonb FROM table where table.x='y';
CREATE OR REPLACE FUNCTION jsonb_merge(pCurrentData jsonb, pMergeData jsonb, pExcludeKeys text[])
RETURNS jsonb IMMUTABLE LANGUAGE sql
AS $$
SELECT json_object_agg(key,value)::jsonb
FROM (
WITH to_merge AS (
SELECT * FROM jsonb_each(pMergeData)
)
SELECT *
FROM jsonb_each(pCurrentData)
WHERE key NOT IN (SELECT key FROM to_merge)
AND ( pExcludeKeys ISNULL OR key <> ALL(pExcludeKeys))
UNION ALL
SELECT * FROM to_merge
) t;
$$;
SELECT jsonb_merge('{"a": 1, "b": 9, "c": 3, "e":5}'::jsonb, '{"b": 2, "d": 4}'::jsonb, '{"c","e"}'::text[]) as jsonb
works well as an alternative to || when recursive deep merge is required (found here) :
create or replace function jsonb_merge_recurse(orig jsonb, delta jsonb)
returns jsonb language sql as $$
select
jsonb_object_agg(
coalesce(keyOrig, keyDelta),
case
when valOrig isnull then valDelta
when valDelta isnull then valOrig
when (jsonb_typeof(valOrig) <> 'object' or jsonb_typeof(valDelta) <> 'object') then valDelta
else jsonb_merge_recurse(valOrig, valDelta)
end
)
from jsonb_each(orig) e1(keyOrig, valOrig)
full join jsonb_each(delta) e2(keyDelta, valDelta) on keyOrig = keyDelta
$$;