Convert column which contains array into new lines - postgresql

I do have a table with the next data
val
---
["a", "b", "c"]
["d", "e", "f"]
is there a way to select and output it into a format of
val
----
a
b
c
d
e
f
Thanks!

That looks like a JSON array, so you can use:
select e.val
from the_table
cross join jsonb_array_elements(t.val) as e(val)
This assumes that val is defined as jsonb (which it should be). If it's "only" json use json_array_elements() instead. If it's not even json, but just text, then cast it val::jsonb

You can use the unnest function to achieve this. Documentation can be found in array functions: https://www.postgresql.org/docs/13/functions-array.html
SELECT unnest(val) should do the job

Related

Combine JSONB array of values by consecutive pairs

In postgresql, I have a simple one JSONB column data store:
data
----------------------------
{"foo": [1,2,3,4]}
{"foo": [10,20,30,40,50,60]}
...
I need to convert consequent pairs of values into data points, essentially calling the array variant of ST_MakeLine like this: ST_MakeLine(ARRAY(ST_MakePoint(10,20), ST_MakePoint(30,40), ST_MakePoint(50,60))) for each row of the source data.
Needed result (note that the x,y order of each point might need to be reversed):
data geometry (after decoding)
---------------------------- --------------------------
{"foo": [1,2,3,4]} LINE (1 2, 3 4)
{"foo": [10,20,30,40,50,60]} LINE (10 20, 30 40, 50 60)
...
Partial solution
I can already iterate over individual array values, but it is the pairing that is giving me trouble. Also, I am not certain if I need to introduce any ordering into the query to preserve the original ordering of the array elements.
SELECT ARRAY(
SELECT elem::int
FROM jsonb_array_elements(data -> 'foo') elem
) arr FROM mytable;
You can achieve this by using window functions lead or lag, then picking only every second row:
SELECT (
SELECT array_agg((a, b) ORDER BY o)
FROM (
SELECT elem::int AS a, lead(elem::int) OVER (ORDER BY o) AS b, o
FROM jsonb_array_elements(data -> 'foo') WITH ORDINALITY els(elem, o)
) AS pairs
WHERE o % 2 = 1
) AS arr
FROM example;
(online demo)
And yes, I would recommend to specify the ordering explicitly, making use of WITH ORDINALITY.

Flask SQLAlchemy Filter On A Postgres JSON List Object Based on a Single String

I have a model called Testing. The field called alias is a JSON field (a list really) and has values such as ["a", "b"] or ["d", "e"] and so on.
class Testing(db.Model):
__tablename__ = 'testing'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(25))
alias = db.Column(JSON)
def __init__(self, name, alias):
self.name = name
self.alias = alias
In my flask view I grab a url parameter that I want to use to filter Testing to get all Testing objects in which the parameter value is in the alias json list. So for example if url_param_value="a" I want all the Testing objects where "a" is in alias. So the alias value of ["a", "b"] would be a hit in this example.
Here is my approach but its not working and I assume it has to do with seralization.
Testing.query.filter(Testing.alias.contains(url_param_val)).all()
I am getting the below error
NotImplementedError: Operator 'contains' is not supported on this expression
The name field is a JSON type, not an array type. JSON columns don't have a contains method, even if you happen to be storing array data (how would the database know?)
In Postgres, you can use json_array_elements to expand a JSON array to a set of JSON values; this will return one row per element:
select id, json_array_elements(alias) as val from testing;
id | val
---------+--------------------
1 | "a"
2 | "b"
You can use that as a subquery to select records that contain a matching value:
select t.id, t.name, t.alias, cast(q.val as varchar)
from testing t, (
select id, json_array_elements(alias) as val
from testing
) q
where q.id=t.id and cast(q.val as varchar) = '"a"';
In SQLAlchemy syntax:
subq = session.query(
Testing.id,
func.json_array_elements(Testing.alias).label("val")
).subquery()
q = session.query(Testing).filter(
cast(subq.c.val, sa.Unicode) == '"a"',
subq.c.id == Testing.id)
Warning: this is going to be very inefficient for large tables; you're probably better off fixing the types to match your data, and then creating appropriate indexes.

select all rows where array elements exist

I have a table:
CREATE TABLE test_array
(
id integer,
arr TEXT[]
);
I insert sample data:
INSERT INTO test_array(id, arr) VALUES (1, '{"a", "b", "c"}');
INSERT INTO test_array(id, arr) VALUES (2, '{"d", "f", "c"}');
INSERT INTO test_array(id, arr) VALUES (3, '{"a", "z", "i"}');
I want to get rows where elements {"a", "c"} is exist,
so the result must be:
'{"a", "b", "c"}'
'{"d", "f", "c"}'
'{"a", "z", "i"}'
I write query:
select * from test_array where arr #> '{"a"}' or arr #> '{"c"}';
but I want to make query without or, in one condition. Is it possible?
When I run select * from test_array where arr #> '{"a", "c"}';
it returns me only one row
https://rextester.com/ATMU4521
The #> means "contains" so all elements from the array on the right hand side must exist in the array on the left hand side. You are looking for the overlaps && operator which is described as "have elements in common":
select *
from test_array
where arr && array['a', 'c'];
I prefer the array[] notation to specify array constant as I don't need to think about nested quotes.

Build jsonb array from jsonb field

I have column options with type jsonb , in format {"names": ["name1", "name2"]} which was created with
UPDATE table1 t1 SET options = (SELECT jsonb_build_object('names', names) FROM table2 t2 WHERE t2.id= t1.id)
and where names have type jsonb array.
SELECT jsonb_typeof(names) FROM table2 give array
Now I want to extract value of names as jsonb array. But query
SELECT jsonb_build_array(options->>'names') FROM table
gave me ["[\"name1\", \"name2\"]"], while I expect ["name1", "name2"]
How can I get value in right format?
The ->> operator will return the value of the field (in your case, a JSON array) as a properly escaped text. What you are looking for is the -> operator instead.
However, note that using the jsonb_build_array on that will return an array containing your original array, which is probably not what you want either; simply using options->'names' should get you what you want.
Actually, you don't need to use jsonb_build_array() function.
Use select options -> 'names' from table; This will fix your issue.
jsonb_build_array() is for generating the array from jsonb object. You are following wrong way. That's why you are getting string like this ["[\"name1\", \"name2\"]"].
Try to execute this sample SQL script:
select j->'names'
from (
select '{"names": ["name1", "name2"]}'::JSONB as j
) as a;

postgres: concat string with array

Assume we have a column with text with next structure: ["A", "B", "C"],
how to concat it with array ARRAY['A','C','D','E'] and produce string ["A", "B", "C", "D", "E"] (string without repeated elements)?
postgres version is 9.4.8
Column data can be ["A", "B", "C"] or null, how it can be concatenated with ARRAY['A','C','D','E'] (actually it can be a string, but i need to add elements to existing string without repeating them), resulting string must have the following pattern ["A", "B", "C", "D", "E"]
Solved with script, that alternate db via pdo.
SELECT array_agg(x) FROM
(SELECT * FROM unnest(ARRAY['A', 'B', 'C'])
UNION
SELECT * FROM unnest(ARRAY['A','C','D','E'])
) a(x);
┌─────────────┐
│ array_agg │
├─────────────┤
│ {D,B,E,C,A} │
└─────────────┘
(1 row)
Transform the string to another array, unnest both and take the ordered UNION of both to form a new array:
SELECT ARRAY(
SELECT * FROM unnest(ARRAY['A','C','D','E'])
UNION
SELECT * FROM unnest(string_to_array(translate('["A", "B", "C"]', '[]"', ''), ', '))
ORDER BY 1
);
I removed the characters []" to proceed with the simple case. You would need to explain why you have them / need them ...
An ARRAY constructor is faster for the simple case.