Postgres Version: 9.5.0
I have a database table where one of the columns is stored as text that represents a json value. The json value is an array of dictionaries e.x.:
[{"picture": "XXX", "image_hash": null, "name": "test", "video": null, "link": "http://www.google.com", "table_id": 356}, ..]
I am trying to update the value associated to the key table_id only for the 1st array element. Here is the query I ran:
update table1 set "json_column" = jsonb_set("json_column", "{0, table_id}", null, false) where id = 1;
I keep running into the error - ERROR: column "{0, table_id}" does not exist
Could someone please help me out understanding how this can be fixed?
Related
Consider the following:
create table query(id integer, query_definition jsonb);
create table query_item(path text[], id integer);
insert into query (id, query_definition)
values
(100, '{"columns":[{"type":"integer","field":"id"},{"type":"str","field":"firstname"},{"type":"str","field":"lastname"}]}'::jsonb),
(101, '{"columns":[{"type":"integer","field":"id"},{"type":"str","field":"firstname"}]}'::jsonb);
insert into query_item(path, id) values
('{columns,0,type}'::text[], 100),
('{columns,1,type}'::text[], 100),
('{columns,2,type}'::text[], 100),
('{columns,0,type}'::text[], 101),
('{columns,1,type}'::text[], 101);
I have a query table which has a jsonb column named query_definition.
The jsonb value looks like the following:
{
"columns": [
{
"type": "integer",
"field": "id"
},
{
"type": "str",
"field": "firstname"
},
{
"type": "str",
"field": "lastname"
}
]
}
In order to replace all "type": "..." with "type": "string", I've built the query_item table which contains the following data:
path |id |
----------------+---+
{columns,0,type}|100|
{columns,1,type}|100|
{columns,2,type}|100|
{columns,0,type}|101|
{columns,1,type}|101|
path matches each path from the json root to the "type" entry, id is the corresponding query's id.
I made up the following sql statement to do what I want:
update query q
set query_definition = jsonb_set(q.query_definition, query_item.path, ('"string"')::jsonb, false)
from query_item
where q.id = query_item.id
But it partially works, as it takes the 1st matching id and skips the others (the 1st and 4th line of query_item table).
I know I could build a for statement, but it requires a plpgsql context and I'd rather avoid its use.
Is there a way to do it with a single update statement?
I've read in this topic it's possible to make it with strings, but I didn't find out how to adapt this mechanism with jsonb treatment.
I have a column products in a table test which is of type of text in the below format:
[{"is_bulk_product": false, "rate": 0, "subtotal": 7.17, "qty": 2, "tax": 0.90}]
It is an array with nested dictionary values. When I tried to alter the column using this:
alter table test alter COLUMN products type jsonb using products::jsonb;
I get below error:
ERROR: 22P02: invalid input syntax for type json
DETAIL: Character with value 0x09 must be escaped.
CONTEXT: JSON data, line 1: ...some_id": 2613, "qty": 2, "upc": "1234...
LOCATION: json_lex_string, json.c:789
Time: 57000.237 ms (00:57.000)
how can we make sure the json is valid before altering the column ?
Thank You
Your written JSON string is correct, so this SQL code execute without exception:
select '[{"is_bulk_product": false, "rate": 0, "subtotal": 7.17, "qty": 2, "tax": 0.90}]'::jsonb
Maybe the table has an incorrect JSON format in other records. You can firstly check this by selecting data, example:
select products::jsonb from test;
And you have incorrect syntax on your SQL code, you can cast products field to JSONB but not a test, test is your table name:
alter table test
alter COLUMN products type jsonb using products::jsonb;
I am trying to upsert into a table with jsonb field based on multiple json properties in the jsonb field using below query
insert into testtable(data) values('{
"key": "Key",
"id": "350B79AD",
"value": "Custom"
}')
On conflict(data ->>'key',data ->>'id')
do update set data =data || '{"value":"Custom"}'
WHERE data ->> 'key' ='Key' and data ->> 'appid'='350B79AD'
Above query throws error as below
ERROR: syntax error at or near "->>"
LINE 8: On conflict(data ->>'key',data ->>'id')
am I missing something obvious here?
I suppose you want to insert unique id and key combination value into the table. Then you need a unique constraint for them :
create unique index on testtable ( (data->>'key'), (data->>'id') );
and also use extra parentheses for the on conflict clause as tuple :
on conflict( (data->>'key'), (data->>'id') )
and qualify the jsonb column name ( data ) by table name (testtable) whenever you meet after do update set or after where clauses as testtable.data. So, convert your statement to :
insert into testtable(data) values('{
"key": "Key",
"id": "350B79AD",
"value": "Custom1"
}')
on conflict( (data->>'key'), (data->>'id') )
do update set data = testtable.data || '{"value":"Custom2"}'
where testtable.data ->> 'key' ='Key' and testtable.data ->> 'id'='350B79AD';
btw, data ->> 'appid'='350B79AD' converted to data ->> 'id'='350B79AD' ( appid -> id )
Demo
I have the following table:
CREATE TABLE trip
(
id SERIAL PRIMARY KEY ,
gps_data_json jsonb NOT NULL
);
The JSON in gps_data_json contains an array of of trip objects with the following fields (sample data below):
mode
timestamp
latitude
longitude
I'm trying to get all rows that contain a certain "mode".
SELECT * FROM trip
where gps_data_json ->> 'mode' = 'WALK';
I pretty sure I'm using the ->> operator wrong, but I'm unsure who to tell the query that the JSONB field is an array of objects?
Sample data:
INSERT INTO trip (gps_data_json) VALUES
('[
{
"latitude": 47.063480377197266,
"timestamp": 1503056880725,
"mode": "TRAIN",
"longitude": 15.450349807739258
},
{
"latitude": 47.06362533569336,
"timestamp": 1503056882725,
"mode": "WALK",
"longitude": 15.450264930725098
}
]');
INSERT INTO trip (gps_data_json) VALUES
('[
{
"latitude": 47.063480377197266,
"timestamp": 1503056880725,
"mode": "BUS",
"longitude": 15.450349807739258
},
{
"latitude": 47.06362533569336,
"timestamp": 1503056882725,
"mode": "WALK",
"longitude": 15.450264930725098
}
]');
The problem arises because ->> operator cannot walk through array:
First unnest your json array using json_array_elements function;
Then use the operator for filtering.
Following query does the trick:
WITH
A AS (
SELECT
Id
,jsonb_array_elements(gps_data_json) AS point
FROM trip
)
SELECT *
FROM A
WHERE (point->>'mode') = 'WALK';
Unnesting the array works fine, if you only want the objects containing the values queried.
The following checks for containment and returns the full JSONB:
SELECT * FROM trip
WHERE gps_data_json #> '[{"mode": "WALK"}]';
See also Postgresql query array of objects in JSONB field
select * from
(select id, jsonb_array_elements(gps_data_json) point from trip where id = 16) t
where point #> '{"mode": "WALK"}';
In My Table, id = 16 is to make sure that the specific row is jsonb-array datatype ONLY. Since other rows data is just JSONB object. So you must filter out jsonb-array data FIRST. Otherwise : ERROR: cannot extract elements from an object
rows = [{"id": 1, "json_value": [{"key": "value"}, {"key2": "value2"}]}, {"id": 2, "json_column": None}]
insert_query = table.insert().values(rows)
connection.execute(insert_query)
Doing this will have "null" (String) entered to the row where id=2. Rather than the NULL type.
Is there any way to properly do multiple row insert where value of some JSON columns is NULL?
The issue was a bug and has been fixed by the SQLAlchemy project maintainer.
Details here: https://groups.google.com/forum/#!topic/sqlalchemy/Bu4lJ18Gsa8