Postgres INSERT ON CONFLICT with JSONB - postgresql

I'm trying to use Postgres as a document store and am running into a problem when I'm trying to effectively upsert a document where the Postgres parser doesn't seem to like the JSONB operator.
I have a table:
CREATE TABLE tbl (data jsonb NOT NULL);
CREATE UNIQUE INDEX ON tbl ((data->>'a'));
and I try to insert data with:
INSERT INTO tbl (data) VALUES ('{ "a": "b" }'::jsonb)
ON CONFLICT (data->>a)
DO UPDATE SET data = data || '{ "a": "b" }'::jsonb
I get this error message:
ERROR: syntax error at or near "->>"
I've tried data->>a, data->>'a', data->a, and maybe data->'a'. All of those are
I'd like to leave the identifier column (a in the example) within the JSON and not make it a column on the table.
Is what I'm trying to do currently supported?

There are two issues you have:
1) You need to add additional parenthesis, like so:
ON CONFLICT ((data->>'a'))
2) You need to preface the last data reference with your table alias, like so:
DO UPDATE SET data = tbl.data || '{ "a": "b" }'::jsonb

Read through this PostgreSQL documentation on this subject. You might want to use the json_populate_record function to fill the table if you are trying to build a json parser. Also see the related question:
How to perform update operations on columns of type JSONB in Postgres 9.4

Related

Postgres : Unable to extract data from a bytea column which stores json array data

I'm trying to extract data from a bytea column which stores JSON data in Postgres 11.9 version.
However, the my code is throwing an error:
ERROR: invalid input syntax for type json
DETAIL: Token "" is invalid.
CONTEXT: JSON data, line 1: ...
Here is the sample data:
create table EMPLOYEE (PAYMENT bytea,NAME character varying);
insert into EMPLOYEE
values ('[{"totalCode":{"code":"EMPLOYER_TAXES"},"totalValue":{"amount":122.5,"currencyCode":"USD"}},{"totalCode":{"code":"OTHER_PAYMENTS"},"totalValue":{"amount":0.0,"currencyCode":"USD"}},{"totalCode":{"code":"GROSS_PAY"},"totalValue":{"amount":1000.0,"currencyCode":"USD"}},{"totalCode":{"code":"TOTAL_HOURS"},"totalValue":{"amount":40.0}}]'::bytea,'Tom')
;
Here is my query:
SELECT *
FROM EMPLOYEE left outer join lateral
jsonb_array_elements(PAYMENT::text::jsonb) element1 on true ;
Please help me in accessing data from this array. Data is always JSON in format.
There was a restriction to use bytea for this column.
You are making your life unnecessary hard by storing JSON values in a bytea column. Just because this is the recommended way in Oracle, doesn't mean this is a good choice for Postgres.
The correct solution is to change that column to jsonb. You will have to have a DBMS specific layer in your application anyway as the actual functions and operators you are using are very different.
Having said that, you can get away with this awful choice by using the convert_from() method:
select e.name, element1.*
from employee e
left join lateral jsonb_array_elements(convert_from(PAYMENT, 'UTF-8')::jsonb) element1 on true;
I also think you should change your INSERT statement to do an explicit conversion from text to bytea so that you can be sure the correct encoding is used:
insert into employee (payment, name)
values (convert_to('[{...}]', 'UTF-8'),'Tom');
But again: the only correct solution is to change that column to jsonb (or least json)

Postgres: update value of TEXT column (CLOB)

I have a column of type TEXT which is supposed to represent a CLOB value and I'm trying to update its value like this:
UPDATE my_table SET my_column = TEXT 'Text value';
Normally this column is written and read by Hibernate and I noticed that values written with Hibernate are stored as integers (perhaps some internal Postgres reference to the CLOB data).
But when I try to update the column with the above SQL, the value is stored as a string and when Hibernate tries to read it, I get the following error: Bad value for type long : ["Text value"]
I tried all the options described in this answer but the result is always the same. How do I insert/update a TEXT column using SQL?
In order to update a cblob created by Hibernate you should use functions to handling large objects:
the documentation can be found in the following links:
https://www.postgresql.org/docs/current/lo-interfaces.html
https://www.postgresql.org/docs/current/lo-funcs.html
Examples:
To query:
select mytable.*, convert_from(loread(lo_open(mycblobfield::int, x'40000'::int), x'40000'::int), 'UTF8') from mytable where mytable.id = 4;
Obs:
x'40000' is corresponding to read mode (INV_WRITE)
To Update:
select lowrite(lo_open(16425, x'60000'::int), convert_to('this an updated text','UTF8'));
Obs:
x'60000' = INV_WRITE + INV_READ is corresponding to read and write mode (INV_WRITE + IV_READ).
The number 16425 is an example loid (large object id) which already exists in a record in your table. It's that integer number you can see as value in the blob field created by Hinernate.
To Insert:
select lowrite(lo_open(lo_creat(-1), x'60000'::int), convert_to('this is a new text','UTF8'));
Obs:
lo_creat(-1) generate a new large object a returns its loid

How to update jsonb field in Postgres v9.4+ using Peewee

I learned that in Postgres v9.4 and above, one can do partial updates to jsonb columns.
I am using Peewee on top of my Postgres database. Peewee has a support to declare a column jsonb but I cannot find an efficient way of updating the jsonb column to harness the power of partial update.
Here's what I am doing -
SQL = "update table_name set data = data || %s where tid = %s"
vals = (json.dumps(partial_data), tid)
db_pointer.execute_sql(SQL, vals)
data is the column name
db_pointer is the pointer to Postgres database using playhouse.
The above code works fine, but here I am not using Peewee to update it. It just a normal query to the database.
Does anyone know a better way to do this?
As I commented on GitHub, you can instead:
data = {'some': 'data', 'for json field': [1, 2, 3]}
query = ModelClass.update({
ModelClass.data: ModelClass.data.concat(data)
}).where(ModelClass.id == some_id)
query.execute()

Postgres HStore - Updating existing columns with values from Hstore

I am trying to move data from a HStore to a Column using the following query
update mytable set "height" = tags->"height" ,
"building:levels" = tags->"building:levels", "type" = tags->"type",
"building:part" = tags->"building:part"
Where ( tags->"building:levels" <>'' or "tags"->"height" <> ''
or "tags"->"type" <> '' or "tags"->"building:part" <> '' );
The idea was to try and speed the query up by testing for non null values in the HStore. (This is a very large database)
After two days of the query running, I am sure there must be a better way. This is my first attempt at moving data form HStore into a Column.
I can see populate_record in the documentation, but cannot figure out how to use it to just get the hstore tags I need to the correct columns.
Is my original syntax correct and is there any way I can do it faster using populate_record and if so, what should the query look like?
Many Thanks

How to perform update operations on columns of type JSONB in Postgres 9.4

Looking through the documentation for the Postgres 9.4 datatype JSONB, it is not immediately obvious to me how to do updates on JSONB columns.
Documentation for JSONB types and functions:
http://www.postgresql.org/docs/9.4/static/functions-json.html
http://www.postgresql.org/docs/9.4/static/datatype-json.html
As an examples, I have this basic table structure:
CREATE TABLE test(id serial, data jsonb);
Inserting is easy, as in:
INSERT INTO test(data) values ('{"name": "my-name", "tags": ["tag1", "tag2"]}');
Now, how would I update the 'data' column? This is invalid syntax:
UPDATE test SET data->'name' = 'my-other-name' WHERE id = 1;
Is this documented somewhere obvious that I missed? Thanks.
If you're able to upgrade to Postgresql 9.5, the jsonb_set command is available, as others have mentioned.
In each of the following SQL statements, I've omitted the where clause for brevity; obviously, you'd want to add that back.
Update name:
UPDATE test SET data = jsonb_set(data, '{name}', '"my-other-name"');
Replace the tags (as oppose to adding or removing tags):
UPDATE test SET data = jsonb_set(data, '{tags}', '["tag3", "tag4"]');
Replacing the second tag (0-indexed):
UPDATE test SET data = jsonb_set(data, '{tags,1}', '"tag5"');
Append a tag (this will work as long as there are fewer than 999 tags; changing argument 999 to 1000 or above generates an error. This no longer appears to be the case in Postgres 9.5.3; a much larger index can be used):
UPDATE test SET data = jsonb_set(data, '{tags,999999999}', '"tag6"', true);
Remove the last tag:
UPDATE test SET data = data #- '{tags,-1}'
Complex update (delete the last tag, insert a new tag, and change the name):
UPDATE test SET data = jsonb_set(
jsonb_set(data #- '{tags,-1}', '{tags,999999999}', '"tag3"', true),
'{name}', '"my-other-name"');
It's important to note that in each of these examples, you're not actually updating a single field of the JSON data. Instead, you're creating a temporary, modified version of the data, and assigning that modified version back to the column. In practice, the result should be the same, but keeping this in mind should make complex updates, like the last example, more understandable.
In the complex example, there are three transformations and three temporary versions: First, the last tag is removed. Then, that version is transformed by adding a new tag. Next, the second version is transformed by changing the name field. The value in the data column is replaced with the final version.
Ideally, you don't use JSON documents for structured, regular data that you want to manipulate inside a relational database. Use a normalized relational design instead.
JSON is primarily intended to store whole documents that do not need to be manipulated inside the RDBMS. Related:
JSONB with indexing vs. hstore
Updating a row in Postgres always writes a new version of the whole row. That's the basic principle of Postgres' MVCC model. From a performance perspective, it hardly matters whether you change a single piece of data inside a JSON object or all of it: a new version of the row has to be written.
Thus the advice in the manual:
JSON data is subject to the same concurrency-control considerations as
any other data type when stored in a table. Although storing large
documents is practicable, keep in mind that any update acquires a
row-level lock on the whole row. Consider limiting JSON documents to a
manageable size in order to decrease lock contention among updating
transactions. Ideally, JSON documents should each represent an atomic
datum that business rules dictate cannot reasonably be further
subdivided into smaller datums that could be modified independently.
The gist of it: to modify anything inside a JSON object, you have to assign a modified object to the column. Postgres supplies limited means to build and manipulate json data in addition to its storage capabilities. The arsenal of tools has grown substantially with every new release since version 9.2. But the principal remains: You always have to assign a complete modified object to the column and Postgres always writes a new row version for any update.
Some techniques how to work with the tools of Postgres 9.3 or later:
How do I modify fields inside the new PostgreSQL JSON datatype?
This answer has attracted about as many downvotes as all my other answers on SO together. People don't seem to like the idea: a normalized design is superior for regular data. This excellent blog post by Craig Ringer explains in more detail:
"PostgreSQL anti-patterns: Unnecessary json/hstore dynamic columns"
Another blog post by Laurenz Albe, another official Postgres contributor like Craig and myself:
JSON in PostgreSQL: how to use it right
This is coming in 9.5 in the form of jsonb_set by Andrew Dunstan based on an existing extension jsonbx that does work with 9.4
For those that run into this issue and want a very quick fix (and are stuck on 9.4.5 or earlier), here is a potential solution:
Creation of test table
CREATE TABLE test(id serial, data jsonb);
INSERT INTO test(data) values ('{"name": "my-name", "tags": ["tag1", "tag2"]}');
Update statement to change jsonb value
UPDATE test
SET data = replace(data::TEXT,': "my-name"',': "my-other-name"')::jsonb
WHERE id = 1;
Ultimately, the accepted answer is correct in that you cannot modify an individual piece of a jsonb object (in 9.4.5 or earlier); however, you can cast the jsonb column to a string (::TEXT) and then manipulate the string and cast back to the jsonb form (::jsonb).
There are two important caveats
this will replace all values equaling "my-name" in the json (in the case you have multiple objects with the same value)
this is not as efficient as jsonb_set would be if you are using 9.5
update the 'name' attribute:
UPDATE test SET data=data||'{"name":"my-other-name"}' WHERE id = 1;
and if you wanted to remove for example the 'name' and 'tags' attributes:
UPDATE test SET data=data-'{"name","tags"}'::text[] WHERE id = 1;
This question was asked in the context of postgres 9.4,
however new viewers coming to this question should be aware that in postgres 9.5,
sub-document Create/Update/Delete operations on JSONB fields are natively supported by the database, without the need for extension functions.
See: JSONB modifying operators and functions
I wrote small function for myself that works recursively in Postgres 9.4. I had same problem (good they did solve some of this headache in Postgres 9.5).
Anyway here is the function (I hope it works well for you):
CREATE OR REPLACE FUNCTION jsonb_update(val1 JSONB,val2 JSONB)
RETURNS JSONB AS $$
DECLARE
result JSONB;
v RECORD;
BEGIN
IF jsonb_typeof(val2) = 'null'
THEN
RETURN val1;
END IF;
result = val1;
FOR v IN SELECT key, value FROM jsonb_each(val2) LOOP
IF jsonb_typeof(val2->v.key) = 'object'
THEN
result = result || jsonb_build_object(v.key, jsonb_update(val1->v.key, val2->v.key));
ELSE
result = result || jsonb_build_object(v.key, v.value);
END IF;
END LOOP;
RETURN result;
END;
$$ LANGUAGE plpgsql;
Here is sample use:
select jsonb_update('{"a":{"b":{"c":{"d":5,"dd":6},"cc":1}},"aaa":5}'::jsonb, '{"a":{"b":{"c":{"d":15}}},"aa":9}'::jsonb);
jsonb_update
---------------------------------------------------------------------
{"a": {"b": {"c": {"d": 15, "dd": 6}, "cc": 1}}, "aa": 9, "aaa": 5}
(1 row)
As you can see it analyze deep down and update/add values where needed.
Maybe:
UPDATE test SET data = '"my-other-name"'::json WHERE id = 1;
It worked with my case, where data is a json type
Matheus de Oliveira created handy functions for JSON CRUD operations in postgresql. They can be imported using the \i directive. Notice the jsonb fork of the functions if jsonb if your data type.
9.3 json https://gist.github.com/matheusoliveira/9488951
9.4 jsonb https://gist.github.com/inindev/2219dff96851928c2282
Updating the whole column worked for me:
UPDATE test SET data='{"name": "my-other-name", "tags": ["tag1", "tag2"]}' where id=1;