I am trying to use jsonb_set to change a single attribute of a jsonb object within my Postgres table.
I am using WITH random_string AS to set the variable random_string to a random hexidecimal value and then pass that string into my UPDATE query. However, it is not working.
This is the code I am using:
WITH random_string AS (SELECT substr(md5(random()::text), 0, 25)::varchar)
UPDATE
teams
SET
profile = jsonb_set(profile, '{api_key}', random_string)
WHERE
team_id="abc123";
The error I get seems to think I am trying to access a column that does not exist, because this is how you would normally reference a column.
Postgres query failed, PostgresPlugin query failed to execute: error: column "random_string" does not exist
Question: How do I use my random_string variable in the jsonb_set function to update this attribute?
Three issues. First, WITH statement gives a table-like result, not a variable. Define a table with a single column and use its name in the FROM clause in UPDATE. Next, the third argument of jsonb_set() is jsonb, use to_jsonb(). And last, a proper text literal is in single quotes 'abc123'.
WITH var(random_string) AS (SELECT substr(md5(random()::text), 0, 25)::varchar)
UPDATE
teams
SET
profile = jsonb_set(profile, '{api_key}', to_jsonb(random_string))
FROM
var
WHERE
team_id = 'abc123';
Related
In a plpgsql function I need to do several checks, and return some values based on those checks.
I can perform a SELECT INTO v_variable column FROM table but what I need to store is not the result of a SELECT rather a result of an UPDATE table SET column = new_value RETURNING check
Is there a way to store this check in a variable to later us it, or just an OUT variable so this value is returned by the function?
You can store the value from the RETURNING clause in a variable using INTO, just as for SELECT:
UPDATE table
SET column = new_value
RETURNING check INTO my_variable
It doesn't matter if it SELECT, INSERT or UPDATE. When you specify RETURNING it works the same way as it was SELECT. So you can write:
UPDATE table SET column = new_value RETURNING check INTO <your_variable>
More to that, you can use the results in the same query with the help of CTE:
WITH updated AS (
UPDATE table SET column = new_value RETURNING check
)
SELECT check FROM updated ...
I have a column of type TEXT which is supposed to represent a CLOB value and I'm trying to update its value like this:
UPDATE my_table SET my_column = TEXT 'Text value';
Normally this column is written and read by Hibernate and I noticed that values written with Hibernate are stored as integers (perhaps some internal Postgres reference to the CLOB data).
But when I try to update the column with the above SQL, the value is stored as a string and when Hibernate tries to read it, I get the following error: Bad value for type long : ["Text value"]
I tried all the options described in this answer but the result is always the same. How do I insert/update a TEXT column using SQL?
In order to update a cblob created by Hibernate you should use functions to handling large objects:
the documentation can be found in the following links:
https://www.postgresql.org/docs/current/lo-interfaces.html
https://www.postgresql.org/docs/current/lo-funcs.html
Examples:
To query:
select mytable.*, convert_from(loread(lo_open(mycblobfield::int, x'40000'::int), x'40000'::int), 'UTF8') from mytable where mytable.id = 4;
Obs:
x'40000' is corresponding to read mode (INV_WRITE)
To Update:
select lowrite(lo_open(16425, x'60000'::int), convert_to('this an updated text','UTF8'));
Obs:
x'60000' = INV_WRITE + INV_READ is corresponding to read and write mode (INV_WRITE + IV_READ).
The number 16425 is an example loid (large object id) which already exists in a record in your table. It's that integer number you can see as value in the blob field created by Hinernate.
To Insert:
select lowrite(lo_open(lo_creat(-1), x'60000'::int), convert_to('this is a new text','UTF8'));
Obs:
lo_creat(-1) generate a new large object a returns its loid
A user can only modify the ST_ASSMT_NM and , CAN_DT columns in the ST_ASSMT_REF record. In our system, we keep history in the same table and we never really update a record, we just insert a new row to represent the updated record. As a result, the "active" record is the record with the greatest LAST_TS timestamp value for a VENDR_ID. To prevent the possibility of an update to columns that cannot be changed, I wrote the logical UPDATE so that it retrieves the non-changable values from the original record and copies them to the new one being created. For the fields that can be modified, I pass them as params,
INSERT INTO GSAS.ST_ASSMT_REF
(
VENDR_ID
,ST_ASSMT_NM
,ST_CD
,EFF_DT
,CAN_DT
,LAST_TS
,LAST_OPER_ID
)
SELECT
ORIG_ST_ASSMT_REF.VENDR_ID
,#ST_ASSMT_NM
,ORIG_ST_ASSMT_REF.ST_CD
,ORIG_ST_ASSMT_REF.EFF_DT
,#CAN_DT
,CURRENT TIMESTAMP
,#LAST_OPER_ID
FROM
(
SELECT
ST_ASSMT_REF_ACTIVE_V.VENDR_ID
,ST_ASSMT_REF_ACTIVE_V.ST_ASSMT_NM
,ST_ASSMT_REF_ACTIVE_V.ST_CD
,ST_ASSMT_REF_ACTIVE_V.EFF_DT
,ST_ASSMT_REF_ACTIVE_V.CAN_DT
,CURRENT TIMESTAMP
,ST_ASSMT_REF_ACTIVE_V.LAST_OPER_ID
FROM
G2YF.ST_ASSMT_REF_ACTIVE_V ST_ASSMT_REF_ACTIVE_V --The view of only the most recent, active records
WHERE
ST_ASSMT_REF_ACTIVE_V.VENDR_ID = #VENDR_ID
) ORIG_ST_ASSMT_REF;
However, I am getting this error:
DB2 SP
:
ERROR [42610] [IBM][DB2] SQL0418N The statement was not processed because the statement contains an invalid use of one of the following: an untyped parameter marker, the DEFAULT keyword, or a null value.
It appears as though DB2 will not allow me to use a variable in a SELECT statement. For example, when I do this in TOAD for DB2:
select 1, #vendorId from SYSIBM.SYSDUMMY1
I get a popup dialog box. When I provide any string value, I get the same error.
I usually use SQL Server and I'm pretty sure I wouldn't have an issue doing this but I am not sure how to handle it get.
Suggestions? I know that I could do this in two seperate commands, 1 query SELECT to retreive the original VALUES and then supply the returned values and the modified ones to the INSERT command, but I should be able to do thios in one. Why can't I?
As you mentioned in your comment, DB2 is really picky about data types, and it wants you to cast your variables into the right data types. Even if you are passing in NULLs, sometimes DB2 wants you to cast the NULL to the data type of the target column.
Here is another answer I have on the topic.
I would like to know how I can insert regular expression in a table column in a PostgreSQl table.
For example I have column called "rule" in a table where I need to store the expression ^[0-9]+$. I tried:
insert into rule_master(rule)
values('^[0-9]+$') where rule_id='7'
But I am getting error syntax near where is wrong. I tried this with and with out single quotes. Please suggest me a solution.
It appears you want to UPDATE an existing record. In that case you should do:
UPDATE rule_master
SET rule = '^[0-9]+$'
WHERE rule_id = '7';
But if this is indeed a new record and you want to INSERT that regex with the value of "rule_id" then do:
INSERT INTO rule_master(rule_id, rule)
VALUES ('7', '^[0-9]+$');
I want to update an XML column in DB2 with dynamic values or you can say with values that I'll pick from another table and insert them in the xml column.
I know how to insert a node along with its value that we provide by
hard coding it, e.g.
<data>some_value</data>
I want to do it in the following way:
UPDATE my_table SET my_table_column = XMLQuery(..... <data>???</data>)
WHERE my_table_id = other_table_id;
Where I place ??? I need a kind of select statement here which will come up with actual value for the node.