I want to update an XML column in DB2 with dynamic values or you can say with values that I'll pick from another table and insert them in the xml column.
I know how to insert a node along with its value that we provide by
hard coding it, e.g.
<data>some_value</data>
I want to do it in the following way:
UPDATE my_table SET my_table_column = XMLQuery(..... <data>???</data>)
WHERE my_table_id = other_table_id;
Where I place ??? I need a kind of select statement here which will come up with actual value for the node.
Related
I have to produce a dynamically generated T-SQL script that inserts records into various tables. I've done a bunch of searching and testing but can't seem to find the path I'm looking for.
I know that the following is valid SQL:
INSERT INTO [MyTable] ( [Col1], [Col2], [Col3] )
SELECT N'Val1', N'Val2', N'Val3';
But, is it at all possible to write something akin to this:
INSERT INTO [MyTable]
SELECT [Col1] = N'Val1', [Col2] = N'Val2', [Col3] = N'Val3';
By having the columns in the select statement, I'm able to do it all at once vs writing 2 separate lines. Obviously my idea doesn't work, I'm trying to figure out whether something similar is possible or I need to stick with the first one.
Much appreciated.
Best practice for insert statements is to specify the columns list in the insert clause, and for very good reasons:
It's far more readable. You know exactly what value goes into what column.
You don't have to provide values to nullable \ default valued columns.
You're not bound to the order of the columns in the table.
In case a column is added to the table, your insert statement might not break (It will if the newly added column is not nullable and doesn't have a default value).
In some cases, SQL Server demands you specify the columns list explicitly, like when identity_insert is set to on.
And in any case, the column names or aliases in the select clause of the insert...select statement does not have any effect as to what target columns the value column should go to. values are directed to target based only on their location in the statement.
I have a column of type TEXT which is supposed to represent a CLOB value and I'm trying to update its value like this:
UPDATE my_table SET my_column = TEXT 'Text value';
Normally this column is written and read by Hibernate and I noticed that values written with Hibernate are stored as integers (perhaps some internal Postgres reference to the CLOB data).
But when I try to update the column with the above SQL, the value is stored as a string and when Hibernate tries to read it, I get the following error: Bad value for type long : ["Text value"]
I tried all the options described in this answer but the result is always the same. How do I insert/update a TEXT column using SQL?
In order to update a cblob created by Hibernate you should use functions to handling large objects:
the documentation can be found in the following links:
https://www.postgresql.org/docs/current/lo-interfaces.html
https://www.postgresql.org/docs/current/lo-funcs.html
Examples:
To query:
select mytable.*, convert_from(loread(lo_open(mycblobfield::int, x'40000'::int), x'40000'::int), 'UTF8') from mytable where mytable.id = 4;
Obs:
x'40000' is corresponding to read mode (INV_WRITE)
To Update:
select lowrite(lo_open(16425, x'60000'::int), convert_to('this an updated text','UTF8'));
Obs:
x'60000' = INV_WRITE + INV_READ is corresponding to read and write mode (INV_WRITE + IV_READ).
The number 16425 is an example loid (large object id) which already exists in a record in your table. It's that integer number you can see as value in the blob field created by Hinernate.
To Insert:
select lowrite(lo_open(lo_creat(-1), x'60000'::int), convert_to('this is a new text','UTF8'));
Obs:
lo_creat(-1) generate a new large object a returns its loid
I needed basic help on how to combine columns into one new column in the same table. I have done the below as a SELECT command and it works fine. I just don't know how to add it to the table permanently so that it becomes part of the table.
SELECT *, concat(z41, z42, z43, z44) AS option_3,
concat(z411, z412, z413, z421, z422, z423, z431, z432, z433, z434, z444,z443, z442, z441) AS option_4,
concat(z4211, z4212, z4213, z4214, z4215, z4311, z4312, z4313, z4314, z4431, z4432, z4433, z4434, z4421, z4422, z4423, z4424, z4425, z4426) AS option_5
FROM combined_full
Like others have mentioned, you are probably better off using a view. But if you really need this computed data in column then you can do this:
ALTER TABLE combined_full ADD COLUMN option_3 varchar,
ADD COLUMN option_4 varchar,
ADD COLUMN option_5 varchar;
UPDATE combined_full
SET option_3 = concat(z41, z42, z43, z44),
option_4 = concat(z411, z412, z413, z421, z422, z423, z431, z432, z433, z434, z444,z443, z442, z441),
option_5 = concat(z4211, z4212, z4213, z4214, z4215, z4311, z4312, z4313, z4314, z4431, z4432, z4433, z4434, z4421, z4422, z4423, z4424, z4425, z4426);
When adding new rows to the table, you should either also enter values for these three new columns, or create an insert trigger so that the values are automatically calculated as you do above.
"so that it becomes part of the table" - you can't. Unfortunately Postgres (as of 9.6) has no (persisted) computed columns.
If the expression is not very expensive to calculate and you don't need an index on it, I would suggest to create a view that contains the expression.
Given the example in your question, this should be good enough in your case as concatenating values isn't really that expensive.
If you really think you need to persist the calculation of the expression because e.g. you want to create an index on that or you constantly use that expression in a where clause, you will need to add a regular column to the table and a trigger that updates the expression when a row is inserted or updated.
A user can only modify the ST_ASSMT_NM and , CAN_DT columns in the ST_ASSMT_REF record. In our system, we keep history in the same table and we never really update a record, we just insert a new row to represent the updated record. As a result, the "active" record is the record with the greatest LAST_TS timestamp value for a VENDR_ID. To prevent the possibility of an update to columns that cannot be changed, I wrote the logical UPDATE so that it retrieves the non-changable values from the original record and copies them to the new one being created. For the fields that can be modified, I pass them as params,
INSERT INTO GSAS.ST_ASSMT_REF
(
VENDR_ID
,ST_ASSMT_NM
,ST_CD
,EFF_DT
,CAN_DT
,LAST_TS
,LAST_OPER_ID
)
SELECT
ORIG_ST_ASSMT_REF.VENDR_ID
,#ST_ASSMT_NM
,ORIG_ST_ASSMT_REF.ST_CD
,ORIG_ST_ASSMT_REF.EFF_DT
,#CAN_DT
,CURRENT TIMESTAMP
,#LAST_OPER_ID
FROM
(
SELECT
ST_ASSMT_REF_ACTIVE_V.VENDR_ID
,ST_ASSMT_REF_ACTIVE_V.ST_ASSMT_NM
,ST_ASSMT_REF_ACTIVE_V.ST_CD
,ST_ASSMT_REF_ACTIVE_V.EFF_DT
,ST_ASSMT_REF_ACTIVE_V.CAN_DT
,CURRENT TIMESTAMP
,ST_ASSMT_REF_ACTIVE_V.LAST_OPER_ID
FROM
G2YF.ST_ASSMT_REF_ACTIVE_V ST_ASSMT_REF_ACTIVE_V --The view of only the most recent, active records
WHERE
ST_ASSMT_REF_ACTIVE_V.VENDR_ID = #VENDR_ID
) ORIG_ST_ASSMT_REF;
However, I am getting this error:
DB2 SP
:
ERROR [42610] [IBM][DB2] SQL0418N The statement was not processed because the statement contains an invalid use of one of the following: an untyped parameter marker, the DEFAULT keyword, or a null value.
It appears as though DB2 will not allow me to use a variable in a SELECT statement. For example, when I do this in TOAD for DB2:
select 1, #vendorId from SYSIBM.SYSDUMMY1
I get a popup dialog box. When I provide any string value, I get the same error.
I usually use SQL Server and I'm pretty sure I wouldn't have an issue doing this but I am not sure how to handle it get.
Suggestions? I know that I could do this in two seperate commands, 1 query SELECT to retreive the original VALUES and then supply the returned values and the modified ones to the INSERT command, but I should be able to do thios in one. Why can't I?
As you mentioned in your comment, DB2 is really picky about data types, and it wants you to cast your variables into the right data types. Even if you are passing in NULLs, sometimes DB2 wants you to cast the NULL to the data type of the target column.
Here is another answer I have on the topic.
I have a table with more than 30.000 entries and have to add a new column (zip_prefixes) containing the first digit of the a zip code (zcta).
I created the column successfully:
alter table zeta add column zip_prefixes text;
Then I tried to put the values in the column with:
update zeta
set zip_prefixes = (
select substr(cast (zctea as text)1,1)
from zeta)
)
Of course I got:
error more than one row returned by a subquery used as an expression
How can I get the first digit of the value from zctea into column zip_prefixes of the same row?
No need for sub-select:
update zeta
set zip_prefixes = substr(zctea, 1, 1);
update zeta
set zip_prefixes = substr(zctea as text)1,1)
There is no need for select query and casting
Consider not adding a functionally dependent column. It's typically cleaner and cheaper overall to retrieve the first character on the fly. If you need a "table", I suggest to add a VIEW.
Why the need to cast(zctea as text)? A zip code should be text to begin with.
Name it zip_prefix, not zip_prefixes.
Use the simpler and cheaper left():
CREATE VIEW zeta_plus AS
SELECT *, left(zctea::text, 1) AS zip_prefix FROM zeta; -- or without cast?
If you need the additional column in the table and the first character is guaranteed to be an ASCII character, consider the data type "char" (with double quotes). 1 byte instead of 2 (on disk) or 5 (in RAM). Details:
What is the overhead for varchar(n)?
Any downsides of using data type "text" for storing strings?
And run both commands in one transaction if you need to minimize lock time and / or avoid a visible column with missing values in the meantime. Faster, too.
BEGIN;
ALTER TABLE zeta ADD COLUMN zip_prefix "char";
UPDATE zeta SET zip_prefixes = left(zctea::text, 1);
COMMIT;