Extract value from an xmlstring in db2 - db2

I have an XML string in a column called RawData from table Inbound. I have to read value Success from an element called status.
xmlstring:
<InboundMessage>
<Transaction>
<Status>Success</Status>
</Transaction>
</InboundMessage>

SELECT X.STATUS
FROM (VALUES XMLPARSE(DOCUMENT '
<InboundMessage>
<Transaction>
<Status>Success</Status>
</Transaction>
</InboundMessage>
')) T (DOC)
, XMLTABLE
(
'$D/InboundMessage/Transaction/Status' PASSING T.DOC AS "D" COLUMNS
STATUS VARCHAR(20) PATH '.'
) X;
Refer to the XMLTABLE function overview link for more details.

Related

Postgressql Xmltable as string for input to procedure problem

Hello iam trying to use xmltable in function, to update some data in other table, as for input i got string
This is a test program the final result will be that the insert should be created from any xml doing some data adjustment
the xml look like this
<header>
<data>
<line>
<field1>some data in f1</field1>
<field2>other data f2</field2>
<field3>this data contains numers 012323</field3>
<field4>and the last data</field4>
<line>
</data>
</header>
Iam taking the next steps
create table with casted string to xml input values
create table if not exists test_xml_table as select input_string::xml as string_xml;
will do some data adjustments
final step is insert
insert into test_tab( field1, field2, field3, field4 )
select xt.field1, xt.field2, xt.field3, field4 from test_xml_table
cross join xmltable( '/data/line' passing string_xml
columns field1 text path 'field1', field2 text path 'field2',
field3 text path 'field3', field4 text path 'field4' ) as xt;
The problem is that if the table test_xml_table doesnt exists, the program dont create it ( still didnt see it after create table command. I tried to do workaround, create the table first and fill it with XML data but now i dont know what to put after PASSING phrase. There is no error just no data is inserted. Will be grateful for help
the whole code is below
create or replace function test_function(
input_string character varying )
RETURNS void
LANGUAGE plpgsql
as $BODY$
begin
create table test_xml_table as select input_string::xml as string_xml;
insert into test_tab( field1, field2, field3, field4 )
select xt.field1, xt.field2, xt.field3, field4 from test_xml_table
cross join xmltable( '/data/line' passing string_xml
columns field1 text path 'field1', field2 text path 'field2',
field3 text path 'field3', field4 text path 'field4' ) as xt;
end;
$BODY$;
call ->
select test_function( '<header>
<data>
<line>
<field1>some data in f1</field1>
<field2>other data f2</field2>
<field3>this data contains numers 012323</field3>
<field4>and the last data</field4>
</line>
</data>
</header>' )
Tried to use postgres xmltable function to insert data into table. Iam expecting that the data from function input that is type character varying will insert data
cross join is not needed.
The xml namespace_uri should be /header/data/line not just /data/line
create or replace function test_function( input_string character varying )
RETURNS void
LANGUAGE plpgsql
as $BODY$
begin
create table test_xml_table as select input_string::xml as string_xml;
insert into test_tab( field1, field2, field3, field4 )
select xt.field1, xt.field2, xt.field3, field4
from test_xml_table, xmltable( '/header/data/line' passing string_xml
columns
field1 text path 'field1',
field2 text path 'field2',
field3 text path 'field3',
field4 text path 'field4'
) as xt;
end;
$BODY$;
Demo here

DB2 XML Select multiple rows in Single statements

In My code,
SELECT X.DEP_ID
FROM (SELECT XMLPARSE (DOCUMENT '<root><DEP_ID>1000000004</DEP_ID><DEP_ID>1000000005</DEP_ID></root>') AS ELEMENT_VALUE
FROM SYSIBM.SYSDUMMY1) AS A,
XMLTABLE (
'$d/root'
PASSING Element_value AS "d"
COLUMNS
DEP_ID VARCHAR (10) PATH 'DEP_ID'
) AS X;
Need as result of:
DEP_ID
1000000004
1000000005
If its single values means it working that means only one DEP_ID in xml.
But Multiple return means it will show error.
How to get the output as like above in db2.
Wrong row-xquery-expression-constant.
Try this:
SELECT X.DEP_ID
FROM
(
SELECT XMLPARSE (DOCUMENT '<root><DEP_ID>1000000004</DEP_ID><DEP_ID>1000000005</DEP_ID></root>') AS ELEMENT_VALUE
FROM SYSIBM.SYSDUMMY1
) AS A
, XMLTABLE
(
'$d/root/DEP_ID' PASSING Element_value AS "d"
COLUMNS
DEP_ID VARCHAR (10) PATH '.'
) AS X;

Unable to INSERT between tables using ST_GeomFromText

I'm trying to insert point geometry values and other data from one table to another table.
-- create tables
create table bh_tmp (bh_id integer, bh_name varchar
, easting decimal, northing decimal, ground_mod decimal);
create table bh (name varchar);
SELECT AddGeometryColumn('bh', 'bh_geom', 27700, 'POINT',3);
-- popualte bh_tmp
insert into bh_tmp values
(1,'C5',542945.0,180846.0,3.947),
(3,'B24',542850.0,180850.0,4.020),
(4,'B26',543020.0,180850.0,4.020);
-- populate bh from bh_tmp
insert into bh(name, bh_geom) SELECT
bh_name,
CONCAT($$ST_GeomFromText('POINT($$, Easting, ' ', Northing, ' '
, Ground_mOD, $$)', 27700)$$);
FROM bh_tmp;
Gives this error:
ERROR: parse error - invalid geometry
SQL state: XX000
Hint: "ST" <-- parse error at position 2 within geometry
I can't see anything wrong with the ST_GeomFromText string that I've specified. But I can populate table bh if I insert rows 'manually', e.g.:
INSERT INTO bh (name, bh_geom)
VALUES ('C5' ST_GeomFromText('POINT(542945.0 180846.0 3.947)', 27700));
What am I doing wrong?
First of all, there is a misplaced semicolon after CONCAT(...);
And you can't concatenate the function name itself into the string:
INSERT INTO bh(name, bh_geom)
SELECT bh_name
, ST_GeomFromText('POINT(' || concat_ws(' ', easting, northing, ground_mod) || ')'
, 27700)
FROM bh_tmp;
Or, since you have values already (not text), you could use ST_MakePoint() and ST_SetSRID():
ST_SetSRID(ST_MakePoint(easting, northing, ground_mod), 27700)
Should be faster.
Npgsql parameterized query output incompatible with PostGIS
You're getting that error because the output of the CONCAT function is text, and your bh_geom column is geometry, so you're trying to insert text into geometry. This will work:
INSERT INTO bh(name, bh_geom) SELECT
bh_name,
ST_GeomFromText('POINT('
|| easting|| ' '
|| Northing
|| ' '
|| Ground_mOD
|| ')', 27700)
FROM bh_tmp;

Prevent lines appearing in XML

I have the following table and content:
CREATE TABLE [dbo].[MyTable](
[PID] [int] NOT NULL,
[CID] [int] NOT NULL
)
INSERT INTO MyTable values (17344,17345)
INSERT INTO MyTable values (17344,17346)
INSERT INTO MyTable values (17272,17273)
INSERT INTO MyTable values (17272,17255)
INSERT INTO MyTable values (17272,17260)
INSERT INTO MyTable values (17272,17274)
INSERT INTO MyTable values (17272,17252)
From this I need to create the following XML layout:
<Item code="17344">
<BOMs>
<BOM code="17344">
<BOMLine type="17345"/>
<BOMLine type="17346"/>
</BOM>
</BOMs>
</Item>
<Item code="17272">
<BOMs>
<BOM code="17272">
<BOMLine type="17273"/>
<BOMLine type="17255"/>
<BOMLine type="17260"/>
<BOMLine type="17274"/>
<BOMLine type="17252"/>
</BOM>
</BOMs>
</Item>
I'm trying to achieve this with the following statement which gives me far too much lines and duplicates:
DECLARE #test XML
SELECT #test =
(SELECT PID '#code',
(SELECT PID as '#code',
(SELECT CID as '#type'
FROM MyTable
FOR XML PATH('BOMLine'), TYPE)
FROM MyTable GROUP BY PID
FOR XML PATH('BOM'), TYPE, ROOT('BOMs'))
FROM MyTable
FOR XML PATH('Item'), TYPE)
select #test
Can anyone help me with this? I'm using SQL Server 2008 by the way.
It would be greatly appreciated.
Best regards,
Wes
You need the group by in the outermost query and you need to make your sub-query correlated with the outer query on PID.
The extra PID in BOM does not need a from clause.
select T1.PID as '#Code',
(
select T1.PID as '#code',
(
select T2.CID as '#type'
from dbo.MyTable as T2
where T1.PID = T2.PID
for xml path('BOMLine'), type
)
for xml path('BOM'), root('BOMs'), type
)
from dbo.MyTable as T1
group by T1.PID
for xml path('Item')

Most succinct way to transform a CSV string to a table in T-SQL?

-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Question: How to get roles into a table view like this:
select 'Pub'
union
select 'RegUser'
union
select 'ServiceAdmin'
After posting this, I started playing with some dynamic SQL. This seems to work, but seems like there might be some security risks by using dynamic SQL - thoughts on this?
declare #rolesSql varchar(800)
select #rolesSql = 'select ''' + replace(#roles, ',', ''' union select ''') + ''''
exec(#rolesSql)
If you're working with SQL Server compatibility level 130 then the STRING_SPLIT function is now the most succinct method available.
Reference link: https://msdn.microsoft.com/en-gb/library/mt684588.aspx
Usage:
SELECT * FROM string_split('Pub,RegUser,ServiceAdmin',',')
RESULT:
value
-----------
Pub
RegUser
ServiceAdmin
See my answer from here
But basically you would:
Create this function in your DB:
CREATE FUNCTION dbo.Split(#origString varchar(max), #Delimiter char(1))
returns #temptable TABLE (items varchar(max))
as
begin
declare #idx int
declare #split varchar(max)
select #idx = 1
if len(#origString )<1 or #origString is null return
while #idx!= 0
begin
set #idx = charindex(#Delimiter,#origString)
if #idx!=0
set #split= left(#origString,#idx - 1)
else
set #split= #origString
if(len(#split)>0)
insert into #temptable(Items) values(#split)
set #origString= right(#origString,len(#origString) - #idx)
if len(#origString) = 0 break
end
return
end
and then call the function and pass in the string you want to split.
Select * From dbo.Split(#roles, ',')
Here's a thorough discussion of your options:
Arrays and Lists in SQL Server
What i do in this case is just using some string replace to convert it to json and open the json like a table. May not be suitable for every use case but it is very simple to get running and works with strings and files. With files you just need to watch your line break character, mostly i find it to be "Char(13)+Char(10)"
declare #myCSV nvarchar(MAX)= N'"Id";"Duration";"PosX";"PosY"
"•P001";223;-30;35
"•P002";248;-28;35
"•P003";235;-26;35'
--CSV to JSON
--convert to json by replacing some stuff
declare #myJson nvarchar(MAX)= '[['+ replace(#myCSV, Char(13)+Char(10), '],[' ) +']]'
set #myJson = replace(#myJson, ';',',') -- Optional: ensure coma delimiters for json if the current delimiter differs
-- set #myJson = replace(#myJson, ',,',',null,') -- Optional: empty in between
-- set #myJson = replace(#myJson, ',]',',null]') -- Optional: empty before linebreak
SELECT
ROW_NUMBER() OVER (ORDER BY (SELECT 0))-1 AS LineNumber, *
FROM OPENJSON( #myJson )
with (
col0 varchar(255) '$[0]'
,col1 varchar(255) '$[1]'
,col2 varchar(255) '$[2]'
,col3 varchar(255) '$[3]'
,col4 varchar(255) '$[4]'
,col5 varchar(255) '$[5]'
,col6 varchar(255) '$[6]'
,col7 varchar(255) '$[7]'
,col8 varchar(255) '$[8]'
,col9 varchar(255) '$[9]'
--any name column count is possible
) csv
order by (SELECT 0) OFFSET 1 ROWS --hide header row
Using SQL Server's built in XML parsing is also an option. Of course, this glosses over all the nuances of an RFC-4180 compliant CSV.
-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Here's the XML way
select split.csv.value('.', 'varchar(100)') as value
from (
select cast('<x>' + replace(#roles, ',', '</x><x>') + '</x>' as xml) as data
) as csv
cross apply data.nodes('/x') as split(csv)
If you are using SQL 2016+, using string_split is better, but this is a common way to do this prior to SQL 2016.
Using BULK INSERT you can import a csv file into your sql table -
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Even the accepted answer is working fine. but I got this function much faster even for thousands of record. create below function and use.
IF EXISTS (
SELECT 1
FROM Information_schema.Routines
WHERE Specific_schema = 'dbo'
AND specific_name = 'FN_CSVToStringListTable'
AND Routine_Type = 'FUNCTION'
)
BEGIN
DROP FUNCTION [dbo].[FN_CSVToStringListTable]
END
GO
CREATE FUNCTION [dbo].[FN_CSVToStringListTable] (#InStr VARCHAR(MAX))
RETURNS #TempTab TABLE (Id NVARCHAR(max) NOT NULL)
AS
BEGIN
;-- Ensure input ends with comma
SET #InStr = REPLACE(#InStr + ',', ',,', ',')
DECLARE #SP INT
DECLARE #VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', #INSTR) <> 0
BEGIN
SELECT #SP = PATINDEX('%,%', #INSTR)
SELECT #VALUE = LEFT(#INSTR, #SP - 1)
SELECT #INSTR = STUFF(#INSTR, 1, #SP, '')
INSERT INTO #TempTab (Id)
VALUES (#VALUE)
END
RETURN
END
GO
---Test like this.
declare #v as NVARCHAR(max) = N'asdf,,as34df,234df,fs,,34v,5fghwer,56gfg,';
SELECT Id FROM dbo.FN_CSVToStringListTable(#v)
I was about you use the solution mentioned in the accepted answer, but doing more research led me to use Table Value Types:
These are far more efficient and you don't need a TVF (Table valued function) just to create a table from csv. You can use it directly in your scripts or pass that to a stored procedure as a Table Value Parameter. The Type can be created as :
CREATE TYPE [UniqueIdentifiers] AS TABLE(
[Id] [varchar](20) NOT NULL
)