I am always getting a null value from using XMLCAST function in Db2. But when using a normal function call it is returning value in xml format... as below query proper xml data is coming. function Getdatavalue is returning xml datatype. using Linux 8.2 and db2 11.5
db2 "select sample.Getdatavalue(1,4504)) from sysibm.sysdummy1"
in the above result we have some double quotes in output xml.
but once we have used xmlcast function as below, it is returning the hexa value of null. i am using MD5 hashing mechanism.
db2 "select HEX(HASH(xmlcast(sample.Getdatavalue(1,4504) as varchar(32672)),0)) as hash_val from sysibm.sysdummy1";
even once i tried without Hash it was returning null result for below query. it looks xmlcast is not working.
db2 "select xmlcast(sample.Getdatavalue(1,4504) as varchar(32672)) as hash_val from sysibm.sysdummy1";
is there any way to convert xml datatype to clob or varchar.
body of Getdatavalue as:
CREATE FUNCTION sample.Getdatavalue (v1 smallint, v2 integer)
RETURNS xml
LANGUAGE SQL
BEGIN ATOMIC
DECLARE v_Data xml;
set v_Data = (SELECT XMLELEMENT( NAME "rt_ky_cmpnnt",
XMLAGG(XMLELEMENT(NAME "rt", XMLAttributes(rt.rt_field_add_val AS
"rt_add_val",
rt.rt_name AS
"rt_name",
rt.rt_name AS
"rt_name",
rt.rt_val1 AS "rt_val1",
rt_lim AS "rt_lim",
rt.rt_val2 AS "rt__val2",
rt.rt_whole_val AS "rt_whole_val" ))order by
rf.rt_num asc)OPTION null ON NULL)
FROM sample.cmpnnt_val rt inner join sample.field_val rf
on rf.abc=rt.abc
and rf.pqr=rt.pqr
and rt.xyz=rf.xyz
where rt.abc = v1
AND rt.bcd = v2 );
RETURN v_Data;
END!
can anyone please explain how to handle xmlcast.
To retrieve a string from an XML value, use the function XMLSERIALIZE.
The XMLSERIALIZE function returns a serialized XML value of the specified data type generated from the XML-expression argument.
Related
While going through the documentation (data-types) of Apache AGE, I ran into some interesting queries that demonstrated the input/output format of data types.
Such as: -
SELECT *
FROM cypher('graph_name', $$
RETURN 1
$$) AS (int_result agtype);
Which as expected outputs
int_result
1
(1 row)
However, if the query is modified to
SELECT *
FROM cypher('graph_name', $$
RETURN 1.4
$$) AS (int_result agtype);
It still outputs
int_result
1.4
(1 row)
The behavior is also noticed in boolean datatype in which you can enter any numeric value, and it outputs it as it is, for an instance
SELECT *
FROM cypher('graph_name', $$
RETURN 1.5
$$) AS (boolean_result agtype);
Will output
boolean_result
1.5
(1 row)
It is also interesting to note that any other input (except numeric, true, or false), for example 't' gives an error for the same boolean query.
SELECT *
FROM cypher('graph_name', $$
RETURN t
$$) AS (boolean_result agtype);
Will output
ERROR: could not find rte for t
I was expecting similar behavior for other inputs that did not match the data types, however, they proceeded just fine. What may be the developmental reason for such behavior?
The confusion here is that in (int_result agtype), int_result is not the data type, but instead just the name of the column that will be displayed in the result. The data type itself is agtype, which is handled internally. AGE itself sets the correct data type for agtype.
SELECT *
FROM cypher('graph_name', $$
RETURN t
$$) AS (boolean_result agtype);
For this case, it gives an error because it expects to return a variable named 't'. However, since no 't' was declared in the cypher query, it throws an error.
This behavior can seem unexpected but is because of fact that Apache Age supports dynamic typing of data. This mean that the system infers the data type of the value returned by the cypher query and maps it to an appropriate AGE data type.
In case of the first query where value returned by the Cypher query is integer Literal “1”. Which AGE maps to an integer Data type and in second query the value returned by the Cypher is a floating point literal “1.4”.
For more information here
#Apache-AGE
#Postgres
I'm trying to insert data into a JSONB field based on a dependent table.
Essentially I want to do this (ignore why this is just an example query):
insert into myschema.teams (team_name, params)
select users.team_name, '{"team_name": teams.team_id, "user_name": users.username }'
from myschema.users
where users.team_name is not null;
As written I'm getting these errors:
ERROR: invalid input syntax for type json
LINE 2: ... '{"team_name...
^
DETAIL: Token "teams" is invalid.
CONTEXT: JSON data, line 1: {"team_name": teams...
You are using a string literal that doesn't contain valid JSON. There is no interpolation going on - you need use the jsonb_build_object function to create the JSONB value from dynamic values. (You could also do string concatenation and the cast from text to json, but please don't).
insert into myschema.teams (team_name, params)
select users.team_name, jsonb_build_object('team_name', teams.team_name, 'user_name', users.username)
from myschema.users
where users.team_name is not null;
I use this sql to execute sql:
v_sql4 :='
INSERT INTO public.rebatesys(head,contract_no,history_no,f_sin,line_no,s_line_no,departmentcd,catagorycd,jan,seriescd,f_exclude, f_del,ins_date,ins_time,ins_user_id,ins_func_id,ins_ope_id,upd_date,upd_time,upd_user_id,upd_func_id,upd_ope_id)
VALUES (0, '''||v_contract_no||''', '||v_history_no||',1, '||v_line_no||', '||v_down_s_line_no||', '||coalesce(v_deptCD,null)||', '||0||', '''||v_singleJan||''','''||0||''','||v_fExclude||',
0, current_date, current_time, '||v_ins_user_id||', 0, 0,
current_date,current_time,'||v_upd_user_id||',0, 0);';
RAISE NOTICE 'v_sql4 IS : %', v_sql4;
EXECUTE v_sql4;
But when field "v_deptCD" is null,the whole sql is null,even I use coalesce,I still can't do id, the out put is :
NOTICE: v_sql4 IS : <NULL>
How to fix it?
When v_deptCD is null, you want to replace it by the string 'null', not the keyword.
', '||coalesce(v_deptCD,'null')||', '
You can use this
case when v_deptCD notnull then v_deptCD else null end
or use this for string concatination inside sql
concat(field1, ', ', field2)
Alternative approach to JGH solution is to use function format(your_string, list, of, values), it can ignore NULL values, but has the option to display them as NULL if you use %L in your format string. It will however single quote numbers if you use that format specifier, requiring casting in some cases.
Format arguments according to a format string. This function is similar to the C function sprintf. See Section 9.4.1.
But in my opinion best solution is to use USING clause and pass values in there. It looks kinda like prepared statement, protects you from SQL Injection, but does not cache plans like prepared statements. There are simple examples on how to do this in documentation for executing dynamic commands.
EXECUTE 'SELECT count(*) FROM mytable WHERE inserted_by = $1 AND inserted <= $2'
INTO c
USING checked_user, checked_date;
My task is to convert a ABAP style date (i.e. 2017-11-20 which is represented as string "20171120") to a HANA date via sql script. This can easily be done by:
select to_date('20171120','YYYYMMDD') from dummy;
But there is another requirement: if the abap date is initial (value '00000000') the database shall store a null value. I have found a working solution: I replace the potential initial date '00000000' with 'Z' and trim the string to null if only 'Z' is found:
select to_date(trim(leading 'Z' from replace('00000000','00000000','Z')),'YYYYMMDD') from dummy;
-- result: null
select to_date(trim(leading 'Z' from replace('20171120','00000000','Z')),'YYYYMMDD') from dummy;
-- result: 2017-11-20
But this looks like a dirty hack. Has anybody an idea for a more elegant solution?
As explained in my presentation Innovation with SAP HANA - What are my options all that string manipulation is really not necessary.
Instead, use the appropriate conversion functions when dealing with ABAP date and time data. In this case, DATS_TO_DATE is the correct function.
with in_dates as
( select '20171120' as in_date from dummy
union all select '00000000' as in_date from dummy)
select
dats_to_date(in_date)
, in_date
from in_dates;
|DATS_TO_DATE(IN_DATE) |IN_DATE
-------------------------+---------
|2017-11-20 |20171120
|? |00000000
The ? here is the output representation for NULL.
DATS_TO_DATE does not return NULL if the given date is initial (0000-00-00), but a special date value (-1-12-31 to be precise).
To receive a NULL value in this case, as you requested, use the following statement:
NULLIF( DATS_TO_DATE(?), DATS_TO_DATE('00000000'))
e. g.:
INSERT INTO null_test VALUES (NULLIF( DATS_TO_DATE('00000000'), DATS_TO_DATE('00000000')));
=> returns NULL
INSERT INTO null_test VALUES (NULLIF( DATS_TO_DATE('20171224'), DATS_TO_DATE('00000000')));
=> returns 2017-12-24
As there are no tedious string operations involved, this statement should yield good performance.
How we can pass data table to stored procedure in c# using Enterprise library ?
I am using command
cmd.AddInParameter(objDbCommand, DBAccessDetails.dataTablePrm, SqlDbType.Structured, curatedListprm);
then coming issue as :
Conversion failed when converting the nvarchar value 'ZAQ' to data type int.
The data for table-valued parameter "#data" doesn't conform to the table type of the parameter. SQL Server error is: 245, state: 1
Plz tell me what I do and what is issue ?
First you need to have created your user defined table type
CREATE TYPE [schema].[tableName] AS TABLE(
table columns and types )
GO
then you need to create your stored proc to take the table valued parameted
create procedure procName
(
#tvp dbo.tableNamereadonly
)
then you need to pass your datatable in via .Net with something like this
var param = new SqlParameter("#tvp" Sqldbtype.Structured);
param.Value = {your data table}