Db2 Nested Query Error SQLCODE=-206, SQLSTATE=42703 - db2

WITH EXECUTIONDATES AS (
SELECT MONTHNAME(UBEXECUTIONDT) AS MONTHNAME,UBEXECUTIONDT FROM WASADMIN.UBTB_CLOSEOFFHIST
),CALCULATED_INT AS (
SELECT ED.UBEXECUTIONDT AS "EXECUTION DATE",IH.ACCOUNTID,MIN(IH.CLEAREDBALANCE) AS "MINIMUM BALANCE",ED.MONTHNAME
FROM WASADMIN.INTERESTHISTORY IH,
(SELECT ACCOUNTID,
MAX(VALUEDATE) AS STARTDATE
FROM WASADMIN.INTERESTHISTORY
WHERE VALUEDATE <= FIRST_DAY(ED.UBEXECUTIONDT)
GROUP BY ACCOUNTID) MD,
EXECUTIONDATES ED
WHERE IH.ACCOUNTID = MD.ACCOUNTID
AND IH.ACCOUNTID = '<ACCOUNTID>'
AND DATE (IH.VALUEDATE) BETWEEN MD.STARTDATE AND ED.UBEXECUTIONDT GROUP BY ED.UBEXECUTIONDT,IH.ACCOUNTID,ED.MONTHNAME
)SELECT * FROM CALCULATED_INT;
this code with return the following error
DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=ED.UBEXECUTIONDT, DRIVER=4.31.10 [SQL State=42703, DB Errorcode=-206]
Next: DB2 SQL Error: SQLCODE=-727, SQLSTATE=56098, SQLERRMC=2;-206;42703;ED.UBEXECUTIONDT, DRIVER=4.31.10 [SQL State=56098, DB Errorcode=-727]
1 statement failed.
But if i replaced the FIRST_DAY(ED.UBEXECUTIONDT) with static value FIRST_DAY('2022-04-30') it works. what could i be doing wrong here?

Related

Database-migration ms sql to postgresql

I am getting a syntax error when converting between SQL Server to PostgreSQL. Any thoughts?
IF (var_port_with_bmrk_total_mv != 0 AND var_bmrk_info IS NOT NULL) THEN
BEGIN
insert into t$tmp_diff
select #asof_dt asof_dt,#choiceID choiceID ,p.input_array_type ,p.group_order, CONVERT(DECIMAL(32,10),p.port_value/#var_port_total_mv) port_value,convert(decimal(32,10), isnull(bmrk_value/#port_with_bmrk_total_mv,0)) bmrk_value
from t$tmp_port_sum p, t$tmp_bmrk_sum b
where p.input_array_type=b.input_array_type and p.group_order = b.group_order
END;
ELSE
Original before conversion
insert into #tmp_other_diff
select #asof_dt asof_dt,#choiceID choiceID , b.input_array_type,b.grouping,convert(decimal(32,10),0) port_value, (bmrk_value/#port_with_bmrk_total_mv) bmrk_value
from #tmp_bmrk_other_sum b
where b.key_value not in ( select p.key_value from #tmp_port_other_sum p)
Error message:
Error occurred during SQL query execution
Reason:
SQL Error [42601]: ERROR: syntax error at or near ","
Position: 9030
the relevant comma being:
CONVERT(DECIMAL(32,10),p.port_value
There is no convert() function in Postgres. Use the SQL standard cast or the Postgres extension ::data type. In this case:
...., cast(0 as decimal(30,10)) port_value, ....
OR
...., 0::decimal(30,10) port_value, ...
Note: No comma after the expression. In the original port_value is the column alias. You need to keep it that way.

ERROR: Teradata prepare: Syntax error when usign date format 'ddmmyyy'd in SAS

I'm using this code to extract information from this database. However, it is showing me this error:
ERROR: Teradata prepare: Syntax error, expected something like ')' between a string or a Unicode character literal and the word
'd'. SQL statement was: WITH vmher102ult as ( select cod_cte, max(fec_consulta) as max_fec_consulta from
klarmxpw_her.vmher102 where cod_cte not in ('','0','00000000') and fec_consulta>='01MAR2021'd group by cod_cte) select t1.*
from klarmxpw_her.vmher101 as t1 inner join vmher102ult as t2 on t1.cod_cte=t2.cod_cte and
t1.fec_consulta=t2.max_fec_consulta.
The code I'm using for this pass through is the following:
proc sql;
connect to teradata as tera (user=&tuser. password=&tpass. server='TDMX03');
create table vmher101_m as
select * from connection to tera (
WITH vmher102ult as (
select cod_cte, max(fec_consulta) as max_fec_consulta
from klarmxpw_her.vmher102
where cod_cte not in ('','0','00000000')
and fec_consulta>='01MAR2021'd
group by cod_cte)
select t1.*
from klarmxpw_her.vmher101 as t1
inner join vmher102ult as t2
on t1.cod_cte=t2.cod_cte and t1.fec_consulta=t2.max_fec_consulta);
disconnect from ter;
Does anybody know what can I do?
You need to use TERADATA code inside the () after from connection to tera.
Try
and fec_consulta>= DATE '2021-03-01'
Teradata Documentation

SQL Error: ERROR: not all tokens processed

Am getting below error in Postgres while executing insert and delete queries. I have around 50 inserts and 50 delete statements. When executed an getting the error as,
SQL Error: ERROR: not all tokens processed
The error is not consistent all the time,
For example,
My 20th delete statement is getting failed
Next time when the same queries are executed, 25th delete statement is getting failed
And when those statements are executed alone, there is no failure.
Not sure if it is a database load issue or infrastructure related issue.
Any suggestion would be helpful
Below is the query,
WITH del_table_1 AS
(
delete from table_1 where to_date('01-'||col1,'DD-mm-YYYY') < current_date-1
RETURNING *
)
update control_table set deleted_count = cnt, status = 'Completed',
update_user_id = 'User', update_datetime = current_date from
(select 'Table1' as table_name, count(*) as cnt from del_table_1) aa
where
control_table.table_name = aa.table_name
and control_table.table_name = 'Table1'
and control_table.status = 'Pending';

syntax error at or near "'select to_char(application_date::timestamp, '"

EXECUTE 'select to_char(application_date::timestamp, 'Mon-YY') as appl_month from my_schema.my_table;';
The above PostgreSQL EXECUTE statement is giving the below error:
ERROR: syntax error at or near "'select
to_char(application_date::timestamp, '" LINE 1: EXECUTE 'select
to_char(application_date::timestamp, 'Mon-YY...
^
********** Error **********
ERROR: syntax error at or near "'select
to_char(application_date::timestamp, '" SQL state: 42601 Character: 9
Any suggestions will be helpful.
Changed to below statement
EXECUTE 'select to_char(application_date::timestamp, ' || quote_literal(Mon-YY) || ') from standard.npo_weekly_export;';
But giving new error:
ERROR: syntax error at or near "'select to_char(application_date::timestamp, '"
LINE 1: EXECUTE 'select to_char(application_date::timestamp, ' || qu...
^
********** Error **********
ERROR: syntax error at or near "'select to_char(application_date::timestamp, '"
SQL state: 42601
Character: 9
Expected Output: - Counts by month in Mon-YY format
Application month Application # Final Approval #
Jan-17 1,000 800
Feb-17 1,010 808
Mar-17 1,020 816
Apr-17 1,030 824
If I do the below query:
select to_char(application_date, 'Mon-YY') as appl_month,
count(distinct application_id) as appl_count,
sum(final_approval_ind) as fa_count,
from my_schema.my_table
group by appl_month
order by appl_month;
Generated output: (Note: Sorted by text, not by date)
"Apr-17";94374;19953
"Apr-18";87446;20903
"Aug-17";102043;21536
"Aug-18";91107;20386
"Dec-17";63263;13755
"Dec-18";21358;74
"Feb-17";89447;18084
"Feb-18";75426;16144
"Jan-17";86103;16394
"Jan-18";79403;17766
"Jul-17";90380;18929
"Jul-18";85439;20186
"Jun-17";95596;20403
"Jun-18";85764;18707
"Mar-17";112929;23323
"Mar-18";91179;21841
"May-17";101907;22349
"May-18";90885;21550
"Nov-17";78284;16791
"Nov-18";80472;7656
"Oct-17";87955;18524
"Oct-18";82821;17056
"Sep-17";80740;17788
"Sep-18";75785;18009
Problem: to_char() returns text and it sorts by text and not by date. So the output is jumbled rather than sorted by Mon-YY.
Do the aggregation in a derived table (aka "sub-query") that preserves the data type, then do the sorting in the outer query:
select to_char(ap_month, 'Mon-YY') as appl_month
appl_count,
fa_count
from (
select date_trunc('month', application_date) as ap_month,
count(distinct application_id) as appl_count,
sum(final_approval_ind) as fa_count,
from my_schema.my_table
group by ap_month
) t
order by ap_month;
date_trunc('month', application_date) will normalize the application_date to the start of the month, but will retain the date data type, so that the sorting in the outer query works correctly.
I have no idea what the dynamic SQL in your question is supposed to do, but if you need to use that query for whatever reasons as dynamic SQL, you need to escape the single quotes by doubling them.
execute '
select to_char(ap_month, ''Mon-YY'') as appl_month
appl_count,
fa_count
from (
select date_trunc(''month'', application_date) as ap_month,
count(distinct application_id) as appl_count,
sum(final_approval_ind) as fa_count,
from my_schema.my_table
group by ap_month
) t
order by ap_month;
'; -- end of dynamic SQL
But using Postgres' dollar quoting would be easier:
execute $dyn$
select to_char(ap_month, 'Mon-YY') as appl_month
appl_count,
fa_count
from (
select date_trunc('month', application_date) as ap_month,
count(distinct application_id) as appl_count,
sum(final_approval_ind) as fa_count,
from my_schema.my_table
group by ap_month
) t
order by ap_month;
$dyn$; -- end of dynamic SQL
Note that you can nest dollar quoted strings, so if that query is used inside a function, just use a different delimiter than you use for the function body (see the example in the manual)

Postgresql CASE statement schema error

I'm getting a 3F000 error from the following CASE statement.
select
entity.level1
, sum(trans_clean.amount_calculated) as total_spend
, sum(CASE WHEN extract(year from age(supplier.open_date::trans_clean.date_calculated)) <= 5 AND org_type = 'SMALL BUSINESS' THEN trans_clean.amount_calculated END) small_young
, trans_clean.date_calculated
from
entity, supplier, trans_clean
where
trans_clean.supplier_id = supplier.supplier_id and
trans_clean.entity_id = entity.entity_id and
date_calculated > '2011-12-01'
group by
entity.level1
, total_spend
, small_young
, trans_clean.date_calculated
;
The error is as follows:
ERROR: schema "trans_clean" does not exist
SQL state: 3F000
The statement works without the CASE line, I'm running the query in PGAdmin III and the right schema (public) is definitely selected.