Definition of COBOL variable to use for DB2 date arithmetic - db2

I have cases where I want to a add or substract a variable number of days from a timestamp.
The most simple example is this:
SELECT CURRENT_TIMESTAMP - :MOD-DAY DAY
INTO :MYTIMESTAMP
FROM SYSIBM.SYSDUMMY1
My problem is figuring out the right cobol definition for MOD-DAY.
As far as I'm aware, we are running DB2 version 11.
According to https://www.ibm.com/support/knowledgecenter/SSEPEK_11.0.0/sqlref/src/tpc/db2z_datearithmetic.html
the DB2 definition of the variable must be DECIMAL(8,0)
That could be 9(08) or S9(08) but in both cases, and any other variation I have thought up so far, I get the compile error
DSNH312I E DSNHSMUD LINE 1181 COL 49 UNDEFINED OR UNUSABLE HOST VARIABLE "MOD-DAY"
I have of course made certain that MOD-DAY has been defined, so the key word must be UNUSABLE
The error code definition for DSNH312I is pretty generic:
https://www.ibm.com/support/knowledgecenter/SSEPEK_10.0.0/msgs/src/tpc/dsnh312i.html
So does anyone know the right COBOL variable definition to use in this case?

Decimal in Mainframe-DB2 means comp-3.
So the field should be defined as S9(08) comp-3
If you look at the Cobol Copybooks generated by DB2 for DB2 Tables / Views you will see both the DB2 definition and the generated Cobol fields. That can be another way to solve queries like this

Related

Setting toDate function as arrival time Anylogic

I'm already struggling for days to use dates from excel in a proper way in anylogic..
I've created a database in where the date is formulated as integers in different columns since otherwise excel is messing up the dates (for example year=2021 , month=12 day=5 hour=6 minute=44 second=0 stands for 2021/12/5 6:44:00)
Now I know this can be converted to a date by the function toDate(year, month, day, hour, minutes seconds). But how can I use this integers to create agent with specific parameters from the database in a source and add to a custom population?
The most simple way is to add a column where the function toDate(......) is added in the database but I do not know how to do this (see picture if it is unclear). Or are there other solutions?
One way: use Dynamic Events.
Create one and in the action code, write mySource.inject(1)
In Main, on startup, load all dbase rows and create a DE for each row, below assuming it is only with an hour-column:
(Use the database query wizard to adjust your query).
In your source object, set it to "call of inject() function"
This will work, but it is quite cumbersome, as you can see. Much easier if you get your Excel right and just import the date column clean and well so you can use the Source option "arrival table in database" directly. I know you need regular arrivals, so maybe code that up in Excel to give you these on specific dates...

UPDATE SQL Command not saving the results

I looked though the forum but I couldn't find a issue like mine.
Essentially I have a table called [p005_MMAT].[dbo].[Storage_Max]. It has three columns Date, HistValue and Tag_ID. I want to make all the values in 'HistValue' column to have 2 decimal places. For example if a number is 1.1, I want it to be 1.10 or if its 1 then also I want it to look like 1.00.
Here is the sql update statement I am using
update [p005_MMAT].[dbo].[Storage_Max]
set [HistValue] = cast([HistValue] as decimal (10,2))
where [Tag_ID] = 94
After executing the query it says 3339 rows affected but when I perform a simple select statement it appears the column had no affect of. I have used that cast function in select statement and it adds two decimal places.
Please advice.
The problem is the datatype and SQL Server. Float or real will not have the trailing zeros. You either have to change the datatype of the column or just deal with it and handle the formatting in your queries or application.
You could run something like the following
select
cast([HistValue] as decimal (10,2))
from [p005_MMAT].[dbo].[Storage_Max]
where [Tag_ID] = 94

SAP HANA Decimal to timestamp or seconddate SLT

I am using SLT to load tables into our Hana DB. SLT uses the ABAP dictionary and sends timestamps as decimal (15,0) to the HANA DB. Once in the HANA DB via a calculated column in a calculation view, I am trying to convert the decimals to timestamps or seconddates. Table looks like this:
I run a small SLT transformation to populate columns 27-30. The ABAP layer in SLT populates the columns based on the Database transactions.
The problem comes when I try and convert columns 28-30 to timestamps or seconddates. using syntax like this:
Select to_timestamp(DELETE_TIME)
FROM SLT_REP.AUSP
Select to_seconddate(DELETE_TIME)
FROM SLT_REP.AUSP
I get the following errors:
Problem being, It works some times as well:
The syntax in calculated column looks like this:
With the error from calculation view being:
Has anyone found a good way to convert ABAP timestamps (Decimal (15,0)) to Timestamp or Seconddate in HANA?
There are conversion functions available, that you can use here (unfortunately not very well documented).
select tstmp_to_seconddate(TO_DECIMAL(20110518082403, 15, 0)) from dummy;
TSTMP_TO_SECONDDATE(TO_DECIMAL(20110518082403,15,0))
2011-05-18 08:24:03.0
The problem was with the ABAP data type. I was declaring the target variable as DEC(15,0). The ABAP extracting the data was rounding up the timestamp in some instances to the 60th second. Once in Target Hana, the to_timestamp(target_field) would come back invalid when a time looked like "20150101121060" with the last two digits being the 60th second. This is invalid and would fail. The base Hana layer did not care as it was merely putting a length 14 into into a field. I changed the source variable to be DEC(21,0). This eliminated the ABAP rounding and fixed my problem.

How can we use H2 to test SQL Server code that involves date to string conversions?

We use the H2 database engine as part of our test tooling for a product that uses SQL Server 2012 in production. Some of the existing SQL views use the three-argument CONVERT function on dates to format them as "ODBC Canonical" date:
CONVERT(VARCHAR, some_date, 120)
Normally when we encounter a situation like this we do one of the following two things:
we replace the SQL with something portable that works in MS SQL and H2
we implement a JAVA function to match MS SQL's behaviour and map it as UDF into H2
At the moment both seem to fail us since MS SQL doesn't seem to offer an alternative way of formatting dates and CONVERT is already a function in H2, just not in the three argument form.
We seem to be left with two options that we don't really like:
add a layer of in-direction on both sides, by defining a UDF in MS SQL that runs the convert, with a corresponding one in H2,
patch H2
The issue with the former is that it will introduce something into production that is solely for testing. That is true to some extent for migrating to more portable SQL as well, but adding the UDF is going a step further.
Patching H2 could be an option, but it is hard to tell how much effort that would be, in particular considering the existing CONVERT function. If suitable for a wider audience we would have to also cover MS SQL's weird world of styles across the types in a reasonable fashion, whereas we are only after one style for dates.
Is there another way? Has anyone experience with solving this problem?
The equivalent result using the FORMAT function is:
SELECT FORMAT(GETDATE(),'yyyy-MM-dd HH:mm:ss');
It seems that using the FORMAT function instead of CONVERT may resolve your issue.
Another way without using CONVERT is this:
SELECT
CAST(YEAR(GETDATE()) AS VARCHAR(4)) + '-' +
CAST(MONTH(GETDATE()) AS VARCHAR(2)) + '-' +
CAST(DAY(GETDATE()) AS VARCHAR(2))
(this is just an example and does not contain time components)

Is there any way to get the leading interval precision for a column through DBI in Perl?

I need to be able to get the following column attributes in a perl script for a column in an ODBC table :
SQL Type
Precision
Scale
Interval Leading Precision
TypeName
Whether its signed or unsigned
Nullability
Whether it's case sensitive
Whether it's auto-unique
Updatability
Right now I'm using DBI, and it seems I can get most of this information, but I'm stumped with getting the interval leading precision. Does DBI expose this?
That pod (which I wrote) is for SQLColAttributes. What happened is MS deprecated SQLColAttributes and replaced it with SQLColAttribute. BTW, there is a typo in the pod as that should by $sth not $dbh (I've just fixed that). ColAttributes is now superceded by DBI attributes in most cases e.g., PRECISION. You cannot pass the new SQLColAttribute types to SQLColAttributes with DBD::ODBC as it only knows about the ones that ODBC defined for SQLColAttributes and as ODBC allowed driver-specific attributes (for which DBD::ODBC could not possibly know the type of) it has to trap them and issue a "driver-specific column attributes not supported" message.
See SQLColAttribute which lists all the values you can pass to ColAttribute. You'll have to look in the sql*.h header files for the values. However, I don't see an interval leading precision in the list. I do see that SQL_DESC_PRECISION says "For data types SQL_TYPE_TIME, SQL_TYPE_TIMESTAMP, and all the interval data types that represent a time interval, its value is the applicable precision of the fractional seconds component.". So I'm unsure what you mean by "interval leading precision".
What ODBC driver are you using?