Snowflake TIMESTAMP - Preserve Millisecond Values to .000000, store in table - date

I have a json value of 2045-06-02T09:23:41.8666668
I want to convert via TIMESTAMP to a DATE data type in snowflake AND hold the exact value, but I have three issues:
1) The TIMESTAMP data type is never stored in tables. (per https://docs.snowflake.net/manuals/sql-reference/data-types-datetime.html)
2) When I do use ::TIMESTAMP it cuts off at 2045-06-02 09:23:41.866
3)It removes the "T" that indicates time
Can anyone point me to documentation that handles this issue?

1) & 2)
The document (you linked to) says TIMESTAMP by default uses 9dp, what you are seeing when yor select is a presentation formatting issue, not a storage issues.
select '2045-06-02 09:23:41.8666668'::text as ta
,'2045-06-02 09:23:41.9777779'::text as tb
,ta::timestamp ta_d, ta::timestamp(0) ta_0
,ta::timestamp(3) ta_3
,ta::timestamp(6) ta_6
,tb::timestamp tb_d, tb::timestamp(0) tb_0
,tb::timestamp(3) tb_3, tb::timestamp(6) tb_6)
,datediff('millisecond', ta_d, tb_d )
,datediff('millisecond', ta_0, tb_0 )
,datediff('millisecond', ta_3, tb_3 )
,datediff('millisecond', ta_6, tb_6 ),
datediff('microsecond', ta_d, tb_d )
,datediff('microsecond', ta_0, tb_0 )
,datediff('microsecond', ta_3, tb_3 )
,datediff('microsecond', ta_6, tb_6 )
,datediff('nanosecond', ta_d, tb_d )
,datediff('nanosecond', ta_0, tb_0 )
,datediff('nanosecond', ta_3, tb_3 )
,datediff('nanosecond', ta_6, tb_6 )
;
shows by default you are getting 9 decimal places.
3) that is a formatting thing..

Related

T-SQL- How to extract pattern '(yyyy)' in string with where 'like' query

I have a string data like 'wordword (2018)', want to extract those data with pattern (yyyy). have tried with '%/([0-9][0-9][0-9][0-9]/)%' but doesn't work
Building on HABO's comment, you can use something like:
DECLARE #Pattern VARCHAR(50) = '%([0-9][0-9][0-9][0-9])%'
SELECT A.value, yyyy = SUBSTRING(A.value, NULLIF(PATINDEX(#pattern, A.Value), 0) + 1, 4)
FROM (
VALUES
('wordword (2018)'),
('Nothing here'),
('this (2010) and that (2020)')
) A(value)
SQL Server has a very limited pattern matching support, so I converted your regex to the closest thing that SQL Server supports. The NULLIF() in the above converts a not-found index of zero to a null, which propagates to the result.
Did you try CHARINDEX?
SUBSTRING(#str,
CHARINDEX(‘[0-9][0-9][0-9][0-9]’,#str),4)

DB2 SQL Error: SQLCODE=-302 while executing prepared statement

I have a SQL query which takes user inputs hence security flaw is present.
The existing query is:
SELECT BUS_NM, STR_ADDR_1, CITY_NM, STATE_CD, POSTAL_CD, COUNTRY_CD,
BUS_PHONE_NB,PEG_ACCOUNT_ID, GDN_ALERT_ID, GBIN, GDN_MON_REF_NB,
ALERT_DT, ALERT_TYPE, ALERT_DESC,ALERT_PRIORITY
FROM ( SELECT A.BUS_NM, AE.STR_ADDR_1, A.CITY_NM, A.STATE_CD, A.POSTAL_CD,
CC.COUNTRY_CD, A.BUS_PHONE_NB, A.PEG_ACCOUNT_ID, 'I' ||
LPAD(INTL_ALERT_DTL_ID, 9,'0') GDN_ALERT_ID,
LPAD(IA.GBIN, 9,'0') GBIN, IA.GDN_MON_REF_NB,
DATE(IAD.ALERT_TS) ALERT_DT,
XMLCAST(XMLQUERY('$A/alertTypeConfig/biqCode/text()' passing
IAC.INTL_ALERT_TYPE_CONFIG as "A") AS CHAR(4)) ALERT_TYPE,
, ROW_NUMBER() OVER () AS "RN"
FROM ACCOUNT A, Other tables
WHERE IA.GDN_MON_REF_NB = '100'
AND A.PEG_ACCOUNT_ID = IAAR.PEG_ACCOUNT_ID
AND CC.COUNTRY_CD = A.COUNTRY_ISO3_CD
ORDER BY IA.INTL_ALERT_ID ASC )
WHERE ALERT_TYPE IN (" +TriggerType+ ");
I changed it to accept TriggerType from setString like:
SELECT BUS_NM, STR_ADDR_1, CITY_NM, STATE_CD, POSTAL_CD, COUNTRY_CD,
BUS_PHONE_NB,PEG_ACCOUNT_ID, GDN_ALERT_ID, GBIN, GDN_MON_REF_NB,
ALERT_DT, ALERT_TYPE, ALERT_DESC,ALERT_PRIORITY
FROM ( SELECT A.BUS_NM, AE.STR_ADDR_1, A.CITY_NM, A.STATE_CD, A.POSTAL_CD,
CC.COUNTRY_CD, A.BUS_PHONE_NB, A.PEG_ACCOUNT_ID,
'I' || LPAD(INTL_ALERT_DTL_ID, 9,'0') GDN_ALERT_ID,
LPAD(IA.GBIN, 9,'0') GBIN, IA.GDN_MON_REF_NB,
DATE(IAD.ALERT_TS) ALERT_DT,
XMLCAST(XMLQUERY('$A/alertTypeConfig/biqCode/text()' passing
IAC.INTL_ALERT_TYPE_CONFIG as "A") AS CHAR(4)) ALERT_TYPE,
ROW_NUMBER() OVER () AS "RN"
FROM ACCOUNT A, other tables
WHERE IA.GDN_MON_REF_NB = '100'
AND A.PEG_ACCOUNT_ID = IAAR.PEG_ACCOUNT_ID
AND CC.COUNTRY_CD = A.COUNTRY_ISO3_CD
ORDER BY IA.INTL_ALERT_ID ASC )
WHERE ALERT_TYPE IN (?);
Setting trigger type as below:
if (StringUtils.isNotBlank(request.getTriggerType())) {
preparedStatement.setString(1, triggerType != null ? triggerType.toString() : "");
}
Getting error as
Caused by: com.ibm.db2.jcc.am.SqlDataException: DB2 SQL Error: SQLCODE=-302, SQLSTATE=22001, SQLERRMC=null, DRIVER=4.19.26
The -302 SQLCODE indicates a conversion error of some sort.
SQLSTATE 22001 narrows that down a bit by telling us that you are trying to force a big string into a small variable. Given the limited information in your question, I am guessing it is the XMLCAST that is the culprit.
DB2 won't jam 30 pounds of crap into a 4 pound bag so to speak, it gives you an error. Maybe giving XML some extra room in the cast might be a help. If you need to make sure it ends up being only 4 characters long, you could explicitly do a LEFT(XMLCAST( ... AS VARCHAR(64)), 4). That way the XMLCAST has the space it needs, but you cut it back to fit your variable on the fetch.
The other thing could be that the variable being passed to the parameter marker is too long. DB2 will guess the type and length based on the length of ALERT_TYPE. Note that you can only pass a single value through a parameter marker. If you pass a comma separated list, it will not behave as expected (unless you expect ALERT_TYPE to also contain a comma separated list). If you are getting the comma separated list from a table, you can use a sub-select instead.
Wrong IN predicate use with a parameter.
Do not expect that IN ('AAAA, M250, ABCD') (as you try to do passing a comma-separated string as a single parameter) works as IN ('AAAA', 'M250', 'ABCD') (as you need). These predicates are not equivalent.
You need some "string tokenizer", if you want to pass such a comma-separated string like below.
select t.*
from
(
select XMLCAST(XMLQUERY('$A/alertTypeConfig/biqCode/text()' passing IAC.INTL_ALERT_TYPE_CONFIG as "A") AS CHAR(4)) ALERT_TYPE
from table(values xmlparse(document '<alertTypeConfig><biqCode>M250, really big code</biqCode></alertTypeConfig>')) IAC(INTL_ALERT_TYPE_CONFIG)
) t
--WHERE ALERT_TYPE IN ('AAAA, M250, ABCD')
join xmltable('for $id in tokenize($s, ",\s?") return <i>{string($id)}</i>'
passing cast('AAA, M250 , ABCD' as varchar(200)) as "s"
columns token varchar(200) path '.') x on x.token=t.ALERT_TYPE
;
Run the statement as is. Then you may uncomment the string with WHERE clause and comment out the rest to see what you try to do.
P.S.:
The error you get is probably because you don't specify the data type of the parameter (you don't use something like IN (cast(? as varchar(xxx))), and db2 compiler assumes that its length is equal to the length of the ALERT_TYPE expression (4 bytes).

How to import data into teradata tables from delimited file using BTEQ import?

I am trying to execute following bteq command on linux environment but couldn't load data properly into Teradata DB server. Can someone please advise me to resolve the below issue that I am facing while loading.
BTEQ Command used :
.SET width 64000;
.SET session transaction btet;
.logmech ldap
.logon XXXXXXX/XXXXXXXX,********;
DATABASE corecm;
.PACK 1000
.IMPORT VARTEXT '~' FILE=/v/global/user/application_event_bus_evt
.REPEAT *
USING(APPLICATION_EVENT_ID CHAR(24),BUS_EVT_ID CHAR(24),BUS_EVT_VID BIGINT,BUS_EVT_RESTATE_IN SMALLINT)
insert into corecm.application_event_bus_evt (APPLICATION_EVENT_ID
, BUS_EVT_ID
, BUS_EVT_VID
, BUS_EVT_RESTATE_IN
)
values
( COALESCE(:APPLICATION_EVENT_ID,1)
, COALESCE(:BUS_EVT_ID,1)
, COALESCE(:BUS_EVT_VID,1)
, COALESCE(:BUS_EVT_RESTATE_IN,1)
) ;
.LOGOFF;
.EXIT;
SAMPLE INPUT FILE DELIMITTER "~" [ /v/global/user/application_event_bus_evt ] :
Ckn3gMxLEeOgIQBQVgErYA==~g+GDDtlaY3n7BdUrYshDFA==~1~1
CL1kEcxLEeOgIQBQVgErYA==~qoKoiuGDbClpcGt/z6RKGw==~1~1
oYIVcMxKEeOgIQBQVgErYA==~mfmQiwl7yAteevzJfilMvA==~1~1
5N7ME5bM4xGhM7exj3ykUw==~yFM2FZbM4xGhM7exj3ykUw==~1~0
JLBH4JfM4xGDH9s5+Ds/8w==~doZ/7pfM4xGDH9s5+Ds/8w==~1~0
fGvpoMxKEeOgIQBQVgErYA==~mQUQIK2mY6WIPcszfp5BTQ==~1~1
Table Definition :
CREATE MULTISET TABLE CORECM.APPLICATION_EVENT_BUS_EVT ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
APPLICATION_EVENT_ID CHAR(26) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL,
BUS_EVT_ID CHAR(26) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL,
BUS_EVT_VID BIGINT NOT NULL,
BUS_EVT_RESTATE_IN SMALLINT)
UNIQUE PRIMARY INDEX ( APPLICATION_EVENT_ID ,BUS_EVT_ID ,BUS_EVT_VID )
INDEX APPLICATION_EVENT_BUS_EVT_IDX1 ( APPLICATION_EVENT_ID )
INDEX APPLICATION_EVENT_BUS_EVT_IDX2 ( BUS_EVT_ID ,BUS_EVT_VID );
Results set in DB server as,
APPLICATION_EVENT_ID BUS_EVT_ID BUS_EVT_VID BUS_EVT_RESTATE_IN
1 Ckn3gMxLEeOgIQBQVgErYA == g+GDDtlaY3n7BdUrYshD 85,849,873,219,141,958 12,544
2 CL1kEcxLEeOgIQBQVgErYA == qoKoiuGDbClpcGt/z6RK 85,849,873,219,155,783 12,544
3 oYIVcMxKEeOgIQBQVgErYA == mfmQiwl7yAteevzJfilM 85,849,873,219,142,006 12,544
4 5N7ME5bM4xGhM7exj3ykUw == JAf0GpbM4xGhM7exj3yk 85,849,873,219,155,797 12,288
5 JLBH4JfM4xGDH9s5+Ds/8w == Du6T7pfM4xGDH9s5+Ds/ 85,849,873,219,155,768 12,288
6 fGvpoMxKEeOgIQBQVgErYA == mQUQIK2mY6WIPcszfp5B 85,849,873,219,146,068 12,544
If we look at the Data, we can see two issues as,
First two column data length is 24 CHARACTERS ( as per input file ), but the issue is that it been shifted two characters in next column.
Column BUS_EVT_VID and BUS_EVT_RESTATE_IN has wrong data 85,849,873,219,141,958 and 12,544 instead of 1 and 1 respectively (this may be because first two column data got shifted)
I tried following options to resolve the above issue but couldn't resolve the issue,
Modified the Table Definition, i.e. changed datatype to
CHAR(28),CHAR(24),CHAR(26)
Modified the Table Definition column
datatypes to VARCHAR(24), VARCHAR(26)
Modified BTEQ command, i.e. altered datatype in below line,
USING(APPLICATION_EVENT_ID CHAR(24),BUS_EVT_ID CHAR(24),BUS_EVT_VID BIGINT,BUS_EVT_RESTATE_IN SMALLINT)
Thanks in advance.
When you define VARTEXT all input columns must be defined as VARCHAR, but you used CHAR and INT.
This should work, VARCHAR length based on the definition of your target table:
USING(
APPLICATION_EVENT_ID VARCHAR(26),
BUS_EVT_ID VARCHAR(26),
BUS_EVT_VID VARCHAR(19),
BUS_EVT_RESTATE_IN VARCHAR(6)
)

Convert a string representing a timestamp to an actual timestamp in PostgreSQL?

In PostgreSQL: I convert string to timestamp with to_timestamp():
select * from ms_secondaryhealthcarearea
where to_timestamp((COALESCE(update_datetime, '19900101010101'),'YYYYMMDDHH24MISS')
> to_timestamp('20121128191843','YYYYMMDDHH24MISS')
But I get this error:
ERROR: syntax error at end of input
LINE 1: ...H24MISS') >to_timestamp('20121128191843','YYYYMMDDHH24MISS')
^
********** Error **********
ERROR: syntax error at end of input
SQL state: 42601
Character: 176
Why? How to convert a string to timestamp?
One too many opening brackets. Try this:
select *
from ms_secondaryhealthcarearea
where to_timestamp(COALESCE(update_datetime, '19900101010101'),'YYYYMMDDHH24MISS') >to_timestamp('20121128191843','YYYYMMDDHH24MISS')
You had two opening brackets at to_timestamp:
where to_timestamp((COA.. -- <-- the second one is not needed!
#ppeterka has pointed out the syntax error.
The more pressing question is: Why store timestamp data as string to begin with? If your circumstances allow, consider converting the column to its proper type:
ALTER TABLE ms_secondaryhealthcarearea
ALTER COLUMN update_datetime TYPE timestamp
USING to_timestamp(update_datetime,'YYYYMMDDHH24MISS');
Or use timestamptz - depending on your requirements.
Another way to convert a string to a timestamp type of PostgreSql is the above,
SELECT to_timestamp('23-11-1986 06:30:00', 'DD-MM-YYYY hh24:mi:ss')::timestamp without time zone;
I had the same requirement as how I read the title. How to convert an epoch timestamp as text to a real timestamp. In my case I extracted one from a json object. So I ended up with a timestamp as text with milliseconds
'1528446110978' (GMT: Friday, June 8, 2018 8:21:50.978 AM)
This is what I tried. Just the latter (ts_ok_with_ms) is exactly right.
SELECT
data->>'expiration' AS expiration,
pg_typeof(data->>'expiration'),
-- to_timestamp(data->>'expiration'), < ERROR: function to_timestamp(text) does not exist
to_timestamp(
(data->>'expiration')::int8
) AS ts_wrong,
to_timestamp(
LEFT(
data->>'expiration',
10
)::int8
) AS ts_ok,
to_timestamp(
LEFT(
data->>'expiration',
10
)::int8
) + (
CASE
WHEN LENGTH(data->>'expiration') = 13
THEN RIGHT(data->>'expiration', 3) ELSE '0'
END||' ms')::interval AS ts_ok_with_ms
FROM (
SELECT '{"expiration": 1528446110978}'::json AS data
) dummy
This is the (transposed) record that is returned:
expiration 1528446110978
pg_typeof text
ts_wrong 50404-07-12 12:09:37.999872+00
ts_ok 2018-06-08 08:21:50+00
ts_ok_with_ms 2018-06-08 08:21:50.978+00
I'm sure I overlooked a simpler version of how to get from a timestamp string in a json object to a real timestamp with ms (ts_ok_with_ms), but I hope this helps nonetheless.
Update: Here's a function for your convenience.
CREATE OR REPLACE FUNCTION data.timestamp_from_text(ts text)
RETURNS timestamptz
LANGUAGE SQL AS
$$
SELECT to_timestamp(LEFT(ts, 10)::int8) +
(
CASE
WHEN LENGTH(ts) = 13
THEN RIGHT(ts, 3) ELSE '0'
END||' ms'
)::interval
$$;

How to make a function in DB2 database to convert an integer to date, and the case when is 0?

I was trying to make a function to work in db2:
CREATE FUNCTION TO_DATE8(DATE_STRING numeric(8,0))
RETURNS DATE
LANGUAGE SQL
IF DATE_STRING > 0 THEN
// ERROR ->
RETURN DATE ( TO_DATE ( SUBSTR ( DATE_STRING , 1 , 8 ) , 'YYYYMMDD' ) )
ELSE
RETURN DATE ( TO_DATE ( '00000000' , 'YYYYMMDD' ) )
END IF
END
ERROR: DATE IS NOT VALID
What to do?
The form of the procedure required seems to be like this (at least on the iSeries version):
CREATE FUNCTION TO_DATE8(DATE_STRING numeric(8,0))
RETURNS DATE
LANGUAGE SQL
BEGIN
RETURN(CASE WHEN DATE_STRING > 0 THEN DATE(SUBSTR(DATE_STRING, 1, 4) || '-' ||
SUBSTR(DATE_STRING, 5, 2) || '-' ||
SUBSTR(DATE_STRING, 7, 2))
ELSE DATE('0001-01-01')
END);
END
However:
Your procedure is misnamed (reading from a date-8, not to it).
Your DATE_STRING is not a string (or even a char), it's numeric. Please rename it to something that does not include the datatype (dateToConvert works)
You seem to want to return something that is not a valid date (all 0s). I'm returning *loval here, although it's possible it should actually be null.
I didn't put in enough checks for a valid date - this will blow up really easily.
If at all possible, the database should be changed to contain actual dates, not a numeric value. Disk is (relative to programmer/architect headaches) cheap.
You may also find a calendar file helpful, if the 8-digit numeric was one of the included columns.
For the benifit of others, this can be done in one line rather than a function:
CASE WHEN MYDATE = 0 THEN NULL ELSE DATE(INSERT(INSERT(LEFT(CHAR(MYDATE),8),5,0,'-'),8,0,'-')) END
MYDATE was 8 packed in my case.