I am using Copy command to load a file in table. It has a timestamp column.
In my File format I have defined Timestamp as other and gave value as MM/DD/YYYY HH:MI:SS AM
to match with data.
When I execute it loads all records which have timestamp with AM format and any records with timestamp of PM time fails in Copy.
Sample fail record:
1, abc, 04/12/2016 12:00:00 PM
Sample successfully Loaded Record:
2, erd, 04/12/2016 08:00:00 AM
To verify my timestamp format I used below query and it worked fine:
SELECT TO_TIMESTAMP('04/12/2016 12:00:00 PM','MM/DD/YYYY HH:MI:SS AM')
The AM in your format definition is for HH12 not HH24 and is otherwise just considered a literal string. Try explicitly using HH12 in your format definition and see if that resolves your issue. I agree that it should behave the same as TO_TIMESTAMP() but since it isn't, I'd try to follow the guidelines of the documentation in your file format.
https://docs.snowflake.com/en/sql-reference/functions-conversion.html#date-and-time-formats-in-conversion-functions
Related
Using derived column i am adding 3 columns -> 2 columns for date and 1 for timestamp. for the date columns i am passing a string as parameter. for eg: 21-11-2021 and timstamp i am using currenttimestamp fucntion.
i wrote expressions in derived columns to convert them as date and timestamp datatype and also in a format that target table needs which is dd-MM-yyyy and dd-MM-yyyy HH:mm:ss repectively
For date->
expression used: toDate($initialdate, 'dd-MM-yyyy')
data preview output: 2021-01-21 --(not in the format i want)
After pipline Debug Run, value in target DB(Azure sql database) column:
2021-01-21T00:00:00 -- in table it shows like this I dont understand why
For Timstamp conversion:
Expression used:
toTimestamp(toString(currentTimestamp(), 'dd-MM-yyyy HH:mm:ss', 'Europe/Amsterdam'), 'dd-MM-yyyy HH:mm:ss')
Data preview output: 2021-11-17 19:37:04 -- not in the format i want
After pipline Debug Run, value in target DB(Azure sql database) column:
2021-11-17T19:37:04:932 -in table it shows like this I dont understand why
question 1: I am NOT getting values in the format the target requires ???and it should be only in DATE And Datetime2 dataype respectively so no string conversions
question 2: after debug run i dont know why after insert the table values look different from Data preview???
Kinldy let me know if i have written any wrong expressions??
--apologies i am not able post pictures---
toDate() converts input date string to date with default format as yyyy-[M]M-[d]d. Accepted formats are :[ yyyy, yyyy-[M]M, yyyy-[M]M-[d]d, yyyy-[M]M-[d]dT* ].
Same goes with toTimestamp(), the default pattern is yyyy-[M]M-[d]d hh:mm:ss[.f...] when it is used.
In Azure SQL Database as well the default date and datetime2 formats are in YYYY-MM-DD and YYYY-MM-DD HH:mm:ss as shown below.
But if your column datatypes are in string (varchar) format, then you can change the output format of date and DateTime in azure data flow mappings.
When loaded to Azure SQL database, it is shown as below:
Note: This format results when datatype is varchar
If the datatype is the date in the Azure SQL database, you can convert them to the required format using date conversions as
select id, col1, date1, convert(varchar(10),date1,105) as 'dd-MM-YYYY' from test1
Azure SQL Database always follows the UTC time zone. Using “AT TIME ZONE” convert it another non-UTC time zone.
select getdate() as a, getdate() AT TIME ZONE 'UTC' AT TIME ZONE 'Central Standard Time' as b
You can also refer to sys.time_zone_info view to check current UTC offset information.
select * from sys.time_zone_info
Date format of the machine is yyyy/mm/dd HH:mm:ss
(yyyy-MM-dd-HH.mm.ssssss) 1988-12-25-17.12.30.000000 this is my time format input, Can this time format be used to query logs from historic_log_info table? Irrespective of the date format set in the machine.
example query - SELECT * FROM TABLE(HISTORY_LOG_INFO( START_TIME => '2021-02-22-09.35.16.508075'))WHERE MESSAGE_ID IS NOT NULL
Based on the description for end-time, that formating looks correct:
start-time
A timestamp expression that indicates the starting timestamp to use when returning history log information.
If this parameter is omitted, the default of CURRENT DATE - 1 DAY is used.
end-time
A timestamp expression that indicates the ending timestamp to use when returning history log information.
If this parameter is omitted, the default of '9999-12-30-00.00.00.000000' is used.
https://www.ibm.com/support/knowledgecenter/ssw_ibm_i_74/rzajq/rzajqudfhistoryloginfo.htm
i make a query about bigint to timestamp and value is '1494257400'
i will use a presto query
but presto is not collect result about from_unixtime() function.
hive version.
select from_unixtime(1494257400) result : '2017-05-09 00:30:00'
presto version.
Blockquote
select from_unixtime(1494257400) result : '2017-05-08 08:30:00'
hive gave a collect result, but presto is not collect result. how i can solve about it?
The presto from_unixtime returns you a date at UTC when the one from Hive returns you a date on your local time zone.
According to https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF, from_unixtime:
Converts the number of seconds from unix epoch (1970-01-01 00:00:00
UTC) to a string representing the timestamp of that moment in the
current system time zone in the format of "1970-01-01 00:00:00".
The output of Hive is not that good because ISO formatted strings should show GMT data if they have any which are not GMT+00.
With Hive, you can use to_utc_timestamp({any primitive type} ts, string timezone) to convert your timestamp to the proper timezones. Take a look at the manual whose link is provided above.
I have a large data and in that one field be like Wed Sep 15 19:17:44 +0100 2010 and I need to insert that field in Hive.
I am getting troubled for choosing data type. I tried both timestamp and date but getting null values when loading from CSV file.
The data type is a String as it is text. If you want to convert it, I would suggest a TIMESTAMP. However you will need to do this conversion yourself while loading the data or (even better) afterwards.
To convert to a timestamp, you can use the following syntax:
CAST(FROM_UNIXTIME(UNIX_TIMESTAMP(<date_column>,'FORMAT')) as TIMESTAMP)
Your format seems complex though. My suggestion is to load it as a string and then just do a simple query on the first record until you get it working.
SELECT your_column as string_representation,
CAST(FROM_UNIXTIME(UNIX_TIMESTAMP(<date_column>,'FORMAT')) as TIMESTAMP) as timestamp_representation
FROM your_table
LIMIT 1
You can find more information on the format here: http://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html
My advice would be to concat some substrings first and try to convert only the day, month, year part before you look at time and timezone et cetera.
I export data in csv format from sql server database. It contain 5 column. one column have date and time value. When i checked the date -time value i found date time value is in wrong format. I add the filter but filter not applied on some data. I try to format the data in same format but formatting did not applied on the data. I tried everything to fix the issue but it is not getting fix.
I have attached the sample data please check it from your end.
7/12/2013 14:50
8/12/2013 20:14
9/12/2013 11:38
10/12/2013 15:31
13/12/2013 12:45:50
13/12/2013 14:35:42
13/12/2013 14:37:40
14/12/2013 17:00:10
18/12/2013 14:57:35
Data started from 13/12/2013 12:45:50 are not getting change in date time format.
The trouble is that your dates are in french format dd/mm/yyyy you can force them to datetime with the following line :
[datetime]::ParseExact("7/12/2013 14:50", "d/MM/yyyy HH:mm", $null)
[datetime]::ParseExact("13/12/2013 12:45:50", "d/MM/yyyy HH:mm:ss", $null)
Be carefull in you case sometime you've got seconds and a double space between day and time.