Rails date loss of precision - postgresql

I have an annoying issue with Rails/Active Record and dates that I'm trying to figure out. We're using date fields for composite keys, which I am turning into timestamps to make it easier for use as URL parameters. This is working fine, but I have inconsistencies when trying to lookup records after converting the timestamp back into a DateTime. When the object is serialised, the ID sent back looks something like 1401810373.197,63 where the first number is the timestamp with milliseconds, and the second value is the original ID that rails normally uses.
When a request is received with this ID, the timestamp is parsed using the following
... get timestamp from input ...
Time.at(Rational(timestamp)).utc.strftime('%Y-%m-%d %H:%M:%S.%3N')
This works as expected, and the queries produced using this also work as expected. The issue lies in that the date time produced here, is slightly different to the one on the original object. They're out by something like 1ms which I'm assuming is due to the loss of precision when using to_f to get the timestamp.
I did a quick test in a console with the following code to replicate it
Model.all.each do |m|
puts Time.at(Rational("#{m.to_param.split(',').first}")).utc.strftime('%Y-%m-%d %H:%M:%S.%3N') == m.created_at.utc.strftime('%Y-%m-%d %H:%M:%S.%3N')
end
The output of this shows multiple true and false values, so something is definitely going wrong in the conversion.
Currently, the to_param method simply converts the created_at field using to_f. I've tried changing this to "%.6f" % m.created_at.to_f but this hasn't changed anything.
Is there anything I can do to prevent this difference in times, as it's causing an array of issues.
This is currently running on Postgres, where the created_at column is a timestamp(3) column. We're using Rails 4.1 with jRuby 1.7.12

Ended up dropping the accuracy entirely. Now the database has a type of timestamp(0) and rails has been modified to not provide the milliseconds. Seems to be working :)

Related

Date CAST in SQL Server throwing conversion failed for a date

I am experiencing a very unique cast error I don't understand why it happens with some dates and only in one particular case.
First at all, I cannot change the current code, it's a dynamic query from a legacy application and it's the result of queries to different tables to assemble the query I am having troubles with.
The error is a classic 'conversion failed when converting date and/or time from character string'.
At the beginning I thought it was a classic file naming error, we obtain the date from the file name in the format YYYYMMDD, the file has prefix and suffix and it's always formatted like that. It was pretty common to get wrongly formatted dates but it doesn't happen anymore. The issue is interesting because it only happens in 1 case for some dates that do not look like errors, for example, 20201105 which is basically translated to 11/05/2020 (US Format with month first).
This is the query:
SELECT TOP 1 CAST(LEFT(REPLACE(FileName, 'XXYYY_,''),8) AS DATE) AS MyDate FROM Mytable
The file name in this case is XXYYY_20201105.txt
Why the top 1? Well, it is a very bad design, there are many rows with the same value and it has to take only one to determine the date.
The most interesting part of it, when it fails I can "fix" the error just adding one more column:
SELECT TOP 1 CAST(LEFT(REPLACE(FileName, 'XXYYY_,''),8) AS DATE) AS MyDate, AnotherColumn
FROM Mytable
This query, just adding a column, doesn't fail. That's the weirdest part. I am trying wrap my head around what is the difference between obtaining ONE column and TWO columns. When I add any other column it seems to make the issue disappear.
Thanks a lot.

Truncate datetimes by second for all queries, but keep milliseconds stored in Postgres

I'm trying to find a way to tell Postgres to truncate all datetime columns so that they are displayed and filtered by seconds (ignoring milliseconds).
I'm aware of the
date_trunc('second', my_date_field)
method, but do not want to do that for all datetime fields in every select and where clause that mentions them. Dates in the where clause need to also capture records with the granularity of seconds.
Ideally, I'd avoid stripping milliseconds from the data when it is stored. But then again, maybe this is the best way. I'd really like to avoid that data migration.
I can imagine Postgres having some kind of runtime configuration like this:
SET DATE_TRUNC 'seconds';
similar to how timezones are configured, but of course that doesn't work and I'm unable to find anything else in the docs. Do I need to write my own Postgres extension? Did someone already write this?

Tableau cannot recognize timestamp field in my log file

I am using Tableau 9.3 to do a preliminary data analysis on one of my log file, the log file is like below:
"199.72.81.55",01/Jul/1995:00:00:01,/history/apollo/,200,6245,Sat
As you can see, there is a datetime for timestamp
In Tableau, initially it is recognized as a string like below:
That's fine, I want to make the field into datetime, and Tableau seems failed on it:
Why? How do I fix it?
Thank you very much.
UPDATED: after applying the formula suggested below, Tableau still cannot recognize the timestamp, here is the screenshot:
UPDATED AGAIN: after tested by nick, it is confirmed his first script is correct and working on his Tableau, why it fails on mine, I don't know, you are welcome to share any clue please, thank you.
Tableau implicit conversions are limited to more standard formats. You can still create a DATETIME field from your timestamp string using a calculated field with the following formula:
DATEPARSE('dd/MMM/yyyy:HH:mm:ss',[timestamp])
Using the above will transform a string like 01/Jul/1995:00:00:01 to a date and time of 7/1/1995 12:00:01 AM
Output using example data:
Sometimes the "date parse" function in Tableau doesn't quite do the job.
When this happens it is worth testing manual string manipulation with your timestamp field to put it into ISO-standard format and only then trying to convert it into a date. ISO format is yyyy-mm-dd hh:mm:ss (eg 2012-02-28 13:04:30). It is common to find that the original string has spurious characters or spaces that throw dateparse. But these are usually easy to manipulate away with suitable text manipulations. This can sometimes be longwinded, but it always works.
It turned out to be the region setting issue, it works after I switch it to USA

Add Filter to extract rows where the timestamp falls in between yesterday at 4 AM and today at 3 AM in Cognos

I am new to Cognos and I am trying to add a filter to a column that only allows rows that are in between Yesterday at 4 AM and today at 3 AM. I have a working query in db2 but when I try to add it to the filter in Cognos I get a parsing error. Also, I found in the properties that the data type for the column I am trying to filter to be Unknown (Unsupported) type. I started off by creating two Data Item Expressions for each time frame I am trying to limit the data by. But I got a parsing error on the first one:
[Presentation Layer].[Cr dtime]=timestamp(current date) - 1 day + 4 hour
This works in my db2 local test database but doesn't even compile in Cognos. I also tried casting the column into a timestamp but that isn't working either. Any help is appreciated. I also tried using the _add_days function but I still get a parsing error. Also sampling the column I get values that appear to be timestamps as this string: 2016-01-02T11:11:45.000000000
Eventually if I get the two filters working I expect the original filter to be close to this syntax:
[Presentation Layer].[Cr dtime] is between [Yesterday 4AM] AND [Today 3AM]
Here is your filter:
[Presentation Layer].[Cr dtime] between
cast(_add_hours(_add_days(current_date,-1),4),timestamp)
and
cast(_add_hours(current_date,3),timestamp)
This works because current_date in Cognos does not have a time component. If you were to cast it directly to a timestamp type you would see the time part of the date as 12:00:00.000 AM, or midnight. Knowing this we can then simply add how much time after midnight we want, cast as a timestamp type and use this in the filter.

How did this number become this datetime?

We have a vendor which one field in the database is a number and somehow in the app interface it shows the date,
I'm trying to figure out how is this conversion
Here is the data:
this number 15862 generates this date 06/05/2013
I have no idea how, the vendor told us it is NOT a custom logic conversion it was used a tsql function although I can't figure which one.
I tried using "convert" without success.
I don't think that's from a tsql function considering it's derived using the UNIX time epoch. Basically it's number of days since 1969-12-31
But you could get it using tsql like so:
select datediff(d,'1969-12-31','2013-06-05')
It looks like it's using a base-date of 1/1/1970 (actually 12/31/1969) and the number represents the number of days after that.
Most probably this is saved as an offset in days since 01/01/1979:
date('m/d/Y', 15862*3600*24) gives 06/06/2013 and
date('m/d/Y', 15862*3600*24-(3600*24) gives exactly 06/05/2013