This is a pretty straight up question. It may be simple, but I'm having a hard time solving it:
How can I get the average of time? I have a column on my dataset called response time which has the number of time a client waited before they got the answer from a company. The values on this column will never be greater than 00:50:00. I need to get the average response time by month but the field comes as a string. I tried creating a calculated field called avg_time by doing this:
DATETIME(AVG(FLOAT([response time])))
but all I got was:
An error occurred while communicating with the PostgreSQL data source 'database_name (namespace.database_name) (company)'
Error Code: A7B6E1FA
ERROR: invalid input syntax for type double precision: "00:09:03"
Can you guys help me with that, please? Thanks!
Related
When I write a date in my input I receive the wrong time in the database (2 hours earlier than the time I want), in the pictures you can see that I got the hour 5:21 in the database even though I wrote the time 7:21 in the input:
The time in the input:
The time in the database:
I thought that it is maybe because of the time zone (I am In Israel) and I try to use the Library Moment.
In this way:
holiday.beginDate = moment.tz(req.body.beginDate, "Asia/Jerusalem");
But, I get the same result.
Someone can tell me what I am doing wrong or what should I do to make the time I write in the input be the same time in the database?
Thank you.
I have a datediff() function that throws an exception.
I am trying to calculate the number of days between two dates. The rub is that one date is a converted integer value in YYYYMMDD format and the second date field is a timestamp. So, in the snippet below I am doing what I think are the correct conversions. Sometimes, it runs actually.
The message I get is: Amazon Invalid operation: Data value "0" has invalid format.
select site, datediff(days,to_date(cast(posting_dt_sk as varchar), 'YYYYMMDD'),trunc(ship_dt)) days_to_ship from sales_table
Later I added a Where-clause as to ignore empty values thinking I had bad data but that's not it. I still get the message.
where posting_dt_sk is not null and posting_dt_sk > 0
It all looks right to me, but it fails.
after using the extract function to extract an hour value from a timestamp in postgresql, how can I count the number of times each value appears? I'm trying to produce the number of calls coming into a helpdesk based on time of day. So I need to be able to count the number that came in at 2pm, 3pm, 4pm and so on. Thanks for any help.
I am new to Cognos and I am trying to add a filter to a column that only allows rows that are in between Yesterday at 4 AM and today at 3 AM. I have a working query in db2 but when I try to add it to the filter in Cognos I get a parsing error. Also, I found in the properties that the data type for the column I am trying to filter to be Unknown (Unsupported) type. I started off by creating two Data Item Expressions for each time frame I am trying to limit the data by. But I got a parsing error on the first one:
[Presentation Layer].[Cr dtime]=timestamp(current date) - 1 day + 4 hour
This works in my db2 local test database but doesn't even compile in Cognos. I also tried casting the column into a timestamp but that isn't working either. Any help is appreciated. I also tried using the _add_days function but I still get a parsing error. Also sampling the column I get values that appear to be timestamps as this string: 2016-01-02T11:11:45.000000000
Eventually if I get the two filters working I expect the original filter to be close to this syntax:
[Presentation Layer].[Cr dtime] is between [Yesterday 4AM] AND [Today 3AM]
Here is your filter:
[Presentation Layer].[Cr dtime] between
cast(_add_hours(_add_days(current_date,-1),4),timestamp)
and
cast(_add_hours(current_date,3),timestamp)
This works because current_date in Cognos does not have a time component. If you were to cast it directly to a timestamp type you would see the time part of the date as 12:00:00.000 AM, or midnight. Knowing this we can then simply add how much time after midnight we want, cast as a timestamp type and use this in the filter.
I have an annoying issue with Rails/Active Record and dates that I'm trying to figure out. We're using date fields for composite keys, which I am turning into timestamps to make it easier for use as URL parameters. This is working fine, but I have inconsistencies when trying to lookup records after converting the timestamp back into a DateTime. When the object is serialised, the ID sent back looks something like 1401810373.197,63 where the first number is the timestamp with milliseconds, and the second value is the original ID that rails normally uses.
When a request is received with this ID, the timestamp is parsed using the following
... get timestamp from input ...
Time.at(Rational(timestamp)).utc.strftime('%Y-%m-%d %H:%M:%S.%3N')
This works as expected, and the queries produced using this also work as expected. The issue lies in that the date time produced here, is slightly different to the one on the original object. They're out by something like 1ms which I'm assuming is due to the loss of precision when using to_f to get the timestamp.
I did a quick test in a console with the following code to replicate it
Model.all.each do |m|
puts Time.at(Rational("#{m.to_param.split(',').first}")).utc.strftime('%Y-%m-%d %H:%M:%S.%3N') == m.created_at.utc.strftime('%Y-%m-%d %H:%M:%S.%3N')
end
The output of this shows multiple true and false values, so something is definitely going wrong in the conversion.
Currently, the to_param method simply converts the created_at field using to_f. I've tried changing this to "%.6f" % m.created_at.to_f but this hasn't changed anything.
Is there anything I can do to prevent this difference in times, as it's causing an array of issues.
This is currently running on Postgres, where the created_at column is a timestamp(3) column. We're using Rails 4.1 with jRuby 1.7.12
Ended up dropping the accuracy entirely. Now the database has a type of timestamp(0) and rails has been modified to not provide the milliseconds. Seems to be working :)