I only want data that is older than 28 days - datediff

Im looking to ignore all the data from the last 30 days.
this is what i have so far. Not sure if its a datedif or something else.
SELECT MOT_BOOKINGS_FW.REASON_CODE_FW,MOT_BOOKINGS_FW.VEHICLE_ID_FW,MOT_BOOKINGS_FW.BOOKING_STATUS_FW,MOT_BOOKINGS_FW.BOOKING_DATE_FW,MOT_BOOKINGS_FW.RECORD_NUMBER_FW FROM MOT_BOOKINGS_FW WHERE ((%CRS:%MOT_BOOKINGS_FW.REASON_CODE_FW = N'mot'%CRE:%) AND MOT_BOOKINGS_FW.ARCHIVE_STATUS_FW = N'N')
ORDER BY MOT_BOOKINGS_FW.BOOKING_DATE_FW ASC

By your sample code, it seems you're using Sql Server (but please, place your RDBMS as a tag).
What you need is something like this:
select * from table
where datediff(day,getdate(), date_column) > 28

Related

Excluding date(s) from a report

Monday this week (26th August) was a public holiday in the UK. I need to exclude this date from some of my reports.
Simple enough but I have one database where the date is broken into 15 minute segments, for example:
2019-08-26 08:30:00.000
2019-08-26 16:15:00.000
The only way I can work out to exclude all these dates from the report would be to use NOT IN, for example:
AND a.mydate NOT IN ('2019-08-26 08:30:00.000', '2019-08-26 16:15:00.000')
Which seems quite a ponderous way of doing it to me.
There's also:
and a.mydate NOT BETWEEN '2019-08-26 08:30:00.000' and '2019-08-26 19:30:00.000'
but any advance on that would be useful to know about, if there is an advance on this, as there will be more, perhaps many more, public holidays to exclude, moving forwards.
I do not have a separate table with holiday dates in it, so I need to script as per the above.
Thank you
If you have discreet values, you could add them to a table and do a left join and put a filter in WHERE saying the exclusion table's Id should be null (meaning the join failed).
Sth like:
SELECT
mydata
FROM
mytable mt LEFT JOIN Holidays h ON mt.mydate = h.date
WHERE
h.Id IS NULL
This is just an alternative way made possible because you have discrete values, I'm not sure if it's actually faster or not...
This worked better:
and cast(a.mydate as date) not in
( '2019-08-26',
'2019-12-25',
'2019-12-26',
'2020-01-01',
. . . )

Date Parameter to Cache SQL

I am very new to Cache. I am trying to develop a report with date parameters. When I issue the SQL command:
SELECT TOP 2 ad.admission_date from system.admission ad WHERE convert(sql_date,ad.admission_date) >= convert(sql_date,'08-01-2014' )
I get what I expect two records.
One of which is 10/1/2010 12:00:00 AM.
Then if I issue the command
SELECT TOP 2 ad.admission_date from system.admission ad WHERE convert(sql_date,ad.admission_date) <= convert(sql_date,'08-01-2014' )
I get no values returned?
When I issue the command
SELECT TOP 2 {fn convert('10-03-2010', sql_date) } FROM system.admission_data
I get two NULL values. Clearly I am confused about how Cache works.
I have found that if you use the standard ODBC format (yyyy-MM-dd) for the date you don't need to use the convert and it is much more efficient:
WHERE ad.admission_date <= '2014-08-01'
I formated date incorrectly. I have my code working now. Should look something like select top 2 convert(DATE, '10/03/2010 12:00:00 AM') .... and then I can actually do comparisons.

FileNet - SQL for a date property between specific hours

I would like to fetch documents that a property date hour is between midnight and 4 AM.
I tried this:
SELECT [This], [Date], FROM Folder_Type_1
WHERE DATEPART(hh,[Date]) >= 0
AND DATEPART(hh,[Date]) <= 4
ORDER BY Date
and
SELECT [This], [Date], FROM Folder_Type_1
WHERE CONVERT(VARCHAR(8),Date,108) between '00:00:00' and '04:00:00'
ORDER BY Date
But none of them is working when I test it in the SQL query builder in the FEM.
DATEPART and CONVERT are not recognised. What is the correct way to do it?
I didn't find anything interesting in this SQL syntax reference.
Thank you in advance!
You are trying to use T-SQL functions within Content Engine Query Language. While its syntax might look like SQL, it is actually not. Not to mention it is obviously not T-SQL.
As of today, it is not possible to accomplish what you want. TimeSpan function introduced in the version 5.1 allows some manipulations with date parts. Those, however, are not sufficient for your task. You might want to check TimeSpan documentation.
I have used the follwoing before:
where c.DateCreated >= 20130101T000000Z
This is a snippet from a query executed using the api an not the fem, but in principle this should be the same sql

Combining a datetime stamp with 2 other values to make 1 value

New software that we have installed needs to have a specific id to be used as a refrence token to know where it left off in the sql database. Presently our other software that enters in data that i am refrencing is not giving no time 2013-05-20 00:00:00. I would like to combine that date time stamp with my pipesize and period.
The data looks like this:
trandate = 2013-05-20 00:00:00 Pipesize = 30 Period = A
I need to have the data converted to look like 2013052000000030A if it is possible or something close.
I used the chart here to determine the correct format: w3schools.com/sql/func_convert.asp and the resulting sqlfiddle here. sqlfiddle.com/#!3/d9654/3

Format Hour in DB2?

Is there any way to display db2 Hour function with AM and PM?
select hour(TIMESTAMP) from ORDERS with ur
This will give out like 5,6,7 etc..
But i want AM/PM after the Hour time.
I'd like to Dislpay as 5AM,6AM,7AM. Is it possible in db2?
Use the TIME() and CHAR() functions:
SELECT CHAR(TIME(timestamp), USA)
FROM Orders
WITH UR
Although, honestly, you should be doing this type of formatting in the application layer, not the SQL Layer.
(Statement run on my local DB2 instance)
EDIT:
Sorry, I missed that part earlier. Going through the documentation has shown me a new function, VARCHAR_FORMAT(). Assuming you're on DB2 9.5, the following should grant you some form of what you're looking for:
SELECT VARCHAR_FORMAT(timestamp, 'HH12 AM')
FROM Orders
WITH UR
Unfortunately, I can't test this myself, as iSeries V6R1 doesn't support the HH12 flag (HH24 only, what?). Otherwise, you're going to have to parse it out yourself.