How to check whether jcr node has property or not? - aem

I am retrieving set of nodes from jcr after specific lastModified property.
But for some nodes it dont have lastModified property in that case my query have to look for jcr:createdDate.
How it will achieved?
This is my query
SELECT * FROM [nt:base] AS s WHERE [cq:lastModified] > CAST('2014-05-08T17:36:00.400+05:30' AS DATE)
Thanks

Test for the property absence looks like that:
[cq:lastModified] IS NULL
Your query could be rewritten to:
SELECT *
FROM [nt:base] AS s
WHERE
[cq:lastModified] > CAST('2014-05-08T17:36:00.400+05:30' AS DATE)
OR ([cq:lastModified] IS NULL
AND
[jcr:created] > CAST('2014-05-08T17:36:00.400+05:30' AS DATE))

I think using an OR condition should work for your case, as the node whose lastModified is not greater than the specified date obviously wouldn't have the createdDate greater than it.
And in case the lastModified is not present, it would check for the createdDate and generate the results accordingly
SELECT * FROM [nt:base] AS s
WHERE [cq:lastModified] > CAST('2014-05-08T17:36:00.400+05:30' AS DATE)
OR [jcr:created] > CAST('2014-05-08T17:36:00.400+05:30' AS DATE)

Related

How to compare string field as Date in postgresql?

I have a string field in a table whose value is date format like: 2020-12-08T18:06:55.132Z or 2020-12-08T18:06:55.132+11:00.
How can I search this field based on date? Like select * from my_table where timestamp > '2020-12-08T00:00:00'.
You can try something like:
SELECT *
FROM my_table
WHERE to_timestamp(timestamp,'YYYY-MM-DD"T"HH24:MI:SS.MS"Z"') > '2020-12-08T00:00:00'
EDIT:
Something even easier, which doesn't rely on your timestamp format:
SELECT *
FROM my_table
WHERE timestamp::timestamp > '2020-12-08T00:00:00'

How to get a sum of all rows that meets condition in postgres

I am trying to return sums with their specific conditions.
SELECT
COUNT(*),
SUM("transactionTotal" WHERE "entryType"=sold) as soldtotal,
SUM(case when "entryType"=new then "transactionTotal" else 0 end) as newtotal
FROM "MoneyTransactions"
WHERE cast("createdAt" as date) BETWEEN '2020-10-08' AND '2020-10-09'
I am trying to sum up the rows with "entryType"=sold and "entryType"=new and return those values separately.
obviously both my logic are wrong.
can someone lend a hand.
You were on the right track to use conditional aggregation, but your syntax is slightly off. Try this version:
SELECT
COUNT(*) AS total_cnt,
SUM(transactionTotal) FILTER (WHERE entryType = 'sold') AS soldtotal,
SUM(transactionTotal) FILTER (WHERE entryType = 'new') AS newtotal
FROM MoneyTransactions
WHERE
createdAt::date BETWEEN '2020-10-08' AND '2020-10-09';
Note: Assuming your createdAt column already be date, then there is no point in casting it. If it is text, then yes you would need to convert it, but you might have to use TO_DATE depending on its format.

How to cast a null date in an hibernate nativeQuery?

I'm using postgreSQL and hibernate. Because I've to use SQL queries that is not take into account by hibernate, I use nativeQuery in my Query annotation.
I have multiple dates param and inevitably at least two of them are null.
I have to check a condition only if my param isn't null. The problem is that I'm not able to cast my date if this one is null.
I try different possible solutions but nothing works
:dEchTransacEqual is null OR transaction.d_ech_transac = cast(:dEchTransacEqual as date)
coalesce(:dEchTransacGreaterThanEqual, null) is null OR transaction.d_ech_transac >= cast(:dEchTransacGreaterThanEqual as date)
CASE cast(:dEchTransacGreaterThanEqual as date) WHEN not null THEN transaction.d_ech_transac >= cast(:dEchTransacGreaterThanEqual as date) END
CASE coalesce(cast(:dEchTransacGreaterThanEqual as boolean), null) WHEN not null THEN transaction.d_ech_transac >= cast(:dEchTransacGreaterThanEqual as date) END
to_date(cast(:dEchTransacGreaterThanEqual as date), 'YYYY/MM/DD') is null OR transaction.d_ech_transac >= cast(:dEchTransacGreaterThanEqual as date)
:dEchTransacGreaterThanEqual is null OR transaction.d_ech_transac >= to_date(cast(:dEchTransacGreaterThanEqual as date), 'YYYY/MM/DD')
I don't have more idea to solve my problem.
How can I handle the nullity of my date using the nativeQuery of hibernate?
The last (and dirty) solution is to implement as many repo method as I have use cases (3).
PS. : I have to use nativeQuery because I'm using over() and partition by that isn't handle by Hibernate.
It looks like PG does not support "IS NULL" test for parameters. It means that you must build a query dynamically. If you don't want manipulate strings, you may use FluentJPA's Dynamic Queries, which also supports OVER clause.

tsql convert string into date when possible

I've got a column to import into an Azure SQL DB that is supposed to be made of dates only but of course contains errors.
In TSQL I would like to do something like: convert to date if it's possible otherwise null.
Does anyone know a statement to test the convertibility of a string into a date?
use TryCast or Isdate
select
try_Cast('test' as date)
select try_Cast('4' as date)
select case when ISDATE('test')=1 then cast('test' as date) else null end
TryCast will fail if the expression is not in expected format ..ie.,if the explicit conversion of expression is not permitted
select
try_cast( 4 as xml)
select try_Cast(4 as date)
You could use TRY_PARSE:
Returns the result of an expression, translated to the requested data type, or null if the cast fails. Use TRY_PARSE only for converting from string to date/time and number types.
SELECT TRY_PARSE('20129901' AS DATE)
-- NULL
Additionaly you could add culture:
SELECT TRY_PARSE('10/25/2015' AS DATE USING 'en-US')
And importing:
INSERT INTO target_table(date_column, ...)
SELECT TRY_PARSE(date_string_column AS DATE) ...
FROM source_table
...

Averages in Sql Server Management Studio

I need to get daily averages for several tags in my data. I am running into a problem with the following query that I have set up:
SET NOCOUNT ON
DECLARE #StartDate DateTime
SET #StartDate = '20100101 00:00:00.000'
SET NOCOUNT OFF
SELECT TagName, DateTime, avg(Value), avg(vValue)
FROM History
WHERE TagName IN ('BFC_CGA_PE.qAr_Reading', 'BFC_CGA_PE.qBTU_Avg', 'BFC_CGA_PE.qBTU_Calc', 'BFC_CGA_PE.qCH4_Reading', 'BFC_CGA_PE.qCO_Reading', 'BFC_CGA_PE.qCO2_Reading', 'BFC_CGA_PE.qH2_Reading', 'BFC_CGA_PE.qN2_Reading', 'BFC_CGA_PE.qO2_Reading')
AND wwRetrievalMode = 'Cyclic'
AND wwVersion = 'Latest'
AND DateTime >= #StartDate
The error that I receive after my attempt to execute is:
Msg 8120, Level 16, State 1, Line 5
Column 'History.TagName' is invalid in the select list because it is not contained in either an aggregate function or the GROUP BY clause.
Could someone help to develop a query to fetch daily average values for my tags?
Try this: (GROUP BY clause added and DateTime column removed from query)
SELECT TagName, /*DateTime,*/ avg(Value), avg(vValue)
FROM History
WHERE TagName IN ('BFC_CGA_PE.qAr_Reading', 'BFC_CGA_PE.qBTU_Avg', 'BFC_CGA_PE.qBTU_Calc', 'BFC_CGA_PE.qCH4_Reading', 'BFC_CGA_PE.qCO_Reading', 'BFC_CGA_PE.qCO2_Reading', 'BFC_CGA_PE.qH2_Reading', 'BFC_CGA_PE.qN2_Reading', 'BFC_CGA_PE.qO2_Reading')
AND wwRetrievalMode = 'Cyclic'
AND wwVersion = 'Latest'
AND DateTime >= #StartDate
GROUP BY TagName
ORDER BY TagName
You just need a group by for TagName. Notice I removed your DateTime column for now. Date time values are likely to be unique and therefore not good candidates for aggregation. Not without some work to isolate a section of the data time value.
Add a GROUP BY clause. Also assuming the DateTime field is storing a date and time, you will want to aggregate by date alone to get daily average as in query below:
SELECT
TagName,
DATEADD(D, 0, DATEDIFF(D, 0, DateTime)),
avg(Value),
avg(vValue)
FROM History
WHERE TagName IN ('BFC_CGA_PE.qAr_Reading', 'BFC_CGA_PE.qBTU_Avg', 'BFC_CGA_PE.qBTU_Calc', 'BFC_CGA_PE.qCH4_Reading', 'BFC_CGA_PE.qCO_Reading', 'BFC_CGA_PE.qCO2_Reading', 'BFC_CGA_PE.qH2_Reading', 'BFC_CGA_PE.qN2_Reading', 'BFC_CGA_PE.qO2_Reading')
AND wwRetrievalMode = 'Cyclic'
AND wwVersion = 'Latest'
AND DateTime >= #StartDate
GROUP BY TagName, DATEADD(D, 0, DATEDIFF(D, 0, DateTime))