Create metric in MicroStrategy that returns Min Date - rdbms

I have a database that maintains history of item receipts. I would like to create a Metric that returns the first receipt date corresponding to an item. The challenge I'm having is that null values are possible and converted to date 1/1/1900 (on a given date the item existed but had not been received). How can I return the next date in value list if first date is 1/1/1900?

You can achieve what you want if you build the metric with:
A filter to ignore your specified "NULL-Date"
Level: Set grouping to "Beginning Fact" for your date attribute
Use min(date attribute) as formula
Here is my test-version of the metric, "Ttime" is the date-attribute
Min(Ttime) {~+, <|Ttime+} <[date filter]; #2; ->
Definition of the "date filter" used (really only a simple "> [NULL-SURROGATE]"):
Ttime (ID) Greater than 1/1/1900

Related

Max Date in Qlikview Expression

I have a Qlikview where I have data loaded for various dates. The pivot table in the qlikview shows the change in values from previous day. so for every selected day, I need the previous date to pick values from.
So I have
Dates 31/08/2021, 28/07/2021, 27/07/2021, 25/07/2021
Based on the date selected, i want the previous date. how do i do it in the qlikview expression.
Assuming your field is named DateField you could try this:
=Max({1<DateField = {"<$(=Min(DateField))"}>} DateField)
"<$(=Min(DateField))" will search for all dates less than the [minimum] date selected. The Max expression will then return the greatest date of this set.

Power Query - Subtract the earliest date in one column from the record-specific date in another column

Every month I download a set of data into a report. That data consists of multiple records and each record has a record specific date as well as having the month end report date on the record's data-row.
I have used Power Query to upload all of these month end reports. I want use Power Query to be able to compare the column of record dates with the earliest date in the column of report dates to see if anybody has fiddled any data entry. The query table has the following headings.
Record ID Record Date Report Date
I've tried adding a custom column using the formula = if Record Date < List.Min(Report Date) then "Old" else "New"
this didn't work and I've spent ages trying to get a solution. I've also tried using Groups to get the minimum value, but I lose all of the other columns, which I want to keep. Any help really appreciated.
You have to refer to the fields in [], so here [Report Date]
To pick a column use Source[Field], so here #"PriorStep"[Report Date]
The List.Min function is not pulling as a number so you cant use <
Insert a Number.From in front of the calculation to convert to number
Same need to add Number.From in front of [Record Date] pulling as a date
Combined code:
#"Added Custom" = Table.AddColumn(#"PriorStep", "Custom", each if Number.From([Record Date])<Number.From(List.Min(#"PriorStep"[Report Date])) then "Old" else "New")

Connecting BigQuery and Google Sheets - DATE parameter issue

following 1 I started creating a Spreadsheet which reads data from BigQuery, but I'm having an issue handling parameters related to date values.
In the first sheet, I created 2 cells with 2 parameters, the start and the end of a date interval, with proper values. Both cells are formatted as "Date" value.
In the second sheet I configured BigQuery connector, for this example, I'm using a public dataset with dates. bigquery-public-data.utility_eu.date_greg
From the BigQuery connector wizard I added:
"STARTDATE" as "PARAMETERS!B1"
"ENDDATE" as "PARAMETERS!B2"
After this configuration, this is the resulting query:
SELECT
date,
date_str,
date_int
FROM `bigquery-public-data.utility_eu.date_greg`
WHERE date > DATE(#STARTDATE) AND date < DATE(#ENDDATE)
LIMIT 10
I'm getting an error directly from the editor with this message:
> Error BigQuery: No matching signature for function DATE for argument types: INT64. Supported signatures: DATE(TIMESTAMP, [STRING]); DATE(DATETIME); DATE(INT64, INT64, INT64) at [8:14]
As far as I can understand, the "date" cells are retrieved as a number, so the direct parse is not working. After a couple of tests, I understood the that given int value is the number I can obtain change cell format to "number".
If you convert cell value from DATE to NUMBER you get this value:
01/05/2019 -> 43.586
31/05/2019 -> 43.616
What is this number? It is not milliseconds, it increases by 1 every next day. In order to create the proper query that can parse this int, I need to understand what is this int (of course I can handle the cell as "text" and writing the timestamp value directly, but I would prefer to have the native date format so I can use the built-in calendar.
My consideration (with simple math) is that this number refers to a number of days since 30/12/1899, but it is very odd (also, every date BEFORE this days is always 0), so I'm asking you directly how to handle this value. Basing on my understanding of when the number counter starts (30/12/1899), I created this query which add the number retrieved from the cell:
SELECT *
FROM `bigquery-public-data.utility_eu.date_greg`
WHERE
date >= DATE_ADD(DATE("1899-12-30"), INTERVAL #DATAINIZIO DAY)
AND date <= DATE_ADD(DATE("1899-12-30"), INTERVAL #DATAFINE DAY)
It is working... but I think I'm doing a workaround that is not the proper way of doing this.
Also, is there any full documentation related to this BigQuery connection provided by Spreadsheet? Besides presentation in 1 I'm unable to find any specific documentation.
Spreadsheets (Google, Excel, ...) store the dates as days passed since a starting date with a fractional day representing time.
From here: "Excel stores dates and times as a number representing the number of days since 1900-Jan-0, plus a fractional portion of a 24 hour day: ddddd.tttttt . This is called a serial date, or serial date-time."
Now, you have to ways to filter by date on your Query:
In the query, you can use DATE_ADD to add your number of days (cell value) to the base date. (Carefull, DATE_ADD takes INT, and the date value is float so needs prior casting).
(preferred) on your spreadsheet you use TEXT(cell, "yyyy-mm-dd") so you can then use DATE() in the BigQuery query.
I use the second method as, though you need that extra cell (unless you directly store the date as YYYY-MM-DD; keeps the query cleaner than having a cast and date_add in there. Also would save you from the "1904 problem" explained in the link above.
What is this number? It is not milliseconds, it increases by 1 every next day.
This is so called serial number which represent number of days since "very beginning"
Google's Spreadsheet date calendar starts from 1900-01-01 - which is treated as a "very beginning"
In order to create the proper query that can parse this int, I need to understand what is this int
Armed with above info you can adjust you dates calculation to be in sync with what BigQuery expects
You mentioned that your fields are already in Date format, maybe you are doing an extra parsing in your query.
Try to do it without the DATE functions.
Also, I found this other doc, not merely related to connection, but might be helpful: Getting info from Spreadsheets with BigQuery.

The group options for a date, time or date-time condition field must be a date group

I'm using VS2017 and trying to convert my reports from Delphi to Asp.Net, but the problem with some until this point that My users can change their sort from the GUI and I need to dynamically adjust the sort in code to match their selection.
To do this I use the following code:
ReportDocument.DataDefinition.Groups[i].ConditionField = ReportDocument.Database.Tables[CrystalReportDatasource].Fields[cField];
However if cField is aDateField and the original is a StringField group I receive the following exception:
The group options for a date, time or date-time condition field must be a date
group options object crystal reports" when I try and excute the above
statement.
Any idea how to fix that?
When you group on a date, Crystal needs to know what type of date grouping you wish to apply (e.g. Every Day, week, or Month...).
You need to take care of that aspect in code or simply create a String formula to convert the date column to a string and Group on that formula instead of on the raw date column.

Double aggregation in Tableau using LOD expressions

I am using Tableau to create a custom google analytics dashboard. I have a custom dimension named author in my google analytics view and I would like to group the date of the first page/view by author and by month having a counter.
I successfully get the date of first page/view using MIN([Date]), but I don't figure out how to use LOD expressions to double aggregate a calculation. I tried the following expression, but tableau shows an error saying that the argument I'm trying to count is already an aggregation and can no longer be aggregated.
{INCLUDE [Author] : COUNT(MIN([Date]))}
What did I miss ?
If your are trying to find the minimum of your [Date] field by your [Author] field, you might want to construct a LOD calculation to find first date, by author like so:
//Creates calculated field for [First Date by author]
{FIXED [Author] : MIN([Date])}
You also have asked about 'double aggregating'. Let's say for example that you wanted find the minimum date, by author, but also by a [Topic]. In this case, you could write:
//Creates calculated field for [First Date by author, by Topic]
{FIXED [Author],[PostTopic] : MIN([Date])}
It is also possible to nest FIXED statements within one another. Let's say for example that you wanted to show maximum [PageViews] for a single month by [Author] and also wanted to limit the time span under consideration to to the first 3 months that an author was publishing. In this case, you could write:
//Creates calculated field for [Most page views in a single month within first 3 months, by Author]
//Example assumes [Date] is a monthly (not daily) data
{ FIXED [Author], [Date] <= DATEADD('month', 3, [First Date by author] ) : MAX({FIXED [Author],[Date]:SUM([PageViews])})}