Is it possible to access forecast results in calculated measure? - tableau-api

I am trying to find time series outlier using Tableau forecast. I need to compare the actual value with the 95% confidence level in forecast results to determine if it is an outlier.
I understand I can view the forecast results on the chart. But I want to use the forecast results in calculated measure. Is there any way to do it? I cannot find any Tableau functions to retrieve the forecast results.

Xuefei. Doesn't look like there is a way currently, at least going by their help page - https://help.tableau.com/v2019.1/pro/desktop/en-us/forecast_options.htm. If you haven't already considered this - integration with R is easy and that way you could just model it in R (accounting for additive/multiplicative, trend/cyclicity/seasonality) and access the forecast values from R. Integration with Python is also supposed to be easy, although I haven't tried it myself.
Example of code in Tableau to incorporate R code for linear regression (this is the formula for the calc field in Tableau)
SCRIPT_REAL("
fv=log(.arg1)
fpri=.arg2
fit=lm(fv~fpri)
exp(fit$fitted)",SUM([Impressions]),SUM([CPM]))

Related

Row level average calculation tableau

I need to calculate average in tableau..one equation was sum(var A* varB)/sum(varB)..I have converted it into row level equation as avg((varAvarB)/varB) and it is working fine.but now I have another equation i.e sum(varAvarB)/total(sum(varB)).could some one help me how to convert this into row level average calculation.
This is a classic scenario where you can leverage Level of Detail expressions in Tableau
read here: https://help.tableau.com/current/pro/desktop/en-us/calculations_calculatedfields_lod_overview.htm
Your expressions would emerge like : AVG([{FIXED [Segment]], [Category]] : SUM([Sales]])}])
You will have greater control over row-level operations using LoD
Please read the link and try it

Qliksense: Compute median of grouped data

I'm facing an issue in QlikSense, trying to compute some statistical indicators (Percentiles, Quartiles, StdDev, Median etc.) on a dataset which is already grouped by the source.
I mean that my dataset is something similar to the following, in which I have for each combination of Week and Customer Age the total number of purchases:
I want to show the median of Customer Age, and due to the structure of the dataset I can't use fractile or median built-in functions, since they would come out with something different.
Let's suppose I want to calculate the median age of people for all the 3 weeks, so that I want to know what's the age of people who have done the 50% of my purchases.
To let you better understand the question, I show you the histogram:
In this case, the median I want to get is 24-26 years, since the 50% of the total population falls under that range.
I found a useful reference here, but I am having troubles in writing this formula in QlikSense
https://mba-lectures.com/statistics/descriptive-statistics/603/relationship-between-quartiles-decile...
Thanks a lot in advance.
[EDIT]: This is my Data Model View:
[EDIT 2]: Here is my qvf with a dataset more similar to the original one I'm using. As you can see, I can't get the correct result using your formula. In addition, I would like to use it in order to plot the trend of the median through weeks, but it doesn't seem to be possible (Even if I use the modified version of the formula I pointed out in the comments).
If you want to calculate median in such a scenario you need to weighted median and basically check which dimension value is in the middle:
Aggr(
If(
(Rangesum(
Above([# Purchases],0,RowNo())
)
/Sum(TOTAL [# Purchases]))>=0.5
and
(Rangesum(
Above([# Purchases],1,RowNo()-1))
/Sum(TOTAL [# Purchases]))<0.5
,[Customer Age])
,[Customer Age])

How do I save forecasted values to use in calculations in Tableau?

I have a report that uses a couple years of historical data and uses the Tableau forecasting to show expected values for the next three months. I would like to be able to save these values for a few reasons.
I need to do calculations on the forecasted values. Multiple different people will be using this forecast and will need different calculations. Some will need to calculate 70% of the forecasted value, and some will need to calculate different percentages. I'd like to have a parameter for them to enter the percent and be able to create a calculation using the parameter and the forecasted value.
I would like to be able to save forecasted values to later show the difference between what was forecasted and what actually happened. I understand you can export your data from your forecast and import that as another data source to show actuals vs forecast, but I need to do it all automatically with no manual intervention.
Is this possible? I haven't found any way to save the values or do calculations on the values that are forecasted. I am using Tableau 10.5
Tableau will never save data. Probably easiest to create the forecast values in the datasource and then report. It's very simple to have "actual" values and "forecast" values so you can compare

Different Aggregation calculations of a measure using two dimensions in Tableau

It is a Tableau 8.3 Desktop Edition question.
I am trying to aggregate data using two different dimensions. So, I want to aggregate twice: first I want to sum over all the rows and then multiply the results in a cummulative manner (so I can build a graph). How do I do that? Ok, too vague, here follow some more details:
I have a set of historical data. The columns are the date, the rows are the categories.
Easy part: I would like to sum all the rows.
Hard part: Given this those summations I want to build a graph that for each date it shows the product of all the summations from the earlier date till this date.
In another words:
Take the sum of all rows, call it x_i, where i is the date.
For each date i find y_i such that y_i = x_0 * x_1 * ... * x_i (if there is missing data, consider it to be one)
Then show a line graph for the y values versus the date.
I have searched for a solution for this and tried to figure it out by myself, but failed.
Thank you very much for your time and help :)
You need n calculated fields (number of columns you have), and manually do the calculation you need:
y_i = sum(field0)*sum(field1)
Basically because you cannot iterate on columns. For tableau, each column represent a different dimension or measure. So it won't consider that there is a logic order among them, meaning, it won't assume that column A comes before column B. It will assume A and B are different things.
Tableau works better with tables organized as databases. So if you have year columns, you should reorganize your data, eliminate all those columns and create a single field called 'Date', which will identify the value of your measure for that date. Yes, you will have less columns but far more rows. But Tableau works better this way (for very good reasons).
Tableau 9.0 allows you to do that directly. I only watched a demo (it was launched yesterday), but I understand that now there is an option to selected those columns (in the Data Connection tab) and convert them to a database format.
With that done, you can use a PREVIOUS_VALUE function to help you. I'm not with Tableau right now. As soon as I get to it I'll update this with the final answer . Unless you take the lead and discover yourself before that ;)

Non-linear regression models in PostgreSQL using R

Background
I have climate data (temperature, precipitation, snow depth) for all of Canada between 1900 and 2009. I have written a basic website and the simplest page allows users to choose category and city. They then get back a very simple report (without the parameters and calculations section):
The primary purpose of the web application is to provide a simple user interface so that the general public can explore the data in meaningful ways. (A list of numbers is not meaningful to the general public, nor is a website that provides too many inputs.) The secondary purpose of the application is to provide climatologists and other scientists with deeper ways to view the data. (Using too many inputs, of course.)
Tool Set
The database is PostgreSQL with R (mostly) installed. The reports are written using iReport and generated using JasperReports.
Poor Model Choice
Currently, a linear regression model is applied against annual averages of daily data. The linear regression model is calculated within a PostgreSQL function as follows:
SELECT
regr_slope( amount, year_taken ),
regr_intercept( amount, year_taken ),
corr( amount, year_taken )
FROM
temp_regression
INTO STRICT slope, intercept, correlation;
The results are returned to JasperReports using:
SELECT
year_taken,
amount,
year_taken * slope + intercept,
slope,
intercept,
correlation,
total_measurements
INTO result;
JasperReports calls into PostgreSQL using the following parameterized analysis function:
SELECT
year_taken,
amount,
measurements,
regression_line,
slope,
intercept,
correlation,
total_measurements,
execute_time
FROM
climate.analysis(
$P{CityId},
$P{Elevation1},
$P{Elevation2},
$P{Radius},
$P{CategoryId},
$P{Year1},
$P{Year2}
)
ORDER BY year_taken
This is not an optimal solution because it gives the false impression that the climate is changing at a slow, but steady rate.
Questions
Using functions that take two parameters (e.g., year [X] and amount [Y]), such as PostgreSQL's regr_slope:
What is a better regression model to apply?
What CPAN-R packages provide such models? (Installable, ideally, using apt-get.)
How can the R functions be called within a PostgreSQL function?
If no such functions exist:
What parameters should I try to obtain for functions that will produce the desired fit?
How would you recommend showing the best fit curve?
Keep in mind that this is a web app for use by the general public. If the only way to analyse the data is from an R shell, then the purpose has been defeated. (I know this is not the case for most R functions I have looked at so far.)
Thank you!
The awesome pl/r package allows you to run R inside PostgreSQL as a procedural language. There are some gotchas because R likes to think about data in terms of vectors which is not what a RDBMS does. It is still a very useful package as it gives you R inside of PostgreSQL saving you some of the roundtrips of your architecture.
And pl/r is apt-get-able for you as it has been part of Debian / Ubuntu for a while. Start with apt-cache show postgresql-8.4-plr (that is on testing, other versions/flavours have it too).
As for the appropriate modeling: that is a whole different ballgame. loess is a fair suggestion for something non-parametric, and you probably also want some sort of dynamic model, either ARMA/ARIMA or lagged regression. The choice of modeling is pretty critical given how politicized the topic is.
I don't think autoregression is what you want. Non-linear isn't what you want either because the implies discontinuous data. You have continuous data, it just may not be a straight line. If you're just visualizing, and especially if you don't know what the shape is supposed to be then loess is what you want.
It's easy to also get a confidence interval band around the line if you just plot the data with ggplot2.
qplot(x, y, data = df, geom = 'point') + stat_smooth()
That will make a nice plot.
If you want to a simpler graph in straight R.
plot(x, y)
lines(loess.smooth(x,y))
May I propose a different solution? Just use PostgreSQL to pull the data, feed it into some R script and finally show the results. The R script may be as complicated as you want as long as the user doesn't have to deal with it.
You may want to have a look at rapache, an Apache module that allows running R scripts in a webpage.
A couple of videos illustrating its use:
Hello world application
Jeffrey Horner's presentation of RApache + links to working apps
In particular check how the San Francisco Estuary Institue Web Query Tool allows the user to interact with the parameters.
As for the regression, I'm not an expert, so I may be saying something extremely stupid... but wouldn't something like a LOESS regression be OK for this?