I have a bunch of daily change % data. I would like to calculate cumulative change, which should just be (1+change)*previous day in a chart in Tableau.
Seems simple enough right? I can do it in a few seconds in Excel, but I've tried for hours to get it to work in Tableau and cannot do it.
My thought was that I can create a column that is (1+daily change%), then try to do a compound product. However, I can't seem to get it to work.
I can't attach any files here so I pasted the data, along with a column that is "cum change", which is what I would like the calculation to be.
Thank you much in advance!
Date Daily Change Cum Change
4/1/2015 0.47% 1
4/2/2015 0.56% 1.0056
4/3/2015 -0.72% 0.99835968
4/6/2015 -0.56% 0.992768866
4/7/2015 -0.80% 0.984826715
4/8/2015 0.44% 0.989159952
4/9/2015 -0.66% 0.982631497
4/10/2015 0.99% 0.992359549
4/13/2015 0.92% 1.001489256
4/14/2015 0.73% 1.008800128
4/15/2015 0.95% 1.018383729
4/16/2015 0.42% 1.022660941
4/17/2015 0.52% 1.027978778
4/20/2015 0.02% 1.028184373
4/21/2015 0.56% 1.033942206
4/22/2015 0.35% 1.037561004
4/23/2015 -0.34% 1.034033296
4/24/2015 0.18% 1.035894556
4/27/2015 0.61% 1.042213513
4/28/2015 0.46% 1.047007695
4/29/2015 0.94% 1.056849568
Create a calculated field:
IF INDEX() = 1
THEN 1
ELSE
(1 + AVG([Daily Change])) * PREVIOUS_VALUE(1)
END
The condition checking to see if it's the first row of the partition (INDEX() = 1) is necessary to ensure that the first value of the field is a 1. After that, you can just use the self-referential PREVIOUS_VALUE() to get the previous value of this same calculation.
Related
I have a list of date ranges for the past 8 quarters given by the below function
q) findLastYQuarters:{reverse("d"$(-3*til y)+m),'-1+"d"$(-3*-1+til y)+m:3 bar"m"$x}[currentDate;8]
q) findLastYQuarters
2020.01.01 2020.03.31
2020.04.01 2020.06.30
2020.07.01 2020.09.30
2020.10.01 2020.12.31
2021.01.01 2021.03.31
2021.04.01 2021.06.30
2021.07.01 2021.09.30
2021.10.01 2021.12.31
I need to produce a separate list that labels each item in this list by a specific format; the second list would need to be
1Q20,2Q20,3Q20,4Q20,1Q21,2Q21,3Q21,4Q21
This code needs to be able to run on it's own, so how can I take the first list as an input and produce the second list? I thought about casting the latter date in the range as a month and dividing it by 3 to get the quarter and extracting the year, but I couldn't figure out how to actually implement that. Any advice would be much appreciated!
I'm sure there are many ways to solve this, a function like f defined below would do the trick:
q)f:{`$string[1+mod[`month$d;12]%3],'"Q",/:string[`year$d:x[;0]][;2 3]}
q)lyq
2020.01.01 2020.03.31
2020.04.01 2020.06.30
2020.07.01 2020.09.30
2020.10.01 2020.12.31
2021.01.01 2021.03.31
2021.04.01 2021.06.30
2021.07.01 2021.09.30
2021.10.01 2021.12.31
q)f lyq
`1Q20`2Q20`3Q20`4Q20`1Q21`2Q21`3Q21`4Q21
Figured it out.
crop:findLastYQuarters;
crop[0]:crop[0][1];
crop[1]:crop[1][1];
crop[2]:crop[2][1];
crop[3]:crop[3][1];
crop[4]:crop[4][1];
crop[5]:crop[5][1];
crop[6]:crop[6][1];
crop[7]:crop[7][1];
labels:()
labelingFunc:{[r] temp:("." vs string["m"$r]); labels,((string(("J"$temp[1])%3)),"Q",(temp[0][2,3])};
leblingFunc each crop;
labels
I have two timetables, each of them have 4 columns, where the first 2 columns are of my particular interest. The first column is a date and the second is an hour.
How can I know which observations (by date an hour) are in the timetable 1 but not in the timetable 2 and, therefore, drop those observations from my timetable 1?
So for example, just by looking I realized that timetable1 included the day 25/05/2015 with hours 1 and 2, but the timetable 2 did not include them, therefore I would like to drop those observations from timetable 1.
I tried using the command groups_timetable1 = findgroups(timetable1.Date,timetable1.Hour);but unfortunately this command does not tell you a lot how to distinguish between observations.
Thank you!
call ismember to find one set of data in another.
to find multiple records as a group in another composite records, you call ismember(..., 'rows').
for example
baseline=[
100, 2.1
200, 7.5
120, 11.0
];
isin=ismember(baseline,[200, 7.5],'rows');
pos=find(isin)
if you have time date strings or datetime objects, please convert those to numerical values, such as by calling datenum or posixtime first.
You can use the timetable method innerjoin to do this. Like so:
% Fabricate some data
dates1 = datetime(2015, 5, ones(10,1));
hours1 = (1:10)';
timetable1 = timetable(dates1(:), hours1, rand(10,1), rand(10,1), ...
'VariableNames', {'Hour', 'Price', 'Volume'});
% Subselect a few rows for timetable2
timetable2 = timetable1([1:3, 6:10],:);
% Use innerjoin to pick rows where Time & Hour intersect:
innerjoin(timetable1, timetable2, 'Keys', {'Time', 'Hour'})
By default, the result of innerjoin contains the table variables from both input tables - that may or may not be what you want.
This question is a slightly varied version of this one...
Now I'm using Measures instead of Calculated columns and the date is static instead of having it based on a dropdown list.
Here's the Power BI test .pbix file:
https://drive.google.com/open?id=1OG7keqhdvDUDYkFQFMHyxcpi9Zi6Pn3d
This printscreen describes what I'm trying to accomplish:
Basically the date in P6 Update table is used as a cut date and will be fixed\static. It's imported from an Excel sheet where the user can customize it however they want.
Here's what should happen when a matching row in Test data table is found for P6 Update date:
column Earned Daily - must have its value summed with the next row if there's one;
column Earned Cum - must grab the next row's value;
all the previous rows should remain intact, that is, their values won't change;
all subsequent rows must have their values assigned 0.
So for example:
If P6 Update is 1-May-2018, this is the expected result:
1-May 7,498 52,106
2-May 0 0
If P6 Update is 30-Apr-2018, this is the expected result:
30-Apr 13,173 50,699
1-May 0 0
2-May 0 0
If P6 Update is 29-Apr-2018, this is the expected result:
29-Apr 11,906 44,608
30-Apr 0 0
1-May 0 0
2-May 0 0
and so on...
Hope this makes sense.
This is easier in Excel, but trying to do this in Power BI is making me go nuts.
I will ignore previously asked related questions and start from scratch.
First, create a measure:
Current Earn =
CALCULATE (
SUM( 'Test data'[Value]),
'Test data'[Act Rem] = "Actual Units",
'Test data'[Type] = "Current"
)
This measure will be used in other measures, to save you from typing all these conditions ("Actual Units" and "Current") again and again. It's a great practice to re-use measures in other measures - saves work, makes code cleaner and easier to refactor.
Create another measure:
Cut Date = SELECTEDVALUE('P6 Update'[Date])
We will use this measure whenever we need a cut off date. Please note that it does not have to be hard-coded - if P6 table contains a list of dates, you can create a pull-down slicer from the dates, and can choose the cut-off date dynamically. The formula will work properly.
Create third measure:
Next Earn =
VAR Cut_Date = [Cut Date]
VAR Current_Date = MAX ( 'Test data'[Date] )
VAR Next_Date = Current_Date + 1
VAR Current_Earn = [Current Earn]
VAR Next_Earn = CALCULATE ( [Current Earn], 'Test data'[Date] = Next_Date )
RETURN
SWITCH (
TRUE,
Current_Date < Cut_Date, Current_Earn,
Current_Date = Cut_Date, Current_Earn + Next_Earn,
BLANK ()
)
I am not sure if "Next Earn" is a good name for it, hopefully you will find a more intuitive name. The way it works: we save all necessary inputs into variables, and then use SWITCH function to define the results. Hopefully it's self-explanatory. (Note: if you need 0 above Cut Date, replace BLANK() with 0).
Finally, we define a measure for cumulative earn. It does not require any special logic, because previous measure takes care of it properly:
Cum Earn =
VAR Current_Date = MAX('Test data'[Date])
RETURN
CALCULATE(
[Next Earn],
FILTER(ALL('Test data'[Date]), 'Test data'[Date] <= Current_Date))
Result:
New to CR and use CR v10 and SQL Server 2000.
For the first record i.e Beginning Balance , the calculation is sum(field) from the input date, which I have calculated in SP as BegDateSum
But for the rest of the records under a group, the calculation should be previous(balance)+IN+OUT
Sample has been given:
Date Doc Descrip IN OUT Balance
Group Header-------- Beginning Balance-------------- 50 <---- sum(field) from my inputdate
3/2/2012 A -1 0 49 <-- (50+(-1)+0)
4/2/2012 B -2 0 47 <-- (49+(-2)+0)
5/2/2012 C 0 3 50
6/2/2012 D -2 3 51
How do I achieve this?
I am not sure whether to use running total, in case I have to how to do it.
A running total field won't work in this case, they are designed to add up (or count, or average, etc) one field and give you the sub-totals automatically. But, we can do some custom functions that will give the results you need. Assuming that your initial 50 is a static value, you would set a variable to that amount, and then add the IN and OUT values as you go along (printing that result of that).
First, initialize the value in the report header with a formula like:
WhilePrintingRecords;
Global NumberVar Balance;
Balance := 50;
""; //print nothing on the screen
Then, the formula to calculate and show the new balance, in the bar where the data is:
WhilePrintingRecords;
Global NumberVar Balance;
Balance := Balance + {tableName.IN} + {tableName.OUT};
The last line both calculates the new value, and tells what the result of the formula should be.
If the "50" is calculated somehow, then that will have to be done before the formula that calculates the new balance. If it is based off of the first record read in, you'll want to use a formula that includes If PreviousIsNull({tableName.Balance}) Then ..., that is usually a good indicator of the first record in the data set (unless that field can be null).
I'm looking for a function or solution to the following:
For the chart in SQL Reporting i need to multiply values from a Column A. For summation i would use =SUM(COLUMN_A) for the chart. But what can i use for multiplication - i was not able to find a solution so far?
Currently i am calculating the value of the stacked column as following:
=ROUND(SUM(Fields!Value_Is.Value)/SUM(Fields!StartValue.Value),3)
Instead of SUM i need something to multiply the values.
Something like that:
=ROUND(MULTIPLY(Fields!Value_Is.Value)/MULTIPLY(Fields!StartValue.Value),3)
EDIT #1
Okay tried to get this thing running.
The expression for the chart looks like this:
=Exp(Sum(Log(IIf(Fields!Menge_Ist.Value = 0, 10^-306, Fields!Menge_Ist.Value)))) / Exp(Sum(Log(IIf(Fields!Startmenge.Value = 0, 10^-306, Fields!Startmenge.Value))))
If i calculate my 'needs' manually i have to get the following result:
In my SQL Report i get the following result:
To make it easier, these are the raw values:
and you have the possibility to group the chart by CW, CQ or CY
(The values from the first pictures are aggregated Sum values from the raw values by FertStufe)
EDIT #2
Tried your expression, which results in this:
Just to make it clear:
The values in the column
=Value_IS / Start_Value
in the first picture are multiplied against each other
0,9947 x 1,0000 x 0,59401 = 0,58573
Diffusion Calenderweek 44 Sums
Startvalue: 1900,00 Value Is: 1890,00 == yield:0,99474
Waffer unbestrahlt Calenderweek 44 Sums
Startvalue: 620,00 Value Is: 620,00 == yield 1,0000
Pellet Calenderweek 44 Sums
Startvalue: 271,00 Value Is: 160,00 == yield 0,59041
yield Diffusion x yield Wafer x yield Pellet = needed Value in chart = 0,58730
EDIT #3
The raw values look like this:
The chart ist grouped - like in the image - on these fields
CY (Calendar year), CM (Calendar month), CW (Calendar week)
You can download the data as xls here:
https://www.dropbox.com/s/g0yrzo3330adgem/2013-01-17_data.xls
The expression i use (copy / past from the edit window)
=Exp(Sum(Log(Fields!Menge_Ist.Value / Fields!Startmenge.Value)))
I've exported the whole report result to excel, you can get it here:
https://www.dropbox.com/s/uogdh9ac2onuqh6/2013-01-17_report.xls
it's actually a workaround. But I am pretty sure is the only solution for this infamous problem :D
This is how I did:
Exp(∑(Log(X))), so what you should do is:
Exp(Sum(Log(Fields!YourField.Value)))
Who said math was worth nothing? =D
EDIT:
Corrected the formula.
By the way, it's tested.
Addressing Ian's concern:
Exp(Sum(Log(IIf(Fields!YourField.Value = 0, 10^-306, Fields!YourField.Value))))
The idea is change 0 with a very small number. Just an idea.
EDIT:
Based on your updated question this is what you should do:
Exp(Sum(Log(Fields!Value_IS.Value / Fields!Start_Value.Value)))
I just tested the above code and got the result you hoped for.