I want to calculate the total number of simultaneous calls (call that had been in the same time), so I should to make a DAX formulas.I have two dates the start & end date, so I should make a counter to calculate for each call the number of simultaneous calls.
Related
I am an absolute Tableau beginner, so forgive my lack of proper terminology.
Context
To give some context to the problem, think of the dataset as the balances and current interest rates of two different loans for which we are trying to calculate a weighted average cost of funds at any point in time, while retaining the ability to filter on Program (specific loan).
I have a single dataset that looks like:
The Balance field is used as a running sum, i.e. to get the actual balance as of 4/30/2022, you would sum the column across all Date values on or before 4/30/2022.
The Rate field is the opposite: it represents the discrete interest rate as of the Date. Thus, it cannot be summed.
Each data point is specific to a specific loan, or Program.
So to get the interest rate of Program A as of 4/30/2022, you would simply grab the Rate value of the row where Date = 4/30/2022 and Program = A, or 5.30%. Sums are fine here, since the value of Rate is never repeated for a single Program and Date combo, but we cannot use a running sum.
On the other hand, to get the balance of Program A as of 4/30/2022, you would need to add (running sum) the Balance values for all rows where Date <= 4/30/2022 and Program = A, or 10,000 + -2500 + -2500 + -2500 = 2500.
Problem / Need
I need a report (or whatever it's called in Tableau) with the following:
Date as a column
Measures as rows
This report would NOT include Program as a row or column, but would include it as a filter.
In this report, I need a Weighted Average Cost of Funds measure.
This is effectively the weighted average Rate over/weighted by the running sum of Balance across Programs included in the filter, of course for any given Date in the columns.
In other words, by Date, latest Ratefor eachProgramtimes thePrograms running sum of Balance, divided by running sum of all Balancesfor allProgram`s included in filter.
Here's an example in Excel:
Here's an example if we were to exclude Program A:
And here's an example if we were to exclude Program B:
Finally, here's the formulas underneath everything in the Excel example:
I am trying to create an excel sheet that calculates average cases per hour based off of start time, end time, and total case count of the order. Ex. Employee A takes a 200 cs order at 0700 and completes by 0745. Employee B takes a 350 case order at 0715 and completes at 0920. Is it possible to create a spreadsheet with formulas that will calculate after entering the required data?
Looking for the pick rates per order, then would like to average on an 8 hour day per each employee.
When I apply Month/Year to Cases or Deaths from my data, the values explode. For Cases it goes from approximately 48 million to over 1 billion, and for Deaths it goes from about 700 thousand to over 22 million. However, when I try the same thing with Initial Claims or the Stringency Index, my values remain correct. I'm trying to find the month over month percentage change by the way. And I'm using the Date column. I only select 2020 and 2021 in the filter for Year.
What I'm asking about is Sheet 21.
Link to workbook: https://public.tableau.com/app/profile/nilajah.rivers/viz/CoronaVirusProject_16323687296770/Sheet21
Your problem is that the data points are daily cumulative deaths. If you change the date aggregation to anything other than days, Tableau will default to summing the numbers for all the days in the month. This will give the wrong result, obviously.
If you want to show the correct total deaths or cases regardless of the time aggregation (months, days, weeks etc.) then you could use the New Case or New Death numbers plus a running sum table calculation. This will always give the correct total for the time period.
Table calculations will also allow automatic calculation of the period to period % change from the same data fields.
This is a common problem when working with datasets that offer pre-calculated aggregations. Tableau doesn't need that as it can dynamically calculate the aggregation of a field over any given time period but it is easy to forget which field has pre-aggregated data and which has raw data. Pre-aggregated fields assume a particular time period and can't be used for different time periods without disentangling that assumption (which is unnecessary if you also have the raw data (in this case daily new deaths/cases).
I am using a MATLAB toolbox, specifically, https://uk.mathworks.com/matlabcentral/fileexchange/32882-armax-garch-k-sk-toolbox-estimation-forecasting-simulation-and-value-at-risk-applications
to insert data into the functions, the author defines a data matrix and then uses data(:,3) for the third column which represents a series.
I would like to do this put add data(:,3) lagged by one period.
My question: is there a way I can write something in Matlab that lags the dataset by one period which can be inserted into the function.
If I understand correctly, you would like to lag a series by one time period, with the time period being however you collect the data, for example, daily data, lag the series by one day.
If so you can use the lagmatrix
To provide an example,
LAGGEDX = lagmatrix(data(:,3),1)
This would lag your data(:,3) series by one day if it is daily data, you could then insert LAGGEDX in replace of data(:,3).
Background: I’m doing analysis of call detail record (CDR) data in order to segmentify customer with respect to their call duration, time of call (holiday call or non holiday call, Business call or non Business call), age group of subscriber and gender. Data is from two table name cdr (include card_number, service_key, calling, called, start_time, clear_time, duration column) and subscriber_detail (include subscriber_name, subscriber_address, DOB, gender column)
I have design OLAP as given below.
Call_date includes Date of call with year, month, and day. Call_time is time of call happen in second.
Question:- if we take call_time in second then it has 86400 column for each day (may be curse of dimensionality) and so we think to reduce its dimensional by taking 30 second time pulse ( telecom charges money on the basic of the pulse and 30 is pulse duration for our context). First Question is :- Is it the best way to replace time by pulse duration? And second is :- if one subscriber do more than 2 call on range of pulse it may cause problem i.e. first call start at 21:01:00 and end at 21:01:05 and he start second call at 21:01:15 and end at 21:01:20. How to resolve these type of problem.
If I were you I would divide the time in 10 minute slot and use link list to store multiple duration time within given time slot so total dimension of time is 144 (Which restrict roll down upto 10 minutes only).
I would keep start_call_time, end_call_time and ellapsed_call_time in seconds.
Then having ellapsed_time does not mean the cube would have a dimension of 86400 members; you could setup a 'ranged/banded' dimension : i.e., a dimension that is built using intervals instead of instants. This is something possible for example with icCube (www).