LibreOffice calc formula: find the value for a given data based on known data - libreoffice

I have LibreOffice Calc spreadsheet with table that calculates cost for two services. I want calculate cost of service #2 based on known data. The known data are rates (0,80 and 0,68: its permanent) and total incl.VAT 21%. Variable data in column C (unknown): C2 always equal to C3. Based on known data, I want split "Total incl. VAT" amount into a two separate parts, service #1 and service #2 cost. In particular, I want know the 'service #2' amount with VAT. (D3 + VAT) Can someone show formula how to make this?
+---+------------+---------------+-----------------+----------+-----------------+
| | A | B | C | D | E |
+---+------------+---------------+-----------------+----------+-----------------+
| 1 | services | Rate (eur/m3) | volume, m3 | Sum(eur) | service #2 cost |
| 2 | service #1 | 0,80 | 71,00 | 56,80 | |
| 3 | service #2 | 0,68 | 71,00 | 48,28 | |
| 4 | | | Subtotal: | 105,08 | |
| 5 | | | VAT 21% | 22,07 | |
| 6 | | | Total incl. VAT | 127,15 | D3 value + VAT |
+---+------------+---------------+-----------------+----------+-----------------+

Related

How to Decompose Global System Metrics to a Per Endpoint Basis on a Webserver

I'm implementing a metrics system for a backend API at scale and am running into a dilemma: using statsd, the application itself is logging request metrics on a per endpoint basis, but the CPU metrics are at the global server level. Currently each server has 10 threads, meaning 10 requests can be processed at once (yeah, yeah its actually serial).
For example, if we have two endpoints, /user and /item, the statsd implementation is differentiating statistics (DB/Redis I/O, etc.) per endpoint. However, say we are looking at linux-metrics every N seconds, those statistics do not separate endpoints, inherently.
I believe that it would be possible, assuming that your polling time ("N seconds") is small enough and that you have enough diversity within your requests, to decompose the global system metrics to create an estimate at the endpoint level.
Image a scenario like this:
note: we'll say a represents a GET to /user and b represents a GET to /item
|------|------|------|------|------|------|------|------|------|------|
| t1 | t2 | t3 | t4 | t5 | t6 | t7 | t8 | t9 | t10 |
|------|------|------|------|------|------|------|------|------|------|
| a | b | b | a | a | b | b | a | b | b |
| b | a | b | | b | a | b | | b | |
| a | b | b | | a | a | b | | a | |
| a | | b | | b | a | a | | a | |
| a | | b | | a | a | b | | | |
| | | | | a | | a | | | |
|------|------|------|------|------|------|------|------|------|------|
At every timestep, t (i.e. t1, t2, etc.), we also take a snapshot of our system metrics. I feel like there should be a way (possibly through a sort of signal decomposition) to estimate the avg load each a/b request takes. Now, in practice I have ~20 routes so it would be far more difficult to get an accurate estimate. But like I said before, provided your requests have enough diversity (but not too much) so that they overlap in certain places like above, it should be at the very least possible to get a rough estimate.
I have to imagine that there is some name for this kind of thing or at the very least some research or naive implementations of this method. In practice, are there any methods that can achieve these kinds of results?
Note: it may be more difficult when considering that requests may bleed over these timesteps, but almost all requests take <250ms. Even if our system stats polling rate is every 5 seconds (which is aggressive), this shouldn't really cause problems. It is also safe to assume that we would be achieving at the very least 50 requests/second on each server, so sparsity of data shouldn't cause problems.
I believe the answer is doing a sum decomposition through linear equations. If we say that a system metric, for example the CPU, is a function CPU(t1), then it would just be a matter of solving the following set of equations for the posted example:
|------|------|------|------|------|------|------|------|------|------|
| t1 | t2 | t3 | t4 | t5 | t6 | t7 | t8 | t9 | t10 |
|------|------|------|------|------|------|------|------|------|------|
| a | b | b | a | a | b | b | a | b | b |
| b | a | b | | b | a | b | | b | |
| a | b | b | | a | a | b | | a | |
| a | | b | | b | a | a | | a | |
| a | | b | | a | a | b | | | |
| | | | | a | | a | | | |
|------|------|------|------|------|------|------|------|------|------|
4a + b = CPU(t1)
a + 2b = CPU(t2)
5b = CPU(t3)
a = CPU(t4)
3a + 3b = CPU(t5)
4a + b = CPU(t6)
2a + 4b = CPU(t7)
a = CPU(t8)
2a + 2b = CPU(t9)
b = CPU(t10)
Now, there will be more than one way to solve this equation (i.e. a = CPU(t8) and a = CPU(t4)), but if you took the average of a and b (AVG(a)) from their corresponding solutions, you should get a pretty solid metric for this.

How to get non-aggregated measures?

I calculate my metrics with SQL and publish the resulting table to Tableau Server. Afterward, use this data source to create charts and dashboards.
For one analysis, I already calculated the measures per day with SQL. When I use the resulting table in Tableau, it aggregates these measures to SUM by default. However, I don't want to have SUM or AVG of the average or SUM of the Percentiles.
What I want is the result when I don't select date dimension and not GROUP BY date in SQL as attached below.
Here is the query:
SELECT
-- date,
COUNT(DISTINCT id) AS count_of_id,
AVG(timediff_in_sec) AS avg_timediff,
PERCENTILE_CONT(0.25) WITHIN GROUP(ORDER BY timediff_in_sec) AS percentile_25,
PERCENTILE_CONT(0.50) WITHIN GROUP(ORDER BY timediff_in_sec) AS percentile_50
FROM
(
--subquery
) AS t1
-- GROUP BY date
Here are the first 10 rows of the resulting table:
+------------+--------------+-------------+---------------+---------------+
| date | avg_timediff | count_of_id | percentile_25 | percentile_50 |
+------------+--------------+-------------+---------------+---------------+
| 10/06/2020 | 61,65186364 | 22 | 8,5765 | 13,3015 |
| 11/06/2020 | 127,2913333 | 3 | 15,6045 | 17,494 |
| 12/06/2020 | 306,0348214 | 28 | 12,2565 | 17,629 |
| 13/06/2020 | 13,2664 | 5 | 11,944 | 13,862 |
| 14/06/2020 | 16,728 | 7 | 14,021 | 17,187 |
| 15/06/2020 | 398,6424595 | 37 | 11,893 | 19,271 |
| 16/06/2020 | 293,6925152 | 33 | 12,527 | 17,134 |
| 17/06/2020 | 155,6554286 | 21 | 13,452 | 16,715 |
| 18/06/2020 | 383,8101429 | 7 | 266,048 | 493,722 |
+------------+--------------+-------------+---------------+---------------+
How can I achieve the desired output above?
Drag them all into the dimensions list, then they will be static dimensions. For your use you could also just drag the Date field to Rows. Aggregating 1 value, which you have for each date, returns the same value whatever the aggregation type.

How to use Tableau TOTAL() on WINDOWS_SUM()

In my tableau workbook, I have a calculated field - "Rolling 12 Month Sales" having the below formula and its working fine.
WINDOWS_SUM(SUM(Sales),-11,0)
Now , I am trying to achieve a Rolling 12 Month Sales % measure.
For this % measure, I am referring an existing calculation - SUM(Sales)/TOTAL(SUM(Sales)) - When this is cut on various segments, I get the percent distribution.
I am trying to get the exact same thing, on the Rolling 12 Month Sales % calculation. I tried the following but its not allowing to use a table calculation inside TOTAL()
WINDOWS_SUM(SUM(Sales),-11,0)/TOTAL(WINDOWS_SUM(SUM(Sales),-11,0))
Original Data
+--------+----------+----------+
| Month | Hardware | Software |
+--------+----------+----------+
| Jan-20 | 5000 | 7500 |
| Feb-20 | 6500 | 10000 |
| Mar-20 | 8000 | 10500 |
| Apr-20 | 11000 | 15000 |
| May-20 | 13500 | 21000 |
+--------+----------+----------+
Rolling 2 Months Sum Sales (This is working fine)
+--------+----------+----------+
| Month | Hardware | Software |
+--------+----------+----------+
| Jan-20 | 5000 | 7500 |
| Feb-20 | 11500 | 17500 |
| Mar-20 | 19500 | 28000 |
| Apr-20 | 25500 | 35500 |
| May-20 | 32500 | 46500 |
+--------+----------+----------+
Rolling 2 Months Sum Sales % - Below are the nos I am trying to achieve.
+--------+----------+----------+
| Month | Hardware | Software |
+--------+----------+----------+
| Jan-20 | 40.00% | 60.00% |
| Feb-20 | 39.66% | 60.34% |
| Mar-20 | 41.05% | 58.95% |
| Apr-20 | 41.80% | 58.20% |
| May-20 | 41.14% | 58.86% |
+--------+----------+----------+
Running out of options!
Best Regards
There should be no need to TOTAL a WINDOW_SUM. I suspect this could be solved with different Compute Using. But first I don't fully understand why you're taking the approach you are attempting. Any chance you could show some sample data with the results you expect along the way? For simplicity in the example it would be easier to do a rolling 2 periods rather than 12.

Tableau - Operations between two tables from different sources and different granularity

I'm using tableau Desktop V9.1.13 with two tables from different data sources (one comes from Bigquery and the other from a postgresql DB) and I want to perform a multiplication between two columns of these tables. The first table has one row for a single transaction and the other has one row for each currency available in the database. I've tried blending and create a calculated field but the last always throws an error. The first table looks like this:
+----------------+--------+----------+
| transaction-id | Value | Currency |
+----------------+--------+----------+
| 123-abc | 120 | BRL |
+----------------+--------+----------+
| 556-fds | 100 | PEN |
+----------------+--------+----------+
| 456-cde | 120000 | COP |
+----------------+--------+----------+
| 789-fgr | 100 | MXN |
+----------------+--------+----------+
And the other table
+----------+--------------+
| Currency | Value in USD |
+----------+--------------+
| COP | 0.0003 |
+----------+--------------+
| BRL | 0.3169 |
+----------+--------------+
| PEN | 0.2958 |
+----------+--------------+
| MXN | 0.0539 |
+----------+--------------+
Now I want to generate a new column called Value_USD that is the product of the transaction amount with the value of its currency.
+----------------+--------+----------+-----------+
| transaction-id | Value | Currency | Value_USD |
+----------------+--------+----------+-----------+
| 123-abc | 120 | BRL | 38 |
+----------------+--------+----------+-----------+
| 556-fds | 100 | PEN | 29.58|
+----------------+--------+----------+-----------+
| 456-cde | 120000 | COP | 40 |
+----------------+--------+----------+-----------+
| 789-fgr | 100 | MXN | 5.39 |
+----------------+--------+----------+-----------+

Tableau: calculated field after data is reshaped

I'm trying to wrap my head around how to created a calculated field in Tableau that is calculated after the source data is pivoted. My source data is "long" i.e. normalized and looks like this:
+---------+---------+-------+
| Company | Measure | Value |
+---------+---------+-------+
| A | Sales | 100 |
+---------+---------+-------+
| A | Exp | -10 |
+---------+---------+-------+
| B | Sales | 200 |
+---------+---------+-------+
| B | Exp | -30 |
+---------+---------+-------+
(Actually every company would have more than two records, but this is simplified)
What I'd like to get out is the following where Net is calculated as Sales + (2 * Exp).
+---------+---------+-------+-------+
| Company | Sales | Exp | Net |
+---------+---------+-------+-------+
| A | 100 | -10 | 80 |
+---------+---------+-------+-------+
| B | 200 | -30 | 140 |
+---------+---------+-------+-------+
I can get the following by simply having Company as my row and Measure as my column and then sum(Value):
+---------+---------+-------+
| Company | Sales | Exp |
+---------+---------+-------+
| A | 100 | -10 |
+---------+---------+-------+
| B | 200 | -30 |
+---------+---------+-------+
But how do I calculate an additional column based on the result of pivoting Measure?
Does this get you what you need?
The crux is creating calculated fields for Exp and Sales like this:
Exp =
if [Measure] = "Exp" then [Value] end
Sales =
if [Measure] = "Sales" then [Value] end
Those become measures you can use as the columns.