Cognos Calculate Variance Crosstab (Relational) - crosstab

I have a simple crosstab such as this:
Trans | Pants | Shirts |
| 2013 | 2014 | 2013 | 2014 |
---------------------------------------
Jan | 33 | 37 | 41 | 53 |
Feb | 31 | 33 | 38 | 43 |
Mar | 26 | 29 | 51 | 56 |
Pants and Shirt belong to the data item: Category
Years belong to the data item: Years
Months belong to the data item: Months
Trans (transactions) belongs to the data item: Trans
Here is what is looks like in report studio:
Trans | <#Category#> | <#Category#> |
| <#Years#> | <#Years#> | <#Years#> | <#Years#> |
-----------------------------------------------------------
<#Months#>| <#1234#> | <#1234#> | <#1234#> | <#1234#> |
I want to be able to calculate the variance of pants and shirts between the years. To get something like this:
Trans | Pants | Shirts |
| 2013 | 2014 | YOY Variance | 2013 | 2014 | YOY Variance |
---------------------------------------------------------------------
Jan | 33 | 37 | 12.12 | 41 | 53 | 29.27 |
Feb | 31 | 33 | 6.45 | 38 | 43 | 13.16 |
Mar | 26 | 29 | 11.54 | 51 | 56 | 9.80 |
I've tried inserting a data item for YOY Variance with the expression below just to see if I can even get the 2014 value but cannot, some odd reason it only returns the 2013 values:
Total([Trans] for maximum[Year],[Category],[Months])
Any ideas? Help?

(I'm assuming you don't have a DMR.)
There is no easy/clean way to do this in Cognos. In your query, you'll have to build a calculation for each year in your output. So, something like this for 2013:
total (if [Years] = 2013) then ([Trans]) else (0))
And basically the same for 2014.
Cut the Trans piece out of your crossab. Then you'll nest those two calcs under your years. To get rid of all the zeroes or nulls, select the two columns. From the menu, select Data,
Suppress,Suppress Columns Only.
Finally, you will drop a calc in next to your Years in the crosstab (not under them). The expression will be ([2014 trans] - [2013 trans])/[2014 trans] (or whatever you end up naming your calcs). Format it as a percent, and you should be good to go.
Told you it was a pain!

Related

stacked bar chart with time series and lines chart as the second axis

I am working with time series data in csv file and I want to have stacked bar chart by time series.
my data looks like this:
|Year |Major | Moderate | Sample depth
|1900 | 12 | 32 | 5
|1901 | 16 | 25 | 6
|1903 | 18 | 12 | 10
|1908 | 20 | 25 | 21
|1910 | 15 | 32 | 22
|1911 | 26 | 24 | 22
|1913 | 22 | 22 | 25
|1918 |20 | 35 | 27
......
I want to have values of Major and Moderate in stacked bar chart and want to add the sample depth to the second axis.
Can anyone provide me with scripts. I am very much appreciated.

Difference between periods in crosstabs, Jaspersoft Studio

I'm working with Jaspersoft Studio.
I've a crosstab similar to this one:
+---------+------+------+
| FRUIT | 2014 | 2015 |
+---------+------+------+
| apples | 12 | 85 |
+---------+------+------+
| peaches | 74 | 36 |
+---------+------+------+
| bananas | 54 | 82 |
+---------+------+------+
and I would like to obtain something like this:
+---------+------+------+---------------------+
| FRUIT | 2014 | 2015 | Difference |
| | | | between 2015/2014 |
| | | | (delta) |
+---------+------+------+---------------------+
| apples | 12 | 85 | 73 |
+---------+------+------+---------------------+
| peaches | 74 | 36 | -38 |
+---------+------+------+---------------------+
| bananas | 54 | 82 | 28 |
+---------+------+------+---------------------+
Is it possible?
I mean, it's there a way to intercetpt the value of a field in the single period (e.g. 2014) and use it for a custom calculation?

Window function to achieve running sum that resets in Postgres SQL [duplicate]

I wrote a query that creates two columns: the_day, and the amount_raised on that day. Here is what I have:
And I would like to add a column that has a running sum of amount_raised:
Ultimately, I would like the sum column to reset after it reaches 1 million.
The recursive approach is above my pay grade, so if anyone knows a way to reset the sum without creating an entirely new table, please comment (maybe with a RESET function?). Thank you
I'd like to thank Juan Carlos Oropeza for providing a script and SQLFiddle with the test data. George, you should have done that.
The query itself it rather simple.
At first calculate a simple running sum (CTE_RunningSum) and divide it by 1,000,000 to get number of whole millions.
Then calculate the running sum again with partitioning by the number of millions.
SQL Fiddle
I included the columns RunningSum and Millions in the final result to illustrate how the query works.
WITH
CTE_RunningSum
AS
(
SELECT
ID
,day_t
,collect
,SUM(collect) OVER(ORDER BY day_t, id) AS RunningSum
,(SUM(collect) OVER(ORDER BY day_t, id)) / 1000000 AS Millions
FROM myTable
)
SELECT
ID
,day_t
,collect
,RunningSum
,Millions
,SUM(collect) OVER(PARTITION BY Millions ORDER BY day_t, id) AS Result
FROM CTE_RunningSum
ORDER BY day_t, id;
Result
| id | day_t | collect | runningsum | millions | result |
|-----|-----------------------------|---------|------------|----------|---------|
| 90 | March, 11 2015 00:00:00 | 69880 | 69880 | 0 | 69880 |
| 13 | March, 25 2015 00:00:00 | 69484 | 139364 | 0 | 139364 |
| 49 | March, 27 2015 00:00:00 | 57412 | 196776 | 0 | 196776 |
| 41 | March, 30 2015 00:00:00 | 56404 | 253180 | 0 | 253180 |
| 99 | April, 03 2015 00:00:00 | 59426 | 312606 | 0 | 312606 |
| 1 | April, 10 2015 00:00:00 | 65825 | 378431 | 0 | 378431 |
| 100 | April, 27 2015 00:00:00 | 60884 | 439315 | 0 | 439315 |
| 50 | May, 11 2015 00:00:00 | 39641 | 478956 | 0 | 478956 |
| 58 | May, 11 2015 00:00:00 | 49759 | 528715 | 0 | 528715 |
| 51 | May, 17 2015 00:00:00 | 32895 | 561610 | 0 | 561610 |
| 15 | May, 19 2015 00:00:00 | 50847 | 612457 | 0 | 612457 |
| 66 | May, 29 2015 00:00:00 | 66332 | 678789 | 0 | 678789 |
| 4 | June, 04 2015 00:00:00 | 46891 | 725680 | 0 | 725680 |
| 38 | June, 09 2015 00:00:00 | 64732 | 790412 | 0 | 790412 |
| 79 | June, 14 2015 00:00:00 | 62843 | 853255 | 0 | 853255 |
| 37 | June, 28 2015 00:00:00 | 54315 | 907570 | 0 | 907570 |
| 59 | June, 30 2015 00:00:00 | 34885 | 942455 | 0 | 942455 |
| 71 | July, 08 2015 00:00:00 | 46440 | 988895 | 0 | 988895 |
| 31 | July, 10 2015 00:00:00 | 39649 | 1028544 | 1 | 39649 |
| 91 | July, 12 2015 00:00:00 | 65048 | 1093592 | 1 | 104697 |
| 57 | July, 14 2015 00:00:00 | 60394 | 1153986 | 1 | 165091 |
| 98 | July, 20 2015 00:00:00 | 34481 | 1188467 | 1 | 199572 |
| 3 | July, 26 2015 00:00:00 | 58672 | 1247139 | 1 | 258244 |
| 95 | August, 19 2015 00:00:00 | 52393 | 1299532 | 1 | 310637 |
| 74 | August, 20 2015 00:00:00 | 37972 | 1337504 | 1 | 348609 |
| 20 | August, 27 2015 00:00:00 | 36882 | 1374386 | 1 | 385491 |
| 2 | September, 07 2015 00:00:00 | 39408 | 1413794 | 1 | 424899 |
| 14 | September, 09 2015 00:00:00 | 40234 | 1454028 | 1 | 465133 |
| 6 | September, 17 2015 00:00:00 | 65957 | 1519985 | 1 | 531090 |
| 93 | September, 29 2015 00:00:00 | 47213 | 1567198 | 1 | 578303 |
| 35 | September, 30 2015 00:00:00 | 49446 | 1616644 | 1 | 627749 |
| 86 | October, 11 2015 00:00:00 | 34291 | 1650935 | 1 | 662040 |
| 75 | October, 12 2015 00:00:00 | 31448 | 1682383 | 1 | 693488 |
| 19 | October, 14 2015 00:00:00 | 48509 | 1730892 | 1 | 741997 |
| 56 | October, 26 2015 00:00:00 | 30072 | 1760964 | 1 | 772069 |
| 48 | October, 28 2015 00:00:00 | 58527 | 1819491 | 1 | 830596 |
| 40 | November, 05 2015 00:00:00 | 67293 | 1886784 | 1 | 897889 |
| 33 | November, 09 2015 00:00:00 | 41944 | 1928728 | 1 | 939833 |
| 34 | November, 11 2015 00:00:00 | 35516 | 1964244 | 1 | 975349 |
| 85 | November, 20 2015 00:00:00 | 43920 | 2008164 | 2 | 43920 |
| 18 | November, 23 2015 00:00:00 | 44925 | 2053089 | 2 | 88845 |
| 62 | December, 24 2015 00:00:00 | 34678 | 2087767 | 2 | 123523 |
| 67 | December, 25 2015 00:00:00 | 35323 | 2123090 | 2 | 158846 |
| 81 | December, 28 2015 00:00:00 | 37071 | 2160161 | 2 | 195917 |
| 54 | January, 02 2016 00:00:00 | 32330 | 2192491 | 2 | 228247 |
| 70 | January, 06 2016 00:00:00 | 47875 | 2240366 | 2 | 276122 |
| 28 | January, 23 2016 00:00:00 | 40250 | 2280616 | 2 | 316372 |
| 65 | January, 25 2016 00:00:00 | 49404 | 2330020 | 2 | 365776 |
| 73 | January, 26 2016 00:00:00 | 65879 | 2395899 | 2 | 431655 |
| 5 | February, 05 2016 00:00:00 | 53953 | 2449852 | 2 | 485608 |
| 32 | February, 11 2016 00:00:00 | 44988 | 2494840 | 2 | 530596 |
| 53 | February, 25 2016 00:00:00 | 68948 | 2563788 | 2 | 599544 |
| 83 | March, 11 2016 00:00:00 | 47244 | 2611032 | 2 | 646788 |
| 8 | March, 25 2016 00:00:00 | 51809 | 2662841 | 2 | 698597 |
| 82 | March, 25 2016 00:00:00 | 66506 | 2729347 | 2 | 765103 |
| 88 | April, 06 2016 00:00:00 | 69288 | 2798635 | 2 | 834391 |
| 89 | April, 14 2016 00:00:00 | 43162 | 2841797 | 2 | 877553 |
| 52 | April, 23 2016 00:00:00 | 47772 | 2889569 | 2 | 925325 |
| 7 | April, 27 2016 00:00:00 | 33368 | 2922937 | 2 | 958693 |
| 84 | April, 27 2016 00:00:00 | 57644 | 2980581 | 2 | 1016337 |
| 17 | May, 17 2016 00:00:00 | 35416 | 3015997 | 3 | 35416 |
| 61 | May, 17 2016 00:00:00 | 64603 | 3080600 | 3 | 100019 |
| 87 | June, 07 2016 00:00:00 | 41865 | 3122465 | 3 | 141884 |
| 97 | June, 08 2016 00:00:00 | 64982 | 3187447 | 3 | 206866 |
| 92 | June, 15 2016 00:00:00 | 58684 | 3246131 | 3 | 265550 |
| 23 | June, 26 2016 00:00:00 | 46147 | 3292278 | 3 | 311697 |
| 46 | June, 30 2016 00:00:00 | 61921 | 3354199 | 3 | 373618 |
| 94 | July, 03 2016 00:00:00 | 55535 | 3409734 | 3 | 429153 |
| 60 | July, 07 2016 00:00:00 | 63607 | 3473341 | 3 | 492760 |
| 45 | July, 20 2016 00:00:00 | 51965 | 3525306 | 3 | 544725 |
| 96 | July, 20 2016 00:00:00 | 46684 | 3571990 | 3 | 591409 |
| 29 | August, 09 2016 00:00:00 | 37707 | 3609697 | 3 | 629116 |
| 69 | August, 11 2016 00:00:00 | 37194 | 3646891 | 3 | 666310 |
| 80 | August, 19 2016 00:00:00 | 62673 | 3709564 | 3 | 728983 |
| 36 | August, 28 2016 00:00:00 | 48237 | 3757801 | 3 | 777220 |
| 39 | August, 29 2016 00:00:00 | 48159 | 3805960 | 3 | 825379 |
| 25 | August, 30 2016 00:00:00 | 60958 | 3866918 | 3 | 886337 |
| 68 | September, 04 2016 00:00:00 | 50167 | 3917085 | 3 | 936504 |
| 55 | September, 08 2016 00:00:00 | 31193 | 3948278 | 3 | 967697 |
| 64 | September, 10 2016 00:00:00 | 31157 | 3979435 | 3 | 998854 |
| 42 | September, 14 2016 00:00:00 | 52878 | 4032313 | 4 | 52878 |
| 43 | September, 15 2016 00:00:00 | 54728 | 4087041 | 4 | 107606 |
| 77 | September, 18 2016 00:00:00 | 65320 | 4152361 | 4 | 172926 |
| 12 | September, 23 2016 00:00:00 | 43597 | 4195958 | 4 | 216523 |
| 30 | September, 26 2016 00:00:00 | 32764 | 4228722 | 4 | 249287 |
| 10 | September, 27 2016 00:00:00 | 47038 | 4275760 | 4 | 296325 |
| 47 | October, 08 2016 00:00:00 | 46280 | 4322040 | 4 | 342605 |
| 26 | October, 10 2016 00:00:00 | 69487 | 4391527 | 4 | 412092 |
| 63 | October, 30 2016 00:00:00 | 49561 | 4441088 | 4 | 461653 |
| 78 | November, 15 2016 00:00:00 | 40138 | 4481226 | 4 | 501791 |
| 27 | November, 27 2016 00:00:00 | 57378 | 4538604 | 4 | 559169 |
| 21 | December, 01 2016 00:00:00 | 35336 | 4573940 | 4 | 594505 |
| 16 | December, 03 2016 00:00:00 | 39671 | 4613611 | 4 | 634176 |
| 22 | December, 13 2016 00:00:00 | 34574 | 4648185 | 4 | 668750 |
| 72 | January, 29 2017 00:00:00 | 55084 | 4703269 | 4 | 723834 |
| 44 | January, 30 2017 00:00:00 | 36742 | 4740011 | 4 | 760576 |
| 24 | February, 01 2017 00:00:00 | 31061 | 4771072 | 4 | 791637 |
| 76 | February, 12 2017 00:00:00 | 35059 | 4806131 | 4 | 826696 |
| 9 | February, 27 2017 00:00:00 | 39767 | 4845898 | 4 | 866463 |
| 11 | February, 28 2017 00:00:00 | 66007 | 4911905 | 4 | 932470 |
I took a look again and couldnt solve it with a Windows Function so I took the recursive aproach
SQL Fiddle Demo
Sample Data: 100 rows random dates between 2015-2017 amounts between 10k - 70k
DROP TABLE IF EXISTS "myTable";
CREATE TABLE "myTable" (
id SERIAL PRIMARY KEY,
day_t varchar(255),
collect integer NULL
);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-04-10',65825),('2015-09-07',39408),('2015-07-26',58672),('2015-06-04',46891),('2016-02-05',53953),('2015-09-17',65957),('2016-04-27',33368),('2016-03-25',51809),('2017-02-27',39767),('2016-09-27',47038);
INSERT INTO "myTable" (day_t,collect) VALUES ('2017-02-28',66007),('2016-09-23',43597),('2015-03-25',69484),('2015-09-09',40234),('2015-05-19',50847),('2016-12-03',39671),('2016-05-17',35416),('2015-11-23',44925),('2015-10-14',48509),('2015-08-27',36882);
INSERT INTO "myTable" (day_t,collect) VALUES ('2016-12-01',35336),('2016-12-13',34574),('2016-06-26',46147),('2017-02-01',31061),('2016-08-30',60958),('2016-10-10',69487),('2016-11-27',57378),('2016-01-23',40250),('2016-08-09',37707),('2016-09-26',32764);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-07-10',39649),('2016-02-11',44988),('2015-11-09',41944),('2015-11-11',35516),('2015-09-30',49446),('2016-08-28',48237),('2015-06-28',54315),('2015-06-09',64732),('2016-08-29',48159),('2015-11-05',67293);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-03-30',56404),('2016-09-14',52878),('2016-09-15',54728),('2017-01-30',36742),('2016-07-20',51965),('2016-06-30',61921),('2016-10-08',46280),('2015-10-28',58527),('2015-03-27',57412),('2015-05-11',39641);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-05-17',32895),('2016-04-23',47772),('2016-02-25',68948),('2016-01-02',32330),('2016-09-08',31193),('2015-10-26',30072),('2015-07-14',60394),('2015-05-11',49759),('2015-06-30',34885),('2016-07-07',63607);
INSERT INTO "myTable" (day_t,collect) VALUES ('2016-05-17',64603),('2015-12-24',34678),('2016-10-30',49561),('2016-09-10',31157),('2016-01-25',49404),('2015-05-29',66332),('2015-12-25',35323),('2016-09-04',50167),('2016-08-11',37194),('2016-01-06',47875);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-07-08',46440),('2017-01-29',55084),('2016-01-26',65879),('2015-08-20',37972),('2015-10-12',31448),('2017-02-12',35059),('2016-09-18',65320),('2016-11-15',40138),('2015-06-14',62843),('2016-08-19',62673);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-12-28',37071),('2016-03-25',66506),('2016-03-11',47244),('2016-04-27',57644),('2015-11-20',43920),('2015-10-11',34291),('2016-06-07',41865),('2016-04-06',69288),('2016-04-14',43162),('2015-03-11',69880);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-07-12',65048),('2016-06-15',58684),('2015-09-29',47213),('2016-07-03',55535),('2015-08-19',52393),('2016-07-20',46684),('2016-06-08',64982),('2015-07-20',34481),('2015-04-03',59426),('2015-04-27',60884);
Create a row_number to perform the recursion need consecutive ID's
CREATE TABLE sortDates as
SELECT day_t,
collect,
row_number() over (order by day_t) rn
FROM "myTable";
Recursive Query
If you see the CASE if previous total m.collect is bigger than 1 million the total is reset.
WITH RECURSIVE million(rn, day_t, collect) AS (
(
SELECT rn, day_t, collect
FROM sortDates
WHERE rn = 1
)
UNION
(
SELECT s.rn, s.day_t, CASE WHEN m.collect > 1000000 THEN s.collect
ELSE m.collect + s.collect
END as collect
FROM sortDates s
JOIN million m
ON s.rn = m.rn + 1
)
)
SELECT *
FROM million
WHERE collect > 1000000
Finally just bring the rows where you break the 1 million limit.
OUTPUT
| rn | day_t | collect |
|----|------------|---------|
| 19 | 2015-07-10 | 1028544 |
| 41 | 2015-11-23 | 1024545 |
| 62 | 2016-05-17 | 1027511 |
| 82 | 2016-09-15 | 1006441 |

Displaying 2 metrics on a tableau map

I am new to Tableau and I have requirements as below:
I need to create a dashboard with a filter on Paywave or EMV and show count of Confirmed and Probable on a geo map.
When I select EMV from the quick filter, it should show a count of confirm & probable for that city. I should be able to drill down and see a count of confirm and probable for zip codes as well.
I am not sure how to achieve the above requirements.
As shown below I have fields like:
EMV Paywave
mrchchant_city, mrch_zipcode confirm probable confirm probable
A 1001 10 15 20 18
B 1005 34 67 78 12
C 2001 24 56 76 45
C 2001 46 19 63 25
Please let me know if any information required from my side.
This will be a lot easier on you if you restructure your data a bit. More often than not, the goal in Tableau is to provide an aggregated summary of the data, rather than showing each individual row. We'll want to group by dimensions (categorical data like "EMV"/"Paywave" or "Confirm"/"Probable"), so this data will be much easier to work with if we get those dimensions into their own columns.
Here's how I personally would go about structuring your table:
+----------------+--------------+---------+----------+-------+-----+
| mrchchant_city | mrch_zipcode | dim1 | dim2 | count | ... |
+----------------+--------------+---------+----------+-------+-----+
| A | 1001 | Paywave | confirm | 20 | ... |
| A | 1001 | Paywave | probable | 18 | ... |
| A | 1001 | EMV | confirm | 10 | ... |
| A | 1001 | EMV | probable | 15 | ... |
| B | 1005 | Paywave | confirm | 78 | ... |
| B | 1005 | Paywave | probable | 12 | ... |
| B | 1005 | EMV | confirm | 34 | ... |
| B | 1005 | EMV | probable | 67 | ... |
| C | 2001 | Paywave | confirm | 76 | ... |
| C | 2001 | Paywave | probable | 45 | ... |
| C | 2001 | EMV | confirm | 24 | ... |
| C | 2001 | EMV | probable | 56 | ... |
| C | 2001 | Paywave | confirm | 63 | ... |
| C | 2001 | Paywave | probable | 25 | ... |
| C | 2001 | EMV | confirm | 46 | ... |
| C | 2001 | EMV | probable | 19 | ... |
| ... | ... | ... | ... | ... | ... |
+----------------+--------------+---------+----------+-------+-----+
(Sorry about the dim1 and dim2, I don't really know what those dimensions represent. You can/should obviously pick a more intuitive nomenclature.)
Once you have a table with columns for your categorical data, it will be simple to filter and group by those dimensions.

Summing columns in a row in all rows with Emacs org-mode

I'm using Emacs org-mode to track times worked on various tasks. The last column in the table is the weekly sum for each task:
|------+-----+-----+-----+-----+-----+-------|
| Task | Mon | Tue | Wed | Thu | Fri | Total |
|------+-----+-----+-----+-----+-----+-------|
| Foo | 2 | 3 | 4 | 5 | 6 | 20 |
| Bar | 2 | 3 | 4 | 5 | 7 | 21 |
#+TBLFM: #2$7=vsum($2..$6)::#3$7=vsum($2..$6)
Currently, I have to add a formula for each new row. Is there any way to customise the formula so that it calculates the sums regardless of how many rows there are?
Column formula did the trick as suggested by fniessen. Here's what I ended up with:
|------+-----+-----+-----+-----+-----+-------|
| Task | Mon | Tue | Wed | Thu | Fri | Total |
|------+-----+-----+-----+-----+-----+-------|
| Foo | 2 | 3 | 4 | 5 | 6 | 20 |
| Bar | 2 | 3 | 4 | 5 | 7 | 21 |
#+TBLFM: $7=vsum($2..$6)
More info in the Column formulas and field formulas section from the docs.
You really should take a closer look at the documentation, and read about "column formulas" (and, even, "row formulas"). A colum formula is $7=...' and is editable viaC-c ='.