Symfony2 Query to find last working date from Holiday Calender - postgresql

I had a calender entity in my project which manages the open and close time of business day of the whole year.
Below is the record of a specific month
id | today_date | year | month_of_year | day_of_month | is_business_day
-------+---------------------+------+---------------+-------------+---------------+
10103 | 2016-02-01 00:00:00 | 2016 | 2 | 1 | t
10104 | 2016-02-02 00:00:00 | 2016 | 2 | 2 | t
10105 | 2016-02-03 00:00:00 | 2016 | 2 | 3 | t
10106 | 2016-02-04 00:00:00 | 2016 | 2 | 4 | t
10107 | 2016-02-05 00:00:00 | 2016 | 2 | 5 | t
10108 | 2016-02-06 00:00:00 | 2016 | 2 | 6 | f
10109 | 2016-02-07 00:00:00 | 2016 | 2 | 7 | f
10110 | 2016-02-08 00:00:00 | 2016 | 2 | 8 | t
10111 | 2016-02-09 00:00:00 | 2016 | 2 | 9 | t
10112 | 2016-02-10 00:00:00 | 2016 | 2 | 10 | t
10113 | 2016-02-11 00:00:00 | 2016 | 2 | 11 | t
10114 | 2016-02-12 00:00:00 | 2016 | 2 | 12 | t
10115 | 2016-02-13 00:00:00 | 2016 | 2 | 13 | f
10116 | 2016-02-14 00:00:00 | 2016 | 2 | 14 | f
10117 | 2016-02-15 00:00:00 | 2016 | 2 | 15 | t
10118 | 2016-02-16 00:00:00 | 2016 | 2 | 16 | t
10119 | 2016-02-17 00:00:00 | 2016 | 2 | 17 | t
10120 | 2016-02-18 00:00:00 | 2016 | 2 | 18 | t
I want the get the today_date of last 7 working date. Supporse today_date is 2016-02-18 and date of last 7 working dates as 2016-02-09.

You can use row_number() for this like this:
SELECT * FROM
(SELECT t.*,row_number() OVER(order by today_date desc) as rnk
FROM Calender t
WHERE today_date <= current_date
AND is_business_day = 't')
WHERE rnk = 7
This will give you the row of the 7th business day from todays date

I see that you tagged your question with Doctrine, ORM and Datetime. Were you after a QueryBuilder solution? Maybe this is closer to what you want:
$qb->select('c.today_date')
->from(Calendar::class, 'c')
->where("c.today_date <= :today")
->andWhere("c.is_business_day = 't'")
->setMaxResults(7)
->orderBy("c.today_date", "DESC")
->setParameter('today', new \DateTime('now'), \Doctrine\DBAL\Types\Type::DATETIME));

Related

How to order the result of ROLLUP by each groups total

So I have the following query that produces the following result:
actname | year | tickets
---------------+----------+---------
Join Division | 2016 | 2
Join Division | 2018 | 2
Join Division | 2020 | 3
Join Division | Total | 7 <<<
QLS | 2018 | 2
QLS | 2019 | 1
QLS | Total | 3 <<<
Scalar Swift | 2017 | 3
Scalar Swift | 2018 | 1
Scalar Swift | 2019 | 1
Scalar Swift | Total | 5 <<<
The Selecter | 2017 | 4
The Selecter | 2018 | 4
The Selecter | Total | 8 <<<
The Where | 2016 | 1
The Where | 2017 | 3
The Where | 2018 | 5
The Where | 2020 | 4
The Where | Total | 13 <<<
ViewBee 40 | 2017 | 3
ViewBee 40 | 2018 | 1
ViewBee 40 | Total | 4 <<<
The problem I have is that I want to re-order the results such that the group with the lowest Total occurs first, such that the results would look like this:
actname | year | tickets
---------------+----------+---------
QLS | 2018 | 2
QLS | 2019 | 1
QLS | Total | 3 <<<
ViewBee 40 | 2017 | 3
ViewBee 40 | 2018 | 1
ViewBee 40 | Total | 4 <<<
Scalar Swift | 2017 | 3
Scalar Swift | 2018 | 1
Scalar Swift | 2019 | 1
Scalar Swift | Total | 5 <<<
Join Division | 2016 | 2
Join Division | 2018 | 2
Join Division | 2020 | 3
Join Division | Total | 7 <<<
The Selecter | 2017 | 4
The Selecter | 2018 | 4
The Selecter | Total | 8 <<<
The Where | 2016 | 1
The Where | 2017 | 3
The Where | 2018 | 5
The Where | 2020 | 4
The Where | Total | 13 <<<
I'm obtaining the results by using the following GROUP:
GROUP BY actname, ROLLUP(year)
Which is combining all the ticket amounts of the same actname and year together.
I can provide the full query if necessary!
Thanks
Using window function (which is sum() in this case) you can set value to groups (groups are partitioned by actname column) , so now every group from actname column, have same value, as its own row where year='Total'.
Then simply sort by that new column, something like this:
with t(actname, year, tickets) as (
VALUES
('Join Division','2016',2),
('Join Division','2018',2),
('Join Division','2020',3),
('Join Division','Total',7),
('QLS','2018',2),
('QLS','2019',1),
('QLS','Total',3 ),
('Scalar Swift','2017',3),
('Scalar Swift','2018',1),
('Scalar Swift','2019',1),
('Scalar Swift','Total',5 ),
('The Selecter','2017',4),
('The Selecter','2018',4),
('The Selecter','Total',8 ),
('The Where','2016',1),
('The Where','2017',3),
('The Where','2018',5),
('The Where','2020',4),
('The Where','Total',13 ),
('ViewBee 40','2017',3),
('ViewBee 40','2018',1),
('ViewBee 40','Total',4 )
)
SELECT * FROM (
select *, sum(case when year = 'Total' then tickets end) over(partition by actname) sm from t
) tt
ORDER BY sm, year

Unable to Calculate 7 Day Moving Average due to inconsistent dates

I just noticed that my code below is not actually a 7 day moving avg, and instead it is a 7 row moving avg. The dates in my table spans several months and I am trying to iron out since I have inconsistent data flow so I can't expect the last 7 rows of the window function to actually represent a 7 day avg. Thanks.
select date, sales,
avg(sales) over(order by date rows between 6 preceding and current row)
from sales_info
order by date
You can get a bit closer to a true 7 day moving average by using RANGE instead of ROWS for your range specification.
Read more about window function frames here.
I believe this should work for you:
select date, sales,
avg(sales) over(order by date range between '6 days' preceding and current row)
from sales_info
order by date;
Here's a demonstration with made up data:
SELECT i,
t,
avg(i) OVER (ORDER BY t RANGE BETWEEN '6 days' preceding and current row) FROM (
SELECT i, t
FROM generate_series('2021-01-01'::timestamp, '2021-02-01'::timestamp, '1 day') WITH ORDINALITY as g(t, i)
) sub;
i | t | avg
----+---------------------+------------------------
1 | 2021-01-01 00:00:00 | 1.00000000000000000000
2 | 2021-01-02 00:00:00 | 1.5000000000000000
3 | 2021-01-03 00:00:00 | 2.0000000000000000
4 | 2021-01-04 00:00:00 | 2.5000000000000000
5 | 2021-01-05 00:00:00 | 3.0000000000000000
6 | 2021-01-06 00:00:00 | 3.5000000000000000
7 | 2021-01-07 00:00:00 | 4.0000000000000000
8 | 2021-01-08 00:00:00 | 5.0000000000000000
9 | 2021-01-09 00:00:00 | 6.0000000000000000
10 | 2021-01-10 00:00:00 | 7.0000000000000000
11 | 2021-01-11 00:00:00 | 8.0000000000000000
12 | 2021-01-12 00:00:00 | 9.0000000000000000
13 | 2021-01-13 00:00:00 | 10.0000000000000000
14 | 2021-01-14 00:00:00 | 11.0000000000000000
15 | 2021-01-15 00:00:00 | 12.0000000000000000
16 | 2021-01-16 00:00:00 | 13.0000000000000000
17 | 2021-01-17 00:00:00 | 14.0000000000000000
18 | 2021-01-18 00:00:00 | 15.0000000000000000
19 | 2021-01-19 00:00:00 | 16.0000000000000000
20 | 2021-01-20 00:00:00 | 17.0000000000000000
21 | 2021-01-21 00:00:00 | 18.0000000000000000
22 | 2021-01-22 00:00:00 | 19.0000000000000000
23 | 2021-01-23 00:00:00 | 20.0000000000000000
24 | 2021-01-24 00:00:00 | 21.0000000000000000
25 | 2021-01-25 00:00:00 | 22.0000000000000000
26 | 2021-01-26 00:00:00 | 23.0000000000000000
27 | 2021-01-27 00:00:00 | 24.0000000000000000
28 | 2021-01-28 00:00:00 | 25.0000000000000000
29 | 2021-01-29 00:00:00 | 26.0000000000000000
30 | 2021-01-30 00:00:00 | 27.0000000000000000
31 | 2021-01-31 00:00:00 | 28.0000000000000000
32 | 2021-02-01 00:00:00 | 29.0000000000000000

Calculate LAG variable after filtering in Tableau

I have a dataset with 4 columns: ID (unique identifier of user), Year, Country and Level in this format:
+----+------+---------+-------+
| ID | Year | Country | Level |
+----+------+---------+-------+
| 1 | 2015 | USA | 1 |
| 1 | 2016 | China | 2 |
| 2 | 2015 | China | 2 |
| 2 | 2016 | Russia | 2 |
| 3 | 2015 | Russia | 1 |
| 3 | 2016 | China | 2 |
| 4 | 2015 | USA | 2 |
| 4 | 2016 | USA | 3 |
| 5 | 2014 | China | 1 |
| 5 | 2016 | USA | 2 |
| 6 | 2015 | USA | 1 |
| 6 | 2016 | USA | 2 |
| 7 | 2015 | Russia | 2 |
| 7 | 2016 | China | 3 |
+----+------+---------+-------+
The user will be able to filter the dataset by country.
I want to create a table using the country filter that shows in a column if a user was the previous year in any of the countries selected aggregated by the level variable, apart from other variables only affected by the current country filter.
For example E.g., if I select China and USA:
+----+------+---------+-------+-----------------+
| ID | Year | Country | Level | In selection PY |
+----+------+---------+-------+-----------------+
| 1 | 2015 | USA | 1 | No |
| 1 | 2016 | China | 2 | Yes |
| 2 | 2015 | China | 2 | No |
| 3 | 2016 | China | 2 | No |
| 4 | 2015 | USA | 2 | No |
| 4 | 2016 | USA | 3 | Yes |
| 5 | 2014 | China | 1 | No |
| 5 | 2016 | USA | 2 | No |
| 6 | 2015 | USA | 1 | No |
| 6 | 2016 | USA | 2 | Yes |
| 7 | 2016 | China | 3 | No |
+----+------+---------+-------+-----------------+
The aggregated result will be:
+-------+-------------------+-----------------+
| Level | Number of records | In selection PY |
+-------+-------------------+-----------------+
| 1 | 3 | 0 |
| 2 | 6 | 2 |
| 3 | 2 | 1 |
+-------+-------------------+-----------------+
Do you know any way to calculate this aggregated table efficiently? (this would be done in a dataset with millions of rows, with a variable set of countries to be selected)
I found a solution, will post in case it is helpful for someone else:
I change the Country filter to "Add to Context" and created this variable:
In Selection PY: if Year = 2016 then
{fixed [ID]:min(if Year = 2015 then 1 END)}
elseif Year = 2015 then
{fixed [ID]:min(if Year = 2014 then 1 END)}
elseif Year = 2014 then
{fixed [ID]:min(if Year = 2013 then 1 END)}
In this way the variable Selection PY is dynamically calculated according to the country filter.
It is only necessary to know in advance which years are stored in the dataset (or add more years to be safe).

Tsql -> filter data 6 months from today, date field in table is YYYYMM

I need some help.
Currently is March 2017.
how do I extract all records 6 months ago from February 2017 until end of this year. the date format in my table is in YYYYMM
Here is my sql statement
select columns from budget
where month_number > = DATEADD(MONTH, -6, CURRENT_TIMESTAMP);
the output I am getting is as below:
+------------+-------+--------------+
| month_name | month | month_number |
+------------+-------+--------------+
| January | 1 | 201601 |
| February | 2 | 201602 |
| March | 3 | 201603 |
| April | 4 | 201604 |
| May | 5 | 201605 |
| June | 6 | 201606 |
| July | 7 | 201607 |
| August | 8 | 201608 |
| September | 9 | 201609 |
| October | 10 | 201610 |
| November | 11 | 201611 |
| December | 12 | 201612 |
| January | 1 | 201701 |
| February | 2 | 201702 |
| March | 3 | 201703 |
| April | 4 | 201704 |
| July | 7 | 201707 |
| December | 12 | 201712 |
+------------+-------+--------------+
I am not getting the right output. I am still getting data from Jan 2016 onwards. Please help
Thanks
Alternatively ..
declare #budget table (month_Number int)
insert #budget (month_number)
select 201601
union all
select 201602
union all
select 201702
union all
select 201705
union all
select 201709
select * from #budget
where month_number >= (YEAR(DATEADD(MONTH, -6, CURRENT_TIMESTAMP)) * 100) + MONTH(DATEADD(MONTH, -6, CURRENT_TIMESTAMP));
Select *
From Budget
Where month_number>= convert(varchar(6),DATEADD(MONTH, -6, CURRENT_TIMESTAMP),112)
Order By month_number
If 2012+
Select *
From Budget
Where month_number>= format(DATEADD(MONTH, -6, CURRENT_TIMESTAMP),'yyyyMM')
Order By month_number
Returns
month_name month month_number
September 9 201609
October 10 201610
November 11 201611
December 12 201612
January 1 201701
February 2 201702
March 3 201703
April 4 201704
July 7 201707
December 12 201712

Window function to achieve running sum that resets in Postgres SQL [duplicate]

I wrote a query that creates two columns: the_day, and the amount_raised on that day. Here is what I have:
And I would like to add a column that has a running sum of amount_raised:
Ultimately, I would like the sum column to reset after it reaches 1 million.
The recursive approach is above my pay grade, so if anyone knows a way to reset the sum without creating an entirely new table, please comment (maybe with a RESET function?). Thank you
I'd like to thank Juan Carlos Oropeza for providing a script and SQLFiddle with the test data. George, you should have done that.
The query itself it rather simple.
At first calculate a simple running sum (CTE_RunningSum) and divide it by 1,000,000 to get number of whole millions.
Then calculate the running sum again with partitioning by the number of millions.
SQL Fiddle
I included the columns RunningSum and Millions in the final result to illustrate how the query works.
WITH
CTE_RunningSum
AS
(
SELECT
ID
,day_t
,collect
,SUM(collect) OVER(ORDER BY day_t, id) AS RunningSum
,(SUM(collect) OVER(ORDER BY day_t, id)) / 1000000 AS Millions
FROM myTable
)
SELECT
ID
,day_t
,collect
,RunningSum
,Millions
,SUM(collect) OVER(PARTITION BY Millions ORDER BY day_t, id) AS Result
FROM CTE_RunningSum
ORDER BY day_t, id;
Result
| id | day_t | collect | runningsum | millions | result |
|-----|-----------------------------|---------|------------|----------|---------|
| 90 | March, 11 2015 00:00:00 | 69880 | 69880 | 0 | 69880 |
| 13 | March, 25 2015 00:00:00 | 69484 | 139364 | 0 | 139364 |
| 49 | March, 27 2015 00:00:00 | 57412 | 196776 | 0 | 196776 |
| 41 | March, 30 2015 00:00:00 | 56404 | 253180 | 0 | 253180 |
| 99 | April, 03 2015 00:00:00 | 59426 | 312606 | 0 | 312606 |
| 1 | April, 10 2015 00:00:00 | 65825 | 378431 | 0 | 378431 |
| 100 | April, 27 2015 00:00:00 | 60884 | 439315 | 0 | 439315 |
| 50 | May, 11 2015 00:00:00 | 39641 | 478956 | 0 | 478956 |
| 58 | May, 11 2015 00:00:00 | 49759 | 528715 | 0 | 528715 |
| 51 | May, 17 2015 00:00:00 | 32895 | 561610 | 0 | 561610 |
| 15 | May, 19 2015 00:00:00 | 50847 | 612457 | 0 | 612457 |
| 66 | May, 29 2015 00:00:00 | 66332 | 678789 | 0 | 678789 |
| 4 | June, 04 2015 00:00:00 | 46891 | 725680 | 0 | 725680 |
| 38 | June, 09 2015 00:00:00 | 64732 | 790412 | 0 | 790412 |
| 79 | June, 14 2015 00:00:00 | 62843 | 853255 | 0 | 853255 |
| 37 | June, 28 2015 00:00:00 | 54315 | 907570 | 0 | 907570 |
| 59 | June, 30 2015 00:00:00 | 34885 | 942455 | 0 | 942455 |
| 71 | July, 08 2015 00:00:00 | 46440 | 988895 | 0 | 988895 |
| 31 | July, 10 2015 00:00:00 | 39649 | 1028544 | 1 | 39649 |
| 91 | July, 12 2015 00:00:00 | 65048 | 1093592 | 1 | 104697 |
| 57 | July, 14 2015 00:00:00 | 60394 | 1153986 | 1 | 165091 |
| 98 | July, 20 2015 00:00:00 | 34481 | 1188467 | 1 | 199572 |
| 3 | July, 26 2015 00:00:00 | 58672 | 1247139 | 1 | 258244 |
| 95 | August, 19 2015 00:00:00 | 52393 | 1299532 | 1 | 310637 |
| 74 | August, 20 2015 00:00:00 | 37972 | 1337504 | 1 | 348609 |
| 20 | August, 27 2015 00:00:00 | 36882 | 1374386 | 1 | 385491 |
| 2 | September, 07 2015 00:00:00 | 39408 | 1413794 | 1 | 424899 |
| 14 | September, 09 2015 00:00:00 | 40234 | 1454028 | 1 | 465133 |
| 6 | September, 17 2015 00:00:00 | 65957 | 1519985 | 1 | 531090 |
| 93 | September, 29 2015 00:00:00 | 47213 | 1567198 | 1 | 578303 |
| 35 | September, 30 2015 00:00:00 | 49446 | 1616644 | 1 | 627749 |
| 86 | October, 11 2015 00:00:00 | 34291 | 1650935 | 1 | 662040 |
| 75 | October, 12 2015 00:00:00 | 31448 | 1682383 | 1 | 693488 |
| 19 | October, 14 2015 00:00:00 | 48509 | 1730892 | 1 | 741997 |
| 56 | October, 26 2015 00:00:00 | 30072 | 1760964 | 1 | 772069 |
| 48 | October, 28 2015 00:00:00 | 58527 | 1819491 | 1 | 830596 |
| 40 | November, 05 2015 00:00:00 | 67293 | 1886784 | 1 | 897889 |
| 33 | November, 09 2015 00:00:00 | 41944 | 1928728 | 1 | 939833 |
| 34 | November, 11 2015 00:00:00 | 35516 | 1964244 | 1 | 975349 |
| 85 | November, 20 2015 00:00:00 | 43920 | 2008164 | 2 | 43920 |
| 18 | November, 23 2015 00:00:00 | 44925 | 2053089 | 2 | 88845 |
| 62 | December, 24 2015 00:00:00 | 34678 | 2087767 | 2 | 123523 |
| 67 | December, 25 2015 00:00:00 | 35323 | 2123090 | 2 | 158846 |
| 81 | December, 28 2015 00:00:00 | 37071 | 2160161 | 2 | 195917 |
| 54 | January, 02 2016 00:00:00 | 32330 | 2192491 | 2 | 228247 |
| 70 | January, 06 2016 00:00:00 | 47875 | 2240366 | 2 | 276122 |
| 28 | January, 23 2016 00:00:00 | 40250 | 2280616 | 2 | 316372 |
| 65 | January, 25 2016 00:00:00 | 49404 | 2330020 | 2 | 365776 |
| 73 | January, 26 2016 00:00:00 | 65879 | 2395899 | 2 | 431655 |
| 5 | February, 05 2016 00:00:00 | 53953 | 2449852 | 2 | 485608 |
| 32 | February, 11 2016 00:00:00 | 44988 | 2494840 | 2 | 530596 |
| 53 | February, 25 2016 00:00:00 | 68948 | 2563788 | 2 | 599544 |
| 83 | March, 11 2016 00:00:00 | 47244 | 2611032 | 2 | 646788 |
| 8 | March, 25 2016 00:00:00 | 51809 | 2662841 | 2 | 698597 |
| 82 | March, 25 2016 00:00:00 | 66506 | 2729347 | 2 | 765103 |
| 88 | April, 06 2016 00:00:00 | 69288 | 2798635 | 2 | 834391 |
| 89 | April, 14 2016 00:00:00 | 43162 | 2841797 | 2 | 877553 |
| 52 | April, 23 2016 00:00:00 | 47772 | 2889569 | 2 | 925325 |
| 7 | April, 27 2016 00:00:00 | 33368 | 2922937 | 2 | 958693 |
| 84 | April, 27 2016 00:00:00 | 57644 | 2980581 | 2 | 1016337 |
| 17 | May, 17 2016 00:00:00 | 35416 | 3015997 | 3 | 35416 |
| 61 | May, 17 2016 00:00:00 | 64603 | 3080600 | 3 | 100019 |
| 87 | June, 07 2016 00:00:00 | 41865 | 3122465 | 3 | 141884 |
| 97 | June, 08 2016 00:00:00 | 64982 | 3187447 | 3 | 206866 |
| 92 | June, 15 2016 00:00:00 | 58684 | 3246131 | 3 | 265550 |
| 23 | June, 26 2016 00:00:00 | 46147 | 3292278 | 3 | 311697 |
| 46 | June, 30 2016 00:00:00 | 61921 | 3354199 | 3 | 373618 |
| 94 | July, 03 2016 00:00:00 | 55535 | 3409734 | 3 | 429153 |
| 60 | July, 07 2016 00:00:00 | 63607 | 3473341 | 3 | 492760 |
| 45 | July, 20 2016 00:00:00 | 51965 | 3525306 | 3 | 544725 |
| 96 | July, 20 2016 00:00:00 | 46684 | 3571990 | 3 | 591409 |
| 29 | August, 09 2016 00:00:00 | 37707 | 3609697 | 3 | 629116 |
| 69 | August, 11 2016 00:00:00 | 37194 | 3646891 | 3 | 666310 |
| 80 | August, 19 2016 00:00:00 | 62673 | 3709564 | 3 | 728983 |
| 36 | August, 28 2016 00:00:00 | 48237 | 3757801 | 3 | 777220 |
| 39 | August, 29 2016 00:00:00 | 48159 | 3805960 | 3 | 825379 |
| 25 | August, 30 2016 00:00:00 | 60958 | 3866918 | 3 | 886337 |
| 68 | September, 04 2016 00:00:00 | 50167 | 3917085 | 3 | 936504 |
| 55 | September, 08 2016 00:00:00 | 31193 | 3948278 | 3 | 967697 |
| 64 | September, 10 2016 00:00:00 | 31157 | 3979435 | 3 | 998854 |
| 42 | September, 14 2016 00:00:00 | 52878 | 4032313 | 4 | 52878 |
| 43 | September, 15 2016 00:00:00 | 54728 | 4087041 | 4 | 107606 |
| 77 | September, 18 2016 00:00:00 | 65320 | 4152361 | 4 | 172926 |
| 12 | September, 23 2016 00:00:00 | 43597 | 4195958 | 4 | 216523 |
| 30 | September, 26 2016 00:00:00 | 32764 | 4228722 | 4 | 249287 |
| 10 | September, 27 2016 00:00:00 | 47038 | 4275760 | 4 | 296325 |
| 47 | October, 08 2016 00:00:00 | 46280 | 4322040 | 4 | 342605 |
| 26 | October, 10 2016 00:00:00 | 69487 | 4391527 | 4 | 412092 |
| 63 | October, 30 2016 00:00:00 | 49561 | 4441088 | 4 | 461653 |
| 78 | November, 15 2016 00:00:00 | 40138 | 4481226 | 4 | 501791 |
| 27 | November, 27 2016 00:00:00 | 57378 | 4538604 | 4 | 559169 |
| 21 | December, 01 2016 00:00:00 | 35336 | 4573940 | 4 | 594505 |
| 16 | December, 03 2016 00:00:00 | 39671 | 4613611 | 4 | 634176 |
| 22 | December, 13 2016 00:00:00 | 34574 | 4648185 | 4 | 668750 |
| 72 | January, 29 2017 00:00:00 | 55084 | 4703269 | 4 | 723834 |
| 44 | January, 30 2017 00:00:00 | 36742 | 4740011 | 4 | 760576 |
| 24 | February, 01 2017 00:00:00 | 31061 | 4771072 | 4 | 791637 |
| 76 | February, 12 2017 00:00:00 | 35059 | 4806131 | 4 | 826696 |
| 9 | February, 27 2017 00:00:00 | 39767 | 4845898 | 4 | 866463 |
| 11 | February, 28 2017 00:00:00 | 66007 | 4911905 | 4 | 932470 |
I took a look again and couldnt solve it with a Windows Function so I took the recursive aproach
SQL Fiddle Demo
Sample Data: 100 rows random dates between 2015-2017 amounts between 10k - 70k
DROP TABLE IF EXISTS "myTable";
CREATE TABLE "myTable" (
id SERIAL PRIMARY KEY,
day_t varchar(255),
collect integer NULL
);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-04-10',65825),('2015-09-07',39408),('2015-07-26',58672),('2015-06-04',46891),('2016-02-05',53953),('2015-09-17',65957),('2016-04-27',33368),('2016-03-25',51809),('2017-02-27',39767),('2016-09-27',47038);
INSERT INTO "myTable" (day_t,collect) VALUES ('2017-02-28',66007),('2016-09-23',43597),('2015-03-25',69484),('2015-09-09',40234),('2015-05-19',50847),('2016-12-03',39671),('2016-05-17',35416),('2015-11-23',44925),('2015-10-14',48509),('2015-08-27',36882);
INSERT INTO "myTable" (day_t,collect) VALUES ('2016-12-01',35336),('2016-12-13',34574),('2016-06-26',46147),('2017-02-01',31061),('2016-08-30',60958),('2016-10-10',69487),('2016-11-27',57378),('2016-01-23',40250),('2016-08-09',37707),('2016-09-26',32764);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-07-10',39649),('2016-02-11',44988),('2015-11-09',41944),('2015-11-11',35516),('2015-09-30',49446),('2016-08-28',48237),('2015-06-28',54315),('2015-06-09',64732),('2016-08-29',48159),('2015-11-05',67293);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-03-30',56404),('2016-09-14',52878),('2016-09-15',54728),('2017-01-30',36742),('2016-07-20',51965),('2016-06-30',61921),('2016-10-08',46280),('2015-10-28',58527),('2015-03-27',57412),('2015-05-11',39641);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-05-17',32895),('2016-04-23',47772),('2016-02-25',68948),('2016-01-02',32330),('2016-09-08',31193),('2015-10-26',30072),('2015-07-14',60394),('2015-05-11',49759),('2015-06-30',34885),('2016-07-07',63607);
INSERT INTO "myTable" (day_t,collect) VALUES ('2016-05-17',64603),('2015-12-24',34678),('2016-10-30',49561),('2016-09-10',31157),('2016-01-25',49404),('2015-05-29',66332),('2015-12-25',35323),('2016-09-04',50167),('2016-08-11',37194),('2016-01-06',47875);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-07-08',46440),('2017-01-29',55084),('2016-01-26',65879),('2015-08-20',37972),('2015-10-12',31448),('2017-02-12',35059),('2016-09-18',65320),('2016-11-15',40138),('2015-06-14',62843),('2016-08-19',62673);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-12-28',37071),('2016-03-25',66506),('2016-03-11',47244),('2016-04-27',57644),('2015-11-20',43920),('2015-10-11',34291),('2016-06-07',41865),('2016-04-06',69288),('2016-04-14',43162),('2015-03-11',69880);
INSERT INTO "myTable" (day_t,collect) VALUES ('2015-07-12',65048),('2016-06-15',58684),('2015-09-29',47213),('2016-07-03',55535),('2015-08-19',52393),('2016-07-20',46684),('2016-06-08',64982),('2015-07-20',34481),('2015-04-03',59426),('2015-04-27',60884);
Create a row_number to perform the recursion need consecutive ID's
CREATE TABLE sortDates as
SELECT day_t,
collect,
row_number() over (order by day_t) rn
FROM "myTable";
Recursive Query
If you see the CASE if previous total m.collect is bigger than 1 million the total is reset.
WITH RECURSIVE million(rn, day_t, collect) AS (
(
SELECT rn, day_t, collect
FROM sortDates
WHERE rn = 1
)
UNION
(
SELECT s.rn, s.day_t, CASE WHEN m.collect > 1000000 THEN s.collect
ELSE m.collect + s.collect
END as collect
FROM sortDates s
JOIN million m
ON s.rn = m.rn + 1
)
)
SELECT *
FROM million
WHERE collect > 1000000
Finally just bring the rows where you break the 1 million limit.
OUTPUT
| rn | day_t | collect |
|----|------------|---------|
| 19 | 2015-07-10 | 1028544 |
| 41 | 2015-11-23 | 1024545 |
| 62 | 2016-05-17 | 1027511 |
| 82 | 2016-09-15 | 1006441 |