PostgreSQL Crosstab Query With Changing Rows - postgresql

Can someone please help me put this query together?
I have this table:
store name status orders
billys store new 15
billys store ordered 20
billys store canceled 2
johnny store new 5
johnny store out_of_stock 20
rosie store new 6
rosie store ordered 4
rosie store out_of_stock 10
So as you can see, some stores have some statuses that others don't.
My desired result is the following:
store name new ordered canceled out of stock
billys store 15 20 2 0
johnny store 5 0 0 20
rosie store 6 4 0 10
I have tried the following:
SELECT * FROM crosstab(
'SELECT store_name::text as store_name,
status::text as status,
count(*)::int as orders
FROM organizations
INNER JOIN orders ON organization_id = organizations.id
GROUP BY store_name, status
ORDER BY store_name, status'
) x (store_name text, "new" int, "ordered" int)
But this doesn't work since it will break when the new row is not an expected value. For example with 'johnny store', after 'new' is not 'ordered', it's 'out_of_stock' so that won't work.
I've looked through a bunch of StackOverflow posts but I'm just overall pretty confused. Thank you

We can do this using CASE to avoid using sub-queries.
CREATE TABLE organisation (
store_name VARCHAR(25),
status VARCHAR(25),
orders INT);
INSERT INTO organisation VALUES
('billys store', 'new' , 15),
('billys store', 'ordered' , 20),
('billys store', 'canceled' , 2),
('johnny store', 'new' , 5),
('johnny store', 'out_of_stock', 20),
('rosie store' , 'new' , 6),
('rosie store' , 'ordered' , 4),
('rosie store' , 'out_of_stock', 10);
8 rows affected
SELECT store_name,
SUM(CASE WHEN status='new' THEN orders ELSE 0 END) new_,
SUM(CASE WHEN status='canceled' THEN orders ELSE 0 END) canceled,
SUM(CASE WHEN status='ordered' THEN orders ELSE 0 END) ordered,
SUM(CASE WHEN status='new' THEN orders ELSE 0 END) o_o_s
FROM organisation o
GROUP BY store_name;
GO
store_name | new | canceled | ordered | o_o_s
:----------- | --: | -------: | ------: | ----:
billys store | 15 | 2 | 20 | 15
johnny store | 5 | 0 | 0 | 5
rosie store | 6 | 0 | 4 | 6
db<>fiddle here

Maybe you couldn't understand it from the link I provided but tablefunc extension makes this much easier IMHO. Here is a sample based on your code, you would replace the first query with yours that gets the data from your tables:
create temporary table myTable (storename text, status text, orders int);
insert into myTable (storename, status, orders)
values
('billys store','new', 15),
('billys store','ordered', 20),
('billys store','canceled', 2),
('johnny store','new', 5),
('johnny store','out_of_stock', 20),
('rosie store','new', 6),
('rosie store','ordered', 4),
('rosie store','out_of_stock', 10);
SELECT * FROM crosstab(
'SELECT storename,
status,
orders
FROM myTable',
'select * from unnest(string_to_array(''new,ordered,canceled,out_of_stock'', '',''))'
) x (storename text, "new" int, "ordered" int, "canceled" int, "out_of_stock" int);
drop table myTable;
Here is DBFiddle demo

Related

Summary for duration per month for each ID for TSQL / sql server 2016

I have a table (##table) with an ID and the status for each day. I need a summary per month(!) how many days each ID was in which status. (##table_result). See below.
This was my approach. But it does not work. How can I summarize the days for each ID and per status for each month?
select item, Cur_Status, convert(varchar(7), s_date, 126) as YM_S_Date, lag(s_date) over(order by month(s_date) asc) as Start_date ,s_date , datediff (day, lag(s_date) over( order by month(s_date) asc), s_date) as duration from ##table order by item, Start_date
Data:
create table ##table (item nvarchar(30), S_date date, Cur_Status nvarchar(30));
insert into ##table values
('A','2022/01/01','AA'),
('A','2022/01/02','AA'),
('A','2022/01/03','AA'),
('A','2022/01/04','BB'),
('A','2022/01/05','BB'),
('A','2022/01/06','BB'),
('A','2022/01/07','AA'),
('A','2022/01/08','AA'),
('A','2022/01/09','AA'),
('A','2022/01/10','AA'),
('A','2022/01/11','AA'),
('A','2022/01/12','AA'),
('A','2022/01/13','CC'),
('A','2022/01/14','CC'),
('A','2022/01/15','AA'),
('A','2022/01/16','DD'),
('A','2022/01/17','DD'),
('A','2022/01/18','DD'),
('A','2022/01/19','EE'),
('A','2022/01/20','AA'),
('A','2022/01/21','BB'),
('A','2022/01/22','FF'),
('A','2022/01/23','FF'),
('A','2022/01/24','FF'),
('A','2022/01/25','FF'),
('A','2022/01/26','AA'),
('A','2022/01/27','AA'),
('A','2022/01/28','AA'),
('A','2022/01/29','AA'),
('A','2022/01/30','AA'),
('A','2022/01/31','AA'),
('A','2022/02/01','AA'),
('A','2022/02/02','AA'),
('A','2022/02/03','AA'),
('A','2022/02/04','AA'),
('A','2022/02/05','AA'),
('A','2022/02/06','BB'),
('A','2022/02/07','AA'),
('A','2022/02/08','AA'),
('A','2022/02/09','AA'),
('A','2022/02/10','AA'),
('A','2022/02/11','AA'),
('A','2022/02/12','AA'),
('A','2022/02/13','CC'),
('A','2022/02/14','CC'),
('A','2022/02/15','AA'),
('A','2022/02/16','DD'),
('A','2022/02/17','DD'),
('A','2022/02/18','DD'),
('A','2022/02/19','EE'),
('A','2022/02/20','AA'),
('A','2022/02/21','BB'),
('A','2022/02/22','AA'),
('A','2022/02/23','AA'),
('A','2022/02/24','AA'),
('A','2022/02/25','FF'),
('A','2022/02/26','AA'),
('A','2022/02/27','AA'),
('A','2022/02/28','AA'),
('A','2022/03/01','AA'),
('A','2022/03/02','AA'),
('A','2022/03/03','BB'),
('A','2022/03/04','AA'),
('B','2022/01/01','AA'),
('B','2022/01/02','AA'),
('B','2022/01/03','AA'),
('B','2022/01/04','BB'),
('B','2022/01/05','BB'),
('B','2022/01/06','BB'),
('B','2022/01/07','AA'),
('B','2022/01/08','AA'),
('B','2022/01/09','AA'),
('B','2022/01/10','AA'),
('B','2022/01/11','AA'),
('B','2022/01/12','AA'),
('B','2022/01/13','AA'),
('B','2022/01/14','AA'),
('B','2022/01/15','AA'),
('B','2022/01/16','AA'),
('B','2022/01/17','AA'),
('B','2022/01/18','AA'),
('B','2022/01/19','AA'),
('B','2022/01/20','AA'),
('B','2022/01/21','AA'),
('B','2022/01/22','AA'),
('B','2022/01/23','AA'),
('B','2022/01/24','AA'),
('B','2022/01/25','AA'),
('B','2022/01/26','AA'),
('B','2022/01/27','AA'),
('B','2022/01/28','AA'),
('B','2022/01/29','AA'),
('B','2022/01/30','AA'),
('B','2022/01/31','AA'),
('B','2022/02/01','AA'),
('B','2022/02/02','AA'),
('B','2022/02/03','AA'),
('B','2022/02/04','FF'),
('B','2022/02/05','FF'),
('B','2022/02/06','FF'),
('B','2022/02/07','AA'),
('B','2022/02/08','AA'),
('B','2022/02/09','AA'),
('B','2022/02/10','AA'),
('B','2022/02/11','AA'),
('B','2022/02/12','AA'),
('B','2022/02/13','CC'),
('B','2022/02/14','CC'),
('B','2022/02/15','AA'),
('B','2022/02/16','DD'),
('B','2022/02/17','DD'),
('B','2022/02/18','DD'),
('B','2022/02/19','EE'),
('B','2022/02/20','AA'),
('B','2022/02/21','AA'),
('B','2022/02/22','AA'),
('B','2022/02/23','AA'),
('B','2022/02/24','AA'),
('B','2022/02/25','FF'),
('B','2022/02/26','AA'),
('B','2022/02/27','AA'),
('B','2022/02/28','AA'),
('B','2022/03/01','BB'),
('B','2022/03/02','AA'),
('B','2022/03/03','AA'),
('B','2022/03/04','AA'),
('B','2022/03/05','AA'),
('B','2022/03/06','AA'),
('B','2022/03/07','AA'),
('B','2022/03/08','AA'),
('B','2022/03/09','BB'),
('B','2022/03/10','BB'),
('B','2022/03/11','BB'),
('B','2022/03/12','BB'),
('B','2022/03/13','BB'),
('B','2022/03/14','AA'),
('B','2022/03/15','AA'),
('B','2022/03/16','AA'),
('B','2022/03/17','AA'),
('B','2022/03/18','AA'),
('B','2022/03/19','DD'),
('B','2022/03/20','DD'),
('B','2022/03/21','AA'),
('B','2022/03/22','AA'),
('B','2022/03/23','AA'),
('B','2022/03/24','AA'),
('B','2022/03/25','BB'),
('B','2022/03/26','AA'),
('B','2022/03/27','AA'),
('B','2022/03/28','BB'),
('B','2022/03/30','AA'),
('B','2022/03/31','BB'),
('B','2022/04/01','BB'),
('B','2022/04/02','BB'),
('B','2022/04/04','BB'),
('C','2022/04/04','BB'),
('C','2022/04/05','BB'),
('C','2022/04/06','BB'),
('C','2022/04/07','AA'),
('C','2022/04/08','AA'),
('C','2022/04/09','AA'),
('C','2022/04/10','AA'),
('C','2022/04/11','AA'),
('C','2022/04/12','AA'),
('C','2022/04/13','CC'),
('C','2022/04/14','CC'),
('E','2022/04/15','AA'),
('E','2022/04/16','DD'),
('E','2022/04/17','DD'),
('E','2022/04/18','DD'),
('E','2022/04/19','EE'),
('E','2022/04/20','AA'),
('E','2022/04/21','BB'),
('E','2022/04/22','FF'),
('E','2022/04/23','FF'),
('E','2022/04/24','FF'),
('E','2022/04/25','FF'),
('E','2022/04/26','AA'),
('E','2022/04/27','AA'),
('E','2022/04/28','AA'),
('E','2022/04/29','AA'),
('E','2022/04/30','FF'),
('E','2022/05/01','FF'),
('E','2022/05/01','FF')
;
select * from ##table order by item, S_date
Expected result:
create table ##table_Result (item nvarchar(30), Start_date date, End_date date, Cur_Status nvarchar(30), Duration int);
insert into ##table_result values
('A','2022/01/01','2022/01/03','AA','3' ),
('A','2022/01/04','2022/01/06','BB','3' ),
('A','2022/01/07','2022/01/12','AA','4' ),
('A','2022/01/13','2022/01/14','CC','2' ),
('A','2022/01/15','2022/01/15','AA','2' ),
('A','2022/01/16','2022/01/18','DD','2' ),
('A','2022/01/19','2022/01/19','EE','1' ),
('A','2022/01/20','2022/01/20','AA','1' ),
('A','2022/01/21','2022/01/21','BB','1' ),
('A','2022/01/22','2022/01/25','FF','4' ),
('A','2022/01/26','2022/01/31','AA','6' ),
('A','2022/02/01','2022/02/05','AA','5' ),
('A','2022/02/06','2022/02/06','BB','5' ),
('A','2022/02/07','2022/02/12','AA','6' ),
('A','2022/02/13','2022/02/14','CC','2' ),
('A','2022/02/15','2022/02/15','AA','1' ),
('A','2022/02/16','2022/02/18','DD','3' ),
('A','2022/02/19','2022/02/19','EE','1' ),
('A','2022/02/20','2022/02/20','AA','1' ),
('A','2022/02/21','2022/02/21','BB','1' ),
('A','2022/02/22','2022/02/24','AA','1' ),
('A','2022/02/25','2022/02/25','FF','1' ),
('A','2022/02/26','2022/02/28','AA','3' ),
('A','2022/03/01','2022/03/02','AA','2' ),
('A','2022/03/03','2022/03/03','BB','2' ),
('A','2022/03/04','2022/03/04','AA','2' ),
('B','2022/01/01','2022/01/02','AA','2' ),
('B','2022/01/03','2022/01/03','AA','1' ),
('B','2022/01/04','2022/01/06','BB','2' ),
('B','2022/01/07','2022/01/31','AA','25'),
('B','2022/02/01','2022/01/03','AA','3')
Yikes, unless you're really super sure you need a global temporary table, you likely should not be using them at all.
Here's a good way to present your demo data and tables:
DECLARE #table TABLE (item NVARCHAR(30), S_date DATE, Cur_Status NVARCHAR(30));
INSERT INTO #table (item, S_date, Cur_Status) VALUES
('A','2022/01/01','AA'), ('A','2022/01/02','AA'), ('A','2022/01/03','AA'), ('A','2022/01/04','BB'), ('A','2022/01/05','BB'), ('A','2022/01/06','BB'), ('A','2022/01/07','AA'), ('A','2022/01/08','AA'), ('A','2022/01/09','AA'), ('A','2022/01/10','AA'),
('A','2022/01/11','AA'), ('A','2022/01/12','AA'), ('A','2022/01/13','CC'), ('A','2022/01/14','CC'), ('A','2022/01/15','AA'), ('A','2022/01/16','DD'), ('A','2022/01/17','DD'), ('A','2022/01/18','DD'), ('A','2022/01/19','EE'), ('A','2022/01/20','AA'),
('A','2022/01/21','BB'), ('A','2022/01/22','FF'), ('A','2022/01/23','FF'), ('A','2022/01/24','FF'), ('A','2022/01/25','FF'), ('A','2022/01/26','AA'), ('A','2022/01/27','AA'), ('A','2022/01/28','AA'), ('A','2022/01/29','AA'), ('A','2022/01/30','AA'),
('A','2022/01/31','AA'), ('A','2022/02/01','AA'), ('A','2022/02/02','AA'), ('A','2022/02/03','AA'), ('A','2022/02/04','AA'), ('A','2022/02/05','AA'), ('A','2022/02/06','BB'), ('A','2022/02/07','AA'), ('A','2022/02/08','AA'), ('A','2022/02/09','AA'),
('A','2022/02/10','AA'), ('A','2022/02/11','AA'), ('A','2022/02/12','AA'), ('A','2022/02/13','CC'), ('A','2022/02/14','CC'), ('A','2022/02/15','AA'), ('A','2022/02/16','DD'), ('A','2022/02/17','DD'), ('A','2022/02/18','DD'), ('A','2022/02/19','EE'),
('A','2022/02/20','AA'), ('A','2022/02/21','BB'), ('A','2022/02/22','AA'), ('A','2022/02/23','AA'), ('A','2022/02/24','AA'), ('A','2022/02/25','FF'), ('A','2022/02/26','AA'), ('A','2022/02/27','AA'), ('A','2022/02/28','AA'), ('A','2022/03/01','AA'),
('A','2022/03/02','AA'), ('A','2022/03/03','BB'), ('A','2022/03/04','AA'), ('B','2022/01/01','AA'), ('B','2022/01/02','AA'), ('B','2022/01/03','AA'), ('B','2022/01/04','BB'), ('B','2022/01/05','BB'), ('B','2022/01/06','BB'), ('B','2022/01/07','AA'),
('B','2022/01/08','AA'), ('B','2022/01/09','AA'), ('B','2022/01/10','AA'), ('B','2022/01/11','AA'), ('B','2022/01/12','AA'), ('B','2022/01/13','AA'), ('B','2022/01/14','AA'), ('B','2022/01/15','AA'), ('B','2022/01/16','AA'), ('B','2022/01/17','AA'),
('B','2022/01/18','AA'), ('B','2022/01/19','AA'), ('B','2022/01/20','AA'), ('B','2022/01/21','AA'), ('B','2022/01/22','AA'), ('B','2022/01/23','AA'), ('B','2022/01/24','AA'), ('B','2022/01/25','AA'), ('B','2022/01/26','AA'), ('B','2022/01/27','AA'),
('B','2022/01/28','AA'), ('B','2022/01/29','AA'), ('B','2022/01/30','AA'), ('B','2022/01/31','AA'), ('B','2022/02/01','AA'), ('B','2022/02/02','AA'), ('B','2022/02/03','AA'), ('B','2022/02/04','FF'), ('B','2022/02/05','FF'), ('B','2022/02/06','FF'),
('B','2022/02/07','AA'), ('B','2022/02/08','AA'), ('B','2022/02/09','AA'), ('B','2022/02/10','AA'), ('B','2022/02/11','AA'), ('B','2022/02/12','AA'), ('B','2022/02/13','CC'), ('B','2022/02/14','CC'), ('B','2022/02/15','AA'), ('B','2022/02/16','DD'),
('B','2022/02/17','DD'), ('B','2022/02/18','DD'), ('B','2022/02/19','EE'), ('B','2022/02/20','AA'), ('B','2022/02/21','AA'), ('B','2022/02/22','AA'), ('B','2022/02/23','AA'), ('B','2022/02/24','AA'), ('B','2022/02/25','FF'), ('B','2022/02/26','AA'),
('B','2022/02/27','AA'), ('B','2022/02/28','AA'), ('B','2022/03/01','BB'), ('B','2022/03/02','AA'), ('B','2022/03/03','AA'), ('B','2022/03/04','AA'), ('B','2022/03/05','AA'), ('B','2022/03/06','AA'), ('B','2022/03/07','AA'), ('B','2022/03/08','AA'),
('B','2022/03/09','BB'), ('B','2022/03/10','BB'), ('B','2022/03/11','BB'), ('B','2022/03/12','BB'), ('B','2022/03/13','BB'), ('B','2022/03/14','AA'), ('B','2022/03/15','AA'), ('B','2022/03/16','AA'), ('B','2022/03/17','AA'), ('B','2022/03/18','AA'),
('B','2022/03/19','DD'), ('B','2022/03/20','DD'), ('B','2022/03/21','AA'), ('B','2022/03/22','AA'), ('B','2022/03/23','AA'), ('B','2022/03/24','AA'), ('B','2022/03/25','BB'), ('B','2022/03/26','AA'), ('B','2022/03/27','AA'), ('B','2022/03/28','BB'),
('B','2022/03/30','AA'), ('B','2022/03/31','BB'), ('B','2022/04/01','BB'), ('B','2022/04/02','BB'), ('B','2022/04/04','BB'), ('C','2022/04/04','BB'), ('C','2022/04/05','BB'), ('C','2022/04/06','BB'), ('C','2022/04/07','AA'), ('C','2022/04/08','AA'),
('C','2022/04/09','AA'), ('C','2022/04/10','AA'), ('C','2022/04/11','AA'), ('C','2022/04/12','AA'), ('C','2022/04/13','CC'), ('C','2022/04/14','CC'), ('E','2022/04/15','AA'), ('E','2022/04/16','DD'), ('E','2022/04/17','DD'), ('E','2022/04/18','DD'),
('E','2022/04/19','EE'), ('E','2022/04/20','AA'), ('E','2022/04/21','BB'), ('E','2022/04/22','FF'), ('E','2022/04/23','FF'), ('E','2022/04/24','FF'), ('E','2022/04/25','FF'), ('E','2022/04/26','AA'), ('E','2022/04/27','AA'), ('E','2022/04/28','AA'),
('E','2022/04/29','AA'), ('E','2022/04/30','FF'), ('E','2022/05/01','FF'), ('E','2022/05/01','FF') ;
Now on to the answer. This looks like an rCTE to me:
;WITH base AS (
SELECT item, S_date, Cur_Status, LAG(Cur_Status,1) OVER (PARTITION BY item ORDER BY S_date) AS prev_Status, CASE WHEN Cur_Status = LAG(Cur_Status,1) OVER (PARTITION BY item ORDER BY S_date) AND DATEPART(MONTH,S_date) = DATEPART(MONTH,LAG(S_date,1) OVER (PARTITION BY item ORDER BY S_date)) THEN 1 END AS Counter
FROM #table
), rCTE AS (
SELECT item, S_date, S_Date AS StartDate, Cur_Status, 1 AS Counter, S_date AS StopDate
FROM base
WHERE Counter IS NULL
UNION ALL
SELECT a.item, r.S_date, a.StartDate, a.Cur_Status, a.Counter + r.Counter, r.S_date AS StopDate
FROM rCTE a
INNER JOIN base r
ON a.item = r.item
AND a.Cur_Status = r.Cur_Status
AND a.S_date = DATEADD(DAY,-1,r.S_date)
AND r.Counter IS NOT NULL
)
SELECT item, rCTE.StartDate AS Start_date, MAX(rCTE.StopDate) AS End_Date, rCTE.Cur_Status, MAX(Counter) AS Duration
FROM rCTE
GROUP BY item, rCTE.StartDate, rCTE.Cur_Status
ORDER BY item, End_Date
OPTION (MAXRECURSION 0)
Basically what we're doing here is iterating over all the rows to make groups where they don't naturally exist.
It looks like your expected data is off too, I found this line:
item Start_date End_date Cur_Status Duration
----------------------------------------------------
A 2022-01-07 2022-01-12 AA 4
Should that not be 6?
Here's some of the example out put:
item Start_date End_Date Cur_Status Duration
----------------------------------------------------
A 2022-01-01 2022-01-03 AA 3
A 2022-01-04 2022-01-06 BB 3
A 2022-01-07 2022-01-12 AA 6
A 2022-01-13 2022-01-14 CC 2
A 2022-01-15 2022-01-15 AA 1
A 2022-01-16 2022-01-18 DD 3
A 2022-01-19 2022-01-19 EE 1
A 2022-01-20 2022-01-20 AA 1
Edit:
I modified the query to take into account the end of month.
The case statement expression handing the NULLABLE counter is now:
CASE WHEN Cur_Status = LAG(Cur_Status,1) OVER (PARTITION BY item ORDER BY S_date) AND DATEPART(MONTH,S_date) = DATEPART(MONTH,LAG(S_date,1) OVER (PARTITION BY item ORDER BY S_date)) THEN 1 END
and an additional predicate was applied in the rCTE:
AND r.Counter IS NOT NULL
Example results are now:
item Start_date End_Date Cur_Status Duration
----------------------------------------------------
A 2022-01-20 2022-01-20 AA 1
A 2022-01-21 2022-01-21 BB 1
A 2022-01-22 2022-01-25 FF 4
**A 2022-01-26 2022-01-31 AA 6**
**A 2022-02-01 2022-02-05 AA 5**
A 2022-02-06 2022-02-06 BB 1
I am sure there is a more eloquent way to do this, but it seems like a gaps and island type of problem. So here is a different way to go about it:
SELECT Item,MIN(S_Date) Start_Date, MAX(S_Date) End_Date, Cur_Status, DATEDIFF(day,MIN(S_Date), MAX(s_DATE)) + 1 Duration
FROM
(
SELECT * ,SUM(CASE WHEN Cur_Status <> LG THEN 1 ELSE 0 END) OVER(PARTITION BY Item,YR,MN ORDER BY s_Date) GRP
FROM
(
select * ,MONTH(s_Date) MN, Year(s_Date) YR, LAG(Cur_Status,1) OVER(PARTITION BY Item ORDER BY S_Date) LG
from #table
) X
) Y
GROUP BY Item,GRP,YR,MN,Cur_Status
ORDER BY Item,Start_Date
First of all thanks for your reply. Yes, you're right, it has to be 6.
I've try it but the issue is that I need the duration for every month. That means that if a period (start-end) goes beyond the month, the duration (start-end) may only be determined up to the end of the month. The rest from the duration should be determined and add to the next month.
For example: A | 2022-01-26 | 2022-02-05 |AA
Result:
A | 2022-01-26 | 2022-01-31 | AA | 6
A | 2000-02-01 | 2022-02-05 | AA | 5

Typeorm order by after distinct on with postgresql

I have a table below:
id
product_id
priceĀ 
1
1
100
2
1
150
3
2
120
4
2
190
5
3
100
6
3
80
I want to select cheapest price for product and sort them by price
Expected output:
id
product_id
price
6
3
80
1
1
100
3
2
120
What I try so far:
`
repository.createQueryBuilder('products')
.orderBy('products.id')
.distinctOn(['products.id'])
.addOrderBy('price')
`
This query returns, cheapest products but not sort them. So, addOrderBy doesn't effect to products. Is there a way to sort products after distinctOn ?
SELECT id,
product_id,
price
FROM (SELECT id,
product_id,
price,
Dense_rank()
OVER (
partition BY product_id
ORDER BY price ASC) dr
FROM product) inline_view
WHERE dr = 1
ORDER BY price ASC;
Setup:
postgres=# create table product(id int, product_id int, price int);
CREATE TABLE
postgres=# insert into product values (1,1,100),(2,1,150),(3,2,120),(4,2,190),(5,3,100),(6,3,80);
INSERT 0 6
Output
id | product_id | price
----+------------+-------
6 | 3 | 80
1 | 1 | 100
3 | 2 | 120
(3 rows)

How to write a select query for displaying data on a table in another way using Postgresql?

I want to write a select query to pick data from a table which is shown in this image below,PICTURE_1
1.Table Containing Data
and display it like this image in this link below, PICTURE_2
2.Result of the query
About the data: The first picture shows data logged into a table for 2 seconds from 3 IDs(1,2&3) having 2 sub IDs (aa&bb). Values and timestamp are also displayed in the picture. The table conatins only 3 column as shown in PICTURE_1. Could you guys help me write a query to display data in the table to get displayed as shown in the second image using Postgresql?. You can extract ID name using substring function. The language that Im using is plpgsql. Any ideas/logic also will be good.Thank you for your time.
Please try this. Here row value has been shown in column wise and also use CTE.
-- PostgreSQL(v11)
WITH cte_t AS (
SELECT LEFT(name, 1) id
, RIGHT(name, POSITION('.' IN REVERSE(name)) - 1) t_name
, value
, time_stamp
FROM test
)
SELECT id
, time_stamp :: DATE "date"
, time_stamp :: TIME "time"
, MAX(CASE WHEN t_name = 'aa' THEN value END) "aa"
, MAX(CASE WHEN t_name = 'bb' THEN value END) "bb"
FROM cte_t
GROUP BY id, time_stamp
ORDER BY date, time, id;
Please check from url https://dbfiddle.uk/?rdbms=postgres_11&fiddle=6d35047560b3f83e6c906584b23034e9
Check this query dbfiddle
with cte (name, value, timeStamp) as (values
('1.aa', 1, '2021-08-20 10:10:01'),
('2.aa', 2, '2021-08-20 10:10:01'),
('3.aa', 3, '2021-08-20 10:10:01'),
('1.bb', 4, '2021-08-20 10:10:01'),
('2.bb', 5, '2021-08-20 10:10:01'),
('3.bb', 6, '2021-08-20 10:10:01'),
('1.aa', 7, '2021-08-20 10:10:02'),
('2.aa', 8, '2021-08-20 10:10:02'),
('3.aa', 9, '2021-08-20 10:10:02'),
('1.bb', 0, '2021-08-20 10:10:02'),
('2.bb', 1, '2021-08-20 10:10:02'),
('3.bb', 2, '2021-08-20 10:10:02')
), sub_cte as (
select split_name[1] as id, split_name[2] as name, value, tt::date as date, tt::time as time from (
select
regexp_split_to_array(name, '\.') split_name,
value,
to_timestamp(timestamp, 'YYYY-MM-DD HH:MI:SS') as tt
from cte
) foo
)
select id, date, time, a.value as aa, b.value as bb from sub_cte a
left join (
select * from sub_cte where name = 'bb'
) as b using (id, date, time)
where a.name = 'aa'
Result
id | date | time | aa | bb
----+------------+----------+----+----
1 | 2021-08-20 | 10:10:01 | 1 | 4
2 | 2021-08-20 | 10:10:01 | 2 | 5
3 | 2021-08-20 | 10:10:01 | 3 | 6
1 | 2021-08-20 | 10:10:02 | 7 | 0
2 | 2021-08-20 | 10:10:02 | 8 | 1
3 | 2021-08-20 | 10:10:02 | 9 | 2
(6 rows)

First and second time appearing row id in PostgreSQL

Suppose we have a list of ids with date. And we want to know when the ids appeared for the first and the second time. About the first time, I have created a query that is
SELECT year, mon, COUNT(id) AS sum_first_id
FROM (
SELECT DISTINCT
ON (id) DATE, id
FROM TABLE
GROUP BY 2, 1
) AS foo
GROUP BY 2, 1
ORDER BY 1, 2;
I think that this works. But how could I find when the ids appear for the second time?
Let's say you have the table table_x:
select *
from table_x
order by 1, 2
id | date
----+------------
1 | 2015-06-04
1 | 2015-06-05
1 | 2015-06-14
2 | 2015-06-05
2 | 2015-06-08
2 | 2015-06-10
2 | 2015-06-17
2 | 2015-06-22
(8 rows)
To select n first element in groups use row_number() function:
select id, date
from (
select id, date, row_number() over (partition by id order by date) rn
from table_x
order by 1, 2
) sub
where rn <= 2
id | date
----+------------
1 | 2015-06-04
1 | 2015-06-05
2 | 2015-06-05
2 | 2015-06-08
(4 rows)
It does not appear that your query is correct.
SELECT year, mon, COUNT(id) AS sum_first_id -- what is year, mon?
FROM (
SELECT DISTINCT
ON (id) DATE, id
FROM TABLE
GROUP BY 2, 1 -- should be order by 2, 1
) AS foo
GROUP BY 2, 1
ORDER BY 1, 2;

how to efficiently locate a value from one table among values from another table, with SQL

I have a problem in Postgresql which I find even difficult to describe in the title: I have two tables, containing each a range of values very similar but not identical. Suppose I have values like 0, 10, 20, 30, ... in one, and 1, 5, 6, 9, 10, 12, 19, 25, 26, ... in the second one (these are milliseconds). For each value of the second one I want to find the values immediately lower and higher in the first one. So, for the value 12 it would give me 10 and 20. I'm doing it like this :
SELECT s.*, MAX(v1."millisec") AS low_v, MIN(v2."millisec") AS high_v
FROM "signals" AS s, "tracks" AS v1, "tracks" AS v2
WHERE v1."millisec" <= s."d_time"
AND v2."millisec" > s."d_time"
GROUP BY s."d_time", s."field2"; -- this is just an example
And it works ... but it is very slow once I process several thousands of lines, even with indexes on s."d_time" and v.millisec. So, I think there must be a much better way to do it, but I fail to find one. Could anyone help me ?
Try:
select s.*,
(select millisec
from tracks t
where t.millisec <= s.d_time
order by t.millisec desc
limit 1
) as low_v,
(select millisec
from tracks t
where t.millisec > s.d_time
order by t.millisec asc
limit 1
) as high_v
from signals s;
Be sure you have an index for track.millisec;
If you had just created
the index, you'll need to analyze the table to take advantage of it.
Naive (trivial) way to find the preceding and next value.
-- the data (this could have been part of the original question)
CREATE TABLE table_one (id SERIAL NOT NULL PRIMARY KEY
, msec INTEGER NOT NULL -- index maight help
);
CREATE TABLE table_two (id SERIAL NOT NULL PRIMARY KEY
, msec INTEGER NOT NULL -- index maight help
);
INSERT INTO table_one(msec) VALUES (0), ( 10), ( 20), ( 30);
INSERT INTO table_two(msec) VALUES (1), ( 5), ( 6), ( 9), ( 10), ( 12), ( 19), ( 25), ( 26);
-- The query: find lower/higher values in table one
-- , but but with no values between "us" and "them".
--
SELECT this.msec AS this
, prev.msec AS prev
, next.msec AS next
FROM table_two this
LEFT JOIN table_one prev ON prev.msec < this.msec AND NOT EXISTS (SELECT 1 FROM table_one nx WHERE nx.msec < this.msec AND nx.msec > prev.msec)
LEFT JOIN table_one next ON next.msec > this.msec AND NOT EXISTS (SELECT 1 FROM table_one nx WHERE nx.msec > this.msec AND nx.msec < next.msec)
;
Result:
CREATE TABLE
CREATE TABLE
INSERT 0 4
INSERT 0 9
this | prev | next
------+------+------
1 | 0 | 10
5 | 0 | 10
6 | 0 | 10
9 | 0 | 10
10 | 0 | 20
12 | 10 | 20
19 | 10 | 20
25 | 20 | 30
26 | 20 | 30
(9 rows)
try this :
select * from signals s,
(select millisec low_value,
lead(millisec) over (order by millisec) high_value from tracks) intervals
where s.d_time between low_value and high_value-1
For this type of problem "Window functions" are ideal see : http://www.postgresql.org/docs/9.1/static/tutorial-window.html