how can I get a new column order sequence when type is only service - amazon-redshift

I have this transaction table how can I get a new column order sequence when type is only service and product.
Question
id
Type
Date
Sequence
1
Member
2021-02-24
4
1
product
2021-01-03
2
2
service
2022-04-21
5
1
product
2021-02-01
3
2
service
2022-02-16
3
1
Member
2022-02-03
6
1
Service
2021-10-23
5
2
product
2022-01-03
2
1
service
2020-12-16
1
2
product
2022-03-30
4
2
service
2021-12-01
1
1
Member
2022-04-03
7
Result
id
Type
Date
Sequence
Expected Result
1
Member
2021-02-24
4
Null
1
product
2021-01-03
2
2
2
service
2022-04-21
5
5
1
product
2021-02-01
3
3
2
service
2022-02-16
3
3
1
Member
2022-02-03
6
Null
1
Service
2021-10-23
5
4
2
product
2022-01-03
2
2
1
service
2020-12-16
1
1
2
product
2022-03-30
4
4
2
service
2021-12-01
1
1
1
Member
2022-04-03
7
Null

You can try to use CASE WHEN expression.
SELECT *, CASE WHEN Type IN ('service','product') THEN Sequence END Newcolumn
FROM T

Related

Create a range of dates in a pyspark DataFrame

I have the following abstracted DataFrame (my original DF has 60 billion lines +)
Id Date Val1 Val2
1 2021-02-01 10 2
1 2021-02-05 8 4
2 2021-02-03 2 0
1 2021-02-07 12 5
2 2021-02-05 1 3
My expected ouput is:
Id Date Val1 Val2
1 2021-02-01 10 2
1 2021-02-02 10 2
1 2021-02-03 10 2
1 2021-02-04 10 2
1 2021-02-05 8 4
1 2021-02-06 8 4
1 2021-02-07 12 5
2 2021-02-03 2 0
2 2021-02-04 2 0
2 2021-02-05 1 3
Basically, what I need is: if Val1 or Val2 changes in a period of time, all the values between this two dates must have have the value from previous date. (To be more clearly, look at ID 2).
I know that I can do this in many ways (window function, udf,...) but my doubt is, since my original DF has more than 60 billion lines, what is the best approach to do this processing?
I think the best approach (performance-wise) is performing an inner join (probably with broadcasting). If you worry about the number of records, I suggest you run them by batch (could be the number of records, or by date, or even a random number). The general idea is just to avoid running all at once.

Mutual followers postgresql query

How can I find mutual followers? I am new to postgres and I am having a hard time constructing this.
followers
u_id f_id
3 1
2 5
4 3
1 2
4 1
1 3
2 8
2 10
1 4
2 4
3 4
4 5
4 8
example of expected output:
u_id f_id
3 1
1 3
4 3
3 4
...
...

How do a simultaneous ascending and descending sort in KDB/Q

In SQL, one can do
SELECT from tbl ORDER BY col1, col2 DESC
In KDB, one can do
`col1 xasc select from tbl
or
`col2 xdesc select from tbl
But how does one sort by col1 ascending then by col2 descending in KDB/Q?
2 sorts.
Create example data:
q)show tbl:([]a:10?10;b:10?10;c:10?10)
a b c
-----
8 4 8
1 9 1
7 2 9
2 7 5
4 0 4
5 1 6
4 9 6
2 2 1
7 1 8
8 8 5
Do sorting:
q)`a xasc `b xdesc tbl
a b c
-----
1 9 1
2 7 5
2 2 1
4 9 6
4 0 4
5 1 6
7 2 9
7 1 8
8 8 5
8 4 8

KDB: Aggregate across consecutive rows with common label

I would like to sum across consecutive rows that share the same label. Any very simple ways to do this?
Example: I start with this table...
qty flag
1 OFF
3 ON
2 ON
2 OFF
9 OFF
4 ON
... and would like to generate...
qty flag
1 OFF
5 ON
11 OFF
4 ON
One method:
q)show t:flip`qty`flag!(1 3 2 2 9 4;`OFF`ON`ON`OFF`OFF`ON)
qty flag
--------
1 OFF
3 ON
2 ON
2 OFF
9 OFF
4 ON
q)show result:select sum qty by d:sums differ flag,flag from t
d flag1| qty
----------| ---
1 OFF | 1
2 ON | 5
3 OFF | 11
4 ON | 4
Then to get it in the format you require:
q)`qty`flag#0!result
qty flag
--------
1 OFF
5 ON
11 OFF
4 ON

how do I get a list of all descendant nodes related to its parent node in rows in tsql?

as a sample, my table (table name: hier) looks like this:
parentID childID
-------- -------
0 1
1 2
1 3
2 5
2 8
3 4
3 6
3 7
4 9
and I want it to output this:
parentID RelatedID
-------- ---------
0 1
0 2
0 3
0 4
0 5
0 6
0 7
0 8
0 9
1 2
1 3
1 4
1 5
1 6
1 7
1 8
1 9
2 5
2 8
3 4
3 6
3 7
3 9
4 9
With cte(p, d)
As
(
Select a.parentID, b.childID From hier a inner join hier b on a.childID=b.parentID
)
Select * From cte Union Select * From hier