Pentaho - How to show complete date in line chart's X axis - charts

I'm creating a Pentaho CDE dashboard and I'm having some difficulties in setup how the date is shown in a line chart. The below chart is what I have at this time:
As you can see, the X axis has numbers from 1-6, they are months. What I want do is show more information in this axis instead of simply 1, I want show "January / 2013" for example, but I have no idea of how can I achieve this.
My Mondrian schema for date dimension is this:
<Dimension type="TimeDimension" visible="true" foreignKey="data_id" highCardinality="false" name="Data">
<Hierarchy name="data" visible="true" hasAll="true">
<Table name="dimensao_data">
</Table>
<Level name="ano" visible="true" column="ano" type="Numeric" uniqueMembers="true" levelType="TimeYears" hideMemberIf="Never">
</Level>
<Level name="semestre" visible="true" column="semestre" type="Numeric" uniqueMembers="false" levelType="TimeHalfYears" hideMemberIf="Never" captionColumn="labelSemestre">
</Level>
<Level name="quarto" visible="true" column="quarto" type="Numeric" uniqueMembers="false" levelType="TimeQuarters" hideMemberIf="Never" captionColumn="labelQuarto">
</Level>
<Level name="mes" visible="true" column="mes" type="Numeric" uniqueMembers="false" levelType="TimeMonths" hideMemberIf="Never" captionColumn="labelMes">
</Level>
<Level name="dia" visible="true" column="dia" type="Numeric" uniqueMembers="false" levelType="TimeDays" hideMemberIf="Never">
</Level>
</Hierarchy>
</Dimension>
and this is the MDX I'm using to retrieve data for the chart:
SELECT NON EMPTY {[Measures].[valor]} ON COLUMNS,
NON EMPTY CrossJoin({[pagamento.forma].[moeda].MEMBERS}, {[Data.data].[mes].MEMBERS}) ON ROWS
FROM [Vendas]
WHERE {[Empresa.empresa].[MATRIZ]}
NEW INFORMATION
When I use debug mode I can see that Data.data don't comes only with month value and in String format:
[pvc.LineChart ]: DATA SOURCE SUMMARY
╔═════════╤═════════════════════╤═════════════╤══════════╗
║ Name │ pagamento.forma │ Data.data │ valor ║
╟─────────┼─────────────────────┼─────────────┼──────────╢
║ Label │ │ │ ║
╟─────────┼─────────────────────┼─────────────┼──────────╢
║ Type │ String │ String │ Numeric ║
╟─────────┼─────────────────────┼─────────────┼──────────╢
║ 1 │ "BOLETO BANCARIO" │ "1" │ 10469.15 ║
║ 2 │ "BOLETO BANCARIO" │ "2" │ 16279.45 ║
║ 3 │ "BOLETO BANCARIO" │ "3" │ 16279.45 ║
║ 4 │ "BOLETO BANCARIO" │ "4" │ 5810.3 ║
║ 5 │ "BOLETO BANCARIO" │ "5" │ 16279.45 ║
║ 6 │ "BOLETO BANCARIO" │ "6" │ 5810.3 ║
║ 7 │ "CARTÃO DE CRÉDITO" │ "1" │ 10243.57 ║
║ 8 │ "CARTÃO DE CRÉDITO" │ "2" │ 9178.03 ║
║ 9 │ "CARTÃO DE CRÉDITO" │ "3" │ 10273.08 ║
║ 10 │ "CARTÃO DE CRÉDITO" │ "4" │ 10110.4 ║
║ 11 │ "CARTÃO DE CRÉDITO" │ "5" │ 10366.3 ║
║ 12 │ "CARTÃO DE CRÉDITO" │ "6" │ 10768.75 ║
║ 13 │ "CARTÃO DE DÉBITO" │ "1" │ 15584.84 ║
║ 14 │ "CARTÃO DE DÉBITO" │ "2" │ 12400.53 ║
║ 15 │ "CARTÃO DE DÉBITO" │ "3" │ 13517.65 ║
╟─────────┼─────────────────────┼─────────────┼──────────╢
║ (15/41) │ │ │ ║
╚═════════╧═════════════════════╧═════════════╧══════════╝
So, I believe the problem is with the result of Data.data. How can I purchase the complete date to show in chart?

There are multiple ways to achieve that:
At the query level:
Define a Measure that holds the info you want to display:
With member [Measures].[Date Label] as [data].CurrentMember.Caption || " / " || Ancestor( [data].CurrentMember, [data].[ano]).Name
This should give you "2013 / January" as the output. Just filter out the columns you want to pass to the chart when defining the CDA query.
At the chart level.
You can change what the chart displays by playing around with the PostFetch of the chart. Something like
function(data){
var results = data.resultset.map(function(d){
// Tweak the contents of each line of data here
// You will want to take the value of d[0] and replace it by
// something else.
return d;
});
data.resultset = results;
return data
}
I prefer to have this type of thing done at the query level, it makes the dashboard simpler to understand and maintain. But it depends a lot on the specifics.

Related

Transposition using multiple merges (feedback wanted)

i have a solution in danfojs (codepen below) but would like some feedback on how to improve it, since im new to danfo, maybe the same result can be obtained more elegantly.
Essentially i want to do a transposition by merging around twenty Dataframes, but realized that the merge feature only allows to merge two single Dataframes, so i implemented a loop, but end up removing extra columns (played with joins but was the best i could get).
The input files (file1...file20) have the following structure:
╔═══╤══════════╤═══════════════════╤═══════════╗
║ │ tag_name │ ts_utc │ tag_value ║
╟───┼──────────┼───────────────────┼───────────╢
║ 0 │ C1 │ Thu Sep 15 2022… │ 1 ║
╟───┼──────────┼───────────────────┼───────────╢
║ 1 │ C1 │ Sat Sep 17 2022… │ 10 ║
╟───┼──────────┼───────────────────┼───────────╢
║ 2 │ C1 │ Sat Sep 17 2022… │ 11 ║
╚═══╧══════════╧═══════════════════╧═══════════╝
...
╔═══╤══════════╤═══════════════════╤═══════════╗
║ │ tag_name │ ts_utc │ tag_value ║
╟───┼──────────┼───────────────────┼───────────╢
║ 0 │ C3 │ Thu Sep 15 2022… │ 3 ║
╟───┼──────────┼───────────────────┼───────────╢
║ 1 │ C3 │ Sat Sep 17 2022… │ 30 ║
╟───┼──────────┼───────────────────┼───────────╢
║ 2 │ C3 │ Sat Sep 17 2022… │ 31 ║
╚═══╧══════════╧═══════════════════╧═══════════╝
And the desired result is:
tag_value header becomes the tag_name (C1,C2...)
Components are merged by date (ts_utc)
tag_name is replaced by constant "sourcetag"
╔═══╤═══════════════════╤═══════════╤═════╤════╤════╗
║ │ ts_utc │ tag_name │ C1 │ C2 │ C3 ║
╟───┼───────────────────┼───────────┼─────┼────┼────╢
║ 0 │ Thu Sep 15 2022… │ sourceTag │ 1 │ 2 │ 3 ║
╟───┼───────────────────┼───────────┼─────┼────┼────╢
║ 1 │ Sat Sep 17 2022… │ sourceTag │ 10 │ 20 │ 30 ║
╟───┼───────────────────┼───────────┼─────┼────┼────╢
║ 2 │ Sat Sep 17 2022… │ sourceTag │ 11 │ 21 │ 31 ║
╚═══╧═══════════════════╧═══════════╧═════╧════╧════╝
Sample code:
https://codepen.io/dotmindlabs/pen/xxJwMqZ?editors=1111
Thanks, any feedback is very appreciatted.

Counting the same positional bits in postgresql bitmasks

I am trying to count each same position bit of multiple bitmasks in postgresql, here is an example of the problem:
Suppose i have three bitmasks (in binary) like:
011011011100110
100011010100101
110110101010101
Now what I want to do is to get the total count of bits in each separate column, considering the above masks as three rows and multiple columns.
e.g The first column have count 2, the second one have count 2, the third one have count of 1 and so on...
In actual i have total of 30 bits in each bitmasks in my database. I want to do it in PostgreSQL. I am open for further explanation of the problem if needed.
You could do it by using the get_bit functoin and a couple of joins:
SELECT sum(bit) FILTER (WHERE i = 0) AS count_0,
sum(bit) FILTER (WHERE i = 1) AS count_1,
...
sum(bit) FILTER (WHERE i = 29) AS count_29
FROM bits
CROSS JOIN generate_series(0, 29) AS i
CROSS JOIN LATERAL get_bit(b, i) AS bit;
The column with the bit string is b in my example.
You could use the bitwise and & operator and bigint arithmetic so long as your bitstrings contain 63 bits or fewer:
# create table bmasks (mask bit(15));
CREATE TABLE
# insert into bmasks values ('011011011100110'), ('100011010100101'), ('110110101010101');
INSERT 0 3
# with masks as (
select (2 ^ x)::bigint::bit(15) as mask, x as posn
from generate_series(0, 14) as gs(x)
)
select m.posn, m.mask, sum((b.mask & m.mask > 0::bit(15))::int) as set_bits
from masks m
cross join bmasks b
group by m.posn, m.mask;
┌──────┬─────────────────┬──────────┐
│ posn │ mask │ set_bits │
├──────┼─────────────────┼──────────┤
│ 0 │ 000000000000001 │ 2 │
│ 1 │ 000000000000010 │ 1 │
│ 2 │ 000000000000100 │ 3 │
│ 3 │ 000000000001000 │ 0 │
│ 4 │ 000000000010000 │ 1 │
│ 5 │ 000000000100000 │ 2 │
│ 6 │ 000000001000000 │ 2 │
│ 7 │ 000000010000000 │ 2 │
│ 8 │ 000000100000000 │ 1 │
│ 9 │ 000001000000000 │ 2 │
│ 10 │ 000010000000000 │ 3 │
│ 11 │ 000100000000000 │ 1 │
│ 12 │ 001000000000000 │ 1 │
│ 13 │ 010000000000000 │ 2 │
│ 14 │ 100000000000000 │ 2 │
└──────┴─────────────────┴──────────┘
(15 rows)

Druid TopNMetricSpec - Performance Clarification

Problem statement:
I need to find all the distinct userIDs from huge dataset (~100Millions) and so experimenting with TopNMetricSpec so that I get userIDs based on the set threshold.
Can you anyone help me to understand how TopNMetricSpec runs?
If I run following TopN query with TopNMetricSpec repeatedly for 'n' times using the same http client,
then I want to know will this scan all the records every time when we set previousStop.
Consider the following Data:
┌──────────────────────────┬─────────┬────────┬────────┐
│ __time │ movieId │ rating │ userId │
├──────────────────────────┼─────────┼────────┼────────┤
│ 2015-02-05T00:10:09.000Z │ 2011 │ 3.5 │ 215 │
│ 2015-02-05T00:10:26.000Z │ 38061 │ 3.5 │ 215 │
│ 2015-02-05T00:10:32.000Z │ 8981 │ 2.0 │ 215 │
│ 2015-02-05T00:11:00.000Z │ 89864 │ 4.0 │ 215 │
│ 2015-02-23T23:55:08.000Z │ 56587 │ 1.5 │ 31 │
│ 2015-02-23T23:55:33.000Z │ 51077 │ 4.0 │ 31 │
│ 2015-02-23T23:55:35.000Z │ 49274 │ 4.0 │ 31 │
│ 2015-02-23T23:55:37.000Z │ 30816 │ 2.0 │ 31
│ 2015-03-19T14:24:01.000Z │ 5066 │ 5.0 │ 176 │
│ 2015-03-19T14:26:23.000Z │ 6776 │ 5.0 │ 176 │
│ 2015-03-29T16:19:58.000Z │ 2337 │ 2.0 │ 96 │
For example, in the following query:
Initially, I have set previous stop as null and threshold has two so it will fetch first two records (because threshold = 2) viz. 215, 176
Now, I will pass previous stop = 176 now the question is will the broker scan all the records again or will it just scan from where it stopped after step 1 i.e. 176?
{
"queryType": "topN",
"dataSource": "ratings30K",
"intervals": "2015-02-05T00:00:00.000Z/2015-03-30T00:00:00.000Z",
"granularity": "all",
"dimension":"userId",
"threshold": 2,
"metric": {
"type": "inverted",
"metric": {
"type": "dimension",
"ordering": "Numeric",
"previousStop": null
}
}
}
It doesn't quite work the way you describe it. You have two options as to what to give as metric in this query. You can either find the users that have given the highest ratings:
"dimension":"userId",
"metric":"rating"
Or you can sort by user id in ascending order, in which case you can provide a previousStop:
"dimension":"userId",
"metric": {
"type": "dimension",
"ordering": "Numeric",
"previousStop": null
}
(For either of those two options, you can invert the sort order by wrapping the metric in an inverted metric spec.)
But when you sort by rating, you can't give a previousStop value. So if you want pagination that is guaranteed to return all rows, you can't sort by rating.

What CTE would generate listed results or is there a better way to aggregate the names in the tables provided

Is there a CTE that can be written to query these tables and to generate the listed results? If so what would it look like?
This is SQL Serve 2012 SP2.
create table d ( id int, calldate date)
insert into d
values (3, '2016-08-03'), (4, '2016-08-04'), (5, '2016-08-05'),
(6, '2016-08-06'), (7, '2016-08-07'), (8, '2016-08-08'),
(9, '2016-08-09'), (10, '2016-08-10'), (11, '2016-08-11'),
(12, '2016-08-12'), (13, '2016-08-13'), (14, '2016-08-14'),
(15, '2016-08-15'), (16, '2016-08-16'), (17, '2016-08-17'),
(18, '2016-08-18'), (19 , '2016-08-19' ), (20 , '2016-08-20' ), (21 , '2016-08-21' ), (22 , '2016-08-22' ), (23 , '2016-08-23' ), (24 , '2016-08-24' ), (25 , '2016-08-25' ), (26 , '2016-08-26' ), (27 , '2016-08-27' ), (28 , '2016-08-28' ), (29 , '2016-08-29' ), (30 , '2016-08-30');
create table w ( id int, startdate date, enddate date, holiday varchar(60))
insert into w
values (1, '2016-05-02', '2016-05-08', ''),
(2, '2016-05-09', '2016-05-15', 'Company holiday'),
(3, '2016-05-16', '2016-05-22', 'Company holiday'),
(4, '2016-05-23', '2016-05-29', ''),
(5, '2016-05-30', '2016-06-05', ''), (6, '2016-06-06', '2016-06-12', 'Company holiday'), (7, '2016-06-13', '2016-06-19', ''), (8, '2016-06-20', '2016-06-26', ''), (9, '2016-06-27', '2016-07-03', ''), (10, '2016-07-04', '2016-07-10', 'Company holiday'), (11, '2016-07-11', '2016-07-17', ''), (12, '2016-07-18', '2016-07-24', ''), (13, '2016-07-25', '2016-07-31', ''), (14, '2016-08-01', '2016-08-07', ''), (15, '2016-08-08', '2016-08-14', ''), (16, '2016-08-15', '2016-08-21', ''), (17, '2016-08-22', '2016-08-28', ''), (18, '2016-08-29', '2016-09-04', ''), (19, '2016-09-05', '2016-09-11', '')
create table e (id int, name varchar(100), init varchar(10), IsAManager bit)
insert into e values (2 , 'Alice' , 'AA', 0), (8 , 'Jack ' , 'JM', 1), (9 , 'Ace ' , 'AQ', 0), (10 , 'Mike ' , 'MM', 0), (16 , 'George' ,'GH', 0), (21 , 'Jenny' , 'JL', 1), (22 , 'Bill ' , 'BG', 1), (30 , 'Blank' , ' ', 0)
CREATE TABLE r ( id INT, NAME VARCHAR(50) );
INSERT INTO r VALUES (13, 'Primary'), (22, 'Secondary'), (33, 'Prim Trd'), (44, 'Sec Trd');
create table ca (e_id int, r_id int, d_id int, s_id int)
insert into ca values (8 ,13 ,8 ,13), (8 ,13 ,9 ,13), (8 ,13 ,10 ,13), (8 ,13 ,11 ,13), (8 ,13 ,12 ,13), (8 ,13 ,13 ,13), (8 ,13 ,14 ,13), (10 ,13 ,15 ,13), (10 ,13 ,16 ,13), (10 ,13 ,17 ,13), (10 ,13 ,18 ,13), (10 ,13 ,19 ,13),
(10 ,13 ,20 ,13), (10 ,13 ,21 ,13), (16 ,13 ,22 ,13), (16 ,13 ,23 ,13), (16 ,13 ,24 ,13), (16 ,13 ,25 ,13), (16 ,13 ,26 ,13), (16 ,13 ,27 ,13), (16 ,13 ,28 ,13), (10 ,22 ,8 ,13), (10 ,22 ,9 ,13), (10 ,22 ,10 ,13),
(10 ,22 ,11 ,13), (10 ,22 ,12 ,13), (10 ,22 ,13 ,13), (10 ,22 ,14 ,13), (16 ,22 ,15 ,13), (16 ,22 ,16 ,13), (16 ,22 ,17 ,13), (16 ,22 ,18 ,13), (16 ,22 ,19 ,13), (16 ,22 ,20 ,13), (16 ,22 ,21 ,13), (2 ,22 ,22 ,13),
(2 ,22 ,23 ,13), (2 ,22 ,24 ,13), (2 ,22 ,25 ,13), (2 ,22 ,26 ,13), (2 ,22 ,27 ,13), (2 ,22 ,28 ,13), (30 ,33 ,8 ,13), (30 ,33 ,9 ,13), (30 ,33 ,10 ,13), (30 ,33 ,11 ,13), (30 ,33 ,12 ,13), (30 ,33 ,13 ,13),
(30 ,33 ,14 ,13), (30 ,33 ,15 ,13), (21 ,33 ,16 ,13), (22 ,33 ,17 ,13), (30 ,33 ,18 ,13), (30 ,33 ,19 ,13), (30 ,33 ,20 ,13), (30 ,33 ,21 ,13), (2 ,33 ,22 ,13), (30 ,33 ,23 ,13), (30 ,33 ,24 ,13), (30 ,33 ,25 ,13),
(30 ,33 ,26 ,13), (30 ,33 ,27 ,13), (30 ,33 ,28 ,13), (30 ,44 ,8 ,13), (30 ,44 ,9 ,13), (30 ,44 ,10 ,13), (30 ,44 ,11 ,13), (30 ,44 ,12 ,13), (30 ,44 ,13 ,13), (30 ,44 ,14 ,13), (30 ,44 ,15 ,13), (30 ,44 ,16 ,13),
(30 ,44 ,17 ,13), (30 ,44 ,18 ,13), (30 ,44 ,19 ,13), (30 ,44 ,20 ,13), (30 ,44 ,21 ,13), (2 ,44 ,22 ,13), (2 ,44 ,23 ,13), (21 ,44 ,24 ,13), (21 ,44 ,25 ,13), (21 ,44 ,26 ,13), (30 ,44 ,27 ,13), (30 ,44 ,28 ,13)
CREATE TABLE s ( id INT, specialty VARCHAR(50) );
insert into s values (13 , 'Pediatric') , (28 , 'EMT'), (55 , 'ER'), (62 , 'ICU'), (74 , 'GIU'), (88 , 'BBB')
Table W
║ id │ startdate │ enddate ║
╠════╪════════════╪════════════╣
║ 11 │ 2016-07-11 │ 2016-07-17 ║
║ 12 │ 2016-07-18 │ 2016-07-24 ║
║ 13 │ 2016-07-25 │ 2016-07-31 ║
║ 14 │ 2016-08-01 │ 2016-08-07 ║
║ 15 │ 2016-08-08 │ 2016-08-14 ║
║ 16 │ 2016-08-15 │ 2016-08-21 ║
║ 17 │ 2016-08-22 │ 2016-08-28 ║
║ 18 │ 2016-08-29 │ 2016-09-04 ║
║ 19 │ 2016-09-05 │ 2016-09-11 ║
Table D
║ id │ calldate ║
╠════╪════════════╣
║ 3 │ 2016-08-03 ║
║ 4 │ 2016-08-04 ║
║ 5 │ 2016-08-05 ║
║ 6 │ 2016-08-06 ║
║ 7 │ 2016-08-07 ║
║ 8 │ 2016-08-08 ║
║ 9 │ 2016-08-09 ║
║ 10 │ 2016-08-10 ║
║ 11 │ 2016-08-11 ║
║ 12 │ 2016-08-12 ║
║ 13 │ 2016-08-13 ║
║ 14 │ 2016-08-14 ║
║ 15 │ 2016-08-15 ║
║ 16 │ 2016-08-16 ║
║ 17 │ 2016-08-17 ║
║ 18 │ 2016-08-18 ║
║ 19 │ 2016-08-19 ║
║ 20 │ 2016-08-20 ║
║ 21 │ 2016-08-21 ║
║ 22 │ 2016-08-22 ║
║ 23 │ 2016-08-23 ║
║ 24 │ 2016-08-24 ║
║ 25 │ 2016-08-25 ║
║ 26 │ 2016-08-26 ║
║ 27 │ 2016-08-27 ║
║ 28 │ 2016-08-28 ║
║ 29 │ 2016-08-29 ║
║ 30 │ 2016-08-30 ║
Table E
╔════╤════════╤═════════╤════════════╗
║ id │ name │ initial │ IsAManager ║
╠════╪════════╪═════════╪════════════╣
║ 2 │ Alice │ AA │ 0 ║
║ 8 │ Jack │ JM │ 1 ║
║ 9 │ Ace │ AQ │ 0 ║
║ 10 │ Mike │ MM │ 0 ║
║ 16 │ George │ GH │ 0 ║
║ 21 │ Jenny │ JL │ 1 ║
║ 22 │ Bill │ BG │ 1 ║
║ 30 │ Blank │ │ 0 ║
╚════╧════════╧═════════╧════════════╝
Table R
╔════╤═══════════╗
║ id │ name ║
╠════╪═══════════╣
║ 13 │ Primary ║
║ 22 │ Secondary ║
║ 33 │ Prim Trd ║
║ 44 │ Sec Trd ║
╚════╧═══════════╝
Table S
|ID | Specialty |
|---+-----------|
|13 | Pediatric |
|28 | EMT |
|55 | ER |
|62 | ICU |
|74 | GIU |
|88 | BBB |
Here is the linking table that links these tables together.
Table CA
║ e_id │ r_id │ d_id │ s_id ║
╠══════╪══════╪══════╪══════╣
║ 8 │ 13 │ 8 │ 13 ║
║ 8 │ 13 │ 9 │ 13 ║
║ 8 │ 13 │ 10 │ 13 ║
║ 8 │ 13 │ 11 │ 13 ║
║ 8 │ 13 │ 12 │ 13 ║
║ 8 │ 13 │ 13 │ 13 ║
║ 8 │ 13 │ 14 │ 13 ║
║ 10 │ 13 │ 15 │ 13 ║
║ 10 │ 13 │ 16 │ 13 ║
║ 10 │ 13 │ 17 │ 13 ║
║ 10 │ 13 │ 18 │ 13 ║
║ 10 │ 13 │ 19 │ 13 ║
║ 10 │ 13 │ 20 │ 13 ║
║ 10 │ 13 │ 21 │ 13 ║
║ 16 │ 13 │ 22 │ 13 ║
║ 16 │ 13 │ 23 │ 13 ║
║ 16 │ 13 │ 24 │ 13 ║
║ 16 │ 13 │ 25 │ 13 ║
║ 16 │ 13 │ 26 │ 13 ║
║ 16 │ 13 │ 27 │ 13 ║
║ 16 │ 13 │ 28 │ 13 ║
║ 10 │ 22 │ 8 │ 13 ║
║ 10 │ 22 │ 9 │ 13 ║
║ 10 │ 22 │ 10 │ 13 ║
║ 10 │ 22 │ 11 │ 13 ║
║ 10 │ 22 │ 12 │ 13 ║
║ 10 │ 22 │ 13 │ 13 ║
║ 10 │ 22 │ 14 │ 13 ║
║ 16 │ 22 │ 15 │ 13 ║
║ 16 │ 22 │ 16 │ 13 ║
║ 16 │ 22 │ 17 │ 13 ║
║ 16 │ 22 │ 18 │ 13 ║
║ 16 │ 22 │ 19 │ 13 ║
║ 16 │ 22 │ 20 │ 13 ║
║ 16 │ 22 │ 21 │ 13 ║
║ 2 │ 22 │ 22 │ 13 ║
║ 2 │ 22 │ 23 │ 13 ║
║ 2 │ 22 │ 24 │ 13 ║
║ 2 │ 22 │ 25 │ 13 ║
║ 2 │ 22 │ 26 │ 13 ║
║ 2 │ 22 │ 27 │ 13 ║
║ 2 │ 22 │ 28 │ 13 ║
║ 30 │ 33 │ 8 │ 13 ║
║ 30 │ 33 │ 9 │ 13 ║
║ 30 │ 33 │ 10 │ 13 ║
║ 30 │ 33 │ 11 │ 13 ║
║ 30 │ 33 │ 12 │ 13 ║
║ 30 │ 33 │ 13 │ 13 ║
║ 30 │ 33 │ 14 │ 13 ║
║ 30 │ 33 │ 15 │ 13 ║
║ 21 │ 33 │ 16 │ 13 ║
║ 22 │ 33 │ 17 │ 13 ║
║ 30 │ 33 │ 18 │ 13 ║
║ 30 │ 33 │ 19 │ 13 ║
║ 30 │ 33 │ 20 │ 13 ║
║ 30 │ 33 │ 21 │ 13 ║
║ 2 │ 33 │ 22 │ 13 ║
║ 30 │ 33 │ 23 │ 13 ║
║ 30 │ 33 │ 24 │ 13 ║
║ 30 │ 33 │ 25 │ 13 ║
║ 30 │ 33 │ 26 │ 13 ║
║ 30 │ 33 │ 27 │ 13 ║
║ 30 │ 33 │ 28 │ 13 ║
║ 30 │ 44 │ 8 │ 13 ║
║ 30 │ 44 │ 9 │ 13 ║
║ 30 │ 44 │ 10 │ 13 ║
║ 30 │ 44 │ 11 │ 13 ║
║ 30 │ 44 │ 12 │ 13 ║
║ 30 │ 44 │ 13 │ 13 ║
║ 30 │ 44 │ 14 │ 13 ║
║ 30 │ 44 │ 15 │ 13 ║
║ 30 │ 44 │ 16 │ 13 ║
║ 30 │ 44 │ 17 │ 13 ║
║ 30 │ 44 │ 18 │ 13 ║
║ 30 │ 44 │ 19 │ 13 ║
║ 30 │ 44 │ 20 │ 13 ║
║ 30 │ 44 │ 21 │ 13 ║
║ 2 │ 44 │ 22 │ 13 ║
║ 2 │ 44 │ 23 │ 13 ║
║ 21 │ 44 │ 24 │ 13 ║
║ 21 │ 44 │ 25 │ 13 ║
║ 21 │ 44 │ 26 │ 13 ║
║ 30 │ 44 │ 27 │ 13 ║
║ 30 │ 44 │ 28 │ 13 ║
Note: The e.id is the id of the last person who performed the trade either "Prim Trd" or "Sec Trd" for that week with disregard to id 30 blank. But if no trades happened for that week then e.id= 30 blank. Where initials are a list of initials that did a trade for that week. Same for names
Desired Result Set:
║ startdate │ enddate │ holiday │ specialty │ initials │ rolename │ s.id │ e.id │ name │ WasThisTraded ║
║ 2016-08-08 │ 2016-08-14 │ │ Pediatric │ JM │ Primary │ 13 │ 8 │ Jack │ N ║
║ 2016-08-08 │ 2016-08-14 │ │ Pediatric │ MM │ Secondary │ 13 │ 10 │ Mike │ N ║
║ 2016-08-08 │ 2016-08-14 │ │ Pediatric │ │ Prim Trd │ 13 │ 30 │ Blank │ N ║
║ 2016-08-08 │ 2016-08-14 │ │ Pediatric │ │ Sec Trd │ 13 │ 30 │ Blank │ N ║
║ 2016-08-15 │ 2016-08-21 │ │ Pediatric │ MM │ Primary │ 13 │ 10 │ Mike │ Y ║
║ 2016-08-15 │ 2016-08-21 │ │ Pediatric │ GH │ Secondary │ 13 │ 16 │ George │ N ║
║ 2016-08-15 │ 2016-08-21 │ │ Pediatric │ JL, BG │ Prim Trd │ 13 │ 22 │ Jenny, Bill │ N ║
║ 2016-08-15 │ 2016-08-21 │ │ Pediatric │ JL │ Sec Trd │ 13 │ 30 │ Jenny │ N ║
║ 2016-08-22 │ 2016-08-28 │ │ Pediatric │ GH │ Primary │ 13 │ 16 │ George │ Y ║
║ 2016-08-22 │ 2016-08-28 │ │ Pediatric │ AA │ Secondary │ 13 │ 2 │ Alice │ Y ║
║ 2016-08-22 │ 2016-08-28 │ │ Pediatric │ AA │ Prim Trd │ 13 │ 2 │ Alice │ N ║
║ 2016-08-22 │ 2016-08-28 │ │ Pediatric │ AA, JL │ Sec Trd │ 13 │ 21 │ Alice, Jenny │ N ║

OnCollisionEnter doesn't trigger

using UnityEngine;
using System.Collections;
public class changedirection : MonoBehaviour {
void OnCollisionEnter(Collision col)
{
if (col.gameObject.name == "soldier")
{
GameObject go = col.gameObject;
Move move = go.GetComponent<Move>();
move.direction = -1;
}
}
}
Both objects that collide are triggers, sorry didn't program with unity for over a year
Static colliders don't collide with each other.
There was a table that showed when collision and trigger events are fired on docs.unity3d.com. But they have updated the site and now I can't find it. I still have it local. So, here it is.
Collision action matrix
Depending on the configurations of the two colliding Objects, a number of different actions can occur. The chart below outlines what you can expect from two colliding Objects, based on the components that are attached to them. Some of the combinations only cause one of the two Objects to be affected by the collision, so keep the standard rule in mind - physics will not be applied to objects that do not have Rigidbodies attached.
Collision detection occurs and messages are sent upon collision
╔═══════════╦══════════╤═══════════╤═══════════╤══════════╤═══════════╤═══════════╗
║ ║ Static │ Rigidbody │ Kinematic │ Static │ Rigidbody │ Kinematic ║
║ ║ Collider │ Collider │ Rigidbody │ Trigger │ Trigger │ Rigidbody ║
║ ║ │ │ Collider │ Collider │ Collider │ Trigger ║
║ ║ │ │ │ │ │ Collider ║
╠═══════════╬══════════╪═══════════╪═══════════╪══════════╪═══════════╪═══════════╣
║ Static ║ │ Y │ │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Rigidbody ║ Y │ Y │ Y │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Kinematic ║ │ Y │ │ │ │ ║
║ Rigidbody ║ │ │ │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Static ║ │ │ │ │ │ ║
║ Trigger ║ │ │ │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Rigidbody ║ │ │ │ │ │ ║
║ Trigger ║ │ │ │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Kinematic ║ │ │ │ │ │ ║
║ Rigidbody ║ │ │ │ │ │ ║
║ Trigger ║ │ │ │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╚═══════════╩══════════╧═══════════╧═══════════╧══════════╧═══════════╧═══════════╝
Trigger messages are sent upon collision
╔═══════════╦══════════╤═══════════╤═══════════╤══════════╤═══════════╤═══════════╗
║ ║ Static │ Rigidbody │ Kinematic │ Static │ Rigidbody │ Kinematic ║
║ ║ Collider │ Collider │ Rigidbody │ Trigger │ Trigger │ Rigidbody ║
║ ║ │ │ Collider │ Collider │ Collider │ Trigger ║
║ ║ │ │ │ │ │ Collider ║
╠═══════════╬══════════╪═══════════╪═══════════╪══════════╪═══════════╪═══════════╣
║ Static ║ │ │ │ │ Y │ Y ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Rigidbody ║ │ │ │ Y │ Y │ Y ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Kinematic ║ │ │ │ │ │ ║
║ Rigidbody ║ │ │ │ Y │ Y │ Y ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Static ║ │ │ │ │ │ ║
║ Trigger ║ │ Y │ Y │ │ Y │ Y ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Rigidbody ║ │ │ │ │ │ ║
║ Trigger ║ Y │ Y │ Y │ Y │ Y │ Y ║
║ Collider ║ │ │ │ │ │ ║
╟───────────╫──────────┼───────────┼───────────┼──────────┼───────────┼───────────╢
║ Kinematic ║ │ │ │ │ │ ║
║ Rigidbody ║ Y │ Y │ Y │ Y │ Y │ Y ║
║ Trigger ║ │ │ │ │ │ ║
║ Collider ║ │ │ │ │ │ ║
╚═══════════╩══════════╧═══════════╧═══════════╧══════════╧═══════════╧═══════════╝
Layer-Based Collision Detection
In Unity 3.x we introduce something called Layer-Based Collision Detection, and you can now selectively tell Unity GameObjects to collide with specific layers they are attached to. For more information click here.