I'm logging energy usage data as a counter, which I would like to display as cumulative graphs that reset daily, as similarly asked here.
I can generate the cumulative value as follows:
SELECT mean("value") \
FROM "energy" \
WHERE $timeFilter \
GROUP BY time($__interval)
and the daily value as well:
SELECT max("value") \
FROM "energy" \
WHERE $timeFilter \
GROUP BY time(1d)
but I cannot subtract this or get this in one query, because the GROUP BY times are different.
(How) is this possible in influxdb? I've looked at INTEGRATE() but this haven't found a way to make this working.
The data looks like this (example limited to 1 day):
time value
---- ----
2018-12-10T17:00:00Z 7
2018-12-10T18:00:00Z 9
2018-12-10T19:00:00Z 10
2018-12-10T20:00:00Z 11
2018-12-10T21:00:00Z 13
2018-12-10T22:00:00Z 14
2018-12-10T23:00:00Z 15
2018-12-11T00:00:00Z 16
2018-12-11T01:00:00Z 17
2018-12-11T02:00:00Z 20
2018-12-11T03:00:00Z 24
2018-12-11T04:00:00Z 25
2018-12-11T05:00:00Z 26
2018-12-11T06:00:00Z 27
2018-12-11T07:00:00Z 28
2018-12-11T08:00:00Z 29
2018-12-11T09:00:00Z 31
2018-12-11T10:00:00Z 32
2018-12-11T11:00:00Z 33
2018-12-11T12:00:00Z 34
2018-12-11T13:00:00Z 35
2018-12-11T14:00:00Z 36
2018-12-11T15:00:00Z 37
2018-12-11T16:00:00Z 38
2018-12-11T17:00:00Z 39
I can plot the following:
But I want something like:
I found a solution, it's quite simple in the end:
SELECT kaifa-kaifa_fill as Energy FROM
(SELECT first(kaifa) as kaifa_fill from energyv2 WHERE $timeFilter group by time(1d) TZ('Europe/Amsterdam')),
(SELECT first(kaifa) as kaifa from energyv2 WHERE $timeFilter GROUP BY time($__interval))
fill(previous)
Note the fill(previous) is required to ensure kaifa_fill and kaifa overlap.
Example data:
time kaifa kaifa_fill kaifa_kaifa_fill
---- ----- ---------- ----------------
2019-08-03T00:00:00Z 179688195 179688195 0
2019-08-03T01:00:00Z 179746833 179688195 58638
2019-08-03T02:00:00Z 179803148 179688195 114953
2019-08-03T03:00:00Z 179859464 179688195 171269
2019-08-03T04:00:00Z 179914038 179688195 225843
2019-08-03T05:00:00Z 179967450 179688195 279255
2019-08-03T06:00:00Z 179905910 179688195 217715
2019-08-03T07:00:00Z 179847272 179688195 159077
2019-08-03T08:00:00Z 179698065 179688195 9870
2019-08-03T09:00:00Z 179378170 179688195 -310025
2019-08-03T10:00:00Z 179341013 179688195 -347182
2019-08-03T11:00:00Z 179126201 179688195 -561994
2019-08-03T12:00:00Z 179039116 179688195 -649079
2019-08-03T13:00:00Z 178935193 179688195 -753002
2019-08-03T14:00:00Z 178687870 179688195 -1000326
2019-08-03T15:00:00Z 178517762 179688195 -1170433
2019-08-03T16:00:00Z 178409776 179688195 -1278420
2019-08-03T17:00:00Z 178376102 179688195 -1312093
2019-08-03T18:00:00Z 178388875 179688195 -1299320
2019-08-03T19:00:00Z 178780181 179688195 -908015
2019-08-03T20:00:00Z 178928226 179688195 -759969
2019-08-03T21:00:00Z 179065241 179688195 -622954
2019-08-03T22:00:00Z 179183098 179688195 -505098
2019-08-03T23:00:00Z 179306179 179688195 -382016
2019-08-04T00:00:00Z 179306179 179370042 -63863
2019-08-04T00:00:00Z 179370042 179370042 0
2019-08-04T01:00:00Z 179417649 179370042 47607
2019-08-04T02:00:00Z 179464094 179370042 94053
2019-08-04T03:00:00Z 179509960 179370042 139918
2019-08-04T04:00:00Z 179591820 179370042 221779
2019-08-04T05:00:00Z 179872817 179370042 502775
2019-08-04T06:00:00Z 180056278 179370042 686236
2019-08-04T07:00:00Z 179929713 179370042 559671
2019-08-04T08:00:00Z 179514604 179370042 144562
2019-08-04T09:00:00Z 179053049 179370042 -316992
2019-08-04T10:00:00Z 178683225 179370042 -686817
2019-08-04T11:00:00Z 178078269 179370042 -1291773
2019-08-04T12:00:00Z 177650387 179370042 -1719654
2019-08-04T13:00:00Z 177281724 179370042 -2088317
2019-08-04T14:00:00Z 177041367 179370042 -2328674
2019-08-04T15:00:00Z 176807397 179370042 -2562645
2019-08-04T16:00:00Z 176737148 179370042 -2632894
2019-08-04T17:00:00Z 176677349 179370042 -2692693
2019-08-04T18:00:00Z 176690702 179370042 -2679340
2019-08-04T19:00:00Z 176734825 179370042 -2635216
2019-08-04T20:00:00Z 176810300 179370042 -2559742
2019-08-04T21:00:00Z 176866035 179370042 -2504007
2019-08-04T22:00:00Z 176914803 179370042 -2455239
2019-08-04T23:00:00Z 176965893 179370042 -2404149
2019-08-05T00:00:00Z 176965893 177016983 -51090
2019-08-05T00:00:00Z 177016983 177016983 0
Example graph:
Related
For example, let's say that I have an invoice for 1000 CUA (Currency A) and the exchange rate is 1 CUA = 20.20 CUB (Currency B). So I make 10 payments of 2019.90 CUB
#
Payment (CUB)
Payment (CUA)
Balance
0
1000.00
1
2019.90
100.00
900.00
2
2019.90
100.00
800.00
3
2019.90
100.00
700.00
4
2019.90
100.00
600.00
5
2019.90
100.00
500.00
6
2019.90
100.00
400.00
7
2019.90
100.00
300.00
8
2019.90
100.00
200.00
9
2019.90
100.00
100.00
10
2019.90
100.00
0.00
Σ
20199.00
1000.00
1000.00 CUA is 20200.00 CUB but total payments were only 20199.00 CUB
I have the following dataset:
StartDate EnterDate Order#
---------- ---------- ------
2018-01-01 2018-01-01 1
2018-01-01 2018-01-01 2
2018-01-01 2018-01-02 3
2018-01-02 2018-01-02 4
2018-01-02 2018-01-03 5
2018-01-02 2018-01-03 6
2018-01-03 2018-01-04 7
2018-01-03 2018-01-04 8
2018-01-03 2018-01-04 9
2018-01-03 2018-01-05 10
I need to COUNT the number of dates in each column.
Example output:
Date StartDate EnterDate
---------- --------- ---------
01-01-2018 3 2
01-02-2018 3 2
01-03-2018 4 2
01-04-2018 0 3
01-05-2018 0 1
NULL can be substituted for 0.
You can use full join to achieve that
select
Date = isnull(t.StartDate, q.EnterDate), StartDate = isnull(t.cnt, 0), EnterDate = isnull(q.cnt, 0)
from (
select
StartDate, count(*) cnt
from
myTable
group by StartDate
) t
full join (
select
EnterDate, count(*) cnt
from
myTable
group by EnterDate
) q on t.StartDate = q.EnterDate
Having the following data in a table:
ID --------- Category --------- Value
1234 -------- Cat01 ----------- V001
1234 -------- Cat02 ----------- V002
1234 -------- Cat03 ----------- V003
1234 -------- Cat03 ----------- V004
1234 -------- Cat03 ----------- V005
I want to have the following output:
ID --------- Cat01 --------- Cat02 --------- Cat03
1234 ------- V001 ---------- V002 ---------- V003
1234 ------- V001 ---------- V002 ---------- V004
1234 ------- V001 ---------- V002 ---------- V005
How can it be done in PostgreSQL. As you can see, the value in Cat01 and Cat02 columns are repeated for each entry in Cat03 column
Many thanks for your help!
How about something like this:
SELECT a.val AS cat01, b.val AS cat02, c.val AS cat03
FROM
test_pivot AS a,
test_pivot AS b,
test_pivot AS c
WHERE
a.category = 'Cat01'
AND
b.category = 'Cat02'
AND
c.category = 'Cat03'
I have the following structure and is indexed using Whoosh.
timestamp name count(b.name)
------------------- ---- -------------
2010-11-16 10:32:22 John 2
2010-11-16 10:35:12 John 7
2010-11-16 10:36:34 John 1
2010-11-16 10:37:45 John 2
2010-11-16 10:48:26 John 8
2010-11-16 10:55:00 John 9
2010-11-16 10:58:08 John 2
I want to make a query to get the following structures, so it displays name frequency every 5 mins
timestamp name count(b.name)
------------------- ---- -------------
2010-11-16 10:30:00 John 2
2010-11-16 10:35:00 John 10
2010-11-16 10:40:00 John 0
2010-11-16 10:45:00 John 8
2010-11-16 10:50:00 John 0
2010-11-16 10:55:00 John 11
One of the possible solutions is to introduce additional field into index e.g. timestamp_trimmed, trim timestamp to 5min interval and save into timestamp_trimmed field and perform search with grouped by timestamp_trimmed field.
I have a project setup like this:
bin/fizzbuzz-game.pl
lib/FizzBuzz.pm
test/TestFizzBuzz.pm
test/TestFizzBuzz.t
When I run coverage on this, using
perl -MDevel::Cover=-db,/tmp/cover_db test/*.t
... I get the following output:
----------------------------------- ------ ------ ------ ------ ------ ------
File stmt bran cond sub time total
----------------------------------- ------ ------ ------ ------ ------ ------
lib/FizzBuzz.pm 100.0 100.0 n/a 100.0 1.4 100.0
test/TestFizzBuzz.pm 100.0 n/a n/a 100.0 97.9 100.0
test/TestFizzBuzz.t 100.0 n/a n/a 100.0 0.7 100.0
Total 100.0 100.0 n/a 100.0 100.0 100.0
----------------------------------- ------ ------ ------ ------ ------ ------
That is: the totally-uncovered file bin/fizzbuzz-game.pl is not included in the results.
How do I fix this?
Have you checked the documentation? The section on Selecting which files to cover seems most helpful. :) It looks like the +select option is the one you are looking for.
I figured out a work-around for this.
The core of this problem is that the uncovered code in the main file (fizzbuzz-game.pl) is not included in the coverage report, hence the overall percentage is wrong. The underlying problem is that substantial logic resides in the main file instead of testable modules. This is a smell (don't know which, but I'm pretty sure there is a name for "lots of logic in main()").
By getting rid of this smell, eg. moving all substatial code from bin/fizzbuzz-game.pl to lib/FizzBuzzGame.pm, the code can theoretically be tested, and can definitively be included in the test run.
The coverage report after this becomes:
----------------------------------- ------ ------ ------ ------ ------ ------
File stmt bran cond sub time total
----------------------------------- ------ ------ ------ ------ ------ ------
lib/FizzBuzz.pm 100.0 100.0 n/a 100.0 0.0 100.0
lib/FizzBuzzGame.pm 75.0 n/a n/a 75.0 100.0 75.0
Total 87.5 100.0 n/a 83.3 100.0 88.9
----------------------------------- ------ ------ ------ ------ ------ ------