Total date calculation for a list of date intervals - date

I have an array of start-end dates (work experience). What I need is to calculate total duration of these experiences.
Example 1:
[ {"startDate": "1999-01-01", "endDate": "1999-12-31"}, {"startDate": "2000-01-01", "endDate": "2000-12-31"} ]
This data should produce this result: 2 Years, 0 Months, 0 Days
Example 2:
[ {"startDate": "1999-01-01", "endDate": "1999-02-28"}, {"startDate": "2000-03-01", "endDate": "2000-05-31"} ]
This data sould produce: 0 Years 5 Months 0 Days
My current code is like this:
int totalDays = 0;
foreach (experience in experiences){
totaldays += experience.endDate.Subtract(experience.startDate);
}
int years = totalDays / 365;
totalDays -= years * 365;
int months = totalDays / 30;
totalDays -= months * 30;
int days = totalDays;
There are 2 problems I'm struggling with:
1: I cannot omit leap days. My result for first example is: 2 Years 0 Months 1 Days (February 29th 2000)
2: I cannot calculate days in months correctly. My result for second example is: 0 Years 5 Months 1 Days (151 / 30)
Is there a correct way to do this?

Related

Aggregate Postgresql rows into groups based on interval

I have a result set from a CTE that returns the interval between each row insertion like follows:
row
interval
1
years 0 mons 0 days 1 hours 0 mins 16.0 secs
2
0 years 0 mons 0 days 0 hours 45 mins 42.0 se
3
0 years 0 mons 0 days 0 hours 0 mins 20.0 sec
4
0 years 0 mons 0 days 5 hours 4 mins 19.0 sec
5
0 years 0 mons 0 days 0 hours 2 mins 32.0 sec
6
0 years 0 mons 0 days 0 hours 0 mins 25.0 sec
7
0 years 0 mons 0 days 0 hours 1 mins 9.0 secs
8
0 years 0 mons 0 days 0 hours 0 mins 25.0 sec
9
0 years 0 mons 0 days 0 hours 0 mins 16.0 sec
10
0 years 0 mons 0 days 2 hours 50 mins 24.0 s
11
0 years 0 mons 0 days 1 hours 6 mins 49.0 se
12
0 years 0 mons 0 days 0 hours 4 mins 6.0 sec
13
0 years 0 mons 0 days 0 hours 1 mins 6.0 sec
14
0 years 0 mons 0 days 0 hours 5 mins 48.0 se
15
0 years 0 mons 0 days 0 hours 3 mins 42.0 se
16
0 years 0 mons 0 days 0 hours 0 mins 22.0 se
17
0 years 0 mons 0 days 0 hours 0 mins 30.0 se
18
0 years 0 mons 0 days 0 hours 0 mins 19.0 se
19
0 years 0 mons 0 days 0 hours 0 mins 16.0 se
20
0 years 0 mons 0 days 0 hours 1 mins 55.0 se
What I need to extract is an aggregation of all rows after a cut off value to be grouped together until another cut off initiates a new grouping. For example, say the interval cut off is 1 hour, I want (starting with row 1) rows 1-3 to be grouped together because row 1 is above the cut off and would be the beginning a new grouping. 2&3 would be included the group as well because they are below the cut off. At row 4 a new group would be created because 4 is above the cut off. Then all subsequent rows would be included up to 9. 10 would be its own group because 11 is also above the cutoff and so on.
Thanks very much for any assistance.
insert into tmp (x, delta)
values (1, '0 years 0 mons 0 days 1 hours 0 mins 16.0 secs'),
(2, '0 years 0 mons 0 days 0 hours 45 mins 42.0 secs'),
(3, '0 years 0 mons 0 days 0 hours 0 mins 20.0 secs'),
(4, '0 years 0 mons 0 days 0 hours 4 mins 19.0 secs'),
(5, '0 years 0 mons 0 days 0 hours 2 mins 32.0 secs'),
(6, '0 years 0 mons 0 days 0 hours 0 mins 25.0 secs'),
(7, '0 years 0 mons 0 days 0 hours 1 mins 9.0 secs'),
(8, '0 years 0 mons 0 days 0 hours 0 mins 25.0 secs'),
(9, '0 years 0 mons 0 days 0 hours 0 mins 16.0 secs'),
(10, '0 years 0 mons 0 days 2 hours 50 mins 24.0 secs'),
(11, '0 years 0 mons 0 days 1 hours 6 mins 49.0 secs'),
(12, '0 years 0 mons 0 days 0 hours 4 mins 6.0 secs'),
(13, '0 years 0 mons 0 days 0 hours 1 mins 6.0 secs'),
(14, '0 years 0 mons 0 days 0 hours 5 mins 48.0 secs'),
(15, '0 years 0 mons 0 days 0 hours 3 mins 42.0 secs'),
(16, '0 years 0 mons 0 days 0 hours 0 mins 22.0 secs'),
(17, '0 years 0 mons 0 days 0 hours 0 mins 30.0 secs'),
(18, '0 years 0 mons 0 days 0 hours 0 mins 19.0 secs'),
(19, '0 years 0 mons 0 days 0 hours 0 mins 16.0 secs'),
(20, '0 years 0 mons 0 days 0 hours 1 mins 55.0 secs');
It's unclear to me from your question if 10 consecutive rows with a delta of 10 minutes each should be lumped into one group or two. If you want them all to be one group, you can do something like this:
WITH ranges AS (
SELECT x as start, lead(x) over (order by x) as end
FROM tmp
WHERE delta > interval '1 hour'
)
SELECT ranges.start, string_agg(x::text, ', ')
FROM tmp, ranges
WHERE tmp.x >= ranges.start
AND (ranges.end is null OR tmp.x < ranges.end)
GROUP BY ranges.start, ranges.end
ORDER BY ranges.start;
start | string_agg
-------+----------------------------------------
1 | 1, 2, 3, 4, 5, 6, 7, 8, 9
10 | 10
11 | 11, 12, 13, 14, 15, 16, 17, 18, 19, 20
First we identify the rows with a delta greater than the cutoff value, which represent the cutoff points, and we use a window function to determine the range. Then it's just a matter of aggregating over those ranges.
However, if you want the groups to be no more than one hour long, then that's trickier. I'm not sure if it can be done without a loop.

How to convert a random distributed vector to datetime object in MATLAB

I have a vector with random distributed values from 0 to 10 in increasing value, e.g. [1 3 4 9 10]. How can i convert this vector to a datetime object with time values between e.g. November and December such that these numbers represent the corresponding times in between?
Example, if x = [1 2 3] and I want the time period the whole January, then the output should be [1st January, 15th January, 30th January], according to their relative values.
Example, if x = [0 0.5 9 10] and we have entire January then 0 should map to the first day in January and 10 to the last day in January. 0.5 will map to the date at part 0.5/10 = 1/20 starting from the first January to the last. That date will be approximately 30 * 1 / 20 = 1 day and a half into January. Now, the 9 will in the same way be in position 9 / 10 of 30 days. That is 30 * 9 / 10 = 27. That is the 27th day of January. So the output should be [1st January, 1.5th January, 27th January, 30th January] in datetime format.
You can use datenum and some basic arithmetic to arrive at the following solution:
formatIn = 'dd.mm.yyyy';
d1 = '01.01.2017'; % user input, should be the earlier date
d2 = '31.01.2017'; % user input, should be the later date
x = [0 0.5 5 7 10]; % user input
d1 = datenum(d1,formatIn);
d2 = datenum(d2,formatIn);
daysAfter_d1 = d2-d1;
x = x/max(x);
addDays = round(daysAfter_d1*x);
interpolatedDates = d1 + addDays;
datestr(interpolatedDates,formatIn)
ans =
01.01.2017
03.01.2017
16.01.2017
22.01.2017
31.01.2017

How to calculate the day of the week based on unix time

I know that there are functions/classes in most programming languages to do that, but I would like to know the calculation.
So: How do I get from the unix time in seconds to a day-number (e.g. 0 for Sunday, 1 for Monday etc.)?
Thanks in advance. BTW: this is my first post on Stack Overflow.
The problem you ask is reasonably easy, compared to how ridiculously complicated other date/time functions can be (e.g. Zeller's congruence).
Unix time is defined as the number of seconds elapsed after January 1, 1970, at 00:00 (midnight) UTC.
You can look up a calendar to find that 1970-01-01 was a Thursday. There are 24 * 60 * 60 = 86400 seconds in a day.
Therefore values 0 to 86399 are Thursday, 86400 to 172799 are Friday, 172800 to 259199 are Saturday, etc. These are blocks of 86400 seconds aligned at 0.
Suppose T is your Unix timestamp. Then floor(T / 86400) tells you the number of days after 1970-01-01. 0 = Thursday January 1st; 1 = Friday January 2nd; 2 = Saturday January 3rd; etc.
Add 4 and modulo 7. Now 0 → 4; 1 → 5; 2 → 6; 3 → 0; 4 → 1; 5 → 2; 6 → 3; 7 → 4; 8 → 5; 9 → 6; 10 → 0; etc. This is your final answer.
In summary: day of week = (floor(T / 86400) + 4) mod 7.
(This assumes that you want the day of week in UTC. If you want to calculate it for another time zone, you need to perform some addition or subtraction of hours and minutes on T first.)
In JavaScript, days of the week are:
0 = Sun
1 = Mon
2 = Tue
3 = Wed
4 = Thu
5 = Fri
6 = Sat
You can use built-in methods:
// Unix epoch, 4 = Thu
new Date(0).getUTCDay()
// Today, 2 = Tue
new Date().getUTCDay()
Or a custom solution (remember to divide getTime() milliseconds by 1000):
// Unix epoch, 4 = Thu
(Math.floor(new Date(0).getTime() / 86400 / 1000) + 4) % 7
// Today, 2 = Tue
(Math.floor(new Date().getTime() / 86400 / 1000) + 4) % 7
Solution (from Geek for Geeks):
function dayOfWeek(d, m, y) {
let t = [0, 3, 2, 5, 0, 3, 5, 1, 4, 6, 2, 4];
y -= (m < 3) ? 1 : 0;
return ( y + y/4 - y/100 + y/400 + t[m-1] + d) % 7;
}
// Unix epoch, 4 = Thu
Math.floor(dayOfWeek(1, 1, 1970))
// Today, 2 = Tue
Math.floor(dayOfWeek(7, 12, 2021))
https://www.geeksforgeeks.org/find-day-of-the-week-for-a-given-date/

How to ask turtles to do different actions according to ticks

I want to ask you if the time step of my model is per month , and I want to divide the actions , the first 6 months the turtle will do some thing and the second six months other action,
what I know if I want to ask the turtle to do every 6 months the same action
if ticks mod 6 = 0
Thanks in advance
If I'm understanding your question right, you can just do:
if ticks < 6 [
do-one-action
]
if ticks >= 6 [
do-other-action
]
Edit: Just saw your comments. If you want to alternate actions every 6 ticks, you could do:
if ticks mod 12 < 6 [
do-one-action
]
if ticks mod 12 >= 6 [
do-other-action
]
If ticks tells us the number of months that have passed in the simulation, then ticks mod 12 tell us which month in the current year it is (e.g. 0, 1, 2, ... 11). So if ticks mod 12 < 6 says "if we're in the first 6 months of the current year".

Retrieve Month, Day, Hour, Minute, from a number in minute

i want create a counter that retrieve the number of month, day, hour, minute from a given number in minutes, for example i know that:
60 minutes in an hour
24 hours in a day = 60 x 24 = 1440 minutes
31 days per month
1 month = 24 x 60 x 31 = 44,640 minutes
so if i give for example the number 44640 i want have 1 month 0 day 0 hour 0 minute , or for example if i give 44700 i want have 1 month, 0 day 0 hour 60 minute or 1 month 0 day 1 hour 0 minute
any help please?
int total_minutes = 44640;
int total_hours = total_minutes / 60;
int minutes = total_minutes % 60;
int total_days = total_hours / 24;
int hours = total_hours % 24;
int months = total_days / 31;
int days = total_days % 31;
printf("%d months, %d days, %02d:%02d\n", months, days, hours, minutes);
But that's misleading, since months are not all 31 days. On average, a month in the Gregorian calendar is 30.436875 days (43829.1 minutes), so you could use that figure. For many applications, such calculations just assume that a month is always 30 days. But if your time interval is anchored to a specific point in time, it might be better to use the date at both ends to determine how many whole months there are between them.