Iterate for loop by hour in MATLAB - matlab

I am writing a for loop to average 10 years of hourly measurements made on the hour. The dates of the measurements are recorded as MATLAB datenums.
I am trying to iterate through using 0.0417 as it is the datenum for 1AM 00/00/00 but it is adding in a couple of seconds of error each time I iterate.
Can anyone recommend a better way for me to iterate by hour?
date = a(:,1);
load = a(:,7);
%loop for each hour of the year
for i=0:0.0417:366
%set condition
%condition removes year from current date
c = date(:)-datenum(year(date(:)),0,0)==i;
%evaluate condition on load vector and find mean
X(i,2)=mean(load(c==1));
end

An hour has a duration of 1/24 day, not 0.0417. Use 1/24 and the precision is sufficient high for a year.
For an even higher precision, use something like datenum(y,1,1,1:24*365,0,0) to generate all timestamps.

To avoid error drift entirely, specify the index using integers, and divide the result down inside the loop:
for hour_index=1:365*24
hour_datenum = (hour_index - 1) / 24;
end

Related

How to get monthly totals from linearly interpolated data

I am working with a data set of 10,000s of variables which have been repeatedly measured since the 1980s. The first meassurements for each variable are not on the same date and the variables are irregularly measured - sometimes measurements are only a month apart, in a small number of cases they are decades apart.
I want to get the change in each variable per month.
So far I have a cell of dates of measurements,and interpolated rates of change between measurements (each cell represents a single variable in either, and I've only posted the first 5 cells in each array)
DateNumss= {[736614;736641;736669] [736636;736666] 736672 [736631;736659;736685] 736686}
LinearInterpss={[17.7777777777778;20.7142857142857;0] [0.200000000000000;0] 0 [2.57142857142857;2.80769230769231;0]}
How do I get monthly sums of the interpolated change in variable?
i.e.
If the first measurement for a variable is made on the January 1st, and the linearly interpolated change between that an the next measurement is 1 per day; and the next measurement is on Febuary the 5th and the corresponding linearly interpolated change is 2; then January has a total change of 1*31 (31 days at 1) and febuary has a total change of 1*5+2*23 (5 days at 1, 23 days at 2).
You would need the points in the serial dates that correspond with the change of a month.
mat(:,1)=sort(repmat(1980:1989,[1,12]));
mat(:,2)=repmat(1:12,[1,size(mat,1)/12]);
mat(:,3)=1;
monthseps=datenum(mat);
This gives you a list of all 120 changes of months in the eighties.
Now you want, for each month, the change per day, and sum it. If you take the original data it is easier, since you can just interpolate each day's value using matlab. If you only have the "LinearInterpss" you need to map it on the days using interp1 with the method 'previous'.
for ct = 2:length(monthseps)
days = monthseps(ct-1):(monthseps(ct)-1); %days in the month
%now we need each day assigned a certain change. This value depends on your "LinearInterpss". interp1 with method 'previous' searches LineairInterpss for the last value.
vals = interp1(DateNumss,LinearInterpss,days,'previous');
sum(vals); %the sum over the change in each day is the total change in a month
end

MATLAB: Find all values on one date, then filter down to an hour and find average [duplicate]

This question already has answers here:
Counting values by day/hour with timeseries in MATLAB
(3 answers)
Closed 6 years ago.
I have a year's worth of data, the data is recorded one minute intervals each day of the year.
The date and time was imported from excel (in form 243.981944, then by adding 42004 (so will be for 2015) and formatting to date it becomes 31.8.15 23:34:00).
Importing to MATLAB it becomes
'31/08/2015 23:34:00'
I require the data for each day of the year to be at hourly intervals, so I need to sum the data recorded in each hour and divide that by the number of data recorded for that hour, giving me the hourly average.
For some reason the data in August actually increments in 2 minute intervals, data for every other month increments in one minute intervals.
ie
...
31/07/2015 23:57:00
31/07/2015 23:58:00
31/07/2015 23:59:00
31/08/2015 00:00:00
31/08/2015 00:02:00
31/08/2015 00:04:00
...
I'm not sure how I can find all the values for a specific date and hour in order to work out the averages. I was thinking of using a for loop to find the values on each day, but when I got down to writing code realised this wouldn't work the way I was thinking.
I presume there must be some kind of functions available that would allow for data to be filtered by the date and time?
edit:
So I tried the following but I get these errors.
dates is a 520000x1 cell array containing the dates form = formatIn.
formatIn = 'DD/MM/YYYY HH:MM:SS';
[~,M,D,H] = datevec(dates, formatIn);
Error using cnv2icudf (line 131) Unrecognized minute format.
Format string: DD/MM/YYYY HH:MM:SS.
Error in datevec (line 112) icu_dtformat = cnv2icudf(varargin{isdateformat});`
Assuming your data is in a matrix or cell-array of strings called A, and your other data is in a vector X. Let's say all the data is in the same year (so we can ignore years)
[~,M,D,H] = datevec(A, 'dd/mm/yyyy HH:MM:SS');
mean_A = accumarray([M, D, H+1], X, [], #mean);
Then data from February will be in
mean_A(2,:,:)
To look at the data, you may find the squeeze() function useful, e.g.
squeeze(mean_A(2,1:10,13:24))
shows the average for the hours after midday (by column) for the first ten days (by row) of February.
See also:
Counting values by day/hour with timeseries in MATLAB

Group of consecutive minimum values within data - MATLAB

I have daily river flow data for 1975-2009 and I am asked to find the 7 consecutive days within each year that have the smallest flows.
Any advice how to start this? I've only been using MATLAB for a couple weeks.
Thanks!
You could convolve the data with ones(1,7) and look for the minimum, which will yield the starting day of your dry period:
[~,startingDay] = min(conv(flow,ones(1,7),'valid'))
(This is basically a moving average filter without the normalization).
Loop through the years to get each year's result.
Start by finding cumulative sum with cumsum. The difference between cumulative sums 7 days apart will give you the total for those 7 days. Then pick the minimum of those.
a = cumsum(flow);
b = a(8:end) - a(1:end-7);
[m,i] = min(b);
Here m holds the smallest total over 7 consecutive days, and i is a vector of indices telling you when they occurred.

Timestamp Processing Brain Teaser

I am processing 1Hz timestamps (variable 'timestamp_1hz') from a logger which doesn't log exactly at the same time every second (the difference varies from 0.984 to 1.094, but sometimes 0.5 or several seconds if the logger burps). The 1Hz dataset is used to build a 10 minute averaged dataset, and each 10 minute interval must have 600 records. Because the logger doesn't log exactly at the same time every second, the timestamp slowly drifts through the 1 second mark. Issues come up when the timestamp cross the 0 mark, as well as the 0.5 mark.
I have tried various ways to pre-process the timestamps. The timestamps with around 1 second between them should be considered valid. A few examples include:
% simple
% this screws up around half second and full second values
rawseconds = raw_1hz_new(:,6)+(raw_1hz_new(:,7)./1000);
rawsecondstest = rawseconds;
rawsecondstest(:,1) = floor(rawseconds(:,1))+ rawseconds(1,1);
% more complicated
% this screws up if there is missing data, then the issue compounds because k+1 timestamp is dependent on k timestamp
rawseconds = raw_1hz_new(:,6)+(raw_1hz_new(:,7)./1000);
A = diff(rawseconds);
numcheck = rawseconds(1,1);
integ = floor(numcheck);
fract = numcheck-integ;
if fract>0.5
rawseconds(1,1) = rawseconds(1,1)-0.5;
end
for k=2:length(rawseconds)
rawsecondstest(k,1) = rawsecondstest(k-1,1)+round(A(k-1,1));
end
I would like to pre-process the timestamps then compare it to a contiguous 1Hz timestamp using 'intersect' in order to find the missing, repeating, etc data such as this:
% pull out the time stamp (round to 1hz and convert to serial number)
timestamp_1hz=round((datenum(raw_1hz_new(:,[1:6])))*86400)/86400;
% calculate new start time and end time to find contig time
starttime=min(timestamp_1hz);
endtime=max(timestamp_1hz);
% determine the contig time
contigtime=round([floor(mean([starttime endtime])):1/86400:ceil(mean([starttime endtime]))-1/86400]'*86400)/86400;
% find indices where logger time stamp matches real time and puts
% the indices of a and b
clear Ia Ib Ic Id
[~,Ia,Ib]=intersect(timestamp_1hz,contigtime);
% find indices where there is a value in real time that is not in
% logger time
[~,Ic] = setdiff(contigtime,timestamp_1hz);
% finds the indices that are unique
[~,Id] = unique(timestamp_1hz);
You can download 10 days of the raw_1hz_new timestamps here. Any help or tips would be much appreciated!
The problem you have is that you can't simply match these stamps up to a list of times, because you could be expecting a set of datapoints at seconds = 1000, 1001, 1002, but if there was an earlier blip you could have entirely legitimate data at 1000.5, 1001.5, 1002.5 instead.
If all you want is a list of valid times/their location in your series, why not just something like (times in seconds):
A = diff(times); % difference between times
n = find(abs(A-1)<0.1) % change 0.1 to whatever your tolerance is
times2 = times(n+1);
times2 should then be a list of all your timestamps where the previous timestamp was approximately 1 second ago - works on a small set of fake data I constructed, didn't try it on yours. (For future reference: it would be more help to provide a small subset of your data, e.g. just a few minutes worth, that you know contains a blip).
I would then take the list of valid timestamps and split it up into 10 minute sections for averaging, counting how many valid timestamps were obtained in each section. If it's working, you should end up with no more than 600 - but not much less if the blips are occasional.

Comparing dates and filling in gap times in matlab

I have a data file which contains time data. The list is quite long, 100,000+ points. There is data every 0.1 seconds, and the time stamps are so:
'2010-10-10 12:34:56'
'2010-10-10 12:34:56.1'
'2010-10-10 12:34:56.2'
'2010-10-10 12:34:53.3'
etc.
Not every 0.1 second interval is necessarily present. I need to check whether a 0.1 second interval is missing, then insert this missing time into the date vector. Comparing strings seems unnecessarily complicated. I tried comparing seconds since midnight:
date_nums=datevec(time_stamps);
secs_since_midnight=date_nums(:,4)*3600+date_nums(:,5)*60+date_nums(:,6);
comparison_secs=linspace(0,86400,864000);
res=(ismember(comparison_secs,secs_since_midnight)~=1);
However this approach doesn't work due to rounding errors. Both the seconds since midnight and the linspace of the seconds to compare it to never quite equal up (due to the tenth of a second resolution?). The intent is to later do an fft on the data associated with the time stamps, so I want as much uniform data as possible (the data associated with the missing intervals will be interpolated). I've considered blocking it into smaller chunks of time and just checking the small chunks one at a time, but I don't know if that's the best way to go about it. Thanks!
Multiply your numbers-of-seconds by 10 and round to the nearest integer before comparing against your range.
There may be more efficient ways to do this than ismember. (I don't know offhand how clever the implementation of ismember is, but if it's The Simplest Thing That Could Possibly Work then you'll be taking O(N^2) time that way.) For instance, you could use the timestamps that are actually present (as integer numbers of 0.1-second intervals) as indices into an array.
Since you're concerned with missing data records and not other timing issues such as a drifting time channel, you could check for missing records by converting the time values to seconds, doing a DIFF and finding those first differences that are greater than some tolerance. This would tell you the indices where the missing records should go. It's then up to you to do something about this. Remember, if you're going to use this list of indices to fill the gaps, process the list in descending index order since inserting the records will cause the index list to be unsynchronized with the data.
>> time_stamps = now:.1/86400:now+1; % Generate test data.
>> time_stamps(randi(length(time_stamps), 10, 1)) = []; % Remove 10 random records.
>> t = datenum(time_stamps); % Convert to date numbers.
>> t = 86400 * t; % Convert to seconds.
>> index = find(diff(t) > 1.999 * 0.1)' + 1 % Find missing records.
index =
30855
147905
338883
566331
566557
586423
642062
654682
733641
806963