Can any tell me how i can write a 5 msec timer code in matlab ?
%% Decomposing into sets of 40 bytes packets
% While Time < T1(= 5 msec), keep on filling the 40 bytes-sized packet
%while(Total_Connection_Time-Running_Time)>0
for n=1:length(total_number_of_bytes)
% n=counter to go through "total_number_of_bytes" marix
packets=[]; % 40-bytes matrix (packetization phase)
% checking whether number of bytes at each talkspurt period is < or > 40 bytes in order to start packetization
if (total_number_of_bytes(n)<=40)
k=40-total_number_of_bytes(n); % calculating how many remaining bytes we need to complete a 40 bytes packet
packets=[packets,total_number_of_bytes(n)+k];
total_number_of_bytes(n)=40; %new bytes matrix after packetization (adding bytes from next talkspurt to get total of 40 bytes)
total_number_of_bytes(n+1)= total_number_of_bytes(n+1)-k; % bytes are taken from the next talkspurt period in order to get a 40 byte packet
if total_number_of_bytes(n+1)<0
for i=1:length(total_number_of_bytes) % looping through the array starting total_number_of_bytes(n+1)
total_number_of_bytes(n+1)=total_number_of_bytes(n+1)+total_number_of_bytes(n+1+i)
total_number_of_bytes(n+1+i)=0;
packets=[total_number_of_bytes]
end
end
end
if(total_number_of_bytes(n)>40)
m=total_number_of_bytes(n)-40; % cz we need 40 bytes packets
packets=[packets,total_number_of_bytes -40];
total_number_of_bytes(n)=40;
total_number_of_bytes(n+1)= total_number_of_bytes(n+1)+m; % The remaining bytes are added to the next talkspurt period bytes
packets=[total_number_of_bytes]
end
for better accuracy use
java.lang.Thread.sleep(5);
instead of tic toc, see more here for further info.
Tic and toc are getting a bad rap, so I will just post this.
I tried the following:
tic
count = 0;
while toc<0.005
a=randn(10);
count = count+1;
end
toc
Running it ten times, the maximum value of toc was 5.006 ms. The count was around 1000 each time.
This is not the same as your program, but if graphics or GUI are not involved, I think tic and toc can do the job.
Related
I am working on a project where i read real time current signal on an Arduino using a Split Core CT. I am able to read exact AC Current and replicate it in Arduino Serial Plotter using the code below.
void setup() {
Serial.begin(115200);
}
void loop() {
Serial.println( (double)(analogRead(A5) - analogRead(A0))*0.009765625 );
}
But I have to do further calculations on it like FFT and THD, so I am sending that data to MATLAB over serial communication. Below is my Matlab Script which reads 1000 data samples, stores it in an array and performs calculations and finally plots it.
clc; close all;
if ~isempty(instrfind)
fclose(instrfind);
delete(instrfind);
end
s1=serial('COM5','Baudrate',115200);
fopen(s1);
Fs=1000;
LoS = 100;
T = 1/Fs;
t = (0:LoS-1)*T;
sig = zeros(1,LoS-1);
str='';
sen=0;
for j = 1:LoS
str=fscanf(s1);
sen=str2double(str);
sig(j)=sen;
end
subplot(2,1,1);
axis([0 LoS -4 4]);
plot(t,sig);
xlabel('Counts');
ylabel('Magnitude');
title('Signal');
Y=fft(sig);
P2 = abs(Y/LoS);
P1 = P2(1:LoS/2+1);
P1(2:end-1) = 2*P1(2:end-1);
f = Fs*(0:(LoS/2))/LoS;
subplot(2,1,2);
axis([0 100 0 10]);
plot(f,P1);
title('FFT(Signal)');
xlabel('f (Hz)');
ylabel('|Power(f)|');
fclose(s1);
delete(s1);
clear s1;
The issue is the frequency of actual signal is 60Hz, but my code outputs a peak at 31Hz. I checked the same code on matlab simulated sinusoids, it gives exact results. But on real data its miscalculating. I implemented the same logic on LABView as well the result remains 31Hz. Can anyone pinpoint my mistake? I am really stuck at this.
Thanks for your time.
You should fix the Arduino sample rate to 1000 samples per second.
As Daniel commented, the Arduino is working as fast as possible, instead of sampling at 1KHz (instead of sampling 1000 samples per second).
You can fix your Arduino loop to sample at 1KHz by making sure each iteration takes 1000 micro-seconds.
The loop below reads the time at the beginning and at the end of each iteration.
At the end of each iteration, there is a delay that completes the duration to 1000us.
void loop() {
unsigned long start_time_us, delta_time_us;
//Returns the number of microseconds since the Arduino board began running the current program
//https://www.arduino.cc/reference/en/language/functions/time/micros/
start_time_us = micros();
Serial.println( (double)(analogRead(A5) - analogRead(A0))*0.009765625 );
//Passed time in microseconds.
delta_time_us = micros() - start_time_us;
if (delta_time_us < 1000)
{
//Delay the remaining time - complete to 1000us (keep sample rate at 1000 samples per second).
//https://www.arduino.cc/reference/en/language/functions/time/delaymicroseconds/
delayMicroseconds((unsigned long)1000 - delta_time_us);
}
}
Please note: I could not verify the solution because I don't have an Arduino board (or simulator).
Remark: sending samples as text strings is inefficient, and may saturate the serial port.
I suggest you do the following:
Send binary data (using Serial.write instead of Serial.println).
Instead of converting the sample to double before sending it from Arduino, send the sample in short format (two bytes): Send the value (short)(analogRead(A5) - analogRead(A0)).
In the MATLAB side, read 1000 binary samples: sig = fread(s1, 1000, 'int16');
Perform the conversion to double and scale samples in MATLAB: sig = double(sig) * 0.009765625.
Arduino code:
//Serial.println( (double)(analogRead(A5) - analogRead(A0))*0.009765625 );
short analog_read = (short)(analogRead(A5) - analogRead(A0));
Serial.write((uint8_t*)&analog_read, 2); //Send two bytes in int16 binary format
MATLAB code:
% for j = 1:LoS
% str=fscanf(s1);
% sen=str2double(str);
% sig(j)=sen;
% end
sig = fread(s1, 1000, 'int16'); % Read 1000 analog samples (each sample is int16).
sig = double(sig) * 0.009765625; % Convert 1000 samples to physical values in double format
The above code is more efficient for both Arduino and MATLAB.
Remark: Keep in mind that I didn't test the code - it's just for demonstrating the concept.
I have 100 lamps. They are blinking. I observe them during some time. For each lamp i calculate mean, std and autocorrelation of intervals between blinking.
Now I should resample observed data and keep permutations, where all parameters (mean, std, autocorrelation) are inside some range. Code which I have works good. But it takes to long time (week) for each round of experiment. I do it on computing server with 12 cores and 2 Tesla K40m GPUs (details are in the end).
My code:
close all
clear all
clc
% open parpool skip error if it was opened
try parpool(24); end
% Sample input. It is faked, just for demo.
% Number of "lamps" and number of "blinks" are similar to real.
NLamps = 10^2;
NBlinks = 2*10^2;
Events = cumsum([randg(9,NLamps,NBlinks)],2); % each row - different "lamp"
DurationOfExperiment=Events(:,end).*1.01;
%% MAIN
% Define parameters
nLags=2; % I need to keep autocorrelation with lags 1-2
alpha=[0.01,0.1]; % range of allowed relative deviation from observed
% parameters should be > 0 to avoid generating original
% sequence
nPermutations=10^2; % In original code 10^5
% Processing of experimental data
DurationOfExperiment=num2cell(DurationOfExperiment);
Events=num2cell(Events,2);
Intervals=cellfun(#(x) diff(x),Events,'UniformOutput',false);
observedParams=cellfun(#(x) fGetParameters(x,nLags),Intervals,'UniformOutput',false);
observedParams=cell2mat(observedParams);
% Constrained shuffling. EXPENSIVE PART!!!
while true
parfor iPermutation=1:nPermutations
% Shuffle intervals
shuffledIntervals=cellfun(#(x,y) fPermute(x,y),Intervals,DurationOfExperiment,'UniformOutput',false);
% get parameters of shuffled intervals
shuffledParameters=cellfun(#(x) fGetParameters(x,nLags),shuffledIntervals,'UniformOutput',false);
shuffledParameters=cell2mat(shuffledParameters);
% get relative deviation
delta=abs((shuffledParameters-observedParams)./observedParams);
% find shuffled Lamps, which are inside alpha range
MaximumDeviation=max(delta,[] ,2);
MinimumDeviation=min(delta,[] ,2);
LampID=find(and(MaximumDeviation<alpha(2),MinimumDeviation>alpha(1)));
% if shuffling of ANY lamp was succesful, save these Intervals
if ~isempty(LampID)
shuffledIntervals=shuffledIntervals(LampID);
shuffledParameters=shuffledParameters(LampID,:);
parsave( LampID,shuffledIntervals,shuffledParameters);
'DONE'
end
end
end
%% FUNCTIONS
function [ params ] = fGetParameters( intervals,nLags )
% Calculate [mean,std,autocorrelations with lags from 1 to nLags
R=nan(1,nLags);
for lag=1:nLags
R(lag) = corr(intervals(1:end-lag)',intervals((1+lag):end)','type','Spearman');
end
params = [mean(intervals),std(intervals),R];
end
%--------------------------------------------------------------------------
function [ Intervals ] = fPermute( Intervals,Duration )
% Create long shuffled time-series
Time=cumsum([0,datasample(Intervals,numel(Intervals)*3)]);
% Keep the same duration
Time(Time>Duration)=[];
% Calculate Intervals
Intervals=diff(Time);
end
%--------------------------------------------------------------------------
function parsave( LampID,Intervals,params)
save([num2str(randi(10^9)),'.mat'],'LampID','Intervals','params')
end
Server specs:
>>gpuDevice()
CUDADevice with properties:
Name: 'Tesla K40m'
Index: 1
ComputeCapability: '3.5'
SupportsDouble: 1
DriverVersion: 8
ToolkitVersion: 8
MaxThreadsPerBlock: 1024
MaxShmemPerBlock: 49152
MaxThreadBlockSize: [1024 1024 64]
MaxGridSize: [2.1475e+09 65535 65535]
SIMDWidth: 32
TotalMemory: 1.1979e+10
AvailableMemory: 1.1846e+10
MultiprocessorCount: 15
ClockRateKHz: 745000
ComputeMode: 'Default'
GPUOverlapsTransfers: 1
KernelExecutionTimeout: 0
CanMapHostMemory: 1
DeviceSupported: 1
DeviceSelected: 1
>> feature('numcores')
MATLAB detected: 12 physical cores.
MATLAB detected: 24 logical cores.
MATLAB was assigned: 24 logical cores by the OS.
MATLAB is using: 12 logical cores.
MATLAB is not using all logical cores because hyper-threading is enabled.
>> system('for /f "tokens=2 delims==" %A in (''wmic cpu get name /value'') do #(echo %A)')
Intel(R) Xeon(R) CPU E5-2630 v2 # 2.60GHz
Intel(R) Xeon(R) CPU E5-2630 v2 # 2.60GHz
>> memory
Maximum possible array: 496890 MB (5.210e+11 bytes) *
Memory available for all arrays: 496890 MB (5.210e+11 bytes) *
Memory used by MATLAB: 18534 MB (1.943e+10 bytes)
Physical Memory (RAM): 262109 MB (2.748e+11 bytes)
* Limited by System Memory (physical + swap file) available.
Question:
Is it possible to speedup my calculation? I think about CPU+GPU computing, but I could not understand how to do it (I have no experience with gpuArrays). Moreover, I am not sure it is a good idea. Sometimes some algorithm optimisation gives bigger profit, then parallel computing.
P.S.
Saving step is not the bottleneck- it happens once in 10-30 mins in best case.
GPU-based processing is only available on some functions and with the right cards (if I remember correctly).
For the GPU part of your question MATLAB has a list of available functions - that you can run on GPU - the most expensive part of your code is the function corr which unfortunately isn't on the list.
If the profiler isn't highlighting bottlenecks - something weird is going on... So I ran some tests on your code above:
nPermutations = 10^0 iteration takes ~0.13 seconds
nPermutations = 10^1 iteration takes ~1.3 seconds
nPermutations = 10^3 iteration takes ~130 seconds
nPermutations = 10^4 probably takes ~1300 seconds
nPermutations = 10^5 probably takes ~13000 seconds
Which is a lot less than a week...
Did I mention that I put a break out of your while statement - as I couldn't see in your code where you ever "break" out of the while loop - I hope for your sake that this isn't the reason that your function would run forever....
while true
parfor iPermutation=1:nPermutations
% Shuffle intervals
shuffledIntervals=cellfun(#(x,y) fPermute(x,y),Intervals,DurationOfExperiment,'UniformOutput',false);
% get parameters of shuffled intervals
shuffledParameters=cellfun(#(x) fGetParameters(x,nLags),shuffledIntervals,'UniformOutput',false);
shuffledParameters=cell2mat(shuffledParameters);
% get relative deviation
delta=abs((shuffledParameters-observedParams)./observedParams);
% find shuffled Lamps, which are inside alpha range
MaximumDeviation=max(delta,[] ,2);
MinimumDeviation=min(delta,[] ,2);
LampID=find(and(MaximumDeviation<alpha(2),MinimumDeviation>alpha(1)));
% if shuffling of ANY lamp was succesful, save these Intervals
if ~isempty(LampID)
shuffledIntervals=shuffledIntervals(LampID);
shuffledParameters=shuffledParameters(LampID,:);
parsave( LampID,shuffledIntervals,shuffledParameters);
'DONE'
end
end
break % You need to break out of the loop at some point
% otherwise it would run forever....
end
I found this speech recognition code that I downloaded from a blog. It works fine, it asks to record sounds to create a dataset and then you have to call a function to train the system using neural networks.
I want to use this code to train using my dataset of 20 words that I want to recognise.
Problem:
I have a dataset of 800 files for twenty words i.e. 40 recordings from different people for each word. I used Windows sound recorder to collect the files.
The problem is that in the code is that the size of the input file is set to ALWAYS be 8000, my dataset on the other hand is not constant, some files are 2 seconds long, some are 3 that means there'll be different number of samples in each file.
If the samples per input signal variate it'll probably generate errors.
I want to use my files to train the system.
How do I do that?
Code:
clc;clear all;
load('voicetrainfinal.mat');
Fs=8000;
for l=1:20
clear y1 y2 y3;
display('record voice');
pause();
x=wavrecord(Fs,Fs); % wavrecord(n,Fs) records n samples at a sampling rate of Fs
maxval = max(x);
if maxval<0.04
display('Threshold value is too large!');
end
t=0.04;
j=1;
for i=1:8000
if(abs(x(i))>t)
y1(j)=x(i);
j=j+1;
end
end
y2=y1/(max(abs(y1)));
y3=[y2,zeros(1,3120-length(y2))];
y=filter([1 -0.9],1,y3');%high pass filter to boost the high frequency components
%%frame blocking
blocklen=240;%30ms block
overlap=80;
block(1,:)=y(1:240);
for i=1:18
block(i+1,:)=y(i*160:(i*160+blocklen-1));
end
w=hamming(blocklen);
for i=1:19
a=xcorr((block(i,:).*w'),12);%finding auto correlation from lag -12 to 12
for j=1:12
auto(j,:)=fliplr(a(j+1:j+12));%forming autocorrelation matrix from lag 0 to 11
end
z=fliplr(a(1:12));%forming a column matrix of autocorrelations for lags 1 to 12
alpha=pinv(auto)*z';
lpc(:,i)=alpha;
end
wavplay(x,Fs);
X1=reshape(lpc,1,228);
a1=sigmoid(Theta1*[1;X1']);
h=sigmoid(Theta2*[1;a1]);
m=max(h);
p1=find(h==m);
if(p1==10)
P=0
else
P=p1
end
end
In your code you have:
Fs=8000;
wavrecord(n,Fs) % records n samples at a sampling rate Fs
for i=1:8000
if(abs(x(i))>t)
y1(j)=x(i);
j=j+1;
end
end
It seems that instead of recording you are going to import your sound file (here for a .wave file):
[y, Fs] = wavread(filename);
Instead of hardcoding the 8000value you can read the length of your file:
n = length(y);
and then just use that n variable in the for loop:
for i=1:n
if(abs(x(i))>t)
y1(j)=x(i);
j=j+1;
end
end
The rest of the code seems to be independent of that 8000 value.
If you are worried that having non-constant file length. Compute n_max, the maximum length of all the audio recordings you have. And for recording shorter than n_max samples pad them with zeros so as to make them all n_max long.
n_max = 0;
for file = ["file1" "file2" ... "filen"]
[y, Fs] = wavread(filename);
n_max = max(n_max,length(y));
end
Then each time you process a sound vector you can pad it with 0 (harmless for you, because 0 means no sound) like so:
y = [y, zeros(1, n_max - length(y))];
n=noOfFiles
for k=1:n
M(k,1:length(filedata{k})) = filedata{k}
end
:P
i have a file, which is the energy consumption of a house.
every 10 minute one value (watt):
10:00 123
10:10 125
10:20 0
...
It means each day have 144 value (Rows).
i want to forecast the energy of the next day with ARX an ARMAX programm. i did write ARX code in Matlab. but i can't forecast the next day. My code take the last 5 consumption and forecast the 6th one. How can i forecast the nex 144 value ( = the day after)
% ARX Process----------------------------
L=length(u_in)
u_in_ID=u_in;% Input data used for Identification
u_in_vfy=u_in;% Input data used for verification
y_out_ID=y_out;% Output data used for Identification
y_out_vfy=y_out;%Output data used for verification
m=5; %Parameter to be used to generate order of delay for Input, Output and Error
n=length(y_out_ID)-m;
I=eye(n,1)+1;
I(1)=I(1)-1;
A=I; % Initialize Matrix A
Y=y_out_ID((m+1):end); % Defining Y vector
length(Y)
na=1;
% Put output delay 1 to m-na in A matrix
for k=1:1:m-na
A=[A y_out_ID((m-k+1):(end-k))];
end
% Put "Current Input -- mth delayed Input" to Matrix A
for p=1:1:m
k=p-1;
A=[A u_in_ID((m-k+1):(end-k))];
end
A(:,1)=[]; % Delete 1st column of Matrix A, which was used to Initialize it
parsol=inv(A'*A)*A'*Y;
BB=A*parsol;
% Generate Identified Output vector based on previous
% outputs, current and previous Inputs and Parameters solved by Least
% square method
n=length(y_out_vfy)-m;
I=eye(n,1)+1;
I(1)=I(1)-1;
A=I;
for k=1:1:m-na
A=[A y_out_vfy((m-k+1):(end-k))];
end
for p=1:1:m
k=p-1;
A=[A u_in_vfy((m-k+1):(end-k))];
end
A(:,1)=[]; % Delete 1st column of Matrix A, which was used to Initialize it
A;
y_out_sysID=A*parsol;
can anyone help me?
I'm having trouble getting consistent results with the code I am using. I want to run my Arduino for a specific amount of time (say 20 seconds) and collect data from the analog pin with a specific sampling rate (say four samples a second). The code is as follows.
a_pin = 0;
tic;
i = 0;
while toc < 20
i = i + 1;
time(i) = toc;
v(i) = a.analogRead(a_pin);
pause(.25);
end
Is there a way to set the loop to run a specific time and then in the loop sample at a different rate?
You can try this:
a_pin = 0;
fs = 4; % sampling frequency (samplings per second)
mt = 20; % time for measurements
ind = 1;
nind = 1;
last_beep = 0;
tic;
while toc < mt
time(ind) = toc;
v(ind) = a.analogRead(a_pin);
% wait for appropriate time for next measurement
while( nind == ind )
nind = floor(toc*fs) + 1;
end
ind = nind;
% beep every second
if (ceil(toc) > last_beep)
beep(); % don't know if this one exist, check docs
last_beep = ceil(toc);
end
end
Maximal sampling time for a single Arduino analog read command is around 0.04 s, in practice I'd go minimally 0.05. Adding two read operations is in the order of 2*0.04, in practice more like 0.1 s. I think it is mainly limited by the USB communication speeds.
I am also new at arduino, but having implemented a real time analysis for EEG using it, on practice, I was able to sample 2 analog channels with a samplinf frequency between 57 and 108Hz. It was very variable (calculated through tic/toc), but it is still proper for realtime processing in my case.
My code uses a While loop, a series of memory updates, digital pin manipulations, plot of trace (drawnow) and seems to run smoothly enough
My answer is simply here : 0.0283 sec for sampling 2 analog inputs in my case.
Cheers