The transmitter signal is there in file ch1norm.csv
The receiver signal is there in file ch2norm.csv
There are 4000 sample values with a sampling time of 0.00000004sec.
The problem is to find the delay between the transmitted and received signal.
I have written the following code and trying to evaluate the delay using both cross-correlation and finddelay function.
D1=load('F:\PhD\SPS-3\DSO_Data\BPW34_R68K_T68ohm\One-meter\ALL0005\ch1norm.CSV');
D2=load('F:\PhD\SPS-3\DSO_Data\BPW34_R68K_T68ohm\One-meter\ALL0005\ch2norm.CSV');
[C21,lag21] = xcorr(D2(:,2),D1(:,2));
[A,I]=max(abs(C21));
t21 = lag21(I); %Delay using xcorr
delay1=finddelay(D1(:,2),D2(:,2)); %Delay using finddelay function.
I am getting a delay of 59 samples.
The following are my questions-
What is the difference between getting a +delay value and a -ve delay value
How can we evaluate BER from these sample value
Related
I am displaying output rate of production time pear hour. Every third output *there is a bit of waiting time that is different than waiting time between first and second output. I don't understand why every third output there is a different waiting time? * (Picture 1). Additionally, the output on the Y axis is different than the output or input value from the source or Sink (Picture 1 and 2). I am a bit confused with these issues. Further illustration below:
I have created a variable (hourOutput), data set (dataProductionRate) and event (productionRateCount) to calculate output rate. Time plot in picture 1 read data set value as output (dataProductionRate).
code used in event: dataProductionRate.add( hourOutput );
code used in Sink:
hourOutput++;
Hyber_Facadesink_V++;
agent.TimeOut = time();
variable++;
if(self.count()>=Gen_Item){
getEngine().finish(); //to end the model execution
}
I am using both MATLAB and LabVIEW for Lab of course of control systems engineering and I want to implement a block diagram(system) in MATLAB and also in LabVIEW
Front panel shows "time response parametric data" which contains 6 parameters fields including settling time ,but i also need settlingMin and settlingMax which are provided/shown in MATLAB by using command stepinfo but i couldn't find a way how to get these two parameters settlingMax and settlingMin in LabVIEW.
Here is MATLAB code
clc
clear all
close all
sys1=tf([10],[1 1])
sys2=tf([1],[2 0.5])
sys_series=series(sys1,sys2)
sys_feedback=feedback(sys_series,0.1)
sys=series(540,sys_feedback)
sys_cl=feedback(sys,1,-1)
step(sys_cl)
stepinfo(sys_cl)
Download link LabVIEW VI
Comparing the help for the LabVIEW VI you are using with the help for the MATLAB function it seems obvious that MATLAB's SettlingMax is the same as LabVIEW's Peak Value in the Time Response Parametric Data cluster, while SettlingMin is the minimum value of the time response signal after Peak Time.
To get the latter value it looks as if you need to:
use CD Get Time Response Data to get the time points and response signal as DBL arrays - I assume Input and Output are both 0 since you have only one signal
use Search 1D Array with the time point array and the Peak Time value to get the array index of the peak time
use Array Subset to select the part of the response array from that index onwards (leave length unwired)
use Array Max & Min to get the minimum value of this array
AIM:
I need to synchronize the animation of two plots in Matlab.
PROBLEM:
The data of the two plots has been acquired with a variable sample rate.
SOLUTION:
I converted the time-stamps of the two datasets in duration objects (relative to the beginning of the streaming).
Now I want to plot the two datasets in a for loop.
For each loop, I want to show the datasets samples whose durations are within the elapsed time.
QUESTION:
How do I determine if the duration of a specific sample already happened or not?
CODE EXAMPLE:
here I simulate and sort 10 random durations (d1), and 1 random elapsed time (et). I want to find which durations are past the elapsed time.
`
% simulate elapsed time
et = calendarDuration(round(rand(1,6)*10));
% simulate data for plot 1
data_for_plot1 = rand(10,1);
% simulate durations for the samples in plot1
d1 = calendarDuration(sortrows(round(rand(10,6)*10)));
% find index of durations which are before the elapsed time
is_past = (d1-et)>0;
% plot the data
plot(data_for_plot1(is_past))
`
ERROR MESSAGE
is_past = (d1-et)>0;
Undefined operator '>' for input arguments of type 'calendarDuration'.
ALTERNATIVE SOLUTIONS:
It's my first time with duration and date objects, and I am hating every bit of it. If you have alternative solutions I would love to hear them.
Mind that the timestamps of data1 come as strings ('yyyy-MM-dd HH:mm:ss.SSS') and the timestamps of data2 come as double (eg: 42.525, 42 s and 525 ms).
Thank you for your help
You can use split function for this purpose.
Use is_past = split((d1-et),'time')> 0; instead of is_past = (d1-et)>0;
I have a transfer function that I am trying to use to filter acceleration data.
So far I have been able to use lsim with a sin wave over about 10 seconds and I'm getting the result I expect. However, I cannot work how to get data to the function in real time.
To clarify, every 0.1 seconds I am receiving an acceleration value from an external program. I need to filter the values to remove high frequency variations in the data. I need this to occur for each data point I receive as I then use the current filtered acceleration value into additional processing steps.
How do I use the transfer function in continuously and update the output value each time new data is received?
This is an example of how to do this with filter:
filter_state = []; % start with empty filter history
while not_stopped()
x = get_one_input_sample();
[y, filter_state] = filter(B, A, x, filter_state);
process_one_output_sample(y);
end;
Note that you need to use the extended form of filter, in which you pass the history of the filter from one iteration to the next using the variable filter_state. The B and A are the coefficients of your discrete filter. If your filter is continuous, you need to transform it to a discrete filter first using c2d or so. Note that if your filter is very complex, you might need to split up the filtering in several steps (e.g. one second-order-stage per filter) to avoid numerical problems.
Solved!
% Function to Generate ECG of heart beat signal for specified duration
%---------------------------------------
function [Heartbeat,t] = ECG_Gen (HR,pulse_width,Amp,duration)
Fs = 48000;
delay = (60/HR);
t = 0 : 1/Fs : duration; % 48000 kHz sample freq for duration (secs)
d = 0 : delay : duration;
Heartbeat = Amp*pulstran(t,d,'tripuls',pulse_width);
I'm having problem outputting my generated Heart beat signals, when I play the signal using Sound in matlab and measure it on an external heart rate monitor. I get a different reading to the simulated value. But seem to be correct only at 60 Bpm to maybe 100 Bpm. Need to include heart rates up to 200 Bpm. In order words, I get a lot of unstable output at high Bpm.
Change
delay = ((60/HR)/2)-(0.5*pulse_width);
into
delay = 30/HR;
tripuls does not change anything to the time input t1, so the pulse width should not be subtracted from the time vector.
You can see this is correct by setting
pulse_width = 60e-4;
% (try)
pulse_width = 60e-10;
% (try again)
You should see that your results slowly go more and more towards the correct HR (provided your external equipment is capable of handling such short pulses).