MATLAB signal up conversion and transmitting over the air - matlab

I have created an OFDM signal (WLAN) in MATLAB and saved the complex valued signal in a file. I then transfer the file to USRP-B210 (Software Defined Radio) and transmit over the air. Note that in MATLAB, I do not require to up-convert the signal to IF(intermediate frequency), as it is done by the radio device itself.
The signal is received by the receiver (which down-converts) and saves the file in local disk. I then process the receive file in MATLAB. Everything works perfect, and data is decoded correctly.
Now, I want to apply a Rayleigh Channel (that has Delay-Doppler effect over multipath channel). MATLAB has built in function for that Fading Channel. In such case, my signal transfer works fine with minor degradation in performance (BER). But degrades as I increase Delay-Doppler effect.
My question is, while I use the channel model, I am not doing any upconversion of the baseband signal before passing it to the channel model function. Was that necessary ? Because applying channel model on the base band sounds incorrect. In that case what other options do I have ?

I am not super familiar with this toolbox from MATLAB, but a quick look at the help page for comm.RayleighChannel doesn't seem to show any inputs that are carrier frequency based. This would imply that the channel model is carrier independent. As such, it should be able to be applied to a complex baseband signal just as readily as to an upconverted pass-band signal.
rayChan =
comm.RayleighChannel with properties:
SampleRate: 100000
PathDelays: 0
AveragePathGains: 0
NormalizePathGains: true
MaximumDopplerShift: 130
DopplerSpectrum: [1x1 struct]
ChannelFiltering: true
PathGainsOutputPort: false

Related

Variable Data Length in Simulink

I'm doing a project, to help develope a Hardware in the loop application for Audioprocessing.
In this project, i will recieve some configuration through UDP from PC, and the UDP datapakage contains something like "FFT Length", thus i should implement a Simulink model, that set the FFT Length or "Data Length" dynamicly.
Here is the thing i already have ( It is TI C6455 DSK Board):
UDP Server is already implemented and packed as a Subsystem with one output Port (Signal), whitch is the FFT Length recieved
the Simulink ADC and DAC Block for the Boards, with fixed Sample per Frame (linke 256 Samples per Frame, and Sample Rate of 48kHz)
FFT Block with the FFT Length set to "Inherit FFT Length from Input"
now i'm considering using Buffer-Block to implement that, but there i have some troubles:
Buffer-Block don't have a Port whitch can dynamicly change the output length
Buffer-Block caused the unstable Spectrum output
Anyone can help me, so that i can solve both problem?
Thanks a lot

DAQ Matlab toolbox: how to count trigger events without an edge counter channel and how to output different value at each successive trigger

I need your help with the session based interface for the Matlab DAQ toolbox. I have not been able to find much help in the MathWorks tutorials or examples. I am currently using a USB-6003 DAQ from NI.
So basically in my system I have 2 analog output channels (ch1 and ch2) and 1 analog input channel (ch3), and what I am trying to do is to drive the output voltage in ch1 from 0V to 10V in steps of 1V, with ch2 constant and then repeat the loop in ch1 for a different voltage in ch2. As for the analog input ch3, I am triggering it some time after triggering the ch1. My triggers are being externally generated by a function generator.
What I have been struggling with is:
1) How to at each successive trigger event output a different value in the ch1.
2) And how after 11 triggers, can I change ch2 output's.
3) How to save the input in a different location between trigger events, so it does not get overwritten by the next event.
My main constraints are:
1) I cannot use an edge-counter channel to count the triggers because I only have two PFI channels and I need both, one to trigger ch1 and the other ch3 (I cannot use only one).
2) I cannot use wait or any other software time function, because I need a high speed acquisition system (it is for a laser microscope)
3) I need two have at least 2 sessions running in parallel because my DAQ does not allow simultaneous tasks in the same session.
I have attached a channel's time diagram of what I am trying to do.
Channels diagram
Caution
"I need a high speed acquisition system"
USB might not be the right option. Using USB as the control/data transport mechanism is slow compared to other computer I/O, like PCIe or EtherCAT. If, after you get this working, you determine that you need lower latency and jitter, my recommendation is to try CompactRIO and LabVIEW Real-Time.
Compounding the performance is the on-demand nature of the USB-6003. While both analong input and analog output are controlled by electrical signals (the start trigger and sample clock) and have their data automatically transferred by the driver, the digital input and counter are only software-timed, which means that reading data isn't automatic and must be prompted by you, the user, with a read command.
Since the only way you can get digital data from a USB-6003 is on-demand, your only option is to wait for it; there is no way to be notified that a new edge has arrived. Other devices (like the PCIe-63xx X Series or cDAQ-940x devices) support digital input change detection, which causes a software event to be sent to the program. If you had one of these devices, then you wouldn't have to wait.
Suggestion
However, if you change your triggering and data strategies a little, I still think you can achieve the kind of I/O you want. You'll then be able to evaluate its speed and reliability to decide if you need to upgrade DAQ hardware.
New triggering and data strategy
The core idea is: instead of keeping the channels on their own "time base", unify them to a single time base and use that to coordinate the voltage updates. By doubling the frequency of your external trigger, all three channels can share the same timing. In other words, both the analog input task and the analog output task use the same external signal as their sample clock.
Double the frequency of the FGEN's trigger signal.
Repeat an analog output sample if the level doesn't need to change.
Throw an analog input sample away if it coincides with an output level change.
The analog output samples would be:
ch1 ch2
0.0 0.0
0.0 0.0
1.0 0.0
1.0 0.0
2.0 0.0
2.0 0.0
0.0 1.0
0.0 1.0
New program strategy
Now that both the analog input and analog output are using the FGEN as their sample clock, the MATLAB routine only needs to prepare the operation and then monitor/feed it. The hardware will be able to generate and acquire without any intervention from the PC, but the PC will need to periodically read analog input data and write more analog output data to keep the driver satisfied.
I don't know how much of the DAQmx API MATLAB exposes, but you can ask the driver how many samples are left in the device's buffer
Analog input is DAQmxGetReadAvailSampPerChan (doc)
Analog output is DAQmxGetWriteSpaceAvail (doc)
Reference
NI USB-6003 Specifications
http://digital.ni.com/manuals.nsf/websearch/666A752FCC177B0186257CD8006C24C8

Calibration of a soundcard in MATLAB

I have designed a GUI to calibrate my sound card using MATLAB, I am able to record my input signal. I would like to calibrate my input.
How do I do that?
My GUI should be capable to adapt to different sound cards and get the dBV values, hence the Calibration is required. Any help would be appreciated.
A: This is a task from a Metrology, rather than from a programming area
To get the job done, you need a fully-controlled-environment to re-run a defined-input/known-output experiment.
In principle,
your both all your devices and your setup, has to be controlled - i.e.
your MIC-Input-accoustic/electric converter, while [dBa] -> [V] conversion is
"readable" down the cable path, it is not a principally important value per-se,
your CABLE-wire-path, which shall not be either neglected or forgotten,
your SND-Card-A/D converter,
your AUDIO-pre-Calibration Sound-Sample,
your TEST-pre-Calibration Environment
so as to be able to pre-Calibrate your devices for measurments.
The calibration itself can be achieved right by using the same AUDIO Sound-Sample in the same TEST Environment and be that measured / calibrated / by another device, that was certified at a locally recognised reference Authority to have a certain level of precision ( a guarantee that it's readings will not be outside a natl./intl. recognised precision class' envelope from correct/exact values ).
Note: you may want to pre-Calibrate your MIC+SND-A/D setup inside your in-vitro controlled environment specifically across a wide range of frequencies, so as to avoid frequency-dependent variation of the measurement-conversion path. Thus your pre-Calibration would have sort of Calibration-curve as an input for your further tests to be performed in-vivo

Receiving the right value when transmitting .dat file using FM radio

I am new to GNU Radio and I'm trying to transmit a value using it and the USRP B210 board.
I used Matlab to convert the value 0.121 to wav format then convert the wav file to .dat file using audio_to_file example in GNU Radio.
When I transmit the .dat file using the B210 and GNU Radio, I received a wav file but when I read the wav using matlab function (audioread()) I get a different value.
P.S.
Sample rate for the converted .dat file was 44100 Hz and 16 bits per sample.
The receiver and transmitter sampling rate is 400K Hz.
I used fm_tx4.py example from the GNU Radio package for my transmitter.
I used uhd_nbfm_receiver.grc for the receiver.
If you're wondering why your received signal doesn't have the same amplitude as your sent signal, you're not getting the very basics of radio communications: as there is no digital line between your transmitter and your receiver, power can go anywhere, and how much reaches the receiver depends on a lot of factors, including gain, antennas, distance, matching...
There will be a lot more things that are different on the RX side than they were on the TX side: Your reception has not been time-synchronized, so you might see a phase shift. You don't mention whether the receiver is the same, a clock-synchronized or an clock-independent B210, which means you have the general case, where no two physical clocks can be identical (yes, that's impossible, but you can reduce errors), so you'll generally see some frequency offset, too.
I recommend reading up a bit on basic radio comm theory, I often recommend GNU Radio's pictured introduction, and GNU Radio's suggested Reading Page. Michael Ossmann gets some recognition for his courses, too, so you should definitely have a look at them.
Also, all your data->Wav->transmit conversion is totally unnecessary. Matlabs fread/fwrite functions can read/store the native machine float format that GNU Radio's file_sink/file_source can store/read. See the FAQ entry.

Why does MATLAB change the sample rate while trying to acquire data?

I am using a DataQ acquisition device in Matlab 32-bit with the Data Acquisition toolbox.
On occasion, when I have my sample rate set to 300, it tells me:
Warning: This hardware could not support the requested value of 300
for SampleRate. SampleRate has been set to 1000"
However, if I set SampleRate to 1000, it sometimes sets it back to 300 with the same error message.
Also, if I set the program so that after the error displays and the device has started recording it returns the SampleRate, this is always at whatever I set it to, not what the program claims it changed it to.
Anyone have any idea how I find out what the actual sample rate was or keep it from resetting mine? I need to know how many samples there are per second for further calculations.
The problem is not with Matlab but with the DAQ. I have a similar "problem" with a NI DAQ. The hardware is set to sample at a very high rate to avoid aliasing. You could sample at a higher rate than required and then use the Matlab command "resample" to reduce your sampling rate. Resample will avoid any aliasing of higher frequencies.