Frequency representation using discrete wavelet transformation - matlab

I am trying to use wavelet transform to represent song in frequency domain using discrete wavelet transform to made decomposition and made the frequency of the singer in place the the song using Matlab
The problem that the dwt and the decomposition mades represent it only in time domain.
How can I represent it in frequency if DWT doesn't represent It what would do?
Thank you

When we say "frequency transform" or talk about "representing frequency" we are usually talking about the Fourier Transform, implemented as the DFT, or discrete Fourier transform. Andre is correct in the comments below when he says that the DWT is also a type of frequency transform; however, wen we say "represent song in frequency domain" it usually means DFT, not DWT.
That being said, I don't recommend the DWT for music and sound analysis because the analysis bands are fixed at one-octave, which is simply too wide to do anything meaningful with. There are other techniques related to wavelets that are more effective for audio, but I don't gather from your question that you are using one of them.
In addition to the DFT, which is usually implemented as the FFT, or fast Fourier transform, you may also want to read about the STFT (short-time Fourier transform).

Related

Fourier transform and filtering the MD trajectory data instead of PCA for dimensionality reduction?

I was using PCA for dimensionality reduction of MD (molecular dynamics) trajectory data of some protein simulations. Basically my data is xyz coordinates of protein atoms which change with time (that means I have lot of frames of this xyz coordinates). The dimension of this data is something like 20000 frames of 200x3 (atoms by coordinates). I implemented PCA using princomp command in Matlab.
I was wondering if I can do FFT on my data. I have experience of doing FFT on audio signals (1D signal). Here my data has both time and space in picture. It must be theoretically possible to implement FFT on my data and then filter it using a LPF (low pass filter). But I am unsure.
Can someone give me some direction/code snippets/references towards implementing FFT on my data?
Why people are preferring PCA more often compared to FFT and filtering. Is it because of computational efficiency of algorithm or is it because of the statistical nature of underlying data?
For the first question "Can someone give me some direction/code snippets/references towards implementing FFT on my data?":
I should say fft is implemented in matlab and you do not need to implement it by your own. Also, for your case you should use fftn (fft documentation)to transform and after applying lowpass filtering by dessignfilt (design filter in matalab), the apply ifftn (inverse fft in matlab)to inverse the transform.
For the second question "Why people are preferring PCA more often compared to FFT and filtering ...":
I should say as the filtering in fft is done in signal space, after filtering you can't generalize it in time space. You can more details about this drawback in this article.
But, Fourier analysis has also some other
serious drawbacks. One of them may be that time
information is lost in transforming to the frequency
domain. When looking at a Fourier transform of a
signal, it is impossible to tell when a particular event
has taken place. If it is a stationary signal - this
drawback isn't very important. However, most
interesting signals contain numerous non-stationary or
transitory characteristics: drift, trends, abrupt changes,
and beginnings and ends of events. These
characteristics are often the most important part of the
signal, and Fourier analysis is not suitable in detecting
them.

Plotting Acoustic Waveform - Magnitude on a Linear Frequency Scale

I have an acoustic waveform of a Spanish phoneme and I'd like to compute its magnitude spectrum and plot it in dB magnitude on a linear frequency scale. How would I be able to accomplish this in MATLAB?
Thanks
First a quick heads up: At stackoverflow you are expected to show some of your own efforts to solve the problem and then ask for help.
Now to your question:
You can plot the spectrogram using the "spectrogram" Matlab function.
[s,f,t] = spectrogram(x,window,noverlap,f,fs)
Check the details here: https://www.mathworks.com/help/signal/ref/spectrogram.html
For a speech signal you will want to specify the sampling frequency "fs" (you can get that when you read the file using:
[y,Fs] = audioread(filename)
You will probably want to specify the variables "window" and "noverlap" since speech signals can show distinct properties depending on the dimension of the window (fast phenomena will not be visible on big windows ). A typical values are 20ms windows with 10ms overlap (select the best value by considering your sampling frequency and the nearest 2^n value for fast Fourier calculation).
The window size and overlap are also valid when you calculate spectrum. If you apply FFT to the whole waveform then you will get the "average" spectral information for the sentence. To catch specific phenomena you must use windowing techniques and perform a short-term Fourier analysis.
use sptool
Signal Processing toolbox Show its Document

How to get the low, middle and high frequency components of an image?

I am currently looking into some image processing project and just wondering how to obtain the low, middle and high frequency components of an image? For example, as this picture showed (I got it from googling without detailed description how to obtained this picture, but presumably using some filtering).
Also, I came across this post of using discrete cosine transform (DCT), and it can help us to get the low and high frequency components of an image. Just wondering how to use DCT to get the middle frequency component?
Link of DCT
I also have very basic knowledge about filtering. I think there are also Gaussian high/low pass filters available to use. And also wavelet based filtering. Just wondering what are the differences between Gaussian, Wavelet and DCT based filtering? Which one should I use?
Typical steps would be:
use a Fourier Transform to bring the image into frequency domain
apply filtering by zero-ing out areas of the fft image
reverse the fourier transform to bring image back to spatial domain
This is a really good example of high/low/mid pass filters in frequency domain: http://paulbourke.net/miscellaneous/imagefilter/
You will want to use MatLab's built in fft our fast fourier transform function. Fourier transforms are an extremely powerful method to filter frequencies. http://www.mathworks.com/help/matlab/ref/fft.html has some great examples on how to use the fft. Once you find the frequencies that make up the image you can take out the undesired frequencies to fit and then reverse fourier transform to obtain the new image.

Deconvolution of data convolved by a Gaussian response

I have a set of experimental data s(t) which consists of a vector (with 81 points as a function of time t).
From the physics, this is the result of the convolution of the system response e(t) with a probe p(t), which is a Gaussian (actually a laser pulse). In terms of vector, its FWHM covers approximately 15 points in time.
I want to deconvolve this data in Matlab using the convolution theorem: FT{e(t)*p(t)}=FT{e(t)}xFT{p(t)} (where * is the convolution, x the product and FT the Fourier transform).
The procedure itself is no problem, if I suppose a Dirac function as my probe, I recover exactly the initial signal (which makes sense, measuring a system with a Dirac gives its impulse response)
However, the Gaussian case as a probe, as far as I understood turns out to be a critical one. When I divide the signal in the Fourier space by the FT of the probe, the wings of the Gaussian highly amplifies those frequencies and I completely loose my initial signal instead of having a deconvolved one.
From your experience, which method could be used here (like Hamming windows or any windowing technique, or...) ? This looks rather pretty simple but I did not find any easy way to follow in signal processing and this is not my field.
You have noise in your experimental data, do you? The problem is ill-posed then (non-uniquely solvable) and you need regularization.
If the noise is Gaussian the keywords are Tikhonov regularization or Wiener filtering.
Basically, add a positive regularization factor that acts as a lowpass filter. In your notation the estimation of the true curve o(t) then becomes:
o(t) = FT^-1(FT(e)*conj(FT(p))/(abs(FT(p))^2+l))
with a suitable l>0.
You're trying to do Deconvolution process by assuming the Filter Model is Gaussian Blur.
Few notes for doing Deconvolution:
Since your data is real (Not synthetic) data it includes some kind of Noise.
Hence it is better to use the Wiener Filter (Even with the assumption of low variance noise). Otherwise, the "Deconvolution Filter" will increase the noise significantly (As it is an High Pass basically).
When doing the division in the Fourier Domain zero pad the signals to the correct size or better yet create the Gaussian Filter in the time domain with the same number of samples as the signal.
Boundaries will create artifact, Windowing might be useful.
There are many more sophisticated methods for Deconvolution by defining a more sophisticated model on the signal and the noise. If you have more prior data about them, you should look for this kind of framework.
You can always set a threshold on the amplification level for certain frequencies, do that if needed.
Use as much samples as you can.
I hope this will assist you.

Strange artefact in my Fourier transform

I have performed an fft (fast fourier transform) on a time series waveform in Matlab, but I seem to have a weird wave actually in the fourier transform plot, although there are spikes this wave looks like something I'd expect to see only in the time domain. Is there any programming reason why this could happen?
The Fourier transform is quite similar to the Inverse Fourier Transform. A spike in one is a wave in the other. Hence, if you have one outlier datapoint in your series, you'll have a wave component in the frequency domain.
A possible programming-related issue could be an uninitialized data point, e.g. providing 1023 datapoints to a 1024-point FFT.
The fft assumes the signal is periodic so you can get some artefacts if the first and last values differ by enough to make that transition look like a step function. You are frequently better off windowing the data to avoid that phenomenon.
Note that the continuous-time Fourier transform of a finite-length signal can have things that look like "spikes" in the frequency domain. See the plots in this post of the continuous-time Fourier transform of a single period of a cosine signal and of ten periods of a cosine signal.
For example, an infinite extent cosine signal has a simple Fourier transform that's a pair of impulses at +/- the cosine frequency. But if you've only got ten periods of the cosine signal, the Fourier transform looks like this:
Steve's currently doing a nice series on Fourier transforms on his blog. He's specifically talking about 2D transforms, but you might find his discussion of windowing helpful.