I have a parameter X that is lognormally distributed with mean 15 and standard deviation 0.48. For monte carlo simulation in MATLAB, I want to generate 40,000 samples from this distribution. How could be done in MATLAB?
To generate an MxN matrix of lognornally distributed random numbers with parameter mu and sigma, use lognrnd (Statistics Toolbox):
result = lognrnd(mu,sigma,M,N);
If you don't have the Statistics Toolbox, you can equivalently use randn and then take the exponential. This exploits the fact that, by definition, the logarithm of a lognormal random variable is a normal random variable:
result = exp(mu+sigma*randn(M,N));
The parameters mu and sigma of the lognormal distribution are the mean and standard deviation of the associated normal distribution. To see how the mean and standard deviarion of the lognormal distribution are related to parameters mu, sigma, see lognrnd documentation.
To generate random samples, you need the inverted cdf. If you have done this, generating samples is nothing more than 'my_icdf(rand(n, m))'
First get the cdf (integrating the pdf) and then invert the function to get the inverted cdf.
You can convert between the mean and variance of the Lognormal distribution and its parameters (mu,sigma) which correspond to the associated Normal (Gaussian) distribution using the formulas.
The approach below uses the Probability Distribution Objects introduced in MATLAB 2013a. More specifically, it uses the makedist, random, and pdf functions.
% Notation
% if X~Lognormal(mu,sigma) the E[X] = m & Var(X) = v
m = 15; % Target mean for Lognormal distribution
v = 0.48; % Target variance Lognormal distribution
getLmuh=#(m,v) log(m/sqrt(1+(v/(m^2))));
getLvarh=#(m,v) log(1 + (v/(m^2)));
mu = getLmuh(m,v);
sigma = sqrt(getLvarh(m,v));
% Generate Random Samples
pd = makedist('Lognormal',mu,sigma);
X = random(pd,1000,1); % Generates a 1000 x 1 vector of samples
You can verify the correctness via the mean and var functions and the distribution object:
>> mean(pd)
ans =
15
>> var(pd)
ans =
0.4800
Generating samples via the inverse transform is also made easy using the icdf (inverse CDF) function.
% Alternate way to generate X~Lognormal(mu,sigma)
U = rand(1000,1); % U ~ Uniform(0,1)
X = icdf(pd,U); % Inverse Transform
The graphic generated by following code (MATLAB 2018a).
Xrng = [0:.01:20]';
figure, hold on, box on
h(1) = histogram(X,'DisplayName','Random Sample (N = 1000)');
h(2) = plot(Xrng,pdf(pd,Xrng),'b-','DisplayName','Theoretical PDF');
legend('show','Location','northwest')
title('Lognormal')
xlabel('X')
ylabel('Probability Density Function')
% Options
h(1).Normalization = 'pdf';
h(1).FaceColor = 'k';
h(1).FaceAlpha = 0.35;
h(2).LineWidth = 2;
Related
Matlab has the function randn to draw from a normal distribution e.g.
x = 0.5 + 0.1*randn()
draws a pseudorandom number from a normal distribution of mean 0.5 and standard deviation 0.1.
Given this, is the following Matlab code equivalent to sampling from a normal distribution truncated at 0 at 1?
while x <=0 || x > 1
x = 0.5 + 0.1*randn();
end
Using MATLAB's Probability Distribution Objects makes sampling from truncated distributions very easy.
You can use the makedist() and truncate() functions to define the object and then modify (truncate it) to prepare the object for the random() function which allows generating random variates from it.
% MATLAB R2017a
pd = makedist('Normal',0.5,0.1) % Normal(mu,sigma)
pdt = truncate(pd,0,1) % truncated to interval (0,1)
sample = random(pdt,numRows,numCols) % Sample from distribution `pdt`
Once the object is created (here it is pdt, the truncated version of pd), you can use it in a variety of function calls.
To generate samples, random(pdt,m,n) produces a m x n array of samples from pdt.
Further, if you want to avoid use of toolboxes, this answer from #Luis Mendo is correct (proof below).
figure, hold on
h = histogram(cr,'Normalization','pdf','DisplayName','#Luis Mendo samples');
X = 0:.01:1;
p = plot(X,pdf(pdt,X),'b-','DisplayName','Theoretical (w/ truncation)');
You need the following steps
1. Draw a random value from uniform distribution, u.
2. Assuming the normal distribution is truncated at a and b. get
u_bar = F(a)*u +F(b) *(1-u)
3. Use the inverse of F
epsilon= F^{-1}(u_bar)
epsilon is a random value for the truncated normal distribution.
Why don't you vectorize? It will probably be faster:
N = 1e5; % desired number of samples
m = .5; % desired mean of underlying Gaussian
s = .1; % desired std of underlying Gaussian
lower = 0; % lower value for truncation
upper = 1; % upper value for truncation
remaining = 1:N;
while remaining
result(remaining) = m + s*randn(1,numel(remaining)); % (pre)allocates the first time
remaining = find(result<=lower | result>upper);
end
I have used Gaussian Process for my prediction. Now let us assume I have predicted value store in x of size 1900 X 1. Now I want to check whether its distribution follow the gaussian distribution or not . I need this in order to compare the distribution functions of other methods predicted values like NN,KNN in order to judge which one is following smooth gaussian or normal distribution functions
How I can Do this ? Better if I can get some result in the form of numerical data. the code is written as follows,
m = mean(ypred); % mean of r
s = std(ypred); % stdev of r
pd = makedist('Normal','mu',m,'sigma',s); % make probability distribution with mu = m and sigma = s
[h,p] = kstest(ypred,'CDF',pd); % calculate probability that it is a normal distribution
The ypred value is the output obtain from fitrgp of matlab. Sample of ypred values are attached here
The [figure]2 is a residual qq_plot of measured and predicted values.
You can make a One-sample Kolmogorov-Smirnov test:
x = 1 + 2.*randn(1000,1); % just some random normal distributed data, replace it with your actual 1900x1 vector.
m = mean(x); % mean of r
s = std(x); % stdev of r
pd = makedist('Normal','mu',m,'sigma',s); % make probability distribution with mu = m and sigma = s
[h,p] = kstest(x,'CDF',pd); % calculate probability that it is a normal distribution
Where p is the probability that it follows a normal distribution and h = 1 if the null-hypothesis is rejected with a significance of 0.05. Since the null-hypothesis is "it follows a normal distribution", h = 0means that it is normal distributed.
Since x was in this example was sampled from a normal distribution, most likely h = 0 and p > 0.05. If you run above code with
x = 1 + 2.*rand(1000,1); % sampled from uniform distribution
h will most likely be 1 and p<0.05. Of course you can write the whole thing as a one-liner to avoid creating m,s and pd.
How can I use Matlab to plot a univariate normal distribution when it has unknown mean but the mean is also normally distributed with known mean of mean and variance of mean?
Eg. N(mean, 4) and mean ~N(2,8)
Using the law of total probability, one can write
pdf(x) = int(pdf(x | mean) * pdf(mean) dmean)
So, we can calculate it in Matlab as follows:
% define the constants
sigma_x = 4;
mu_mu = 2;
sigma_mu = 8;
% define the pdf of a normal distribution using the Symbolic Toolbox
% to be able to calculate the integral
syms x mu sigma
pdf(x, mu, sigma) = 1./sqrt(2*pi*sigma.^2) * exp(-(x-mu).^2/(2*sigma.^2));
% calculate the desired pdf
pdf_x(x) = int(pdf(x, mu, sigma_x) * pdf(mu, mu_mu, sigma_mu), mu, -Inf, Inf);
pdfCheck = int(pdf_x, x, -Inf, Inf) % should be one
% plot the desired pdf (green) and N(2, 4) as reference (red)
xs = -40:0.1:40;
figure
plot(xs, pdf(xs, mu_mu, sigma_x), 'r')
hold on
plot(xs, pdf_x(xs), 'g')
Note that I also checked that the integral of the calculated pdf is indeed equal to one, which is a necessary condition for being a pdf.
The green plot is the requested pdf. The red plot is added as reference and represents the pdf for a constant mean (equal to the average mean).
I am trying to generate a random number based off of normal distribution traits that I have (mean and standard deviation). I do NOT have the Statistics and Machine Learning toolbox.
I know one way to do it would be to randomly generate a random number r from 0 to 1 and find the value that gives a probability of that random number. I can do this by entering the standard normal function
f= #(y) (1/(1*2.50663))*exp(-((y).^2)/(2*1^2))
and solving for
r=integral(f,-Inf,z)
and then extrapolating from that z-value to the final answer X with the equation
z=(X-mew)/sigma
But as far as I know, there is no matlab command that allows you to solve for x where x is the limit of an integral. Is there a way to do this, or is there a better way to randomly generate this number?
You can use the built-in randn function which yields random numbers pulled from a standard normal distribution with a zero mean and a standard deviation of 1. To alter this distribution, you can multiply the output of randn by your desired standard deviation and then add your desired mean.
% Define the distribution that you'd like to get
mu = 2.5;
sigma = 2.0;
% You can any size matrix of values
sz = [10000 1];
value = (randn(sz) * sigma) + mu;
% mean(value)
% 2.4696
%
% std(value)
% 1.9939
If you just want a single number from the distribution, you can use the no-input version of randn to yield a scalar
value = (randn * sigma) + mu;
Just for the fun of it, you can generate a Gaussian random variable using a uniform random generator:
The logarithm of a uniform random variable on (0,1) has an exponential distribution
The square root of that has a Rayleigh distribution
Multiply by the cosine (or sine) of a uniform random variable on (0,2*pi) and the result is Gaussian. You need to multiply by sqrt(2) to normalize.
The obtained Gaussian variable is normalized (zero mean, unit standard deviation). If you need specific mean and standard deviation, multiply by the latter and then add the former.
Example (normalized Gaussian):
m = 1; n = 1e5; % desired output size
x = sqrt(-2*log(rand(m,n))).*cos(2*pi*rand(m,n));
Check:
>> mean(x)
ans =
-0.001194631660594
>> std(x)
ans =
0.999770464360453
>> histogram(x,41)
I also have to keep in mind the skewness and the kurtosis of the distribution and these have to be reflected in the simulated values.
My empirical values are past stock returns (non-standard normal distribution).
Is there an existing package that will do this for me? All the packages I see online have only the first two moments.
What you're describing is using the method of moments to define a distribution. Such methods have generally fallen out of favor in statistics. However, you can check out pearsonrnd, which may work fine depending on your data.
Instead, I'd suggest directly finding the empirical CDF for the data using ecdf and use that in conjunction with inverse sampling to generate random variates. Here's a basic function that will do that:
function r=empiricalrnd(y,varargin)
%EMPIRICALRND Random values from an empirical distribution
% EMPIRICALRND(Y) returns a single random value from distribution described by the data
% in the vector Y.
%
% EMPIRICALRND(Y,M), EMPIRICALRND(Y,M,N), EMPIRICALRND(Y,[M,N]), etc. return random arrays.
[f,x] = ecdf(y);
r = rand(varargin{:});
r = interp1(f,x,r,'linear','extrap');
You can play with the options for interp1 if you like. And here's a quick test:
% Generate demo data for 100 samples of log-normal distribution
mu = 1;
sig = 1;
m = 1e2;
rng(1); % Set seed to make repeatable
y = lognrnd(mu,sig,m,1);
% Generate 1000 random variates from data
n = 1e3;
r = empiricalrnd(y,n,1);
% Plot CDFs
ecdf(y);
hold on;
ecdf(r);
x = 0:0.1:50;
plot(x,logncdf(x,mu,sig),'g');
legend('Empirical CDF','CDF of sampled data','Actual CDF');