Solving a linear system of equation with two variables in MATLAB - matlab

It might seem a simple question. I need it, though. Let's assume we have two equations:
2 * y + x + 1 = 0
and
y - 2 * x = 0
I would like to find their bisection which can be calculated from this equation:
|x + 2 * y + 1| |-2 *x + y |
------------------- = -----------------
(sqrt(2^2 + 1^2)) (sqrt(1^2 + 2^2))
To make the long story short, we only need to solve this below system of equation:
2 * y + x + 1 = -2 *x + y
and
2 * y + x + 1 = 2 *x - y
However, using solve function of MATLAB:
syms x y
eqn1 = 2 * y + x + 1 == -2 *x + y ;
eqn2 = 2 * y + x + 1 == 2 *x - y ;
[x, y] = solve (eqn1 , eqn2, x, y) ;
Will give me:
x = -1/5 and y = -2/5
But, I am looking for the result equations, which is:
y = -3 * x - 1 and 3 * y = 2 * x - 1
So, does anyone know how I can get the above line equation instead of the result point? Thanks,

The following should solve both equations with y on the left-hand-side:
y1 = solve(eqn1,y)
y2 = solve(eqn2,y)
Result:
y1 =
- 3*x - 1
y2 =
x/3 - 1/3
As an aside, it would be much faster to solve this system by thinking of it it as a matrix inversion problem Ax=b rather than using MATLAB's symbolic tools:
A = [1 2; -2 1];
b = [-1; 0];
x = A\b
Result:
x =
-0.2000
-0.4000

Related

large number of linear equations in MATLAB

I want to use solve() to solve a large system of linear equations. The solve() function needs equations and variables. I use a loop to generate the equations and my variables are contained in a large array. This is a simple code of what I am trying to do:
x = sym('x',[1 3])
eqn = sym('eqn', [1,3])
eqn1 = 2*x1 + x2 + x3 - 2 == 0
eqn2 = 2*x2 -x2 -x1- x3 == 3
eqn3 = 2*x2+ x1 + 3*x3 == -10
Y = solve(eqn, x)
MATLAB does not recognize my variable x1. I have solved the same system using the following code:
syms x1 x2 x3
eqn1 = 2*x1 + x2 + x3 == 2
eqn2 = 2*x2 -x2 -x1 - x3 == 3
eqn3 = 2*x2+ x1 + 3*x3 == -10
X = solve([eqn1, eqn2, eqn3, eqn4], [x1 x2 x3])
structfun(#subs,X)
But this is useless for a very large number of equations. What am I doing wrong?
You don't need symbolic (syms) for that. This is a standard linear system of equations that can be represented as:
Ax = b where A = [2 1 1; -1 1 -1; 1 2 3], x = [x1; x2; x3] and b = [0; 3; -10]
To solve for x, you would first define
A = [2 1 1; -1 1 -1; 1 2 3]
and
b = [0; 3; -10]
and then solve using
x = A\b
PS. There are some odd things in your question, eg. in eq.2 eqn2 = 2*x2 -x2 -x1- x3 == 3 I assume you omitted simplying this to -x1 +x2 -x3 ==3.
PS2. This is pretty standard Matlab, you can find a lot of info under the standard mldivide page in the documentation along with a lot similar questions here on SO.

Matrix Basis expansion

MATLAB:
I am trying to do basis expansion of a huge matrix(1000x15).
For example,
X =
x1 x2
1 4
2 5
3 6
I want to build a new matrix.
Y =
x1 x2 x1*x1 x1*x2 x2*x2
1 4 1 4 16
2 5 4 10 25
3 6 9 18 36
Could any one please suggest a easier way to do this
% your input
A = [1 4; 2 5; 3 6];
% generate pairs
[p,q] = meshgrid(1:size(A,2), 1:size(A,2));
% only retain unique pairs
ii = tril(p) > 0;
% perform element wise multiplication
res = [A A(:,p(ii)) .* A(:,q(ii))];
Using the one-liner from this answer to get the 2-combinations of the indices, you can generate the matrix without the interleaved ordering with
function Y = columnCombo(Y)
comb = nchoosek(1:size(Y,2),2);
Y = [Y , Y.^2 , Y(:,comb(:,1)).*Y(:,comb(:,2))];
end
For the interleaved ordering, I came up with this, possibly sub-optimal, solution:
function Y = columnCombo(Y)
[m,n] = size(Y);
comb = nchoosek(1:n,2);
Y = [Y,zeros(m,n + size(comb,1))];
col = n+1;
for k = 1:n-1
Y(:,col) = Y(:,k).*Y(:,k) ;
ms = comb(comb(:,1)==k,:) ;
ncol = size(ms,1) ;
Y(:,col+(1:ncol)) = Y(:,ms(:,1)).*Y(:,ms(:,2)) ;
col = col + ncol + 1 ;
end
Y(:,end) = Y(:,n).^2;
end

Numerical solutions of a nonlinear equation with different independent values in matlab

For example, if I have this function: g = t^3 - 5*t^2 + 2
And g = [3 4 6 2 9 10 17 1]
I would like to solve the equation for each g[i] and obtain the resulting t vector.
This might guide you:
>> syms t g %// define symbolic variables
>> y = t^3 - 5*t^2 + 2 - g; %// define y so that equation is: y=0
>> g_data = [3 4 6 2 9 10 17 1]; %// define g values
>> n = 1; %// choose first value. Or use a loop: for n = 1:numel(g_data)
>> s = solve(subs(y, g, g_data(n))) %// substitute g value and solve equation y=0
s =
25/(9*((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)) + ((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3) + 5/3
5/3 - ((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)/2 - 25/(18*((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)) - (3^(1/2)*(25/(9*((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)) - ((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3))*i)/2
5/3 - ((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)/2 - 25/(18*((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)) + (3^(1/2)*(25/(9*((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3)) - ((108^(1/2)*527^(1/2))/108 + 277/54)^(1/3))*i)/2
>> double(s) %// show solutions as floating point values
ans =
5.039377328113847
-0.019688664056924 + 0.445027607060817i
-0.019688664056924 - 0.445027607060817i

How can I plot 3 equations with 3 variables in MATLAB?

I am trying to plot this system:
x1 - x2 + 3x3 = 8
2x1 - x2 + 4x3 = 11
- x1 + 2x2 -4x3 = -11
I tried with ezsurf and meshgrid, but I wasn't able to do it.
clc
clear all
close all
A = [1 -1 3; 2 -1 4; -1 2 -4];
B = [8 11 -11]';
C = [A B];
R = rref(C);
% R =
% 1 0 0 1
% 0 1 0 -1
% 0 0 1 2
D = R(:,4); % salvo la 4 colonna che contiene le soluzioni
disp('Le soluzioni del sistema proposto sono:');
disp(D);
figure(1);
hold on
grid on
syms x y z
eq = x + y + 3*z - 8;
Z = solve(eq,z)
ezsurf('8/3 - y/3 - x/3');
scatter3(D(1),D(2),D(3));
How can I plot this system of equations?
Maybe I'm missing something, but you have 3 unknown x1, x2 and x3 for 3 equations, therefore there is a unique solution (provided the determinant of the matrix is not zero):
>> A = [1 -1 3; 2 -1 4; -1 2 -4];
>> B = [8 11 -11]';
>> x = A\B
x =
1
-1
2
So there is nothing to plot other than a single point?

Matlab Vectorization : How to avoid this "for" loop?

I have following matrices :
X=1 2 3
Y=4 5 6
A=1 2 3
4 5 6
7 8 9
I Want to do
for each (i,j) in A
v = A(i,j)*X - Y
B(i,j) = v * v'
i.e. each element of A is multiplied by vector X, then resultant vector subtracts Y from itself and finally we take inner product of that vector to bring a single number.
Can it be done without for loop ?
One thing often forgotten in Matlab: The operator ' takes the conjugate transposed (.' is the ordinary transposed). In other words, A' == conj(trans(A)), whereas A.' == trans(A), which makes a difference if A is a complex matrix.
Ok, let's apply some mathematics to your equations. We have
v = A(i,j)*X - Y
B(i,j) = v * v'
= (A(i,j)*X - Y) * (A(i,j)*X - Y)'
= A(i,j)*X * conj(A(i,j))*X' - Y * conj(A(i,j))*X'
- A(i,j)*X * Y' + Y * Y'
= A(i,j)*conj(A(i,j)) * X*X' - conj(A(i,j)) * Y*X' - A(i,j) * X*Y' + Y*Y'
So a first result would be
B = A.*conj(A) * (X*X') - conj(A) * (Y*X') - A * (X*Y') + Y*Y'
In the case of real matrices/vectors, one has the identities
X*Y' == Y*X'
A == conj(A)
which means, you can reduce the expression to
B = A.*A * (X*X') - 2*A * (X*Y') + Y*Y'
= A.^2 * (X*X') - 2*A * (X*Y') + Y*Y'
An alternative method:
X = [1 2 3]
Y = [4 5 6]
A = [1 2 3; 4 5 6; 7 8 9]
V = bsxfun(#minus, A(:)*X, [4 5 6])
b = sum((V.^2)')
B = reshape(b , 3, 3)
I get the result:
B = 27 5 11
45 107 197
315 461 635