Matrix multiplication with different dimensions - matlab

I need to vectorize the following loop
a=rand(m,n,k)
b=rand(n,k)
c=zeros(m,k)
for ik=1:k
c(:,ik)=a(:,:,ik)*b(:,ik)
end
I couldn't find any matlab function for doing this and I think bsxfun #multiply does something different. Could you please help on doing this?

I think you can use bsxfun as follows (can't test this right now - let me know if this gives you trouble):
c = squeeze(sum(bsxfun(#times, a, b), 2));
The bsxfun will expand the matrix b and then do element-by-element multiplication. The sum operation on the second dimension takes care of the "matrix multiplication" aspect. It is possible that you need to expand b to have an explicit singleton first dimension:
c = squeeze(sum(bsxfun(#times, a, reshape(b, 1, n, k)), 2));

Related

Multiply each element of a vector by each element of another vector

I have two very big column vectors, A and B, of size ax1 and bx1, respectively. I want to construct a vector C of size (b*a)x1 by computing A(i)*B(j) for each i and j. To illustrate what I mean:
clear
a=10^3;
b=10^3;
A=randn(a,1);
B=randn(b,1);
Ctemp=zeros(a,b);
for i=1:a
for j=1:b
Ctemp(i,j)=A(i)*B(j);
end
end
C=reshape(Ctemp, a*b,1);
Question: is there a more efficient way to obtain C which avoid double looping? My actual a and b are bigger than 10^3.
This is a simple case of array multiplication that can benefit from implicit (or explicit) expansion:
% Implicit (R2016b and newer):
C = A(:) .* B(:).'; % optionally surround by reshape( ... , [], 1);
% "Explicit" (R2007a and newer):
C = bsxfun( #times, A(:), B(:).' );
From there it's just a matter of reshaping, as you're already doing (D = C(:) or D = C(:).').
You can also calculate the outer product of the vectors, resulting in a matrix of your required terms:
C = A*B'; % Assuming A,B are column vectors here
And reshape the output afterward as stated. Not sure if more efficient though.

element by element matrix multiplication in Matlab

So I have the following matrices:
A = [1 2 3; 4 5 6];
B = [0.5 2 3];
I'm writing a function in MATLAB that will allow me to multiply a vector and a matrix by element as long as the number of elements in the vector matches the number of columns. In A there are 3 columns:
1 2 3
4 5 6
B also has 3 elements so this should work. I'm trying to produce the following output based on A and B:
0.5 4 9
2 10 18
My code is below. Does anyone know what I'm doing wrong?
function C = lab11(mat, vec)
C = zeros(2,3);
[a, b] = size(mat);
[c, d] = size(vec);
for i = 1:a
for k = 1:b
for j = 1
C(i,k) = C(i,k) + A(i,j) * B(j,k);
end
end
end
end
MATLAB already has functionality to do this in the bsxfun function. bsxfun will take two matrices and duplicate singleton dimensions until the matrices are the same size, then perform a binary operation on the two matrices. So, for your example, you would simply do the following:
C = bsxfun(#times,mat,vec);
Referencing MrAzzaman, bsxfun is the way to go with this. However, judging from your function name, this looks like it's homework, and so let's stick with what you have originally. As such, you need to only write two for loops. You would use the second for loop to index into both the vector and the columns of the matrix at the same time. The outer most for loop would access the rows of the matrix. In addition, you are referencing A and B, which are variables that don't exist in your code. You are also initializing the output matrix C to be 2 x 3 always. You want this to be the same size as mat. I also removed your checking of the length of the vector because you weren't doing anything with the result.
As such:
function C = lab11(mat, vec)
[a, b] = size(mat);
C = zeros(a,b);
for i = 1:a
for k = 1:b
C(i,k) = mat(i,k) * vec(k);
end
end
end
Take special note at what I did. The outer-most for loop accesses the rows of mat, while the inner-most loop accesses the columns of mat as well as the elements of vec. Bear in mind that the number of columns of mat need to be the same as the number of elements in vec. You should probably check for this in your code.
If you don't like using the bsxfun approach, one alternative is to take the vector vec and make a matrix out of this that is the same size as mat by stacking the vector vec on top of itself for as many times as we have rows in mat. After this, you can do element-by-element multiplication. You can do this stacking by using repmat which repeats a vector or matrices a given number of times in any dimension(s) you want. As such, your function would be simplified to:
function C = lab11(mat, vec)
rows = size(mat, 1);
vec_mat = repmat(vec, rows, 1);
C = mat .* vec_mat;
end
However, I would personally go with the bsxfun route. bsxfun basically does what the repmat paradigm does under the hood. Internally, it ensures that both of your inputs have the same size. If it doesn't, it replicates the smaller array / matrix until it is the same size as the larger array / matrix, then applies an element-by-element operation to the corresponding elements in both variables. bsxfun stands for Binary Singleton EXpansion FUNction, which is a fancy way of saying exactly what I just talked about.
Therefore, your function is further simplified to:
function C = lab11(mat, vec)
C = bsxfun(#times, mat, vec);
end
Good luck!

Multiply tensor with vector

A = ones(4,4,4);
b = [1,2,3,4];
I wish to multiply A with b in such a manner that,
ans(:,:,1) == ones(4,4)*b(1);
ans(:,:,2) == ones(4,4)*b(2);
etc.
I think you are looking for the following:
A = ones(4,4,4);
B = 1:4;
C = shiftdim(B,-1);
bsxfun(#times,A,C)
Shiftdim makes sure the vector is placed in the right dimension. Then bsxfun makes sure the vector gets expanded to match the matrix, after which they can be properly multiplied.
If you struggle to understand this function, you may justs want to use a loop over the entities of b as that should allow you to get this result as well.
In addition to Dennis' answer, you can combine permute and bsxfun like this:
bsxfun(#times, A, permute(b,[3 1 2]))
permute shifts the dimension of b so that it lies along the third dimension, and bsxfun makes sure the dimensions match when doing the multiplication.
I realize that you probably need to tweak this to make it fit your needs. Therefore, if you have a hard time understanding how bsxfun, permute, shiftdim etc. works, don't care about performance and don't intend using MATLAB the way it's supposed to be used... You can always do it using loops.
C = zeros(size(A));
for ii = 1:numel(b)
C(:,:,ii) = A(:,:,ii)*b(ii);
end

Multidimensional Arrays Multiplication in Matlab

I have the following three arrays in Matlab:
A size: 2xMxN
B size: MxN
C size: 2xN
Is there any way to remove the following loop to speed things up?
D = zeros(2,N);
for i=1:N
D(:,i) = A(:,:,i) * ( B(:,i) - A(:,:,i)' * C(:,i) );
end
Thanks
Yes, it is possible to do without the for loop, but whether this leads to a speed-up depends on the values of M and N.
Your idea of a generalized matrix multiplication is interesting, but it is not exactly to the point here, because through the repeated use of the index i you effectively take a generalized diagonal of a generalized product, which means that most of the multiplication results are not needed.
The trick to implement the computation without a loop is to a) match matrix dimensions through reshape, b) obtain the matrix product through bsxfun(#times, …) and sum, and c) get rid of the resulting singleton dimensions through reshape:
par = B - reshape(sum(bsxfun(#times, A, reshape(C, 2, 1, N)), 1), M, N);
D = reshape(sum(bsxfun(#times, A, reshape(par, 1, M, N)), 2), 2, N);
par is the value of the inner expression in parentheses, D the final result.
As said, the timing depends on the exact values. For M = 100 and N = 1000000 I find a speed-up by about a factor of two, for M = 10000 and N = 10000 the loop-less implementation is actually a bit slower.
You may find that the following
D=tprod(A,[1 -3 2],B-tprod(A,[-3 1 2],C,[-3 2]),[-3 2]);
cuts the time taken. I did a few tests and found the time was cut in about half.
tprod is available at
http://www.mathworks.com/matlabcentral/fileexchange/16275
tprod requires that A, B and C are full (not sparse).

bsxfun-like for matrix product

I need to multiply a matrix A with n matrices, and get n matrices back. For example, multiply a 2x2 matrix with 3 2x2 matrices stacked as a 2x2x3 Matlab array. bsxfun is what I usually use for such situations, but it only applies for element-wise operations.
I could do something like:
blkdiag(a, a, a) * blkdiag(b(:,:,1), b(:,:,2), b(:,:,3))
but I need a solution for arbitrary n - ?
You can reshape the stacked matrices. Suppose you have k-by-k matrix a and a stack of m k-by-k matrices sb and you want the product a*sb(:,:,ii) for ii = 1..m. Then all you need is
sza = size(a);
b = reshape( b, sza(2), [] ); % concatenate all matrices aloong the second dim
res = a * b;
res = reshape( res, sza(1), [], size(sb,3) ); % stack back to 3d
Your solution can be adapted to arbitrary size using comma-saparated lists obtained from cell arrays:
[k m n] = size(B);
Acell = mat2cell(repmat(A,[1 1 n]),k,m,ones(1,n));
Bcell = mat2cell(B,k,m,ones(1,n));
blkdiag(Acell{:}) * blkdiag(Bcell{:});
You could then stack the blocks on a 3D array using this answer, and keep only the relevant ones.
But in this case a good old loop is probably faster:
C = NaN(size(B));
for nn = 1:n
C(:,:,nn) = A * B(:,:,nn);
end
For large stacks of matrices and/or vectors over which to execute matrix multiplication, speed can start becoming an issue. To avoid re-inventing the wheel, you could simply compile and use the following fast MEX code:
MTIMESX - Mathworks.
As a rule of thumb, MATLAB is often quite inefficient at executing for loops over large numbers of operations which look like they should be vectorizable; I cannot think of a straightforward way of generalising Shai's answer to this case.