Dynamically create numeric matrix from fields of a scalar structure - matlab

in Matlab, I have a scalar structure S with some fields. Each field contains a numeric vector, and all these vectors have the same size, say nx1.
Now I would like to create a numeric matrix based on a selection of the fields.
The starting point is a logical mask sized mx1, where m is the number of fields of S. mask(i) is true if the ith field of S should be included in the matrix. So the matrix size would be n x sum(mask).
Example (in my code, the structure is not built in this way, of course :-)
vec = rand(1000,1);
S.f1 = vec;
S.f2 = vec;
S.f3 = vec;
S.f4 = vec;
S.f5 = vec;
mask = [false true true false false]; % 5 elements because S has 5 fields
The expected output would be:
output = [S.f2 S.f3];
But of course, the creation of output should depend dynamically on the fields of S and on mask.
Is there any way to achieve this without using an ugly construction including a filter of the struct field names, loop, etc.?
Thank you very much!
Philip

Here's one way -
fns = fieldnames(S) %// Get all fieldnames
S = rmfield(S,fns(~mask)) %// Remove all fields with masking values as false
Next, to get your numeric array, use struct2array -
output = struct2array(S)

You can convert the struct into a cell using struct2cell, and use normal cell indexing to get the fields you want.
a = (1:5).';
s.f1 = a; s.f2 = 2*a; s.f3 = 3*a; s.f4 = 4*a; s.f5 = 5*a;
c = struct2cell(S);
[c{mask}]
ans =
2 3
4 6
6 9
8 12
10 15

You can do something like:
msks = fieldnames(S);
msks = msks(mask);
n = numel(msks);
output = zeros(1000,n);
for i = 1:n
output(:,i) = S.(msks{i});
end

Related

Check if combination of field values exists in a MATLAB cell array of structs

Suppose I have a cell array containing structs, each with 3 fields.
When I encounter a new struct, I would like to check whether the values in 2 of its 3 fields match those of any struct elements in the array.
cell_array = cell(4,1)
cell_array{1}.Field1 = "ABC"
cell_array{1}.Field2 = 46
cell_array{1}.Field3 = 1648
% Would like to check if fields 1 and 2 match
% any struct in cell_array
new_struct.Field1 = "ABC"
new_struct.Field2 = 46
new_struct.field3 = 1765
Thank you.
You should use Matlab's intersect command. It finds similarities in between two lists of any sort and returns those similarities.
Should then be as simple as:
cell_array = {'ABC', '46', '1648'};
new_array = {'ABC', '46', '1765'};
[C,~,~] = intersect(cell_array,new_array)
disp(C) % C = {'ABC'} {'46'}; 2x1 cell array
% Then simply checking the length of C
if length(C) >= 2
% Perform your task
end

How to make calculations on certain cells (within a table) that meet specific criteria?

I have the following code:
L_sum = zeros(height(ABC),1);
for i = 1:height(ABC)
L_sum(i) = sum(ABC{i, ABC.L(i,4:281)});
end
Here my table:
Problem: My sum function sums the entire row values (col. 4-281) per date whereas I only want those cells to be added whose headers are in the cell array of ABC.L, for any given date.
X = ABC.L{1, 1}; gives (excerpt):
Red arrow: what sum function is referencing (L of same date).
Green arrow: what I am trying to reference now (L of previous date).
Thanks for your help
In general, in matlab you dont need to use for loops to do simple operations like selective sums.
Example:
Data=...
[1 2 3;
4 5 6;
7 8 9;
7 7 7];
NofRows=size(Data,1);
RowsToSum=3:NofRows;
ColToSum=[1,3];
% sum second dimension 2d array
Result=sum(Data(RowsToSum,ColToSum), 2)
% table mode
DataTable=array2table(Data);
Result2=sum(DataTable{RowsToSum,ColToSum}, 2)
To do that you need to first extract the columns you want to sum, and then sum them:
% some arbitrary data:
ABC = table;
ABC.L{1,1} = {'aa','cc'};
ABC.L{2,1} = {'aa','b'};
ABC.L{3,1} = {'aa','d'};
ABC.L{4,1} = {'b','d'};
ABC{1:4,2:5} = magic(4);
ABC.Properties.VariableNames(2:5) = {'aa','b','cc','d'}
% summing the correct columns:
L_sum = zeros(height(ABC),1);
col_names = ABC.Properties.VariableNames; % just to make things shorter
for k = 1:height(ABC)
% the following 'cellfun' compares each column to the values in ABC.L{k},
% and returns a cell array of the result for each of them, then
% 'cell2mat' converts it to logical array, and 'any' combines the
% results for all elements in ABC.L{k} to one logical vector:
col_to_sum = any(cell2mat(...
cellfun(#(x) strcmp(col_names,x),ABC.L{k},...
'UniformOutput', false).'),1);
% then a logical indexing is used to define the columns for summation:
L_sum(k) = sum(ABC{k,col_to_sum});
end

How to convert a struct to a matrix

I have a 10 x 10 struct with four fields a,b,c,d.
How do I convert this struct to a 10 x 10 matrix with entries only from the field a?
You can rely on the fact that str.a returns a comma-separated list. We can therefore concatenate the values together and reshape the resulting array to be the same size as the input struct.
% If a contains scalars
out = reshape([str.a], size(str));
% If a contains matrices
out = reshape({str.a}, size(str));
One liner solution
res = cellfun(#(strctObj) strctObj.a,str,'UniformOutput',false);
Further explanation
Define a one-line function which extract the a value.
getAFunc = #(strctObj) strctObj.a;
use MATLAB's cellfun function to apply it on your cell and extract the matrix:
res = cellfun(#(strctObj) getAFunc ,strctCellObj,'UniformOutput',false);
Example
%initializes input
N=10;
str = cell(N,N);
for t=1:N*N
str{t}.a = rand;
str{t}.b = rand;
str{t}.c = rand;
str{t}.d = rand;
end
%extracts output matrix
res = cellfun(#(strctObj) strctObj.a,str,'UniformOutput',false);

Matlab: Create a matrix from a row vector according to specifications

I am working in matlab. I have a row vector in and a scalar number fuzzy_no. I want to create a matrix output of size fuzzy_no x (numel(in)-fuzzy_no). such that the ith col of the matrix output has the elements from i:i+fuzzy_no-1 of row vector in.
In other words I want to implement the following loop without using loops
n = numel(in);
output = zeros(fuzzy_no,n-fuzzy_no);
for i = 1:size(output,2)
output(:,i) = in(1,i:i+fuzzy_no-1);
end
Note that in your example the last element from in is missing in the output. Assuming you want all the elements, you could use indexing like so:
[ii, jj] = meshgrid(1:fuzzy_no, 0:n-fuzzy_no);
output = in(ii+jj)
Or you could use the slightly more satisfying hankel built-in:
output = hankel(in(1:fuzzy_no), in(fuzzy_no:end))
Try this -
n = numel(in);
lim1 = n-fuzzy_no
t1 = bsxfun(#times,in',ones(1,lim1)) %//'
uind = triu(ones(size(t1)),1)>0
lind = [zeros(fuzzy_no,lim1) ; tril(ones([size(t1,1)-fuzzy_no lim1]))]>0
t1(uind | lind)=[];
output = reshape(t1,fuzzy_no,n-fuzzy_no)

Removing a random number of columns from a matrix

I need to take away a random number of columns from an arbitrarily large matrix, I've put my attempt below, but I'm certain that there is a better way.
function new = reduceMatrices(original, colsToTakeAway)
a = colsToTakeAway(1);
b = colsToTakeAway(2);
c = colsToTakeAway(3);
x = original(1:a-1);
y = original(a+1:b-1);
z = original(b+1:c-1);
if c == size(original, 2);
new = [x,y,z];
elseif (c+1) == size(original, 2);
new = [x,y,z,c+1]
else
new = [x,y,z,c+1:size(original, 2)];
end
Here's one approach. First, generate a row vector of random numbers with numcols elements, where numcols is the number of columns in the original matrix:
rc = rand(1,numcols)
Next make a vector of 1s and 0s from this, for example
lv = rc>0.75
which will produce something like
0 1 1 0 1
and you can use Matlab's logical indexing feature to write
original(:,lv)
which will return only those columns of original which correspond to the 1s in lv.
It's not entirely clear from your question how you want to make the vector of column selections, but this should give you some ideas.
function newM = reduceMatrices(original, colsToTakeAway)
% define the columns to keep := cols \ colsToTakeAway
colsToKeep = setdiff(1:size(original,2), colsToTakeAway);
newM = original(:, colsToKeep);
end