Pop-up windows asking the user for input/save his work (MATLAB) - matlab

I am writing a program and I need some help. It starts by asking this question:
A = questdlg('What would you like to do?','Artificial Neural Network',...
'Train','Test','Exit','Exit');
Then depending what the use chooses it asks certain questions and do certain things
`if strcmp (A,'Train')
B = questdlg ('Would you like to create a new network or add to the already trained data?',...
'!','Create','Add','Exit','Exit');
if strcmp (B, 'Create')
if strcmp (B, 'Create')
%add as many text file as he wants to - need to figure out how I
%can extract the data from them though
[fname,dirpath]=uigetfile ('*.txt','Select a txt file','MultiSelect',...
'on');
elseif strcmp(B,'Add')
%choose what type is it
D = listdlg('PromptString','What colour is it?',...
'SelectionMode','single', 'ListString',...
{'Strawberry','Orange',...
'Chocolate','Banana','Rose'}, 'Name','Select Ice Cream',...
'ListSize',[230 130]);
%and then whatever choise he chooses it will feed it to the main
%function. For example if he chooses Orange then it will go the
%second part of the training, if it chooses Rose and the fifth
%one and so on.
else strcmp(B,'Exit')
disp('Exit')
end
So the thing I want help with is:
How can the user when he imports the txt files in Matlab use them in order to run the program? and
How can the user add more choices at the listdlg and when it will choose a choice then automatically it will go to the corresponding step of the code?
Any help would be appreciated!
Thanks!! :)
PS: Sorry for the long post!

with uigetfile etc. you only get the filename and path. But to get the data you have to load the file:
For mat-files use:
TMW: load mat-files
For other files use:
TMW: load data from file

To open a file in MATLAB, you can use uigetfile. To save a file, you can use uiputfile. This will open up standard file dialog boxex for opening and saving files. The result would be a cell array, and then use textscan to read the data from the individual files.
You should switch-case. On selecting one of the choices, you can train the neural network accordingly. The training preferably should be written in separate m files or different subfunctions for readability.

Related

pixelLabelDatastore from loaded image in workspace

I have multiple small *.mat files, each containing 4 input images (template{1:4} and a second channel template2{1:4}) and 4 output images (region_of_interests{1:4}), a binarized ('mask') image to train a deep neural network.
I basically followed an example on Mathworks and it suggests to use a function (in this example #matreader) to read in custom file formats.
However ...
It seems impossible to load multiple images from one *.mat file using any load function as it only allows one output, and imageDatastore doen't seem to allow loading data from workspace. How could this be achieved?
Similarly, it seems impossible to load a pixelLabelDatastore from a workspace variable. As a workaround I ended up saving the contents of my *.mat file to an image (using imwrite, saving to save_dir), and re-loading it from there (in this case, the function doesn't even allow to load *.mat files.). (How) can this be achieved without re-saving the file as image?
Here my failed attempt to do so:
%main script
image_dir = pwd; %location of *.mat files
save_dir = [pwd '/a/']; %location of saved output masks
imds = imageDatastore(image_dir,'FileExtensions','.mat','ReadFcn',#matreader); %load template (input) images
pxds = pixelLabelDatastore(save_dir,{'nothing','something'},[0 255]);%load region_of_interests (output) image
%etc, etc, go on to train network
%matreader function, save as separate file
function data=matreader(filename)
in=1; %give up the 3 other images stored in template{1:4}
load(filename); %loads template and template2, containing 4x input images each
data=cat(3,template{in},template2{in}); %concatinate 2 template input images in 3rd dimension
end
%generate example data for this question, will save into a file 'example.mat' in workspace
for ind=1:4
template{ind}=rand([200,400]);
template2{ind}=rand([200,400]);
region_of_interests{ind}=rand([200,400])>.5;
end
save('example','template','template2','output')
You should be able to achieve this using the standard load and save function. Have a look at this code:
image_dir = pwd;
save_dir = pwd;
imds = imageDatastore(image_dir,'FileExtensions',{'.jpg','.tif'});
pxds = pixelLabelDatastore(save_dir,{'nothing','something'},[0 255]);
save('images.mat','imds', 'pxds')
clear
load('images.mat') % gives you the variable "imds" and "pxds" directly -> might override previous variables
tmp = load('images.mat'); % saves all variables in a struct, access it via tmp.imds and tmp.pxds
If you only want to select the variables you want to load use:
load('images.mat','imds') % loads "imds" variable
load('images.mat','pxds') % loads "pxds" variable
load('images.mat','imds','pxds') % loads both variables
EDIT
Now I get the problem, but I fear this is not how it is going to work. The Idea behind the Datastore objects is, that it is used if the data is too big to fit in memory as a whole, but every little piece is small enough to fit in memory. You can use the Datastore object than to easily process and read multiple files on a disk.
This means for you: Simply save your images not as one big *mat file but as multiple small *.mat files that only contain one image.
EDIT 2
Is it strictly necessary to use an imageDatastore for this task? If not you can use something like the following:
image_dir = pwd;
matFiles = dir([image_dir '*.mat']);
for i=1:length(matFiles)
data = load(matFiles(i).name);
img = convertMatToImage(data); % write custom function which converts the mat input to your image
% or something like this:
% for j=1:4
% img(:,:,j) = cat(3,template{j},template2{j});
% end
% process image
end
another alternative would be to create a "image" in your 'matreader' which does not only have 2 bands but to simply put all bands (all templates) on top of each other providing a "datacube" and then in an second step after iterating over all small mat files and reading them splitting the single images out of the one bigger datacube.
would look something like this:
function data=matreader(filename)
load(filename);
for in=1:4
data=cat(3,template{in},template2{in});
end
end
and in your main file, you have to simply split the data into 4 pieces.
I have never tested it but maybe it is possible to return a cell instead of a matrix?
function data=matreader(filename)
load(filename);
data = cell(1,4)
for in=1:4
data{in}=cat(3,template{in},template2{in});
end
end
Not sure if this would work.
However, the right way to go forward from here really depends on how you plan to use the images from imds and if it is really necessary to use a imageDatastore.

MATLAB: making a histogram plot from csv files read and put into cells?

Unfortunately I am not too tech proficient and only have a basic MATLAB/programming background...
I have several csv data files in a folder, and would like to make a histogram plot of all of them simultaneously in order to compare them. I am not sure how to go about doing this. Some digging online gave a script:
d=dir('*.csv'); % return the list of csv files
for i=1:length(d)
m{i}=csvread(d(i).name); % put into cell array
end
The problem is I cannot now simply write histogram(m(i)) command, because m(i) is a cell type not a csv file type (I'm not sure I'm using this terminology correctly, but MATLAB definitely isn't accepting the former).
I am not quite sure how to proceed. In fact, I am not sure what exactly is the nature of the elements m(i) and what I can/cannot do with them. The histogram command wants a matrix input, so presumably I would need a 'vector of matrices' and a command which plots each of the vector elements (i.e. matrices) on a separate plot. I would have about 14 altogether, which is quite a lot and would take a long time to load, but I am not sure how to proceed more efficiently.
Generalizing the question:
I will later be writing a script to reduce the noise and smooth out the data in the csv file, and binarise it (the csv files are for noisy images with vague shapes, and I want to distinguish these shapes by setting a cut off for the pixel intensity/value in the csv matrix, such as to create a binary image showing these shapes). Ideally, I would like to apply this to all of the images in my folder at once so I can shift out which images are best for analysis. So my question is, how can I run a script with all of the csv files in my folder so that I can compare them all at once? I presume whatever technique I use for the histogram plots can apply to this too, but I am not sure.
It should probably be better to write a script which:
-makes a histogram plot and/or runs the binarising script for each csv file in the folder
-and puts all of the images into a new, designated folder, so I can sift through these.
I would greatly appreciate pointers on how to do this. As I mentioned, I am quite new to programming and am getting overwhelmed when looking at suggestions, seeing various different commands used to apparently achieve the same thing- reading several files at once.
The function csvread returns natively a matrix. I am not sure but it is possible that if some elements inside the csv file are not numbers, Matlab automatically makes a cell array out of the output. Since I don't know the structure of your csv-files I will recommend you trying out some similar functions(readtable, xlsread):
M = readtable(d(i).name) % Reads table like data, most recommended
M = xlsread(d(i).name) % Excel like structures, but works also on similar data
Try them out and let me know if it worked. If not please upload a file sample.
The function csvread(filename)
always return the matrix M that is numerical matrix and will never give the cell as return.
If you have textual data inside the .csv file, it will give you an error for not having the numerical data only. The only reason I can see for using the cell array when reading the files is if the dimensions of individual matrices read from each file are different, for example first .csv file contains data organised as 3xA, and second .csv file contains data organised as 2xB, so you can place them all into a single structure.
However, it is still possible to use histogram on cell array, by extracting the element as an array instead of extracting it as cell element.
If M is a cell matrix, there are two options for extracting the data:
M(i) and M{i}. M(i) will give you the cell element, and cannot be used for histogram, however M{i} returns element in its initial form which is numerical matrix.
TL;DR use histogram(M{i}) instead of histogram(M(i)).

How can I increase the speed of this xlsread for loop?

I have made a script which contains a for loop selecting columns from 533 different excel files and places them into matrices so that they can be compared, however the process is taking too long (it ran for 3 hours yesterday and wasn't even halfway through!!).
I know xlsread is naturally slow, but does anyone know how I can make my script run faster? The script is below, thanks!!
%Split the data into g's and h's
CRNum = 533; %Number of Carrington Rotation files
A(:,1) = xlsread('CR1643.xlsx','A:A'); % Set harmonic coefficient columns
A(:,2) = xlsread('CR1643.xlsx','B:B');
B(:,1) = xlsread('CR1643.xlsx','A:A');
B(:,2) = xlsread('CR1643.xlsx','B:B');
for k = 1:CRNum
textFileName = ['CR' num2str(k+1642) '.xlsx'];
A(:,k+2) = xlsread(textFileName,'C:C'); %for g
B(:,k+2) = xlsread(textFileName,'D:D'); %for h
end
Don't use xlsread if you want to go through a loop. because it opens excel and then closes excel server each time you call it, which is time consuming. instead before the loop use actxserver to open excel, do what you want and finally close actxserver after your loop. For a good example of using actxserver, search for "Read Spreadsheet Data Using Excel as Automation Server" in MATLAB help.
And also take a look at readtable which works faster than xlsread, but generates a table instead.
The most obvious improvement seems to be to load the files only partially if possible. However, if that is not an option, try whether it helps to only open each file once (read everything you need, and then assign it).
M(:,k+2) = xlsread(textFileName,'C:D');
Also check how much you are reading in each time, if you read in many rows in the first file, you may make the first dimension of A big, and then you will fill it each time you read a file?
As an extra: a small but simple improvment can be found at the start. Don't use 4 load statements, but use 1 and then assign variables based on the result.
As mentioned in this post, the easiest thing to change would be to set 'Basic' to true. This disables things like formulas and macros in Excel and allows you to read a simple table more quickly. For example, you can use:
xlsread('CR1643.xlsx','A:A', 'Basic', true)
This resulted in a decrease in load time from about 22 seconds to about 1 second for me when I tested it on a 11,000 by 7 Excel sheet.

importing data with a very complex structure into matlab

i have some data files that i would like to load into matlab. unfortunatly, they have a quite complex structure - at least compared to what i am used to. you should be able to download an old example of this here, https://www.dropbox.com/s/vbh6kl334c5zg1s/fn1_2.out (it opens fine in notepad or wordpad)
it is data files based on synchrotron data where both the raw data, regularized "raw" data and the (indirect) fourier transformed data+fit to data is listed. there are furthermore some statistics from the fourier transformation.
I just need to quote the results from the statistics in my paper, so while it would be nice to plot some of the results, it is not strictly necessary. I need, however, the raw and regularized data together with the fit, and the fourier transformed data.
My problem
in the data file, the results from the statistical analysis is shown before the data i need. but the size of the columns from the statistical analysis varies from data file to data file. this means that i cannot just include the statistics in the header unless i manually change the number of header lines for each file i import. i need to analysis groups of 5 data files together and i would at least need to analyze around 30 files this time so i would like to avoid it if possible. in the future i would again need to load this kind of data files - so even if changing the number of headerlines 30 times does not sound bad it would be nice to be able do it automatically
Possible solution
both the he raw and regularized data together with the fit as well as the fourier transformed data are preceded by a specific line that tells me that after this and a blank/empty line, the data begins
so i though that maybe i could use regular expressions to tell matlab to ignore everything until you see this specific line, ignore this line and one more, and then import data
i googled and found this topic where regular expressions are used: Trying to parse a fairly complex text file
but i am new to regular expressions and the code suggested is a bit complex for me. i can gather that he uses named capture but i am not quite sure i understand how he uses it and if i can adopt it to me need. i have checked the official matlab documentation but their examples are somewhat simpler :) (http://www.mathworks.se/help/matlab/matlab_prog/regular-expressions.html#bqm94nz-1)
Sorry for writing such a long post. any suggestions on how to proceed with this problem will be greatly appreciated
/Martin
EDIT
the code i have used based on the link in the comment:
fileName = 'data.dat';
inputfile = fopen(fileName);
% Ignore all until we see one that just consists of this:
startString = ' R P(R) ERROR';
mydata = [];
while 1
tline = fgetl(inputfile);
% Break if we hit end of file, or the start marker
if ~ischar(tline) || strcmp(tline, startString)
break
end
data = sscanf(tline, '%f', 3 );
mydata(end+1,:) = data;
end
fclose(inputfile);
When i run the code i get the error:
Subscripted assignment dimension mismatch.
mydata(end+1,:) = data;
any suggestions will be greatly appriciated and my apologize for the strange layout/leaving the link in the comment. i am not allowed to include more than two links in a post and i cannot add a new answer yet - both due to me having to low rep :)
Since the blocks are separated by at least two new lines you can use that to separate the text into blocks and analyse them individually. Try this code
fileH = fopen('fn1_2.out');
content = fscanf(fileH, '%c', inf);
fclose(fileH);
splitstring = regexp(content, '\r\n\r\n', 'split');
blocks = regexp(splitstring, '\d\.\d{4}.*\r\n.*\d\.\d{4}','match');
numericBlocksIdx = find(cellfun(#(x) ~isempty(x), blocks));
numericBlocks = splitstring(numericBlocksIdx);
Now the numericBlocks{1}, numericBlocks{2}, ... contain the tables that you are interested in. Note that for some tables the headers are also included because they are not separated by two new lines. From here you can use functions like textscan to read the data into matrices.

MATLAB Saving and Loading Feature Vectors

I am trying to load feature vectors into classifiers such as a k-nearest neighbors classifier.
I have my code for GLCM, so I get contrast, correlation, energy, homogeneity in numbers (feature vectors).
My question is, how can I save every set of feature vectors from all the training images? I have seen somewhere that people had a .set file to load into classifiers (may be it is a special case for the particular classifier toolbox).
load 'mydata.set';
for example.
I suppose it does not have to be a .set file.
I'd just need a way to store all the feature vectors from all the training images in a separate file that can be loaded.
I've google,
and I found this that may be useful
but I am not entirely sure.
Thanks for your time and help in advance.
Regards.
If you arrange your feature vectors as the columns of an array called X, then just issue the command
save('some_description.mat','X');
Alternatively, if you want the save file to be readable, say in ASCII, then just use this instead:
save('some_description.txt', 'X', '-ASCII');
Later, when you want to re-use the data, just say
var = {'X'}; % <-- You can modify this if you want to load multiple variables.
load('some_description.mat', var{:});
load('some_description.txt', var{:}); % <-- Use this if you saved to .txt file.
Then the variable named 'X' will be loaded into the workspace and its columns will be the same feature vectors you computed before.
You will want to replace the some_description part of each file name above and instead use something that allows you to easily identify which data set's feature vectors are saved in the file (if you have multiple data sets). Your array of feature vectors may also be called something besides X, so you can change the name accordingly.