multiple index integer linear program - matlab

I have a bounded integer linear programming which has more than 1000 decision variables. the decision variable has five indices x_{ijkzf} the constraints have one or two or more summations. Are there any easy trick for extracting, constructing and preparing the coefficient matrix? it is not easy to construct the matrices for integer linear program function.

It can be done just using GAMS. We can read data from Excel in GAMS. Also the multiple-dimension matrices can easily be read in GAMS. Using Table command, we can import a table to GAMS. Also, we can use excel file. For a multidimensional table, we can add indices hierarchically in excel. For example, for A(i,j,k), we can add i values in first ow of excel sheet. For i=1 we can enter A(1,j,k) at under i=1 cell as a 2d matrix. For i=2, we can enter A(2,j,k) under i=2 in excel sheet.

Related

Simulink: Vector summation and saving the output to workspace

I can not solve a very simple problem in the Simulink: summation of 2 equal size vectors and writing the result into the Matlab workspace.
The trivial operation that takes 1 line in Matlab seems a real problem in the simulink.
I have 2 vectors with the same size, for e.g. 10x1 and I want to get their summation result into the workspace with the same size (10x1).
I have already used 'sum' block for that and even my own function with element-wise summation, but I think the problem is that Simulink block 'to workspace' always concatenate outputs either along 1-st or 3-rd dimension. Hence the size of the output does not inherit the size of inputs.
I can not find any solution in the web, will be really appreciate for your help!
I didn't notice the vectors are saved in a column-based using "to workspace" block. Did you try to add "(:)" in your code to get it in a single column?
As I know, storing the data in columns (1x10) is faster than in rows (10x1). Maybe that is the reason for getting columns instead of rows.
https://www.mathworks.com/matlabcentral/answers/216512-which-is-faster-a-row-vector-or-a-column-vector-can-anyone-answer-me-please

How to extract columns with a for loop?

Question
If there is a matrix of MxN, how can extract all of the data based in the columns?
Im interesting to pass each column to a function and to be plotted.
if A(:) is used all the matrix is merged into a single column, (I remember this command is intended for that) but this does not serve to me.
Matlab arrays use the indexing, partOfArray = array(rows, columns). The variables rows and columns can be a vector (including a scalar, which is a vector of length 1) or : which is effectively interpreted by Matlab as meaning 'all' (e.g. array(:,columns) would be all rows of the selected columns).
Alternatively, Matlab also allows linear indexing, in which array(aNumber) counts in order from array(1,1) first by rows, then by columns. e.g. if array is 2x4, array(5) is equivalent to array(2,1). When you call A(:) Matlab interprets that as using linear indexing to access all elements in the array, hence merging the matrix into a single column.
To access each column vector in a for loop, in this case printing it out, use:
A = magic(4)
numColumnsInA = size(A,2);
for i=1:numColumnsInA
disp(A(:,i))
end
You can find more information about indexing in Matlab here: Array Indexing

How to apply tsne() to MATLAB tabular data?

I have a 33000 x 1975 table in MATLAB, obviously requiring dimensionality reduction before I do any further analysis. The features are the 1975 columns and the rows are instances of the data. I tried using tsne() function on the MATLAB table, but it seems tsne() only works on numeric arrays. The thing is that is there a way to apply tsne on my MATLAB table. The table consists of both numeric as well as string data types, so table2array() doesn't work in my case for converting the table to a numeric array.
Moreover, it seems from the MATHWORKS documentation, as applied to the fisheriris dataset as an example, that tsne() takes the feature columns as the function argument. So, I would need to separate the predictors from the resonses, which shouldn't be a problem. But, initially, it seems confusing as to how I can proceed further for using the tsne. Any suggestions in this regard would be highly appreciated.
You can probably use table indexing using {} to get out the data that you want. Here's a simple example adapted from the tsne reference page:
load fisheriris
% Make a table where the first variable is the species name,
% and the other variables are the measurements
data = table(species, meas(:,1), meas(:,2), meas(:,3), meas(:,4))
% Use {} indexing on 'data' to extract a numeric matrix, then
% call 'tsne' on that
Y = tsne(data{:, 2:end});
% plot as per example.
gscatter(Y(:,1),Y(:,2),data.species)

Getting Values from Excel Table on Matlab

I've been trying to simulate a chemical reactor on Matlab. For this to work I need to be able to get some values from excel tables with thermodynamic properties. Each table is for a chemical species and the first row is always the temperature.
I would like to input the temperature so that Matlab would search the temperature column for the value and return, for example, the associated enthalpy.
I used xlsread to load the tables as matrices but I am having trouble to tell Matlab what to do next.
I've been told to try
[row, ~] = find(temp<=data(:,2)); %temp is the given temperature
but this just returns me
row = 1, 2, ...
I would also like to be able to handle intermediate temperatures that are not explicit in the table, it could be some kind of interpolation or just getting the nearest value.
Any help will be truly appreciated.

Visualizing a large matrix in matlab

I have a huge sparse matrix (1,000 x 1,000,000) that I cannot load on matlab (not enough RAM).
I want to visualize this matrix to have an idea of its sparsity and of the differences of the values.
Because of the memory constraints, I want to proceed as follows:
1- Divide the matrix into 4 matrices
2- Load each matrix on matlab and visualize it so that the colors give an idea of the values (and of the zeros particularly)
3- "Stick" the 4 images I will get in order to have a global idea for the original matrix
(i) Is it possible to load "part of a matrix" in matlab?
(ii) For the visualization tool, I read about spy (and daspect). However, this function only enables to visualize the non-zero values indifferently of their scales. Is there a way to add a color code?
(iii) How can I "stick" plots in order to make one?
If your matrix is sparse, then it seems that the currently method of storing it (as a full matrix in a text file) is very inefficient, and certainly makes loading it into MATLAB very hard. However, I suspect that as long as it is sparse enough, it can still be leaded into MATLAB as a sparse matrix.
The traditional way of doing this would be to load it all in at once, then convert to sparse representation. In your case, however, it would make sense to read in the text file, one line at a time, and convert to a MATLAB sparse matrix on-the-fly.
You can find out if this is possible by estimating the sparsity of your matrix, and using this to see if the whole thing could be loaded into MATLAB's memory as a sparse matrix.
Try something like: (untested code!)
% initialise sparse matrix
sparse_matrix = sparse(num_rows, num_cols);
row_num = 1;
fid = fopen(filename);
% read each line of text file in turn
while ~feof(fid)
this_line = fscanf(fid, '%f');
% add row to sparse matrix (note transpose, which I think is required)
sparse_matrix(row_num, :) = this_line';
row_num = row_num + 1;
end
fclose(fid)
% visualise using spy
spy(sparse_matrix)
Visualisation
With regards to visualisation: visualising a sparse matrix like this via a tool like imagesc is possible, but I believe it may internally create the full matrix – maybe someone can confirm if this is true or not. If it does, then it's going to cause you memory problems.
All spy is really doing is plotting in 2D the locations of the non-zero elements. You can fairly easily write your own spy function, which can have different coloured or sized points depending on the values at each location. See this answer for some examples.
Saving sparse matrices
As I say above, the method your matrix is saved as is pretty inefficient – for a matrix with 10% sparsity, around 95% of your text file will be a zero or a space. I don't know where this data has come from, but if you have any control over its creation (e.g. it comes from another program you have written) it would make much more sense to save only the non-zero elements in the format row_idx, col_idx, value.
You can then use spconvert to import the sparse matrix directly.
One of the simplest methods (if you can actually store the full sparse matrix in RAM) is to use gnuplot to visualize the sparisty pattern.
I was able to spy matrices of size 10-20GB using gnuplot without problems. But make sure you use png or jpeg formats to output the image. Note that you don't need the value of the non-zero entry only the integers (row, col). And plot them "plot "row_col.dat" using 1:2 with points".
This chooses your row as x axis and cols as your y axis and start plotting the non-zero entries. It is very easy to do this. This is the most scalable solution I know. Gnuplot works at decent speed even for very large datasets (>10GB of [row, cols]), but Matlab just hangs (with due respect)
I use imagesc() to visualise arrays. It scales the values in array to values between 0 and 1, then plots the array like a greyscale bitmap image (of course you can change the colormap to make it easier to see detail).