Efficiently handle nonlinear models in Matlab (not Simulink) - matlab

I wanted to ask if there is a framework for non-linear models in Matlab, similar to that for linear dynamical systems (ss, lsim, connect, etc.). I need to create some examples for a control theory lecture and would like to compare linear and non-linear systems.
For example, I've got a nonlinear model of a simple inverted pendulum, and functions for the jacobians of the dynamics and the output map. Now I implement some controllers (state feedback, LQR, ...) at a stationary point and compare the linear and non-linear closed-loop (using ode-solvers like ode15s). I know how to do it for one single model, but i'd like to easily switch the models and stationary points. I already made some simple data structures and functions for automated plotting, controller synthesis and interconnections.
However, this took me quite some time and it's not easily adapted to other models with possibly different dimensions. So I wondered if there was an already implemented framework similar to what exists for linear systems with functions like lsim for calculating responses or connect for creating interconnections, where you can give names to inputs and outputs etc.
I'm happy about any hints on how to work with nonlinear models more efficiently :)

Related

Are there any softwares that implemented the multiple output gauss process?

I am trying to implement bayesian optimization using gauss process regression, and I want to try the multiple output GP firstly.
There are many softwares that implemented GP, like the fitrgp function in MATLAB and the ooDACE toolbox.
But I didn't find any available softwares that implementd the so called multiple output GP, that is, the Gauss Process Model that predict vector valued functions.
So, Are there any softwares that implemented the multiple output gauss process that I can use directly?
I am not sure my answer will help you as you seem to search matlab libraries.
However, you can do co-kriging in R with gstat. See http://www.css.cornell.edu/faculty/dgr2/teach/R/R_ck.pdf or https://github.com/cran/gstat/blob/master/demo/cokriging.R for more details about usage.
The lack of tools to do cokriging is partly due to the relative difficulty to use it. You need more assumptions than for simple kriging: in particular, modelling the dependence between in of the cokriged outputs via a cross-covariance function (https://stsda.kaust.edu.sa/Documents/2012.AGS.JASA.pdf). The covariance matrix is much bigger and you still need to make sure that it is positive definite, which can become quite hard depending on your covariance functions...

How to compare gams vs matlab in optimization

Is it possible to use Matlab instead of GAMS for optimization problems? How do they compare? In other words, can every problem solved with gams be solved with some matlab toolbox And finally what is a list of the optimization tools in Matlab.
Matlab and GAMS are very different in how they approach modeling. GAMS is organized along the concept of equations (essentially an optimization model is a collection of equations). This is both for LP, MIP, MINLP and other types of models. These equations largely resemble how you would write things down in Math. Matlab views an optimization model (LP/MIP) as a matrix (or two matrices depending on whether we deal with equalities or inequalities). You have to translate your constraints in these one or two matrices by populating them. Depending on the model this can be a difficult task. For structured models it is not so bad, but for large, complex models the GAMS approach is much more natural and convenient.
NLP problems in GAMS are just like LPs: equation based. GAMS uses automatic differentiation so no need to write gradients and GAMS targets large scale NLP problems. Matlab uses functions in their NLP solvers, and these are mostly suited for smaller problems. Gradients are provided by the user.
GAMS supports many solvers. MATLAB has an optimization toolbox, but these solvers are largely targeted to smaller and medium sized models. Having said that many state-of-the-art solvers have a Matlab interface (e.g. Cplex, Gurobi).
Not all solvers available under GAMS are directly callable from Matlab but many are (sometimes using external toolboxes).

Matlab versus simulation products such as ANSYS and COMSOL

This may be the wrong place to ask this, but I can't find a better place on the SE network.
I've briefly worked with both Matlab and Ansys, and from what I have learnt/can gather, Matlab is a programming environment that has functions that perform common math, visualization and analysis operations. You primarily write programs in a textual fashion (.m files) or use Simulink to generate flow graphs (model-based development). Ansys on the other hand is primary a simulation environment where quite a lot can be done simply with the GUI (3D models, physics domains, configuration, display settings), and you can add equations at various points in the simulation engine in order to modify the simulation flow.
Whatever I understand is cursory and only serves as an overview. Can anyone give me a suitable real-world comparison between Matlab and Ansys (or any other simulation product such as COMSOL) that would allow us to understand when to use which, and the weaknesses of each system.
I haven't used Ansys, but Ansys is often compared with Comsol, and I've used Comsol and Matlab for years.
Matlab:
Programming language and environment that runs it. Which means it can do anything (that any other programming language can do). What are its highlights, compared to other languages?
Hundreds of built-in functions to work with Matrices. For example, in one project I needed to do simple matrix algebra (add, multiply, scale matrices), and also needed singular value decomposition. SVD is not something you could write in 50 lines of code, so I needed a ready-made library. At the time I used a library for Java, and wrote my own code for representing matrices and doing matrix algebra on them. That's a few hundreds of lines of code. Had I used Matlab, it would have been about ten lines of code, because all of it is there. I would have needed only to type help svd to find out how to use it. However, if you don't need any of that, stay away from Matlab at all costs! There are much better languages that are free.
Great to use as a calculator that is always open on the desktop, and can do back-of-the-envelope style calculations.
Plotting graphs. Many academics recommend Matlab as the tool of choice for producing publication-quality graphics. These can be exported as PDF and imported into Inkscape for further editing. The best thing is that commands for plotting a graph could be put into a script file, and then parts of it can be changed later as needed, which can save a lot of work compared to manually drawing a graph (imagine you wanted to change the axes or symbols used to present the data points).
Personally, I also use it for curve-fitting. It has many toolboxes, one of which is a neat tool that allows me to find equations that model a set of data points.
Comsol:
Specialised tool for solving partial differential equations (PDEs) on complicated domains using the finite element method (FEM). This might sound obscure, but many real-world engineering needs reduce to this. Such things as:
Finding loads, stresses and strains in civil engineering structures with complicated real-world geometry (what happens when there is gusty wind blowing onto a building or bridge?)
How do currents flow in particular conductive objects?
Chemical reactions in various industrial reactors.
What is the power efficiency of a generator (magnet spinning in coil) design?
How to place aircon outlets in a nontrivially-shaped room to achieve both good temperature distribution and good efficiency?
Comsol, as any other FEM tool that can work with arbitrary equations, can do multiphysics, which means, for example, that one could solve for chemistry of a battery, as well as the temperature and pressure, and how that feeds back into the chemical reaction (speeds up or slows down). Compared with a tool where you need to provide the equations, in Comsol, most of the things that would be needed to solve most problems are already there, and just need to be selected and applied to the geometry, which is also built inside Comsol. Also, equations of arbitrary description can be introduced.
The physical descriptions of how these physical substances behave are called PDEs.
Once Comsol has finished solving a problem, the data could be exported for post-processing into Matlab, which has much more versatile tools for manipulating data and making various plots.

Essential philosophy behind Support Vector Machine

I am studying Support Vector Machines (SVM) by reading a lot of material. However, it seems that most of it focuses on how to classify the input 2D data by mapping it using several kernels such as linear, polynomial, RBF / Gaussian, etc.
My first question is, can SVM handle high-dimensional (n-D) input data?
According to what I found, the answer is YES!
If my understanding is correct, n-D input data will be
constructed in Hilbert hyperspace, then those data will be
simplified by using some approaches (such as PCA ?) to combine it together / project it back to 2D plane, so that
the kernel methods can map it into an appropriate shape such a line or curve can separate it into distinguish groups.
It means most of the guides / tutorials focus on step (3). But some toolboxes I've checked cannot plot if the input data greater than 2D. How can the data after be projected to 2D?
If there is no projection of data, how can they classify it?
My second question is: is my understanding correct?
My first question is, does SVM can handle high-dimensional (n-D) input data?
Yes. I have dealt with data where n > 2500 when using LIBSVM software: http://www.csie.ntu.edu.tw/~cjlin/libsvm/. I used linear and RBF kernels.
My second question is, does it correct my understanding?
I'm not entirely sure on what you mean here, so I'll try to comment on what you said most recently. I believe your intuition is generally correct. Data is "constructed" in some n-dimensional space, and a hyperplane of dimension n-1 is used to classify the data into two groups. However, by using kernel methods, it's possible to generate this information using linear methods and not consume all the memory of your computer.
I'm not sure if you've seen this already, but if you haven't, you may be interested in some of the information in this paper: http://pyml.sourceforge.net/doc/howto.pdf. I've copied and pasted a part of the text that may appeal to your thoughts:
A kernel method is an algorithm that depends on the data only through dot-products. When this is the case, the dot product can be replaced by a kernel function which computes a dot product in some possibly high dimensional feature space. This has two advantages: First, the ability to generate non-linear decision boundaries using methods designed for linear classifiers. Second, the use of kernel functions allows the user to apply a classifier to data that have no obvious fixed-dimensional vector space representation. The prime example of such data in bioinformatics are sequence, either DNA or protein, and protein structure.
It would also help if you could explain what "guides" you are referring to. I don't think I've ever had to project data on a 2-D plane before, and it doesn't make sense to do so anyway for data with a ridiculous amount of dimensions (or "features" as it is called in LIBSVM). Using selected kernel methods should be enough to classify such data.

fitting two dimensional curves in matlab

There's a toolbox function for the curve fitting toolbox called cftool that lets you fit curves to 1-d data. Is there anything for 2-d data?
Jerry suggested two very good choices. There are other options though, if you want a more formulaic form for the model.
The curvefitting toolbox, in the current version, allows you to fit surfaces to data, not just curves.
Or fit a 2-d polynomial model, using a tool like polyfitn.
Or you can use a nonlinear regression, if you have a model in mind. The optimization toolbox will help you there, with lsqnonlin or lsqcurvefit, either of which can fit 2-d (or higher) models. Or, if you have the stats toolbox, then try nlinfit.
Perhaps you might like a tool to fit Radial Basis Functions.
Neural nets are another way to fit data, so use the Neural Network Toolbox
So there are many ways to model surfaces, depending on your interests, your knowledge of a likely form for the model, what toolboxes you have or what you might choose to download. A very big factor in your model choice are your goals for the model. What will you do with it? How will it be used?
You seem to be looking for griddata. You might also want to look at gridfit.