When using genetic algorithm to find global minimum of objective function, algorithm gives only the best result (smallest value of objective function).
Is it possible for genetic algorithm to give multiple best local minima it found? In my problem, I would like to get approximately best 100 results and to sort them by the value of objective function.
I couldn't find this option on MathWorks instruction site.
Related
I am performing a numerical optimization where I try to find the parameters of a statistical model that best match certain moments of the data. I have 6 parameters in total I need to find. I have written a matlab function which takes the parameters as input and gives the sum of squared deviations from the empirical moments as output. I use the fminsearch function to find the parameters and it gives me a solution.
However, I am unsure if this is really a global minimum. What type of checks I could do to ensure the numerical solution is correct? Plotting the function is challenging due to high dimensionality. Any general advice in solving this type of problem is also appreciated.
You are describing the difficulties of a global optimization problem.
As mentioned in one of the comments, fminsearch() and related function fminunc() will return a local minimum. It provides no guarantee that you will get a global minimum.
A simple way to check if the answer you get really is a global minimum, would be to run the function multiple times from various starting points. If the answer all converges to the same value, it might be a global minimum. If you find an answer with lower error values, then the last answer was not the global minimum.
The only way to be perfectly sure that you have the global minima, is to know whether or not your function is convex (i.e. your function has only a single minima.) This will have to be done analytically.
If it is not possible to be done analytically, there are many global optimization methods you may want to consider, including some available as this MATLAB toolbox.
I am trying to implement bayesian optimization using gauss process regression, and I want to try the multiple output GP firstly.
There are many softwares that implemented GP, like the fitrgp function in MATLAB and the ooDACE toolbox.
But I didn't find any available softwares that implementd the so called multiple output GP, that is, the Gauss Process Model that predict vector valued functions.
So, Are there any softwares that implemented the multiple output gauss process that I can use directly?
I am not sure my answer will help you as you seem to search matlab libraries.
However, you can do co-kriging in R with gstat. See http://www.css.cornell.edu/faculty/dgr2/teach/R/R_ck.pdf or https://github.com/cran/gstat/blob/master/demo/cokriging.R for more details about usage.
The lack of tools to do cokriging is partly due to the relative difficulty to use it. You need more assumptions than for simple kriging: in particular, modelling the dependence between in of the cokriged outputs via a cross-covariance function (https://stsda.kaust.edu.sa/Documents/2012.AGS.JASA.pdf). The covariance matrix is much bigger and you still need to make sure that it is positive definite, which can become quite hard depending on your covariance functions...
I have a function fun(x,y,z), such that say, x=1:10, y=50:60, z=100:105. Which optimization method (and how) I can use to get the minimum of this function, for example, where (x,y,z)=(3,52,101). I am working in Matlab.
Thank you for any help
Algorithms
There are many many algorithms out there that you can use for direct search optimization such as Nelder-Mead, Particle Swarm, Genetic Algorithm, etc.
I believe Nelder-Mead is a simplex optimization method which is used by fminsearch function in MATLAB.
Also, there is Genetic Algorithm which comes with MATLAB Global Optimization toolbox. You may want to give that a try as well.
Particle Swarm Optimization (PSO) is another direct search method that you can use. However, there is no official toolbox for Particle Swarm method built by Mathworks. The good news is there is quite a few PSO toolbox developed by other people. I personally have used this one and am quite happy with the performance. Its syntax is similar to Genetic Algorithm functions that come with Global Optimization Toolbox.
Discrete Optimization
Regarding your question that you are looking for a set of integer values namely x,y, and z corresponding to the minimum objective function value, I would add a part at the beginning of the objective function that rounds the variables to the closest integers and then feeds them to your main function fun(x,y,z). This way you would discretize your function space.
I hope my answer helps.
I want to optimize a multi-variable function with the patternsearch function in MATLAB. The function requires a lower and upper boundary and looks within the boundaries in a continuous domain.
I however have a discrete set of values in an excel file and would like the algorithm to search within this discrete domain instead of in the continuous domain.
Is this possible with patternsearch?
Maybe I don't understand correctly your question but if you have a (discret and finite) set of values, why don't you compute the function's value at these points and return the minium?
In short, no. That is not what patternsearch is intended for. Optimization techniques for discrete and continuous search spaces are quite expectedly different.
If you're looking for an approximate answer however, it is possible to use spline, polyfit, etc. to arrive at an approximate continuous function for your data and then apply patternsearch on it.
If you provide greater detail about your problem, I or someone else may be able to suggest a more suitable way of working with your data.
The best optimization tool for this is the Genetic Algorithm. This optimization tool comes with Matlab's global optimization toolbox and allows for optimization of both continuous and discrete variables at the same time.
In the genetic algorithm variables that are integers have to be declared as such. Non-declared variables are continuous by default.
Check the Global Optimization Toolbox guide for information on how it works: http://it.mathworks.com/help/pdf_doc/gads/gads_tb.pdf.
I'm using the genetic algorithm from the MATLAB Global Optimization Toolbox with SimEvents, in order to implement a mixed integer optimization making use of simulation outputs to evaluate the fitness function. My model is pretty similar to the one described in this video from MathWorks website:
http://www.mathworks.it/videos/optimizing-manufacturing-production-processes-68961.html
Reading the documentation, I found that ga can solve constrained problems only if such constraints are linear inequalities. The constraints are supposed to be written as functions of the problem's variables, that in this case are the number of resources used during the simulation.
I would like, instead, to set a constraint that takes into account another simulation output (e.g. the drain utilization), i.e. minimize
objfun = backlog*10000 + cost
where backlog is a simulation output (obtained using simOut.get), considering the following constraint:
drain_utilization > 0.7
where drain_ utilization is another simulation output (again, obtained using simOut.get).
Is it possible or this feature is not supported by the Global Optimization Toolbox?
Thank you in advance and forgive me for any improper term, but I'm new to the Global Optimization Toolbox.