Bicubic Interpolation In Modelica - modelica

I have a set of data on irregular grids. I have perfrom interpolation to find f(x,y). I have implemented bilinear interpolation with the help of algorithm from wikipedia. However, it is not accurate. I would like to implement either Bicubic interpolation or Bicubic spline interpolation. I have found an algorithm for Bicubic interpolation but I have to find the derivatives fx,fy and fxy. It makes my code more complicated. Are there any models already available for Bicubic or Bicubic spline interpolation? If not are there models atleast to calculate fx, fy and fxy? Any kind of solution is very much helpful to me.

According to https://trac.modelica.org/Modelica/ticket/1153#comment:11 it would be very few effort to add bicubic interpolation (on a regular grid) to CombiTable2D of the Modelica Standard Library. This would then be implemented as an external object, e.g. in C.
Check https://github.com/diazona/interp2d/blob/master/bicubic.c for a C implementation of the bicubic interpolation - including the derivatives.

Related

How to obtain damping ratio with Curve Fitting Toolbox

I have some experimental data from an oscillating system (time domaine) and I would like to get an approximation of the damping ratio (zeta). I have already try to use the half-power band width method with the vibrationdata Matlab package. But I would like to compare the result with another method.
I found several methods in this paper : www.vce.at/sites/default/files/uploads/downloads/2007_damping_estimation.pdf and I would like to try to do the curve fitting method with the Curve Fitting Toolbox of Matlab (R2014b) (chapter 2.2.2 Curve fitting in the paper).
It's the first time I use this Toolbox with a custom equation, so, I do not really know how to do it with my sample data.
I do not know how to write in the Curve Fitting Toolbow, the equation of the paper as it involves differentiate equation...
Can anyone help me regarding this.

Polynomial curve fit not fitting with defaults

The documentation I am reading for curve fitting doesnt seem to be that complicated, but clearly I am doing something wrong.
Given x,y data, trying to fit a polynomial function of degree 3, the data points are seemingly ignored. Can someone explain both how to fix this, and what this curve is actually calculating, as opposed to what I think it should be calculating?
Data: http://pastebin.com/4EXu0FSv
You most likely are using the wrong regression model or interval (or points). Curve fitting is very very complex topic and can not be simply solved. Have a read of the Mathworks site about the curve fitting toolbox here .
However I would not be fitting a 3 order polynomial to this data. I would be more inclinded to fit positive reciprocal function - see if that gives you a better fit.

CV: Difference between MATLAB and OpenCV camera calibration techniques

I calibrated a camera with checkerboard pattern using OpenCV and MATLAB. I got .489 and .187 for Mean Re-projection errors in OpenCV and MATLAB respectively. From the looks of it, MATLAB is more precise. But my adviser feels both MATLAB and OpenCV use the same BOUGET's algorithm and should report same error (or close). Is it so ? Can someone explain the difference b/w MATLAB and OpenCV camera calibration methods ?
Thanks!
Your adviser is correct in that both MATLAB and OpenCV use essentially the same calibration algorithm. However, MATLAB uses the Levenberg-Marquardt non-linear least squares algorithm for the optimization (see documentation), whereas OpenCV uses gradient descent. I would guess that this accounts for most of the difference in the reprojection errors.
Additionally, MATLAB and OpenCV use different algorithms for checkerboard detection.

Fitting sigmoid to data

There are many curve fitting and interpolation tools like polyfit (or even this nice logfit toolbox I found here), but I can't seem to find anything that will fit a sigmoid function to my x-y data.
Does such a tool exist or do I need to make my own?
If you have the Statistics Toolbox installed, you can use nonlinear regression with nlinfit:
sigfunc = #(A, x)(A(1) ./ (A(2) + exp(-x)));
A0 = ones(size(A)); %// Initial values fed into the iterative algorithm
A_fit = nlinfit(x, y, sigfunc, A0);
Here sigfunc is just an example for a sigmoid function, and A is the vector of the fitting coefficients.
nlinfit, and especially gatool, are big hammers for this problem. A sigmoid is not a specific function. Most commonly it is taken to be the same as the logistic function (also often the most efficient to calculate):
y = 1./(1+exp(-x));
or a generalized logistic. But all manner of curves can have sigmoidal shapes. If you know if your data corresponds to one in particular, fitting can be improved and more efficient methods can be applied. For example, the error function (erf) has a sigmoidal shape and shows up in the CDF of the normal distribution. If you know that your data is the result of a Gaussian process (i.e., the data is the CDF) and you have the Stats toolbox, you can use the normfit function. This function is based on maximum likelihood estimation (MLE). If you end up needing to write a custom fitting function - say, for performance reasons - I'd investigate MLE techniques for the particular form of sigmoid that you'd like to fit.
I would suggest you use MATLAB's Global Optimization Toolbox, and in particular the Genetic Algorithm Solver, which you can use for your problem by optimizing (= finding the best fit for your data) the sigmoid function's parameters through genetic algorithm. It has a GUI that is easy to use.
The Genetic Algorithm Solver's GUI, which you can call using gatool:

Linear least-squares fit with constraint - any ideas?

I have a problem where I am fitting a high-order polynomial to (not very) noisy data using linear least squares. Currently I'm using polynomial orders around 15 - 25, which work surprisingly well: The dependence is very nearly linear, but the accuracy of modelling the 'very nearly' is critical. I'm using Matlab's polyfit() function, and (obviously) normalising the x-data. This generally works fine, but I have come across an issue with some recent datasets. The fitted polynomial has extrema within the x-data interval. For the application I'm working on this is a non-no. The polynomial model must have no stationary points over the x-interval.
So I need to add a constraint to the least-squares problem: the derivative of the fitted polynomial must be strictly positive over a known x-range (or strictly negative - this depends on the data but a simple linear fit will quickly tell me which it is.) I have had a quick look at the available optimisation toolbox functions, but I admit I'm at a loss to know how to go about this. Does anyone have any suggestions?
[I appreciate there are probably better models than polynomials for this data, but in the short term it isn't feasible to change the form of the model]
[A closing note: I have finally got the go-ahead to replace this awful polynomial model! I am going to adopt a nonparametric approach, spline smoothing, using the excellent SPLINEFIT code by Jonas Lundgren. This has the advantage that I'm already using a spline model in the end-user application, so I already have C# code available to evaluate a spline model]
You could use cftool and use the exclude data points option.