How to obtain damping ratio with Curve Fitting Toolbox - matlab

I have some experimental data from an oscillating system (time domaine) and I would like to get an approximation of the damping ratio (zeta). I have already try to use the half-power band width method with the vibrationdata Matlab package. But I would like to compare the result with another method.
I found several methods in this paper : www.vce.at/sites/default/files/uploads/downloads/2007_damping_estimation.pdf and I would like to try to do the curve fitting method with the Curve Fitting Toolbox of Matlab (R2014b) (chapter 2.2.2 Curve fitting in the paper).
It's the first time I use this Toolbox with a custom equation, so, I do not really know how to do it with my sample data.
I do not know how to write in the Curve Fitting Toolbow, the equation of the paper as it involves differentiate equation...
Can anyone help me regarding this.

Related

How does Matlab Simbiology calculate pharmacokinetic (PK) non-compartmental (NCA) area under the curve (AUC)?

I can't seem to find the answer to my question in SimBiology's documentation.
Does anyone know how Matlab calculates the non-compartmental AUC? Does it use the linear / log-linear trapezoidal rule? How many points does it use to extrapolate the curve to infinity? Does it use log or linear extension?
Many thanks!

Polynomial curve fit not fitting with defaults

The documentation I am reading for curve fitting doesnt seem to be that complicated, but clearly I am doing something wrong.
Given x,y data, trying to fit a polynomial function of degree 3, the data points are seemingly ignored. Can someone explain both how to fix this, and what this curve is actually calculating, as opposed to what I think it should be calculating?
Data: http://pastebin.com/4EXu0FSv
You most likely are using the wrong regression model or interval (or points). Curve fitting is very very complex topic and can not be simply solved. Have a read of the Mathworks site about the curve fitting toolbox here .
However I would not be fitting a 3 order polynomial to this data. I would be more inclinded to fit positive reciprocal function - see if that gives you a better fit.

Fitting sigmoid to data

There are many curve fitting and interpolation tools like polyfit (or even this nice logfit toolbox I found here), but I can't seem to find anything that will fit a sigmoid function to my x-y data.
Does such a tool exist or do I need to make my own?
If you have the Statistics Toolbox installed, you can use nonlinear regression with nlinfit:
sigfunc = #(A, x)(A(1) ./ (A(2) + exp(-x)));
A0 = ones(size(A)); %// Initial values fed into the iterative algorithm
A_fit = nlinfit(x, y, sigfunc, A0);
Here sigfunc is just an example for a sigmoid function, and A is the vector of the fitting coefficients.
nlinfit, and especially gatool, are big hammers for this problem. A sigmoid is not a specific function. Most commonly it is taken to be the same as the logistic function (also often the most efficient to calculate):
y = 1./(1+exp(-x));
or a generalized logistic. But all manner of curves can have sigmoidal shapes. If you know if your data corresponds to one in particular, fitting can be improved and more efficient methods can be applied. For example, the error function (erf) has a sigmoidal shape and shows up in the CDF of the normal distribution. If you know that your data is the result of a Gaussian process (i.e., the data is the CDF) and you have the Stats toolbox, you can use the normfit function. This function is based on maximum likelihood estimation (MLE). If you end up needing to write a custom fitting function - say, for performance reasons - I'd investigate MLE techniques for the particular form of sigmoid that you'd like to fit.
I would suggest you use MATLAB's Global Optimization Toolbox, and in particular the Genetic Algorithm Solver, which you can use for your problem by optimizing (= finding the best fit for your data) the sigmoid function's parameters through genetic algorithm. It has a GUI that is easy to use.
The Genetic Algorithm Solver's GUI, which you can call using gatool:

Beginners issue in polynomial curve fitting [Part 1]

I have just started understanding modeling techniques based on regression models and was going through MATLAB curve fitting toolbox and the SO. I have fundamental doubts and unable to proceed further. I have a single vector set with k=100 data points which I want to fit into an AR model,MA model,ARMA model successively to see which is better suited.Starting with an AR(p) model of the form y(k+1)=a*y(k)+ b*y(k-1)The command
coeff = polyfit(x,y,d)
will fit a polynomial of degree say d=1 with p number of coefficients indicating the order of the model (AR(p)). But I just have 1 set of data which is the recording of the angular moment.So,what will go as the first parameter (x) of the function signature i.e what will be x,y?Then, what if the linear models are not good enough so I may have to select the nonlinear models.Can somebody please guide with code snippets what are the steps in fitting,checking for overfitting,residual calculation etc.
x is likely to be k (index of y). And the whole code:
c =polyfit(1:length(y), y, d).
Matlab has a curve fitting toolbox. You could use it to check different nonlinear fitting in GUI to get some intuition.
If you want steps there's a great Coursera Machine Learning course. The beginning of this course is related to linear regression and I recommend you to spend some hours at least on that beginning.

Linear least-squares fit with constraint - any ideas?

I have a problem where I am fitting a high-order polynomial to (not very) noisy data using linear least squares. Currently I'm using polynomial orders around 15 - 25, which work surprisingly well: The dependence is very nearly linear, but the accuracy of modelling the 'very nearly' is critical. I'm using Matlab's polyfit() function, and (obviously) normalising the x-data. This generally works fine, but I have come across an issue with some recent datasets. The fitted polynomial has extrema within the x-data interval. For the application I'm working on this is a non-no. The polynomial model must have no stationary points over the x-interval.
So I need to add a constraint to the least-squares problem: the derivative of the fitted polynomial must be strictly positive over a known x-range (or strictly negative - this depends on the data but a simple linear fit will quickly tell me which it is.) I have had a quick look at the available optimisation toolbox functions, but I admit I'm at a loss to know how to go about this. Does anyone have any suggestions?
[I appreciate there are probably better models than polynomials for this data, but in the short term it isn't feasible to change the form of the model]
[A closing note: I have finally got the go-ahead to replace this awful polynomial model! I am going to adopt a nonparametric approach, spline smoothing, using the excellent SPLINEFIT code by Jonas Lundgren. This has the advantage that I'm already using a spline model in the end-user application, so I already have C# code available to evaluate a spline model]
You could use cftool and use the exclude data points option.