Algorithm to "trace" sequential points into bezier curves - trace

I have a sequential collection of points in X,Y and I'd like to "trace" these into a set of bezier curves. Could any open source bitmap to vector tracing algorithm or library be used for this?

This depends on what you want to accomplish. If you want to see the 'best fit' curve, or at least a rough approximation, you should use a b_spline. A b_spline will fit itself 'inside' the points it is given. For going through the points in question I would generally use a Catmull-Rom spline which, when given points 1,2,3 will pass through point 2 with slope equal to the slope between points 1 & 3.
Sample Code:
http://willperone.net/Code/spline.php
Explanation of the algorithm:
http://steve.hollasch.net/cgindex/curves/catmull-rom.html

You want to use piece-wise b-spline curves rather than beziers if you want the curve to pass through an existing set of points.
There's tons of code on the web for doing this.

This is an older question, but I found it because I need an algorithm for autotracing coordinates as they're being drawn, and found this SO post through Google. It looks like for this particular question no one's mentioned Potrace (small wikipedia article on it here), which is pretty much literally what the original question was asking for, and is open source with several ports as well as the papers that describe its function freely available.

Related

Curve fitting, but I want to guarantee only one inflection point

I often find myself fitting a scatter plot, and knowing that the 'true fit' should have only one inflection point. Any ideas for forcing a fit that will obey this?
I am using Matlab and Microsoft Excel
Many thanks
Option 1:
I like to use spline smoothing with Akaike information criteria, and while it is a hyper-parametric fit and has a large number of analytic candidate inflection points, the smoothed data at the sample points tends to reveal only what is within the data.
If your data doesn't actually have an inflection point, this is indicated. If it does, it is also usually captured. Statistical jargon for an important cousin to this is called a "non-informative prior".
Try slides 30-31 here: link.
Option 2:
If you have an older version of MatLab then you can specify the exact model easily in the "cftool" (not the same as sftool) then get m-file that gives how you put it into your own script. Pick a model appropriate to your data.

Data interpolation over a non-homogeneous surface

In my project i deal with big data surfaces.
In a certain point, i have a line across the data, and I need the values of the points of the line.
The grid is non,homogeneous, it doesnt go from n:m with fixed steps nor nothing.
Lets ilustrate!
In the figure the 2D proyection of my data can be seen. Each of the points has also other 3 data information. I defined a arbitrary red line with the form y=ax+b. a and b are known.
How can I define i.e. 50 points in the line that has not only the x and y coords (wich is straigforward) but also the interpolation of the 3 data information of each of the points around it.
I know is not an easy question but I can't seem to step forward even a bit.
PD: realize I DONT want code written for me, but the idea of how to achieve my objective.
You could use a tool like triScatteredInterp, which will triangulate the 2-d domain, then interpolate a list of points along your line. Griddata is also an option.
I have a toolbox for problems like this (of course.) It allows me to build a triangulation of the non-convex domain in the (x,y) plane. Then it can form a completely general slice through that surface, interpolating in z also as it does so. The result will be a 1-manifold, in this case a piecewise linear function along that path in (x,y,z). While those tools are not posted on the file exchange, they are available for the person willing to invest the time to learn to use them.
If the surface you describe is a completely general one in 3-d, that might be fairly complex, then you might need a CRUST based tool to define that surface triangulation. These can be found online too. Once a triangulation is available, my tools can then be used to slice them. (Sorry, I never did finish that piece.)
What I did was to define several points in the crack line and then cheack for each one of them in wich quadrilateral it is with inpoligon matlab function (no tthe fastest way but less than 2 secs).
Then I created a triangular plane in the used quadrilaterals using x,y and Z or the othre data , achieving a linear interpolation between the data.
finally i take out all the points that are 0 o Nan.

calculate distance between curves

I need to calculate some kind of distance between to curves.
Those are general curves, and may not be functions - that is, some values of x may be mapped to more then one value.
EDIT
The curves are given as a list of X,Y pairs and the logical curve is the line passing through all the points in the order given. a typical data set will include about 1000 points
as noted, the curve may not be a function, but is usually similar to a function
This issue us what prevents using interp1 or the curve fitting toolbox (in Matlab)
The distance measure I was thinking of the the area of the region between the curves - but any reasonable alternative is ok.
EDIT
a sample illustration of to curves, and the area I want to compute
A Matlab solution is preferred, but other languages are also fine.
If you have functions that are of the type y = f(x) and they are defined over the same domain, then a common way to find the "distance" is to use the L2 norm as explained here http://en.wikipedia.org/wiki/L2_norm#p-norm. This is simply the integral of the absolute value of the difference between the functions squared. If you have parametric curves then you cannot directly employ this approach. If the L2 norm is not good enough for your requirements then you will need to provide a more concrete definition of what you mean by "distance". If you are unclear as to what you need try taking a look at different types of mathematical norm and see if any of the commonly used ones are what you need (ie L1 norm, uniform norm). The wikepedia link above is a good start point. If the L2 is good enough then you need a way to calculate the integral that you have - there are many many numerical integration techniques out there and I suggest google is your friend here (or a good numerical analysis text book).
If you do have parametric type curves then this is very nontrivial. Using the "area" between curves is not a good idea as there is no clear way to define this area and would become even more complicated in the general case where you could have self-intersecting curves. If your curves are parametrized in the same way you could try some very crude measurement where you evaluate points on each curve at equally spaced values over the parameter range, then calculate the length of the distance between each, sum and take the average as a notion of "closeness". i.e. partition your parameter range into a set {u_0, ... , u_n} and evaluate curve1(u_i) and curve2(u_i) for each i to generate a set of n paired points. Then sum the euclidean distances between each pair of points.
This is very very crude though and if the parametrizations are different then it wont be much use.
You need to define what you mean by distance between the curves. If it is the closest approach between two general curves, then it becomes quite difficult to solve the problem.
If the "curves" are not even representable as single valued functions of x, then it becomes more complex yet.
Merely telling us that you need to define "some kind of distance" is too broad of a statement to be on-topic here, and it says that you have not yet thought out the problem you wish to solve.
If all you are willing to tell us is that the curves are two totally general parametric curves, which may be closed or not, or they may not even lie over the same domain, then the question becomes so totally ill-posed as to be impossible to answer. What is the area between two curves in that case?
If the curves are defined over the SAME support, then subtracting them and integration of the absolute value or square of the difference will be adequate. But you have already told us that these "curves" may be multi-valued. In that case, it is essentially impossible to do what you are asking.

Efficiently visualise large quantities of points with matlab.

I have a set of 3D points which numbers up around 1 million points. I am looking to visualise these with matlab.
I have tried the following functions:
plot3
scatter3
But they are both very sluggish. Is there a more efficient way to visualise this level of points in matlab? Maybe a way to mesh the points?
If not can anyone suggest a plug-in or even a different program for visualising 3D points?
You're going to run into efficiency issues no matter what plugin/program you use if you want all million+ points to show up in a plot. My suggestion would be to downsample. Use the plot3 or scatter3 function on every other point, or every nth point, until you get a figure that is not sluggish. As long as the variance in your data isn't astronomical, downsampling a little bit shouldn't affect the overall distribution of points (given that you have a million+ points). And any software that is able to display that much data without being sluggish is most likely downsampling/binning or using some interpolation technique to do so (so you might as well have control over it).
fscatter3 from the file exchange, does what you like.
Is there a specific reason to actually have it display that many points?
I Googled around a bit and found some people who have had similar issues (someone suggested Avizo as an alternate program but I've never used it):
http://www.mathworks.com/matlabcentral/newsreader/view_thread/308948
mathworks.com/matlabcentral/newsreader/view_thread/134022 (not clickable because I don't have enough rep to post more than two links)
An alternate solution would be to generate a histogram if you're more interested in the density of the data:
http://blogs.mathworks.com/videos/2010/01/22/advanced-making-a-2d-or-3d-histogram-to-visualize-data-density/
I you know beforehand roughly the coordinates of the feature you are looking for, try passing the cloud through a simple pass-through-filter, which essentially crops your point cloud. I.e. if you know that the feature is at a x-coordinate > 5, remove all points with x-coordinate < 5.
This filter could for the first coordinated be realized as
data = data(data(1,:) > 5,:);
Provided that your 3d data is stored in an n by 3 matrix.
This, together with downsampling, could help you out. If you still find the performance lagging, consider using something like the PCD viewer in PointCloudLibrary, check half way down the page at
http://pointclouds.org/documentation/overview/visualization.php
It is a stand alone app you could launch from matlab. I find it's performance far better than the sluggish matlab plotting tools.
For anyone who is interested I ended up finding a Point cloud visualiser called Cloud Compare. It is extremely fast and allows selection and segmentation as well as filtering on point clouds.

how to make a smooth plot in matlab

I have about 100 data points which mostly satisfying a certain function (but some points are off). I would like to plot all those points in a smooth curve but the problem is the points are not uniformly distributed. So is that anyway to get the smooth curve? I am thinking to interpolate some points in between, but the only way that comes up to my mind is to linearly insert some artificial points between two data points. But that will show a pretty weird shape (like some sharp corner). So any better idea? Thanks.
If you know more or less what the actual curve should be, you can try to fit that curve to your points (e.g. using polyfit). Depending on how many points are off and how far, you can get by with least squares regression (which is fairly easy to get working). If you have too many outliers (or they are much too large/small), you can also try robust regression (e.g. least absolute deviation fitting) using the robustfit function.
If you can manually determine the outliers, you can also fit a curve through the other points to get better results or even use interpolation methods (e.g. interp1 in MATLAB) on those points to get a smoother curve.
If you know which function describes your data, robust fitting (using, e.g. ROBUSTFIT, or the new convenient functions LINEARMODEL and NONLINEARMODEL with the robust option) is a good way to go if there are outliers in your data.
If you don't know the function that describes your data, but want a smooth trendline that is little affected by outliers, SMOOTHN from the File Exchange does an excellent job in my experience.
Have you looked at the use of smoothing splines? Like interpolating splines, but with the knot points and coefficients chosen to minimise a least-squares error function. There is an excellent implementation available from Matlab central which I have used successfully.