Smoothing algorithm, 2.5D - triangulation

The picture below shows a triangular surface mesh. Its vertices are exactly on the surface of the original 3D object but the straight edges and faces have of course some geometric error where the original surface bends and I need some algorithm to estimate the smooth original surface.
Details: I have a height field of (a projectable part of) this surface (a 2.5D triangulation where each x,y pair has a unique height z) and I need to compute the height z of arbitrary x,y pairs. For example the z-value of the point in the image where the cursor points to.
If it was a 2D problem, I would use cubic splines but for surfaces I'm not sure what is the best solution.

As commented by #Darren what you need are patches.
It can be bi-linear patches or bi-quadratic or Coon's patches or other.
I have found no much reference doing a quick search but this links:
provide an overview: http://www.cs.cornell.edu/Courses/cs4620/2013fa/lectures/17surfaces.pdf
while this is more technical: https://www.doc.ic.ac.uk/~dfg/graphics/graphics2010/GraphicsHandout05.pdf
The concept is that you calculate splines along the edges (height function with respect to the straight edge segment itself) and then make a blending inside the surface delimited by the edges.
The patch os responsible for the blending meaning that inside any face you have an height which is a function of the point position coordinates inside the face and the values of the spline ssegments which are defined on the edges of the same face.
As per my knowledge it is quite easy to use this approach on a quadrilateral mesh (because it becomes easy to define on which edges sequence to do the splines) while I am not sure how to apply if you are forced to go for an actual triangulation.

Related

Is it possible to find the depth of an internal point of an object using stereo images (or any other method)?

I have image of robot with yellow markers as shown
The yellow points shown are the markers. There are two cameras used to view placed at an offset of 90 degrees. The robot bends in between the cameras. The crude schematic of the setup can be referred.
https://i.stack.imgur.com/aVyDq.png
Using the two cameras I am able to get its 3d co-ordinates of the yellow markers. But, I need to find the 3d-co-oridnates of the central point of the robot as shown.
I need to find the 3d position of the red marker points which is inside the cylindrical robot. Firstly, is it even feasible? If yes, what is the method I can use to achieve this?
As a bonus, is there any literature where they find the 3d location of such internal points which I can refer to (I searched, but could not find anything similar to my ask).
I am welcome to a theoretical solution as well(as long as it assures to find the central point within a reasonable error), which I can later translate to code.
If you know the actual dimensions, or at least, shape (e.g. perfect circle) of the white bands, then yes, it is feasible and possible.
You need to do the following steps, which are quite non trivial to do, and I won't do them here:
Optional but extremely suggested: calibrate your camera, and
undistort it.
find the equation of the projection of a 3D circle into a 2D camera, for any given rotation. You can simplify this by assuming the white line will be completely horizontal. You want some function that takes the parameters that make a circle and a rotation.
Find all white bands in the image, segment them, and make them horizontal (rotate them)
Fit points in the corrected white circle to the equation in (1). That should give you the parameters of the circle in 3d (radious, angle), if you wrote the equation right.
Now that you have an analytic equation of the actual circle (equation from 1 with parameters from 3), you can map any point from this circle (e.g. its center) to the image location. Remember to uncorrect for the rotations in step 2.
This requires understanding of curve fitting, some geometric analytical maths, and decent code skills. Not trivial, but this will provide a solution that is highly accurate.
For an inaccurate solution:
Find end points of white circles
Make line connecting endpoints
Chose center as mid point of this line.
This will be inaccurate because: choosing end points will have more error than fitting an equation with all points, ignores cone shape of view of the camera, ignores geometry.
But it may be good enough for what you want.
I have been able to extract the midpoint by fitting an ellipse to the arc visible to the camera. The centroid of the ellipse is the required midpoint.
There will be wrong ellipses as well, which can be ignored. The steps to extract the ellipse were:
Extract the markers
Binarise and skeletonise
Fit ellipse to the arc (found a matlab function for this)
Get the centroid of the ellipse
hsv_img=rgb2hsv(im);
bin=new_hsv_img(:,:,3)>marker_th; %was chosen 0.35
%skeletonise
skel=bwskel(bin);
%use regionprops to get the pixelID list
stats=regionprops(skel,'all');
for i=1:numel(stats)
el = fit_ellipse(stats(i).PixelList(:,1),stats(i).PixelList(:,2));
ellipse_draw(el.a, el.b, -el.phi, el.X0_in, el.Y0_in, 'g');
The link for fit_ellipse function
Link for ellipse_draw function

Extrapolation delaunay triangulation

Shown Figure (1) is a typical Delaunay triangulation (blue) and it has a boundary line (black rectangle).
Each vertex in the Delaunay triangulation has a height value. So I can calculate the height inside convex hull. I am figuring out a method to calculate the height up to the boundary line (some sort of extrapolation).
There are two things associated with this task
Triangulate up to the boundary point
Figuring out the height at newly created triangle vertices
Anybody come across this issue?
Figure 1:
I'd project the convex hull points of the triangulation to the visible box segments and then insert the 4 box corners and the projected points into the triangulation.
There is no unique correct way to assign heights to the new points. One easy and stable method would be to assign to each new point the height of the closest visible convex hull vertex. Be careful with extrapolation: Triangles of the convex hull tend to have unstable slopes, see the large triangles in front of the below terrain image. Their projection to the xy plane has almost 0 area but due to the height difference they are large and almost 90 degrees to the xy plane.
I've had some luck with the following approach:
Find the segment on the convex hull that is closet to the extrapolation point
If I can drop a perpendicular onto the segment, interpolate between the two vertices of the segment.
If I can not construct the perpendicular, just use the closest vertex
This approach results in a continuous surface, but does not provide 1st derivative continuity.
You can find some code that might be helpful at TriangularFacetInterpolator.java. Look for the interpolateWithExteriorSupport method.

Matlab circle packing on a sphere

I am trying to pack same-radi circles on a sphere using matlab. I have figured out how to pack circles and spheres in a box or a cube but just can't figure out where to start when packing it on the surface of a sphere. I would love any hints or ideas in doing this.
Wish you all the best of luck
This is a really hard problem, not sure if there exists a generic solution.
To obtain perfectly regular distribution of circles, take any regular polyhedron, and put a circle at each vertex. The largest regular polyhedron, the dodecahedron has 20 vertices, so you would be able to fit 20 circles on a sphere with this method. If you are willing to have a tiling that is not perfectly regular, you can use shapes such as that of a buckyball (same as soccer ball).
Other ways to distribute points evenly around a sphere have been proposed.
A friend of mine proposed a way that allowed him to easily index each point. It was based on the icosahedron (dual of the dodecahedron, it has 20 faces), and tiled each triangular face. See his paper for more details.
That paper has references to some other methods known at that time. I think the most useful one would be the one where the sphere is cut vertically into slices of equal width (the surface of the sphere section has equal height, the cuts themselves are not spaced evenly), and each section is then a strip that can be tiled by circles evenly. This leads to one circle on the top and one on the bottom, and in between rows of circles. The distribution is not even, and the spacing depends on the ratio of the circle size and sphere size.

Detecting overlapped elliptical regions in image (MATLAB)

I have a multiple plants in a single binary image. How would I identify each leaf in the image assuming that each leaf is approximately elliptical?
example input: http://i.imgur.com/BwhLVmd.png
I was thinking a good place to start would be finding the tip of each leaf and then getting the center of each plant. Then I could fit the curves starting from the tip and then going to the center. I've been looking online and saw something involving a watershed method, but I do not know where to begin with that idea.
You should be aware that these things are tricky to get working robustly - there will always be a failure case.
This said, I think your idea is not bad.
You could start as follows:
Identify the boundary curve of each plant (i.e. pixels with both foreground and background in their neighbourhood).
Compute the centroid of each plant.
Convert each plant boundary to a polar coordinate system, with the centroid as the origin. This amounts to setting up a coordinate system with the distance of each boundary curve point on the Y axis and the angle on the X axis.
In this representation of the boundary curve, try to identify maxima; these are the tips of the leaves. You will probably need to do some smoothing. Use the parts of the curve before and after the maxima the start fitting your ellipses or some other shape.
Generally, a polar coordinate system is always useful for analysing stuff thats roughly circular.
To fit you ellipses, once you have a rough initial position, I would probably try an EM-style approach.
I would do something like this (I is your binary image)
I=bwmorph(bwmorph(I, 'bridge'), 'clean');
SK=bwmorph(I, 'skel', Inf);
endpts = bwmorph(SK,'endpoints');
props=regionprops(I, 'All');
And then connect every segment from the centroids listed in props.centroid to the elements of endpts that should give you your leaves (petals?).
A bit of filtering is probably necessary, bwmorph is your friend. Have fun!

How to find normals to an edge in an image

I am doing some work related to eye images.
I did edge detection to it. The edge is like a curve and not continuous. I have to assume it to be continuous and find normals to that curve. How do I find the normals to it using MATLAB?
you can see the image below.
I want to find the normals to the upper curve.
I hope that I was clear enough.
Even though it seems unintuitive, the edge direction at every pixel is a pretty good estimate of the normal. This would be the simplest solution, because it doesn't involve any curve fitting.
In MATLAB, you can find pixel-wise edge directions using the Sobel filter:
[BW,thresh,gv,gh] = edge(I,'sobel');
edgeDir = atan2(gv, gh);
This gives you the edge directions as angles in radians.
You may want to consider curve fitting (MSE based or some other criteria) to the data. I believe a second order will do good for the upper curve, and once you have a model you can can calculate the tangent and normal at each point.
As Zaphod recommended the normal is perpendicular to the edge. You don't need to do curve fitting, you can use back projection to identify the focal point of the curve.
Start at each edge point along the curve and draw a line from curve in the direction of the normal. Draw the line by incrementing the value of each pixel the line passes through. Once you do this for all the edges you would hope to find two pixels with higher values then the rest, one for each of your curves. You should then know by there locations which is the focal point for each curve.