Converting 3D point clouds to range image - matlab

I have many 3D point clouds gathered by velodyne sensor. eg(x, y, z) in meter.
I'd like to convert 3D point clouds to range image.
Firstly, I've got transformtation from Catesian to spherical coordinate.
r = sqrt(x*x + y*y + z*z)
azimuth angle = atan2(x, z)
elevation angle = asin(y/r)
Now. How can I convert 3D point to Range image using these transformation in matlab?
Whole points are about 180,000 and I want 870*64 range image.
azimuth angle range(-180 ~ 180), elevation angle range(-15 ~ 15)

Divide up your azimuth and elevation into M and N ranges respectively. Now you have M*N "bins" (M = 870, N = 64).
Then (per bin) accumulate a histogram of points that project into that bin.
Finally, pick a representative value from each bin for the final range image. You could pick the average value (noisy, fast) or fit some distribution and then use that to pick the value (more precise, slow).

The pointcloud2image code available from Matlab File Exchange can help you to directly convert point cloud (in x,y,z format) to 2D raster image.

Related

Fastest way to compute distance between a point with an orientation to the boundaries of a region

I'm facing the following problem: I'm tracking a pixel that moves in a map. It has a position and an orientation. If its orientation is theta, I have the following angles to calculate the distance [theta-90 theta theta+90 theta+180], as the image shows for theta = 0 or theta = pi*n...
I calculated it in 2 ways:
x=[x , y, theta];
res=size(map)
v1= 1:res(1);
v2= 1:res(2);
a_x=floor(x(1)+v1.*cos(x(3)*pi/180));
a_y=floor(x(2)+v2.*sin(x(3)*pi/180));
a=[a_x' a_y'];
first using the boundaries of the binary map
p=intersect(a,boundaries,'rows');
and using the binary map:
v=zeros(res(1),res(2));
a_x=a_x(a_x>=1 & a_x<=res(1));
a_y=a_y(a_y>=1 & a_y<=res(2));
a_x=a_x(1:min(length(a_y),length(a_x)));
a_y=a_y(1:min(length(a_y),length(a_x)));
v(sub2ind(size(v),a_x,a_y))=1;
p=find(v&map==1);
then calculated the shortest distance between the points and the boundaries with the two methods ...
x=repmat(x(1:2)',size(p,1),1);
d=min(sqrt((p(:,1)-x(:,1)).^2+(p(:,2)-x(:,2)).^2));
I'm doing it for a large image 1920x1080, however I do this for a lot of points with different orientations (2000 points) and it is slow. I would like to know if there's any alternative to compute this faster.
Thank you!

Calculating a spiral in MATLAB

We have these logarithmic spirals which are circling around the centre of the coordinate system:
x = ebθ cos(θ)
y = ebθ sin(θ)
where the ebθ is the distance between the point (which is on the spiral) and the centre; and the θ is the angle between the line connecting the point and the origin and the axis x.
Consider a spiral where the angle is θ ϵ <0,10π> and the parameter is b=0.1. By thickening points on the spirals (and the angle θ) calculate the circumference with the relative precision better than 1%. Draw the spiral!
I'm preparing for a (MATLAB) test and I'm stuck with this exercise. Please help, any hint is appreciated.
Start by computing a list of x,y for your range of theta and value of b. For more accurate results, have your theta increment in smaller steps (I chose 5000 arbitrarily). Then, its simply computing the distance for each pair of consecutive points and summing them up.
t = linspace(0,10*pi,5000);
b = 0.1;
x = exp(b*t).*cos(t);
y = exp(b*t).*sin(t);
result = sum(sqrt((x(2:end) - x(1:end-1)).^2 + (y(2:end)-y(1:end-1)).^2))

convert pixel coordinates to map coordinates

I have an image A of dimension p x q. If I know the UTM coordinate of A(1,1) and A(p,q) and pixel size in meters.
How to convert the pixel coordinates to map coordinates in MATLAB?
Xsize = (1:p)*PixelSizeInMeter+UTM_x_onA11;
Ysize = (1:q)*PixelSizeInMeter+UTM_y_onA11;
figure;
surface(Xsize,Ysize,A);
Now you can plot your map using Xsize and Ysize. Since UTM is a Cartesian grid, life's quite easy: get the correct number of elements, multiply with the grid size and add the lower corner's coordinates to shift the plot to the correct location.

Approximating relative angle between two line segments on sphere surface

I am in need of an idea! I want to model the vascular network on the eye in 3D. I have made statistics on the branching behaviour in relation to vessel diameter, length etc. What I am stuck at right now is the visualization:
The eye is approximated as a sphere E with center in origo C = [0, 0, 0] and a radius r.
What I want to achieve is that based on the following input parameters, it should be able to draw a segment on the surface/perimeter of E:
Input:
Cartesian position of previous segment ending: P_0 = [x_0, y_0, z_0]
Segment length: L
Segment diameter: d
Desired angle relative to the previous segment: a (1)
Output:
Cartesian position of resulting segment ending: P_1 = [x_1, y_1, z_1]
What I do now, is the following:
From P_0, generate a sphere with radius L, representing all the points we could possibly draw to with the correct length. This set is called pool.
Limit pool to only include points with a distance to C between r*0.95 and r, so only the points around the perimeter of the eye are included.
Select only the point that would generate a relative angle (2) closest to the desired angle a.
The problem is, that whatever angle a I desire, is actually not what is measured by the dot product. Say I want an angle at 0 (i.e. that the new segment is following the same direction as the previous`, what I actually get is an angle around 30 degrees because of the curvature of the sphere. I guess what I want is more the 2D angle when looking from an angle orthogonal from the sphere to the branching point. Please take a look at the screenshots below for a visualization.
Any ideas?
(1) The reason for this is, that the child node with the greatest diameter is usually follows the path of the previous segment, whereas smaller child nodes tend to angle differently.
(2) Calculated by acos(dot(v1/norm(v1), v2/norm(v2)))
Screenshots explaining the problem:
Yellow line: previous segment
Red line: "new" segment to one of the points (not neccesarily the correct one)
Blue x'es: Pool (text=angle in radians)
I will restate the problem with my own notation:
Given two points P and Q on the surface of a sphere centered at C with radius r, find a new point T such that the angle of the turn from PQ to QT is A and the length of QT is L.
Because the segments are small in relation to the sphere, we will use a locally-planar approximation of the sphere at the pivot point Q. (If this isn't an okay assumption, you need to be more explicit in your question.)
You can then compute T as follows.
// First compute an aligned orthonormal basis {U,V,W}.
// - {U,V} should be a basis for the plane tangent at Q.
// - W should be normal to the plane tangent at Q.
// - U should be in the direction PQ in the plane tangent at Q
W = normalize(Q - C)
U = normalize(Q - P)
U = normalize(U - W * dotprod(W, U))
V = normalize(crossprod(W, U))
// Next compute the next point S in the plane tangent at Q.
// In a regular plane, the parametric equation of a unit circle
// centered at the origin is:
// f(A) = (cos A, sin A) = (1,0) cos A + (0,1) sin A
// We just do the same thing, but with the {U,V} basis instead
// of the standard basis {(1,0),(0,1)}.
S = Q + L * (U cos A + V sin A)
// Finally project S onto the sphere, obtaining the segment QT.
T = C + r * normalize(S - C)

How to interpolate ECEF coordinates on an WGS84 ellipsoid

Is there a direct method (not involving converting the coordinates to lat/lon) to interpolate between 2 ECEF coordinates (xyz) in order for the interpolated point to be located on the WGS84 ellispoid. The original 2 points are computed from geodetic coordinates.
Interpolating on a sphere seem obvious but I can't seem to derive a solution for the ellipsoid.
Thank you in advance.
Let assume you got 2 points p0(x,y,z) and p1(x,y,z) and want to interpolate some p(t) where t=<0.0,1.0> between the two.
you can:
rescale your ellipsoid to sphere
simply like this:
const double mz=6378137.00000/6356752.31414; // [m] equatoreal/polar radius of Earth
p0.z*=mz;
p1.z*=mz;
now you got Cartesian coordinates refering to spherical Earth model.
interpolate
simple linear interpolation would do
p(t) = p0+(p1-p0)*t
but of coarse you also need to normalize to earth curvature so:
r0 = |p0|
r1 = |p1|
p(t) = p0+(p1-p0)*t
r(t) = r0+(r1-r0)*t
p(t)*=r/|p(t)|
where |p0| means length of vector p0.
rescale back to ellipsoid
by dividing with the same value
p(t).z/=mz
This is simple and cheap but the interpolated path will not have linear time scale.
Here C++ example:
void XYZ_interpolate(double *pt,double *p0,double *p1,double t)
{
const double mz=6378137.00000/6356752.31414;
const double _mz=6356752.31414/6378137.00000;
double p[3],r,r0,r1;
// compute spherical radiuses of input points
r0=sqrt((p0[0]*p0[0])+(p0[1]*p0[1])+(p0[2]*p0[2]*mz*mz));
r1=sqrt((p1[0]*p1[0])+(p1[1]*p1[1])+(p1[2]*p1[2]*mz*mz));
// linear interpolation
r = r0 +(r1 -r0 )*t;
p[0]= p0[0]+(p1[0]-p0[0])*t;
p[1]= p0[1]+(p1[1]-p0[1])*t;
p[2]=(p0[2]+(p1[2]-p0[2])*t)*mz;
// correct radius and rescale back
r/=sqrt((p[0]*p[0])+(p[1]*p[1])+(p[2]*p[2]));
pt[0]=p[0]*r;
pt[1]=p[1]*r;
pt[2]=p[2]*r*_mz;
}
And preview:
Yellow squares are the used p0,p1 Cartesian coordinates, the White curve is the interpolated path where t=<0.0,1.0> ...