Backprojection from projection matrix using MATLAB - matlab

I've a 256x256 projection matrix. each row is a projection taken with equal angles. i need to generate the original image with backprojection using matlab and I am not really familiar with matlab. Can you suggest me any code samples or alghorithms? I've found some similar codes i couldn't generate the original image using them.

This should be relatively simple with the iradon command, if you have the Image Processing Toolbox. If you don't, it will be a bit tougher because you need to roll your own version of that. Apparently you can't use this, but for what it's worth I get an image if I use:
I = iradon(Pteta',linspace(0,179,size(Pteta,1));
So, how can you do this yourself? I'll try to help you along the way without giving you the answer - this is homework after all!
First, think about your 0-degree projection. Imagine the axis you're projecting on has units 1,256. Now imagine back projection of these coordinates across your image, it would look something like this:
Similarly, think a 90-degree projection would look like this:
Cool, we can get these matrices by using [X, Y] = meshgrid(1:256);, but what about off-axis projections? Just think of the distance along some angled line, like converting polar/Cartesian coordinates:
theta = 45 % projection angle in degrees
t = X*cosd(theta) + Y*sind(theta);
And it works!
There is a problem, though! Notice the values go up over 350 now? Also it's sort of off-center. The coordinates now exceed the length of our projections because the diagonal of a square is longer than the side. I'll leave it to you to figure out how to resolve this, but figure the final image will be smaller than the initial projections, and you may need to use different units (-127 to 128 instead of 1 to 256).
Now you can just index your projections for those angles to backproject the actual values across the image. Here we have a second problem, though, because the values are not integers! We could just round them, this is called nearest-neighbor interpolation, but it doesn't give the best results.
proj = Pteta(angle,:);
% add projection filtering here
t = X*cosd(theta) + Y*sind(theta);
% do some rounding/interpolating to make t all integers
imagesc(proj(t));
For our off-center version, that gives us this image, or something similar:
Now you just need to do this for every angle, and add them all up.

Related

matlab scatter plot time series

I would like to look at two dimensional data in a time-series - the first idea I had was to use a scatter plot, where you can easily explore timepoint-to-timepoint. Is there a function I could use for this? I looked at scatter3 but it can only plot perfectly-cubic data, not as below:
e.g.
data=rand(5,5,3);
scatter3(data(1,:,:),data(:,1,:),data(:,:,1)) %throws an error
thanks
edit: Originally I had something like >this< in mind
scatter3 seems to be for 3D plots, but you say your data is 2D.
For a simple time-series graph you could presumably even just use plot:
figure
nPoints = 25;
dataX = 1:nPoints;
dataY = rand(1,nPoints);
plot(dataX,dataY, 'o-')
However, the example you give in your link looks like something else, so it seems like scatter (rather than scatter3) might be what you're after. Maybe something like this?
figure
nPoints = 25;
dataX = 1:nPoints;
dataY = rand(1,nPoints);
dataArea = rand(1,nPoints)*100;
dataColours = rand(nPoints,3);
scatter(dataX,dataY,dataArea,dataColours)
EDIT:
I think I understand better what you mean, sorry I didn't see the buttons at the bottom of the link, but correct me if I'm wrong. So you have a set of XY coordinates for multiple objects at different points in time, and ideally you want to plot how the XY coordinates of each object (in 2 dimensions) change over time (in 3 dimensions). Your initial approach in using scatter3 was to try and make a simple 3d graph, but maybe ideally you want a 2d graph that can be either animated or interactive, to change the time point displayed at any given time?
Going back to your original question, I think the issue with your attempt to use scatter3 (or plot3 might be useful too) is I'm not sure what your dummy data would represent. You created data as a 5x5x3 matrix, and I assume that might represent 25 data points, at 3 different time intervals? However, which data would represent the X and which the Y coordinates? It would work with something like the following, where each variable represents the X/Y/Z coordinates of 6 objects (columns) at 5 different time points (rows)
myX = rand(5,6,1);
myY = rand(5,6,1);
% I'm making each time point increase linearly.
myZ = repmat((1:size(myX,1))', 1, size(myX,2));
plot3(myX, myY, myZ, 'o-')
grid on
% Note I find the default dimensions it treats as X, Y and Z unintuitive (e.g. the Z dimension is the vertical dimension), but one could swap the input arguments around to overcome this.
However, especially if you have a lot of points, I'm not sure how clear a graph like this will be, especially compared to the example in your link.
Instead it seems like you ideally want only the XY coordinates of all objects to be plotted for only one time point at once, and a way to cycle through each time point sequentially. This seems trickier, and maybe someone else will be able to answer better than I have. A couple more questions though that might be useful:
How much do you care about the smoothness of the transition. In the example link the circles move smoothly from one position to another, rather than just jumping/teleporting between points.
Ideally do you want a function that would produce an 'animation', cycling through all the time points from begining to end, or a way of manually specifying/changing which time point is being displayed. If the former, maybe this function would be useful (though I've tried it myself yet) https://uk.mathworks.com/matlabcentral/fileexchange/42305-multicomet

Rotating a template to match an edge

(I use MATLAB R2015a). I have a plot of edge points of an object (obtained using edge detection), and I have a plot of the template of the object. I want to rotate the template until it matches with the detected edge points. (Figure link included: solid blue - template, red dots - edge detected points; the rotation is subtle, but it's there.)
I plan to rotate the template in a loop about the centroid through different values of thetas (which I know how to do), and ask the code to 'stop executing when it matches with the edge' (which is what I want to know how) and return the corresponding theta.
The number of points making up the template and that making up the edges are not the same, so splitting the plots into 3 lines and 1 (half) ellipse and directly comparing does not work.
Using regionprops 'orientation' does not give the expected result for each frame because of the way the edges are being detected in each frame. (I can elaborate more on this if required)
I have intentionally plotted points using plot, rather than keeping the edge as a BW image because, otherwise, I'm having to round off indices while creating the template, and for my application, I cannot afford to lose precision like that.
I'm not lazy, I don't want somebody to just code it up for me. None of my ideas worked and I'm unable to think any differently, so perhaps somebody with a fresh mind and more experience in Matlab will have some idea.
Assuming both image and template are bitmaps (= template isn't given as 3x lines + half of ellipse), and you know the position and size of the template, just not the angle:
For each edge, find the closest point on the template, and sum all the distances, or perhaps sum of sqrt of distances, or some other metric that penalizes many small outliers - many points will be slightly wrong if rotation is slightly wrong. Few points that are completely wrong are noise to ignore. So, something like:
minDist = inf;
minAngle = -1;
for t = 1 : length(thetas)
templatePoints = ...; % calculate template points.
for i = 1 : length(edges)
edge = edges(i); % assuming this edge is (x, y) edge point
mind = inf; % Min distance
for j = 1 : length(templatePoints)
d = sum(sqrt((edge - templatePoints).^2));
if (d < mind)
mind = d;
end
end
end
currentDist = sum(mind);
if (currentDist < minDist)
...
end
When you complete the full circle of template rotation select the rotation with the lowest difference.
This procedure might be a bit problematic because you might have template rotated by 10.5 deg and you are going in a step of 1 deg, so you will not have totally optimal angle in the end. Plus you will try tons of completely wrong angles, slowing you down.
But to find the optimal angle you can change angle step, say you first try every 10 deg rotation, then every 1deg around the minimum, then every 0.1deg etc. Or use optimization method. Gradient descent should work fine if you don't have too much noise, use simulated annealing for noisy images. Use the same code as above, currentDist is a good enough optimization parameter - it should be as close to 0 as possible.
If you have unknown template size too, or unknown template position, you definitely should use simulated annealing, gradient descent will almost surely get stuck in a local minimum. Use code similar to above to calculate difference between the edges and template, and put all the unknown parameters to the method.
One option is to make an image of your result for each rotation, but do not make it binary, make it grayscale. There are ways of making this so the "maximum interpolated edge" is where your continuous function is (such as Xiaolin Wu's line algorithm, for example).
As you are matching an image, you are not loosing precision, but putting the precision of your target to the same level as your image.
Once you get this, for each rotation, use a metric to evaluate how the 2 grayscale (yes, not binary) images match, using i.e. correlation coefficient, mutual information, universal quality index (UQI), etc.

Align 2 sets of 3D points in Unity by using Singular Value Decomposition method (SVD)

I wanted to translate a set of reference points on contour to a set of corresponding target points. There are total 8 points on each contour.
In order to calculate the rotation & translation vector, I was using Math.Net Numerics library to perform SVD calculation - The idea came from this URL (page 3-7):
But somehow I noticed that transformation done using result from SVD calculation seems inaccurate. The result as shown below:
The transform supposed to move reference points to target points as close as possible, but as highlighted, it moves far away from target point.
In addition, I also did a simple test whereby I calculated centroid for both contours and perform deduction: (TargetCentroid - RefCentroid = translation vector). The final transformation result is the same as going through SVD.
Am I did something wrong? Can anyone suggest a better solution to transform ref point to target point?
Edit:
1. Garment transformation from reference model to various target models
This seems like an over complicated solution to the problem.
If you have the target points, you can just Lerp the given points to their corresponding target points.
Or if the target is the same mesh but of different scale and rotation as in the picture, you can just Lerp the transform values, scale and rotation respectfully, without the need to go over all the points individually.
Using Vector3.Lerp
Edit:
Additionally, lerping will cause all the points to reach their targets at the same time, which is, in most cases, the desired behavior.

Plot normal vector field 3D

I am implementing the algorithm for Photometric Stereo where I have already calculated the normals from a set of images with different light directions.
How can I plot the normal vector field in matlab? I have a matrix of normals of size (N x 3).
I'm afraid you have left out a step. You need to retrieve the depth map from the surface normals, and then you can start plotting. To see how to do this, you can check out section 4 of the following paper:
http://www.wisdom.weizmann.ac.il/~vision/photostereo/Photometric%20Stereo%20with%20General%20Unknown%20Lighting%20-%20BasriJacobsKemelmacher_ijcv06.pdf
There are other resources on the web too; I don't know of any built-in function in any Matlab library, but I don't have the Computer Vision toolbox, so who knows?
I suspect you are looking for quiver3.
You need to present normals field, as an gradient field, then you can,use quiver function. And in gradient field previously normalized triple {pn,qn,rn}, the data is presented in such a way , as to rend the third component of it always equal to one (at least in theory). I mean with rn=1, or should I now say, that now : R=1, and you actually need only {P,Q} coomponents to present contents of gradient field with ordinary 2D quiver function. Thus, gradient vector is something quite different and distinct from normals field, because: P=pn/sqrt(pn^2+qn^2+rn^2) , and Q=qn/(pn^2+qn^2+rn^2) POINTWISELY saying.
However you don't bother with double for loops, run over X,Y directions, cause POINTWISELY, correctly rendered calculations for gradient field from normals, is the following: P=pn./(pn.^2 + qn.^2 + rn.^2).^(1/2);, and so on.
You can see as well: http://www.mathworks.com/matlabcentral/fileexchange/authors/126090/
You need to present normals field, as an gradient field, then you can use a Matlab's quiver function. And in gradient field, the previously normalized triple {pn,qn,rn}, of the data, is presented in such a way, as to rend the third component of it always equal to one (at least in theory).
I mean with rn = 1, or should I now say, that now with: some R=1, you actually need only {P,Q} coomponents to present contents of gradient field with ordinary 2D quiver function. Thus, gradient vector is something quite different and distinct from normals field, because:
P=pn/sqrt(pn^2+qn^2+rn^2) , and Q=qn/(pn^2+qn^2+rn^2) POINTWISELY saying.
However you don't bother with double for loops, which would be run over X, Y directions, cause POINTWISELY, correctly rendered calculations for gradient field from normals, are the following:
P=pn./(pn.^2 + qn.^2 + rn.^2).^(1/2); , and: Q=qn./(pn.^2 + qn.^2 + rn.^2);
You can see as well:
http://www.mathworks.com/matlabcentral/fileexchange/authors/126090/
Briefly saying, the gradient field always represents slopes on X, Y directions, while descending exactly one height unit alongside Z axis on the 3D surface retrieved with for instance Photometric Stereo algorithm. That is why the third component in quiver function visualization is always equal to one (i.e. R = 1), and practically irrelevant.
I have posted, last month, some codes, on THE simplest Photometric Stereo methods, on Mathworks Web pages, due to some span of time available to tide it all up, I mean my own so far produced codes in Matlab...

MATLAB Quiver - Tiny arrows

I am trying to plot x and y velocities using quiver function in MATLAB.
I have x,y,u and v arrays(with their usual meanings) with dimension 100x100
So, the result is my quiver plot is dense and I cannot see the arrows unless I zoom in.
Somewhat like this: quiver not drawing arrows just lots of blue, matlab
Take a look at my plot:
Is there any way to make quiver plot less dense(and with bigger arrows)? I am planning to clip x-axis range to 0-4. But anything apart from that?
I cannot make my mesh less dense for accuracy concerns. I am, however willing to ignore some fine data points if that's required to make the plot look better.
You can plot a reduced number of arrows by plotting, for example, (assuming your data are in arrays)
quiver(x(1:2:end,1:2:end),y(1:2:end,1:2:end),u(1:2:end,1:2:end),v(1:2:end,1:2:end))
where the 2 in this example means we plot only a quarter as many arrows. You can of course change it, as long as you change all of the 2's so that the arrays are all appropriately sized.
If you want to change the length of the arrows there are two options. Firstly, you can use the scale option scale=2 to scale the arrows by the amount specified, or you can normalise the velocities if you want to have all the arrows the same length. You do lose information doing that, because you can't compare the magnitude of the velocity by looking at the arrows, but it may be useful in some situations. You can do this by dividing u and v both by sqrt(u.^2+v.^2) (at the points you wish to plot arrows at.
Hope that helps and sets everything out nicely.
You need to make your interval value a bit larger in order to make your matrix more sparse.
This is very dense:
1:0.0001:100
This is very sparse:
1:1:100
EDIT:
If you have the Image Processing Toolkit you can use the imresize function to reduce the matrix resolution:
newMat = imresize(oldMat, newSize);
And if you don't have the Toolbox then you can resize in a similar manner to this example using interp2 Interpolation:
orgY = 1:size(oldMat,1);
orgX = 1:size(oldMat,2);
[orgX,orgY] = meshgrid(orgX ,orgY);
newY = linspace(1,size(mat,1),newHeight);
newX = linspace(1,size(mat,2),newWidth);
[newX,newY] = meshgrid(newX,newY);
newMat = interp2(orgX,orgY,mat,newX,newY);
And thanks to #David, if you want to just strip out some individual points you can simply do:
xPlot=x(1:2:end)