I want to rectify an image with perspectival distorsion. I have points of the corners and I have also have an algorithm that perfoms what I need but it executes really slow. It has 'imtransform' and 'maketform' functions which matlab has faster functions for these actions. So I tried to replace them but I couldn't make it right. Any helps will be appreciated.
Here is the Images to make this question clearer:
Input Image with known Coordinates(x,y):
and Desired Output:
This process executed with the interval of 2 seconds, I need to replace this process via new matlab functions but I couldn't make it.
Old algorihm was:
%X has the clockwise X coordinates %Y has the clockwise Y coordinates
A=zeros(8,8);
A(1,:)=[X(1),Y(1),1,0,0,0,-1*X(1)*x(1),-1*Y(1)*x(1)];
A(2,:)=[0,0,0,X(1),Y(1),1,-1*X(1)*y(1),-1*Y(1)*y(1)];
A(3,:)=[X(2),Y(2),1,0,0,0,-1*X(2)*x(2),-1*Y(2)*x(2)];
A(4,:)=[0,0,0,X(2),Y(2),1,-1*X(2)*y(2),-1*Y(2)*y(2)];
A(5,:)=[X(3),Y(3),1,0,0,0,-1*X(3)*x(3),-1*Y(3)*x(3)];
A(6,:)=[0,0,0,X(3),Y(3),1,-1*X(3)*y(3),-1*Y(3)*y(3)];
A(7,:)=[X(4),Y(4),1,0,0,0,-1*X(4)*x(4),-1*Y(4)*x(4)];
A(8,:)=[0,0,0,X(4),Y(4),1,-1*X(4)*y(4),-1*Y(4)*y(4)];
v=[x(1);y(1);x(2);y(2);x(3);y(3);x(4);y(4)];
u=A\v;
%transfer fonksiyonumuz
U=reshape([u;1],3,3)';
w=U*[X';Y';ones(1,4)];
w=w./(ones(3,1)*w(3,:));
T=maketform('projective',U');
%transform uygulayıp resmi düzleştiriyoruz
P2=imtransform(I,T,'XData',[1 n],'YData',[1 m]);
if it helps, here is how I generated "A" matrix and U matrix:
Out Link
using the builtin MATLAB functions (fitgeotrans, imref2d, and imwarp) the following code runs in 0.06 seconds on my laptop:
% read the image
im = imread('paper.jpg');
tic
% set the moving points := the original image control points
x = [1380;2183;1282;422];
y = [727;1166;2351;1678];
movingPoints = [x,y];
% set the fixed points := the desired image control points
xfix = [1;1000;1000;1];
yfix = [1;1;1000;1000];
fixedPoints = [xfix,yfix];
% generate geometric transform
tform = fitgeotrans(movingPoints,fixedPoints,'projective');
% generate reference object (full desired image size)
R = imref2d([1000 1000]);
% warp image
outputImage = imwarp(im,tform,'OutputView',R);
toc
% show image
imshow(outputImage);
Related
I want to convert an image from Cartesian to Polar and to use it for opengl texture.
So I used matlab referring to the two articles below.
Link 1
Link 2
My code is exactly same with Link 2's anwser
% load image
img = imread('my_image.png');
% convert pixel coordinates from cartesian to polar
[h,w,~] = size(img);
[X,Y] = meshgrid((1:w)-floor(w/2), (1:h)-floor(h/2));
[theta,rho] = cart2pol(X, Y);
Z = zeros(size(theta));
% show pixel locations (subsample to get less dense points)
XX = X(1:8:end,1:4:end);
YY = Y(1:8:end,1:4:end);
tt = theta(1:8:end,1:4:end);
rr = rho(1:8:end,1:4:end);
subplot(121), scatter(XX(:),YY(:),3,'filled'), axis ij image
subplot(122), scatter(tt(:),rr(:),3,'filled'), axis ij square tight
% show images
figure
subplot(121), imshow(img), axis on
subplot(122), warp(theta, rho, Z, img), view(2), axis square
The result was exactly what I wanted, and I was very satisfied except for one thing. It's the area (red circled area) in the picture just below. Considering that the opposite side (blue circled area) is not, I think this part should also be filled. Because of this part is empty, so there is a problem when using it as a texture.
And I wonder how I can fill this part. Thank you.
(little difference from Link 2's answer code like degree<->radian and axis values. but i think it is not important.)
Those issues you show in your question happen because your algorithm is wrong.
What you did (push):
throw a grid on the source image
transform those points
try to plot these colored points and let MATLAB do some magic to make it look like a dense picture
Do it the other way around (pull):
throw a grid on the output
transform that backwards
sample the input at those points
The distinction is called "push" (into output) vs "pull" (from input). Only Pull gives proper results.
Very little MATLAB code is necessary. You just need pol2cart and interp2, and a meshgrid.
With interp2 you get to choose the interpolation (linear, cubic, ...). Nearest-neighbor interpolation leaves visible artefacts.
im = im2single(imread("PQFax.jpg"));
% center of polar map, manually picked
cx = 10 + 409/2;
cy = 7 + 413/2;
% output parameters
radius = 212;
dRho = 1;
dTheta = 2*pi / (2*pi * radius);
Thetas = pi/2 - (0:dTheta:2*pi);
Rhos = (0:dRho:radius);
% polar mesh
[Theta, Rho] = meshgrid(Thetas, Rhos);
% transform...
[Xq,Yq] = pol2cart(Theta, Rho);
% translate to sit on the circle's center
Xq = Xq + cx;
Yq = Yq + cy;
% sample image at those points
Ro = interp2(im(:,:,1), Xq,Yq, "cubic");
Go = interp2(im(:,:,2), Xq,Yq, "cubic");
Bo = interp2(im(:,:,3), Xq,Yq, "cubic");
Vo = cat(3, Ro, Go, Bo);
Vo = imrotate(Vo, 180);
imshow(Vo)
The other way around (get a "torus" from a "ribbon") is quite similar. Throw a meshgrid on the torus space, subtract center, transform from cartesian to polar, and use those to sample from the "ribbon" image into the "torus" image.
I'm more familiar with OpenCV than with MATLAB. Perhaps MATLAB has something like OpenCV's warpPolar(), or a generic remap(). In any case, the operation is trivial to do entirely "by hand" but there are enough supporting functions to take the heavy lifting off your hands (interp2, pol2cart, meshgrid).
1.- The white arcs tell that the used translation pol-cart introduces significant errors.
2.- Reversing the following script solves your question.
It's a script that goes from cart-pol without introducing errors or ignoring input data, which is what happens when such wide white arcs show up upon translation apparently correct.
clear all;clc;close all
clc,cla;
format long;
A=imread('shaffen dass.jpg');
[sz1 sz2 sz3]=size(A);
szx=sz2;szy=sz1;
A1=A(:,:,1);A2=A(:,:,2);A3=A(:,:,3); % working with binary maps or grey scale images this wouldn't be necessary
figure(1);imshow(A);
hold all;
Cx=floor(szx/2);Cy=floor(szy/2);
plot(Cx,Cy,'co'); % because observe image centre not centered
Rmin=80;Rmax=400; % radius search range for imfindcircles
[centers, radii]=imfindcircles(A,[Rmin Rmax],... % outer circle
'ObjectPolarity','dark','Sensitivity',0.9);
h=viscircles(centers,radii);
hold all; % inner circle
[centers2, radii2]=imfindcircles(A,[Rmin Rmax],...
'ObjectPolarity','bright');
h=viscircles(centers2,radii2);
% L=floor(.5*(radii+radii2)); % this is NOT the length X that should have the resulting XY morphed graph
L=floor(2*pi*radii); % expected length of the morphed graph
cx=floor(.5*(centers(1)+centers2(1))); % coordinates of averaged circle centres
cy=floor(.5*(centers(2)+centers2(2)));
plot(cx,cy,'r*'); % check avg centre circle is not aligned to figure centre
plot([cx 1],[cy 1],'r-.');
t=[45:360/L:404+1-360/L]; % if step=1 then we only get 360 points but need an amount of L points
% if angle step 1/L over minute waiting for for loop to finish
R=radii+5;x=R*sind(t)+cx;y=R*cosd(t)+cy; % build outer perimeter
hL1=plot(x,y,'m'); % axis equal;grid on;
% hold all;
% plot(hL1.XData,hL1.YData,'ro');
x_ref=hL1.XData;y_ref=hL1.YData;
% Sx=zeros(ceil(R),1);Sy=zeros(ceil(R),1);
Sx={};Sy={};
for k=1:1:numel(hL1.XData)
Lx=floor(linspace(x_ref(k),cx,ceil(R)));
Ly=floor(linspace(y_ref(k),cy,ceil(R)));
% plot(Lx,Ly,'go'); % check
% plot([cx x(k)],[cy y(k)],'r');
% L1=unique([Lx;Ly]','rows');
Sx=[Sx Lx'];Sy=[Sy Ly'];
end
sx=cell2mat(Sx);sy=cell2mat(Sy);
[s1 s2]=size(sx);
B1=uint8(zeros(s1,s2));
B2=uint8(zeros(s1,s2));
B3=uint8(zeros(s1,s2));
for n=1:1:s2
for k=1:1:s1
B1(k,n)=A1(sx(k,n),sy(k,n));
B2(k,n)=A2(sx(k,n),sy(k,n));
B3(k,n)=A3(sx(k,n),sy(k,n));
end
end
C=uint8(zeros(s1,s2,3));
C(:,:,1)=B1;
C(:,:,2)=B2;
C(:,:,3)=B3;
figure(2);imshow(C);
the resulting
3.- let me know if you'd like some assistance writing pol-cart from this script.
Regards
John BG
This question already has answers here:
MATLAB: Drawing a line over a black and white image
(5 answers)
Closed 4 years ago.
I have a set of points that I want to connect sequentially. Suppose the points are (A1,A2,A3,...A9); I want to connect A1 to A2, A2 to A3 and so on and finally connect A9 to A1.
All I need is to know a function that would help me connect A1 to A2, I could do the rest using for loops.
I know connecting two points is a question that has been asked here several times before but I couldn't find the answer I required. Several of the solutions suggest using "plot" and "line" but these functions overlay the results on the image and don't actually make any changes to the original image.
I did try them out and managed to save the resulting figure using the "saveas" and "print" functions but the image doesn't get saved in the proper format and there are a lot of problems using the parameters for these functions. Besides, I don't really want to save the image, it's just an unnecessary overhead I was willing to add if I could get the desired image with the lines.
I've also tried "imline" to draw lines but it seems to be interactive.
This particular question reflects my problem perfectly but when I ran the code snippets given as solutions, all of them gave a set of dots in the resulting image.
I tried the above mentioned codes in the link with this image that I found here.
A dotted line was an output for all three code snippets in the link above.
For example, I ran the first code like this:
I = imread('berries_copy.png');
grayImage=rgb2gray(I);
img =false(size(grayImage,1), size(grayImage,2));
I wrote the above piece of code just to get a black image for the following operations:
x = [500 470]; % x coordinates
y = [300 310]; % y coordinates
nPoints = max(abs(diff(x)), abs(diff(y)))+1; % Number of points in line
rIndex = round(linspace(y(1), y(2), nPoints)); % Row indices
cIndex = round(linspace(x(1), x(2), nPoints)); % Column indices
index = sub2ind(size(img), rIndex, cIndex); % Linear indices
img(index) = 255; % Set the line points to white
imshow(img); % Display the image
This is the resulting image for the above code as well as the other two, which as you can see, is just a few points on a black background which isn't the desired output.
I changed the code and used the "plot" function for the same to get this output which is what I want. Is there anyway I can change the dotted output into a solid line?
Or if could anyone suggest a simple function or a method that would draw a line from A1 to A2 and would actually make a change in the input image, I'd be grateful. (I really hope this is just me being a novice rather than Matlab not having a simple function to draw a line in an image.)
P.S. I don't have the Computer Vision toolbox and if possible, I'd like to find a solution that doesn't involve it.
Your first problem is that you are creating a blank image the same size as the image you load with this line:
img =false(size(grayImage,1), size(grayImage,2));
When you add the line to it, you get a black image with a white line on it, as expected.
Your second problem is that you are trying to apply a solution for grayscale intensity images to an RGB (Truecolor) image, which requires you to modify the data at the given indices for all three color planes (red, green, and blue). Here's how you can modify the grayscale solution from my other answer:
img = imread('berries_copy.png'); % Load image
[R, C, D] = size(img); % Get dimension sizes, D should be 3
x = [500 470]; % x coordinates
y = [300 310]; % y coordinates
nPoints = max(abs(diff(x)), abs(diff(y)))+1; % Number of points in line
rIndex = round(linspace(y(1), y(2), nPoints)); % Row indices
cIndex = round(linspace(x(1), x(2), nPoints)); % Column indices
index = sub2ind([R C], rIndex, cIndex); % Linear indices
img(index) = 255; % Modify red plane
img(index+R*C) = 255; % Modify green plane
img(index+2*R*C) = 255; % Modify blue plane
imshow(img); % Display image
imwrite(img, 'berries_line.png'); % Save image, if desired
And the resulting image (note the white line above the berry in the bottom right corner):
You can use the Bresenham Algorithm. Of course it has been implemented and you can find it here: Bresenham optimized for Matlab. This algorithm selects the pixels approximating a line.
A simple example, using your variable name could be:
I = rgb2gray(imread('peppers.png'));
A1 = [1 1];
A2 = [40 40];
[x y] = bresenham(A1(1),A1(2),A2(1),A2(2));
ind = sub2ind(size(I),x,y);
I(ind) = 255;
imshow(I)
You can use imshow to display a image and then use plot to plot lines and save the figure. Check the below code:
I = imread('peppers.png') ;
imshow(I)
hold on
[m,n,p] = size(I) ;
%% Get random points A1, A2,..A10
N = 9 ;
x = (n-1)*rand(1,N)+1 ;
y = (m-1)*rand(1,N)+1 ;
P = [x; y]; % coordinates / points
c = mean(P,2); % mean/ central point
d = P-c ; % vectors connecting the central point and the given points
th = atan2(d(2,:),d(1,:)); % angle above x axis
[th, idx] = sort(th); % sorting the angles
P = P(:,idx); % sorting the given points
P = [P P(:,1)]; % add the first at the end to close the polygon
plot( P(1,:), P(2,:), 'k');
saveas(gcf,'image.png')
Using Matlab, I'm trying to calculate and plot points which are contained in z-slices of this shape:
It is a pyramid at a 45deg angle with the top removed.
When I take a slice at z=198 (a bit below the datatip above), I get a weird artifact:
Here is my code snippet doing the checking:
% Setup alphaShape as "alpha" ....
points = 1e3 * [0.3223 0.1761 0.3299
0.5752 1.2034 -0.0618
-0.0172 1.2034 -0.0618
0.2357 0.1761 0.3299
0.3223 0.0671 0.2209
0.5752 0.4588 -0.8064
-0.0172 0.4588 -0.8064
0.2357 0.0671 0.2209];
alpha = alphaShape(points);
% Then, check if coordinates on slice are in or outside alpha
z = 198;
width = 550;
depth = 450;
inout = zeros(width,depth);
for qx = 1:width
for qy = 1:depth
in = inShape(alpha,qx,qy,z);
if (in)
inout(qx,qy) = 1;
end
end
end
% Plot
figure; contourf(inout.'); % Not sure why conjugate needed?
Can anyone point out what is causing the extra "peg" sticking out of my slice result? Or how to fix my code to remove it?
EDIT: Using plot(alpha), the square top is not connected to the base the way I thought:
My initial plot was done with:
K = convhull(points);
figure;trisurf(K,points(:,1),points(:,2),points(:,3),'Facecolor','red','FaceAlpha',0.1);
The issue was how I created my alphaShape.
Using the builtin convhull() function and inhull() from the Matlab File Exchange gave the results I desire.
How to count circle objects in a bright image using MATLAB?
The input image is:
imfindcircles function can't find any circle in this image.
Based on well known image processing techniques, you can write your own processing tool:
img = imread('Mlj6r.jpg'); % read the image
imgGray = rgb2gray(img); % convert to grayscale
sigma = 1;
imgGray = imgaussfilt(imgGray, sigma); % filter the image (we will take derivatives, which are sensitive to noise)
imshow(imgGray) % show the image
[gx, gy] = gradient(double(imgGray)); % take the first derivative
[gxx, gxy] = gradient(gx); % take the second derivatives
[gxy, gyy] = gradient(gy); % take the second derivatives
k = 0.04; %0.04-0.15 (see wikipedia)
blob = (gxx.*gyy - gxy.*gxy - k*(gxx + gyy).^2); % Harris corner detector (high second derivatives in two perpendicular directions)
blob = blob .* (gxx < 0 & gyy < 0); % select the top of the corner (i.e. positive second derivative)
figure
imshow(blob) % show the blobs
blobThresshold = 1;
circles = imregionalmax(blob) & blob > blobThresshold; % find local maxima and apply a thresshold
figure
imshow(imgGray) % show the original image
hold on
[X, Y] = find(circles); % find the position of the circles
plot(Y, X, 'w.'); % plot the circle positions on top of the original figure
nCircles = length(X)
This code counts 2710 circles, which is probably a slight (but not so bad) overestimation.
The following figure shows the original image with the circle positions indicated as white dots. Some wrong detections are made at the border of the object. You can try to make some adjustments to the constants sigma, k and blobThresshold to obtain better results. In particular, higher k may be beneficial. See wikipedia, for more information about the Harris corner detector.
I have two 3D images, i need to register these two images using "lsqcurvefit". I know that I can use "imregister" but I want to use my own registration using "lsqcurvefit" in Matlab. My images are are following Gaussian distribution. it is not documented well that how should I provide it, anyone can help me in detail?
image registration is a repeated process of maping source image to target image using i.e affine. i want to use intensity base registration, and i use all voxels of my image. therefore, i need to fit these two images as much as possible.
Thanks
Here's an example of how to do point-wise image registration using lsqcurvefit. Basically you make a function that takes a set of points and an Affine matrix (we're just going to use the translate and rotate parts but you can use skew and magnify if desired) and returns a new set of points. There's probably a built-in function for this already but it's only two lines so it's easy to write. That function is:
function TformPts = TransformPoints(StartCoordinates, TransformMatrix)
TformPts = StartCoordinates*TransformMatrix;
Here's a script that generates some points, rotates and translates them by a random angle and vector, then uses the TransformPoints function as the input for lsqcurvefit to fit the needed transformation matrix for the registration. Then it's just a matrix multiplication to generate the registered set of points. If we did this all right the red circles (original data) will line up with the black stars (shifted then registered points) very well when the code below is run.
% 20 random points in x and y between 0 and 100
% row of ones pads out third dimension
pointsList = [100*rand(2, 20); ones(1, 20)];
rotateTheta = pi*rand(1); % add rotation, in radians
translateVector = 10*rand(1,2); % add translation, up to 10 units here
% 2D transformation matrix
% last row pads out third dimension
inputTransMatrix = [cos(rotateTheta), -sin(rotateTheta), translateVector(1);
sin(rotateTheta), cos(rotateTheta), translateVector(2);
0 0 1];
% Transform starting points by this matrix to make an array of shifted
% points.
% For point-wise registration, pointsList represents points from one image,
% shiftedPoints points from the other image
shiftedPoints = inputTransMatrix*pointsList;
% Add some random noise
% Remove this line if you want the registration to be exact
shiftedPoints = shiftedPoints + rand(size(shiftedPoints, 1), size(shiftedPoints, 2));
% Plot starting sets of points
figure(1)
plot(pointsList(1,:), pointsList(2,:), 'ro');
hold on
plot(shiftedPoints(1,:), shiftedPoints(2,:), 'bx');
hold off
% Fitting routine
% Make some initial, random guesses
initialFitTheta = pi*rand(1);
initialFitTranslate = [2, 2];
guessTransMatrix = [cos(initialFitTheta), -sin(initialFitTheta), initialFitTranslate(1);
sin(initialFitTheta), cos(initialFitTheta), initialFitTranslate(2);
0 0 1];
% fit = lsqcurvefit(#fcn, initialGuess, shiftedPoints, referencePoints)
fitTransMatrix = lsqcurvefit(#TransformPoints, guessTransMatrix, pointsList, shiftedPoints);
% Un-shift second set of points by fit values
fitShiftPoints = fitTransMatrix\shiftedPoints;
% Plot it up
figure(1)
hold on
plot(fitShiftPoints(1,:), fitShiftPoints(2,:), 'k*');
hold off
% Display start transformation and result fit
disp(inputTransMatrix)
disp(fitTransMatrix)