Translate Braille into matlab (after filtering) - matlab

As shown in the picture above, the Braille X,Y locations are complete. However, I don't know how to divide the zones in 3x2 rows to match the average size of the objects. Please let me know.
If there are any other coding methods other than Matlab, please recommend them.
centroids =
1.0e+03 *
0.1515 0.1635
0.2419 0.0806
0.3619 0.2506
0.3624 0.0804
0.3640 0.1652
0.4492 0.2505
0.4503 0.1661
0.4518 0.0822
0.5705 0.1669
0.5748 0.0806
0.6598 0.2502
0.9878 0.0809
1.0748 0.0796
1.0768 0.1646
1.1979 0.1630
1.2830 0.2476
1.2855 0.0772
1.4911 0.1603
1.6120 0.0774
1.6141 0.1608
1.7034 0.2460
1.8217 0.0756
1.8272 0.2453
1.9147 0.1622
2.0306 0.1578
2.1241 0.2462
This is my coordinates. However, the coordinates are only about this picture, and the values of the coordinates change when you insert another picture.
```matlab code
I = imread('sample1.png'); % 이미지 불러오기
H = imgaussfilt(I, 2); % 흐림 필터
T = rgb2gray(H); % 회색조 필터
Q = imadjust(T,[0.5 0.8],[]); % 명암 조절
BW = imbinarize(Q); % 흑백화 필터
bw2 = imcomplement(BW); % 흑백 반전 필터
BW2 = bwareaopen(bw2, 800); % 800픽셀 이하 객체 제거
s = regionprops(BW2,'centroid'); % 객체 중앙 좌표 찍기
centroids = cat(1,s.Centroid);
imshow(BW2)
hold on
plot(centroids(:,1),centroids(:,2),'ro')
hold off
```

Related

how to find the corners of rotated object in matlab?

I want to find the corners of objects.
I tried the following code:
Vstats = regionprops(BW2,'Centroid','MajorAxisLength','MinorAxisLength',...
'Orientation');
u = [Vstats.Centroid];
VcX = u(1:2:end);
VcY = u(2:2:end);
[VcY id] = sort(VcY); % sorting regions by vertical position
VcX = VcX(id);
Vstats = Vstats(id); % permute according sort
Bv = Bv(id);
Vori = [Vstats.Orientation];
VRmaj = [Vstats.MajorAxisLength]/2;
VRmin = [Vstats.MinorAxisLength]/2;
% find corners of vertebrae
figure,imshow(BW2)
hold on
% C = corner(VER);
% plot(C(:,1), C(:,2), 'or');
C = cell(size(Bv));
Anterior = zeros(2*length(C),2);
Posterior = zeros(2*length(C),2);
for i = 1:length(C) % for each region
cx = VcX(i); % centroid coordinates
cy = VcY(i);
bx = Bv{i}(:,2); % edge points coordinates
by = Bv{i}(:,1);
ux = bx-cx; % move to the origin
uy = by-cy;
[t, r] = cart2pol(ux,uy); % translate in polar coodinates
t = t - deg2rad(Vori(i)); % unrotate
for k = 1:4 % find corners (look each quadrant)
fi = t( (t>=(k-3)*pi/2) & (t<=(k-2)*pi/2) );
ri = r( (t>=(k-3)*pi/2) & (t<=(k-2)*pi/2) );
[rp, ip] = max(ri); % find farthest point
tc(k) = fi(ip); % save coordinates
rc(k) = rp;
end
[xc,yc] = pol2cart(tc+1*deg2rad(Vori(i)) ,rc); % de-rotate, translate in cartesian
C{i}(:,1) = xc + cx; % return to previous place
C{i}(:,2) = yc + cy;
plot(C{i}([1,4],1),C{i}([1,4],2),'or',C{i}([2,3],1),C{i}([2,3],2),'og')
% save coordinates :
Anterior([2*i-1,2*i],:) = [C{i}([1,4],1), C{i}([1,4],2)];
Posterior([2*i-1,2*i],:) = [C{i}([2,3],1), C{i}([2,3],2)];
end
My input image is :
I got the following output image
The bottommost object in the image is not detected properly. How can I correct the code? It fails to work for a rotated image.
You can get all the points from the image, and use kmeans clustering and partition the points into 8 groups. Once partition is done, you have the points in and and you can pick what ever the points you want.
rgbImage = imread('your image') ;
%% crop out the unwanted white background from the image
grayImage = min(rgbImage, [], 3);
binaryImage = grayImage < 200;
binaryImage = bwareafilt(binaryImage, 1);
[rows, columns] = find(binaryImage);
row1 = min(rows);
row2 = max(rows);
col1 = min(columns);
col2 = max(columns);
% Crop
croppedImage = rgbImage(row1:row2, col1:col2, :);
I = rgb2gray(croppedImage) ;
%% Get the white regions
[y,x,val] = find(I) ;
%5 use kmeans clustering
[idx,C] = kmeans([x,y],8) ;
%%
figure
imshow(I) ;
hold on
for i = 1:8
xi = x(idx==i) ; yi = y(idx==i) ;
id1=convhull(xi,yi) ;
coor = [xi(id1) yi(id1)] ;
[id,c] = kmeans(coor,4) ;
plot(coor(:,1),coor(:,2),'r','linewidth',3) ;
plot(c(:,1),c(:,2),'*b')
end
Now we are able to capture the regions..the boundary/convex hull points are in hand. You can do what ever math you want with the points.
Did you solve the problem? I Looked into it and it seems that the rotation given by 'regionprops' seems to be off. To fix that I've prepared a quick solution: I've dilated the image to close the gaps, found 4 most distant peaks of each spine, and then validated if a peak is on the left, or on the right of the centerline (that I have obtained by extrapolating form sorted centroids). This method seems to work for this particular problem.
BW2 = rgb2gray(Image);
BW2 = imbinarize(BW2);
%dilate and erode will help to remove extra features of the vertebra
se = strel('disk',4,4);
BW2_dilate = imdilate(BW2,se);
BW2_erode = imerode(BW2_dilate,se);
sb = bwboundaries(BW2_erode);
figure
imshow(BW2)
hold on
centerLine = [];
corners = [];
for bone = 1:length(sb)
x0 = sb{bone}(:,2) - mean(sb{bone}(:,2));
y0 = sb{bone}(:,1) - mean(sb{bone}(:,1));
%save the position of the centroid
centerLine = [centerLine; [mean(sb{bone}(:,1)) mean(sb{bone}(:,2))]];
[th0,rho0] = cart2pol(x0,y0);
%make sure that the indexing starts at the dip, not at the corner
lowest_val = find(rho0==min(rho0));
rho1 = [rho0(lowest_val:end); rho0(1:lowest_val-1)];
th00 = [th0(lowest_val:end); th0(1:lowest_val-1)];
y1 = [y0(lowest_val:end); y0(1:lowest_val-1)];
x1 = [x0(lowest_val:end); x0(1:lowest_val-1)];
%detect corners, using smooth data to remove noise
[pks,locs] = findpeaks(smooth(rho1));
[pksS,idS] = sort(pks,'descend');
%4 most pronounced peaks are where the corners are
edgesFndCx = x1(locs(idS(1:4)));
edgesFndCy = y1(locs(idS(1:4)));
edgesFndCx = edgesFndCx + mean(sb{bone}(:,2));
edgesFndCy = edgesFndCy + mean(sb{bone}(:,1));
corners{bone} = [edgesFndCy edgesFndCx];
end
[~,idCL] = sort(centerLine(:,1),'descend');
centerLine = centerLine(idCL,:);
%extrapolate the spine centerline
yDatExt= 1:size(BW2_erode,1);
extrpLine = interp1(centerLine(:,1),centerLine(:,2),yDatExt,'spline','extrap');
plot(centerLine(:,2),centerLine(:,1),'r')
plot(extrpLine,yDatExt,'r')
%find edges to the left, and to the right of the centerline
for bone = 1:length(corners)
x0 = corners{bone}(:,2);
y0 = corners{bone}(:,1);
for crn = 1:4
xCompare = extrpLine(y0(crn));
if x0(crn) < xCompare
plot(x0(crn),y0(crn),'go','LineWidth',2)
else
plot(x0(crn),y0(crn),'ro','LineWidth',2)
end
end
end
Solution

Issue with step in the code. Require assistance

Good evening,
I have a code working just fine as follows:
r=1.6;
M=0.000207;
D=-0.0256;
kappa=0.5;
gamma=20;%\W\km
Pp=0.2;
L=462.5;
SQ=sqrt(-(0.5*D)^2 +M);
z=0:0.1:L;
A=kappa*SQ.*(z-L);
B=((sqrt(M)/r)+(D/2))/(SQ);
C=EA(B);
E=SQ*cot(A+C);
Pminus=E-(D/2);
Pplus=M./Pminus;
P0=Pplus+Pminus+D;
CST1=Pplus.*Pminus;
CST2=P0-Pplus-Pminus;
dP0dz=kappa.*P0.*(Pplus-Pminus);
figure
set(gca,'fontsize',18)
plot(z,Pplus*1000, 'b',z,Pminus*1000,'r',z,P0*1000,'k','Linewidth',4);
xlabel('z(m)')
ylabel('Power (mW)')
legend('P_+','P_-','P_0','D','M','r','Orientation','horizontal')
%str = sprintf('D = %.4f , M = %.9f, and r =%.3f ',D,M,r);
%title(str);
xlim([0 L])
N=gamma.*Pp;%gamma Pp with units \km
y1=0.5*kappa.*1000*(Pplus+Pminus);
y2 =y1./N;
figure
hax=axes;
[ax,p1,p2] = plotyy(z,y1,z,y2,'plot','plot');
set(ax,{'ycolor'},{'k';'k'})
set(ax,{'fontsize'},{18;18})
set(p1,'linewidth',4)% to change the first line
set(p2,'linewidth',4) % to change the second line
set(p1,'Color','b')% to change the first line
set(p2,'Color','b') % to change the second line
%str = sprintf('D = %.6f ',D);
%title(str);
ylabel(ax(1),'K_{SBS}(km^{-1})','fontsize',18,...
'Color','k') % label left y-axis
ylabel(ax(2),'K_{SBS}( \gamma P )','fontsize',18,...
'Color','k') % label right y-axis
xlabel(ax(2),'z(m)','fontsize',18,...
'Color','k')% label x-axis
set(ax(1),'XLim',[0 L])
set(ax(2),'XLim',[0 L])
set(ax(1),'YLim',[min(y1) max(y1)])
set(ax(2),'YLim',[min(y2) max(y2)])
zz1=(max(y1)-min(y1))./4;
zz2=(max(y2)-min(y2))./4;
set(ax(1),'YTick',min(y1):zz1: max(y1) )
set(ax(2),'YTick',min(y2):zz2: max(y2) )
legend('K_{SBS}(km^{-1})')
% figure
% set(gca,'fontsize',18)
% plot(z,1000*dP0dz,'Linewidth',4);
% xlabel('z(m)')
% ylabel('dP_0/dz')
% xlim([0 L])
% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
N=gamma.*Pp;%\km
lamdazero=1550;
lamdapump=linspace(1535,1560,100000);
lamdasignal=1545;
beta3=0.06;
beta4=-1*10^-4;
c=2*pi*299792458/1000;
A0=(1./lamdapump) -(1./lamdazero);
B0=(1./lamdapump) -(1./lamdasignal);
Third0=beta3.*(c.^3).*A0.*(B0.^2);
Fourth0=beta4.*(1./2).*c.^4.*(A0.^2).*(B0.^2);
Fourorder=c.^4.*beta4.*(1/12).*(B0).^4;
deltabeta=Third0+Fourth0+Fourorder;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%NO SBS
total0=deltabeta+(2.*N);
g0=((sqrt((N.^2)-(total0./2).^2)));
Gain0=((((N./(g0)).^2).*(sinh(g0.*L/1000)).^2));
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
g0=1.1*10^-11;
Aeff=1.1*10^-11;
PSBS=0.016;
SBS=(g0.*PSBS.*1000)./Aeff;
totalsbs=deltabeta+(2.*N)-SBS/2;
gsbs=((sqrt((N.^2)-(totalsbs./2).^2)));
Gainsbs=((((N./(gsbs)).^2).*(sinh(gsbs.*L/1000)).^2));
n=35;
first=round(1*(L/n));
step=round(L/n);
last=L;
for jj=1:length(deltabeta)
M = eye(2);
for ii = first:step:last;
tsv=deltabeta(jj)+(2*N)-(y1(z==(ii-round(0.5*(L/n)))));
gsv=((sqrt((N.^2)-(tsv./2).^2)));
F1=1i*tsv;
F2=2*gsv;
F3=F1./F2;
F4=1i*(N./gsv);
M1=cosh(gsv.*(step/1000))+(F3.*sinh(gsv.*(step/1000)));
M2=F4.*sinh(gsv.*(step/1000));
M3=-F4.*sinh(gsv.*(step/1000));
M4=cosh(gsv.*(step/1000))-(F3.*sinh(gsv.*(step/1000)));
M=M*[M1 M2;M3 M4];
TeRm=P0(z==ii)/P0(z==(ii-step));
M=[sqrt(TeRm) 0;0 1]*M;
end
Gv(1,jj)=M(1,1).*conj(M(1,1));
end
FigHandle = figure;
figureh=plot(lamdapump,10*log10(1+Gain0),'b',lamdapump,10*log10(1+Gainsbs),'g',lamdapump,10*log10(Gv),'r','Linewidth',5);
hold on
xlabel('\lambda_P (nm)','Fontsize',18)
ylabel('Signal Gain (dB)','Fontsize',18)
legend('Without SBS',' With SBS ideal ',' With SBS actual')
% title('Gain curves with SBS','fontsize',18)
set(gca,'XTick',1536:3:1560)
set(gca,'fontsize',18)
ylim([0 12])
hold off
My issue is that when changing the parameter L into a value smaller than one like with some changes in the parameters:
r=1.8;
M=0.80322;
D=-1.69;
kappa=700/4.6;
gamma=5285;%\W\km
Pp=7.6;
L=0.045333;
SQ=sqrt(-(0.5*D)^2 +M);
z=0:0.00001:L;
A=kappa*SQ.*(z-L);
B=((sqrt(M)/r)+(D/2))/(SQ);
C=EA(B);
E=SQ*cot(A+C);
Pminus=E-(D/2);
Pplus=M./Pminus;
P0=Pplus+Pminus+D;
CST1=Pplus.*Pminus;
CST2=P0-Pplus-Pminus;
dP0dz=kappa.*P0.*(Pplus-Pminus);
figure
set(gca,'fontsize',18)
plot(z,Pplus*1000, 'b',z,Pminus*1000,'r',z,P0*1000,'k','Linewidth',4);
xlabel('z(m)')
ylabel('Power (mW)')
legend('P_+','P_-','P_0','D','M','r','Orientation','horizontal')
%str = sprintf('D = %.4f , M = %.9f, and r =%.3f ',D,M,r);
%title(str);
xlim([0 L])
N=gamma.*Pp;%gamma Pp with units \km
y1=0.5*kappa.*1000*(Pplus+Pminus);
y2 =y1./N;
figure
hax=axes;
[ax,p1,p2] = plotyy(z,y1,z,y2,'plot','plot');
set(ax,{'ycolor'},{'k';'k'})
set(ax,{'fontsize'},{18;18})
set(p1,'linewidth',4)% to change the first line
set(p2,'linewidth',4) % to change the second line
set(p1,'Color','b')% to change the first line
set(p2,'Color','b') % to change the second line
%str = sprintf('D = %.6f ',D);
%title(str);
ylabel(ax(1),'K_{SBS}(km^{-1})','fontsize',18,...
'Color','k') % label left y-axis
ylabel(ax(2),'K_{SBS}( \gamma P )','fontsize',18,...
'Color','k') % label right y-axis
xlabel(ax(2),'z(m)','fontsize',18,...
'Color','k')% label x-axis
set(ax(1),'XLim',[0 L])
set(ax(2),'XLim',[0 L])
set(ax(1),'YLim',[min(y1) max(y1)])
set(ax(2),'YLim',[min(y2) max(y2)])
zz1=(max(y1)-min(y1))./4;
zz2=(max(y2)-min(y2))./4;
set(ax(1),'YTick',min(y1):zz1: max(y1) )
set(ax(2),'YTick',min(y2):zz2: max(y2) )
legend('K_{SBS}(km^{-1})')
% figure
% set(gca,'fontsize',18)
% plot(z,1000*dP0dz,'Linewidth',4);
% xlabel('z(m)')
% ylabel('dP_0/dz')
% xlim([0 L])
% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
N=gamma.*Pp;%\km
lamdazero=1556;
lamdapump=linspace(1300,1800,100000);
lamdasignal=1536;
beta3=1.26;
beta4=0;
c=2*pi*299792458/1000;
A0=(1./lamdapump) -(1./lamdazero);
B0=(1./lamdapump) -(1./lamdasignal);
Third0=beta3.*(c.^3).*A0.*(B0.^2);
Fourth0=beta4.*(1./2).*c.^4.*(A0.^2).*(B0.^2);
Fourorder=c.^4.*beta4.*(1/12).*(B0).^4;
deltabeta=Third0+Fourth0+Fourorder;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%NO SBS
total0=deltabeta+(2.*N);
g0=((sqrt((N.^2)-(total0./2).^2)));
Gain0=((((N./(g0)).^2).*(sinh(g0.*L/1000)).^2));
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
g0=0.7*10^-9;
Aeff=2.3*10^-12;
PSBS=Pp*4/57.5;
SBS=(g0.*PSBS.*1000)./Aeff;
totalsbs=deltabeta+(2.*N)-SBS/2;
gsbs=((sqrt((N.^2)-(totalsbs./2).^2)));
Gainsbs=((((N./(gsbs)).^2).*(sinh(gsbs.*L/1000)).^2));
n=35;
first=(1*(L/n));
step=(L/n);
last=L;
for jj=1:length(deltabeta)
M = eye(2);
for ii = first:step:last;
tsv=deltabeta(jj)+(2*N)-(y1(z==(ii-(0.5*(L/n)))));
gsv=((sqrt((N.^2)-(tsv./2).^2)));
F1=1i*tsv;
F2=2*gsv;
F3=F1./F2;
F4=1i*(N./gsv);
M1=cosh(gsv.*(step/1000))+(F3.*sinh(gsv.*(step/1000)));
M2=F4.*sinh(gsv.*(step/1000));
M3=-F4.*sinh(gsv.*(step/1000));
M4=cosh(gsv.*(step/1000))-(F3.*sinh(gsv.*(step/1000)));
M=M*[M1 M2;M3 M4];
TeRm=P0(z==ii)/P0(z==(ii-step));
M=[sqrt(TeRm) 0;0 1]*M;
end
Gv(1,jj)=M(1,1).*conj(M(1,1));
end
FigHandle = figure;
figureh=plot(lamdapump,10*log10(1+Gain0),'b',lamdapump,10*log10(1+Gainsbs),'g',lamdapump,10*log10(Gv),'r','Linewidth',5);
hold on
xlabel('\lambda_P (nm)','Fontsize',18)
ylabel('Signal Gain (dB)','Fontsize',18)
legend('Without SBS',' With SBS ideal ',' With SBS actual')
title('Gain curves with SBS','fontsize',18)
%set(gca,'XTick',1300:10:1800)
set(gca,'fontsize',18)
%ylim([0 12])
hold off
The code does not work. The reason in my opinion is that when using the function round it rounds to zero. If i remove the round function it no longer works. Any thoughts on how can I make my code work again?
Is there any chance to work around round function because i need to take n steps. However I simply can not do so because L is very small. I need the value of L to stay small.
My apologizes for the long codes and explanations. Be free to ask !]1

How to extend the line of 2 PolyFit from either side to intersect and get a combined fit line

I am trying to get combined fit line made from two linear polyfit from either side (should intersect), here is the picture of fit lines:
I am trying to make the two fit (blue) lines intersect and produce a combined fit line as shown in the picture below:
Note that the crest can happen anywhere so I cannot assume to be in the center.
Here is the code that creates the first plot:
xdatPart1 = R;
zdatPart1 = z;
n = 3000;
ln = length(R);
[sX,In] = sort(R,1);
sZ = z(In);
xdatP1 = sX(1:n,1);
zdatP1 = sZ(1:n,1);
n2 = ln - 3000;
xdatP2 = sX(n2:ln,1);
zdatP2 = sZ(n2:ln,1);
pp1 = polyfit(xdatP1,zdatP1,1);
pp2 = polyfit(xdatP2,zdatP2,1);
ff1 = polyval(pp1,xdatP1);
ff2 = polyval(pp2,xdatP2);
xDat = [xdatPart1];
zDat = [zdatPart1];
axes(handles.axes2);
cla(handles.axes2);
plot(xdatPart1,zdatPart1,'.r');
hold on
plot(xdatP1,ff1,'.b');
plot(xdatP2,ff2,'.b');
xlabel(['R ',units]);
ylabel(['Z ', units]);
grid on
hold off
Below's a rough implementation with no curve fitting toolbox. Although the code should be self-explanatory, here's an outline of the algorithm:
We generate some data.
We estimate the intersection point by smoothing the data and finding the location of the maximum value.
We fit a line to each side of the estimated intersection point.
We compute the intersection of the fitted lines using the fitted equations.
We use mkpp to construct a function handle to an "evaluateable" piecewise polynomial.
The output, ppfunc, is a function handle of 1 variable, that you can use just like any regular function.
Now, this solution is not optimal in any sense (such as MMSE, LSQ, etc.) but as you will see in the comparison with the result from MATLAB's toolbox, it's not that bad!
function ppfunc = q40160257
%% Define the ground truth:
center_x = 6 + randn(1);
center_y = 78.15 + 0.01 * randn(1);
% Define a couple of points for the left section
leftmost_x = 0;
leftmost_y = 78.015 + 0.01 * randn(1);
% Define a couple of points for the right section
rightmost_x = 14.8;
rightmost_y = 78.02 + 0.01 * randn(1);
% Find the line equations:
m1 = (center_y-leftmost_y)/(center_x-leftmost_x);
n1 = getN(leftmost_x,leftmost_y,m1);
m2 = (rightmost_y-center_y)/(rightmost_x-center_x);
n2 = getN(rightmost_x,rightmost_y,m2);
% Print the ground truth:
fprintf(1,'The line equations are: {y1=%f*x+%f} , {y2=%f*x+%f}\n',m1,n1,m2,n2)
%% Generate some data:
NOISE_MAGNITUDE = 0.002;
N_POINTS_PER_SIDE = 1000;
x1 = linspace(leftmost_x,center_x,N_POINTS_PER_SIDE);
y1 = m1*x1+n1+NOISE_MAGNITUDE*randn(1,numel(x1));
x2 = linspace(center_x,rightmost_x,N_POINTS_PER_SIDE);
y2 = m2*x2+n2+NOISE_MAGNITUDE*randn(1,numel(x2));
X = [x1 x2(2:end)]; Y = [y1 y2(2:end)];
%% See what we have:
figure(); plot(X,Y,'.r'); hold on;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Estimating the intersection point:
MOVING_AVERAGE_PERIOD = 10; % Play around with this value.
smoothed_data = conv(Y, ones(1,MOVING_AVERAGE_PERIOD)/MOVING_AVERAGE_PERIOD, 'same');
plot(X, smoothed_data, '-b'); ylim([floor(leftmost_y*10) ceil(center_y*10)]/10);
[~,centerInd] = max(smoothed_data);
fprintf(1,'The real intersection is at index %d, the estimated is at %d.\n',...
N_POINTS_PER_SIDE, centerInd);
%% Fitting a polynomial to each side:
p1 = polyfit(X(1:centerInd),Y(1:centerInd),1);
p2 = polyfit(X(centerInd+1:end),Y(centerInd+1:end),1);
[x_int,y_int] = getLineIntersection(p1,p2);
plot(x_int,y_int,'sg');
pp = mkpp([X(1) x_int X(end)],[p1; (p2 + [0 x_int*p2(1)])]);
ppfunc = #(x)ppval(pp,x);
plot(X, ppfunc(X),'-k','LineWidth',3)
legend('Original data', 'Smoothed data', 'Computed intersection',...
'Final piecewise-linear fit');
grid on; grid minor;
%% Comparison with the curve-fitting toolbox:
if license('test','Curve_Fitting_Toolbox')
ft = fittype( '(x<=-(n2-n1)/(m2-m1))*(m1*x+n1)+(x>-(n2-n1)/(m2-m1))*(m2*x+n2)',...
'independent', 'x', 'dependent', 'y' );
opts = fitoptions( 'Method', 'NonlinearLeastSquares' );
% Parameter order: m1, m2, n1, n2:
opts.StartPoint = [0.02 -0.02 78 78];
fitresult = fit( X(:), Y(:), ft, opts);
% Comparison with what we did above:
fprintf(1,[...
'Our solution:\n'...
'\tm1 = %-12f\n\tm2 = %-12f\n\tn1 = %-12f\n\tn2 = %-12f\n'...
'Curve Fitting Toolbox'' solution:\n'...
'\tm1 = %-12f\n\tm2 = %-12f\n\tn1 = %-12f\n\tn2 = %-12f\n'],...
m1,m2,n1,n2,fitresult.m1,fitresult.m2,fitresult.n1,fitresult.n2);
end
%% Helper functions:
function n = getN(x0,y0,m)
% y = m*x+n => n = y0-m*x0;
n = y0-m*x0;
function [x_int,y_int] = getLineIntersection(p1,p2)
% m1*x+n1 = m2*x+n2 => x = -(n2-n1)/(m2-m1)
x_int = -(p2(2)-p1(2))/(p2(1)-p1(1));
y_int = p1(1)*x_int+p1(2);
The result (sample run):
Our solution:
m1 = 0.022982
m2 = -0.011863
n1 = 78.012992
n2 = 78.208973
Curve Fitting Toolbox' solution:
m1 = 0.022974
m2 = -0.011882
n1 = 78.013022
n2 = 78.209127
Zoomed in around the intersection:

How project Velodyne point clouds on image? (KITTI Dataset)

Here is my code to project Velodyne points into the images:
cam = 2;
frame = 20;
% compute projection matrix velodyne->image plane
R_cam_to_rect = eye(4);
[P, Tr_velo_to_cam, R] = readCalibration('D:/Shared/training/calib/',frame,cam)
R_cam_to_rect(1:3,1:3) = R;
P_velo_to_img = P*R_cam_to_rect*Tr_velo_to_cam;
% load and display image
img = imread(sprintf('D:/Shared/training/image_2/%06d.png',frame));
fig = figure('Position',[20 100 size(img,2) size(img,1)]); axes('Position',[0 0 1 1]);
imshow(img); hold on;
% load velodyne points
fid = fopen(sprintf('D:/Shared/training/velodyne/%06d.bin',frame),'rb');
velo = fread(fid,[4 inf],'single')';
% remove every 5th point for display speed
velo = velo(1:5:end,:);
fclose(fid);
% remove all points behind image plane (approximation
idx = velo(:,1)<5;
velo(idx,:) = [];
% project to image plane (exclude luminance)
velo_img = project(velo(:,1:3),P_velo_to_img);
% plot points
cols = jet;
for i=1:size(velo_img,1)
col_idx = round(64*5/velo(i,1));
plot(velo_img(i,1),velo_img(i,2),'o','LineWidth',4,'MarkerSize',1,'Color',cols(col_idx,:));
where readCalibration function is defined as
function [P, Tr_velo_to_cam, R_cam_to_rect] = readCalibration(calib_dir,img_idx,cam)
% load 3x4 projection matrix
P = dlmread(sprintf('%s/%06d.txt',calib_dir,img_idx),' ',0,1);
Tr_velo_to_cam = P(6,:);
R_cam_to_rect = P(5,1:9);
P = P(cam+1,:);
P = reshape(P ,[4,3])';
Tr_velo_to_cam = reshape(Tr_velo_to_cam ,[3,4])';
R_cam_to_rect = reshape(R_cam_to_rect ,[3,3])';
end
But here is the result:
what is wrong with my code? I changed the "cam" variable from 0 to 3 and none of them worked. You can find a sample of Calibration file in this link:
How to understand KITTI camera calibration files
I fixed it by myself. here is the modification in readCalibration function:
Tr_velo_to_cam = P(6,:);
Tr_velo_to_cam = reshape(Tr_velo_to_cam ,[4,3])';
Tr_velo_to_cam = [Tr_velo_to_cam;0 0 0 1];

Issues with imgIdx in DescriptorMatcher mexopencv

My idea is simple here. I am using mexopencv and trying to see whether there is any object present in my current that matches with any image stored in my database.I am using OpenCV DescriptorMatcher function to train my images.
Here is a snippet, I am wishing to build on top of this, which is one to one one image matching using mexopencv, and can also be extended for image stream.
function hello
detector = cv.FeatureDetector('ORB');
extractor = cv.DescriptorExtractor('ORB');
matcher = cv.DescriptorMatcher('BruteForce-Hamming');
train = [];
for i=1:3
train(i).img = [];
train(i).points = [];
train(i).features = [];
end;
train(1).img = imread('D:\test\1.jpg');
train(2).img = imread('D:\test\2.png');
train(3).img = imread('D:\test\3.jpg');
for i=1:3
frameImage = train(i).img;
framePoints = detector.detect(frameImage);
frameFeatures = extractor.compute(frameImage , framePoints);
train(i).points = framePoints;
train(i).features = frameFeatures;
end;
for i = 1:3
boxfeatures = train(i).features;
matcher.add(boxfeatures);
end;
matcher.train();
camera = cv.VideoCapture;
pause(3);%Sometimes necessary
window = figure('KeyPressFcn',#(obj,evt)setappdata(obj,'flag',true));
setappdata(window,'flag',false);
while(true)
sceneImage = camera.read;
sceneImage = rgb2gray(sceneImage);
scenePoints = detector.detect(sceneImage);
sceneFeatures = extractor.compute(sceneImage,scenePoints);
m = matcher.match(sceneFeatures);
%{
%Comments in
img_no = m.imgIdx;
img_no = img_no(1);
%I am planning to do this based on the fact that
%on a perfect match imgIdx a 1xN will be filled
%with the index of the training
%example 1,2 or 3
objPoints = train(img_no+1).points;
boxImage = train(img_no+1).img;
ptsScene = cat(1,scenePoints([m.queryIdx]+1).pt);
ptsScene = num2cell(ptsScene,2);
ptsObj = cat(1,objPoints([m.trainIdx]+1).pt);
ptsObj = num2cell(ptsObj,2);
%This is where the problem starts here, assuming the
%above is correct , Matlab yells this at me
%index exceeds matrix dimensions.
end [H,inliers] = cv.findHomography(ptsScene,ptsObj,'Method','Ransac');
m = m(inliers);
imgMatches = cv.drawMatches(sceneImage,scenePoints,boxImage,boxPoints,m,...
'NotDrawSinglePoints',true);
imshow(imgMatches);
%Comment out
%}
flag = getappdata(window,'flag');
if isempty(flag) || flag, break; end
pause(0.0001);
end
Now the issue here is that imgIdx is a 1xN matrix , and it contains the index of different training indices, which is obvious. And only on a perfect match is the matrix imgIdx is completely filled with the matched image index. So, how do I use this matrix to pick the right image index. Also
in these two lines, I get the error of index exceeding matrix dimension.
ptsObj = cat(1,objPoints([m.trainIdx]+1).pt);
ptsObj = num2cell(ptsObj,2);
This is obvious since while debugging I saw clearly that the size of m.trainIdx is greater than objPoints, i.e I am accessing points which I should not, hence index exceeds
There is scant documentation on use of imgIdx , so anybody who has knowledge on this subject, I need help.
These are the images I used.
Image1
Image2
Image3
1st update after #Amro's response:
With the ratio of min distance to distance at 3.6 , I get the following response.
With the ratio of min distance to distance at 1.6 , I get the following response.
I think it is easier to explain with code, so here it goes :)
%% init
detector = cv.FeatureDetector('ORB');
extractor = cv.DescriptorExtractor('ORB');
matcher = cv.DescriptorMatcher('BruteForce-Hamming');
urls = {
'http://i.imgur.com/8Pz4M9q.jpg?1'
'http://i.imgur.com/1aZj0MI.png?1'
'http://i.imgur.com/pYepuzd.jpg?1'
};
N = numel(urls);
train = struct('img',cell(N,1), 'pts',cell(N,1), 'feat',cell(N,1));
%% training
for i=1:N
% read image
train(i).img = imread(urls{i});
if ~ismatrix(train(i).img)
train(i).img = rgb2gray(train(i).img);
end
% extract keypoints and compute features
train(i).pts = detector.detect(train(i).img);
train(i).feat = extractor.compute(train(i).img, train(i).pts);
% add to training set to match against
matcher.add(train(i).feat);
end
% build index
matcher.train();
%% testing
% lets create a distorted query image from one of the training images
% (rotation+shear transformations)
t = -pi/3; % -60 degrees angle
tform = [cos(t) -sin(t) 0; 0.5*sin(t) cos(t) 0; 0 0 1];
img = imwarp(train(3).img, affine2d(tform)); % try all three images here!
% detect fetures in query image
pts = detector.detect(img);
feat = extractor.compute(img, pts);
% match against training images
m = matcher.match(feat);
% keep only good matches
%hist([m.distance])
m = m([m.distance] < 3.6*min([m.distance]));
% sort by distances, and keep at most the first/best 200 matches
[~,ord] = sort([m.distance]);
m = m(ord);
m = m(1:min(200,numel(m)));
% naive classification (majority vote)
tabulate([m.imgIdx]) % how many matches each training image received
idx = mode([m.imgIdx]);
% matches with keypoints belonging to chosen training image
mm = m([m.imgIdx] == idx);
% estimate homography (used to locate object in query image)
ptsQuery = num2cell(cat(1, pts([mm.queryIdx]+1).pt), 2);
ptsTrain = num2cell(cat(1, train(idx+1).pts([mm.trainIdx]+1).pt), 2);
[H,inliers] = cv.findHomography(ptsTrain, ptsQuery, 'Method','Ransac');
% show final matches
imgMatches = cv.drawMatches(img, pts, ...
train(idx+1).img, train(idx+1).pts, ...
mm(logical(inliers)), 'NotDrawSinglePoints',true);
% apply the homography to the corner points of the training image
[h,w] = size(train(idx+1).img);
corners = permute([0 0; w 0; w h; 0 h], [3 1 2]);
p = cv.perspectiveTransform(corners, H);
p = permute(p, [2 3 1]);
% show where the training object is located in the query image
opts = {'Color',[0 255 0], 'Thickness',4};
imgMatches = cv.line(imgMatches, p(1,:), p(2,:), opts{:});
imgMatches = cv.line(imgMatches, p(2,:), p(3,:), opts{:});
imgMatches = cv.line(imgMatches, p(3,:), p(4,:), opts{:});
imgMatches = cv.line(imgMatches, p(4,:), p(1,:), opts{:});
imshow(imgMatches)
The result:
Note that since you did not post any testing images (in your code you are taking input from the webcam), I created one by distorting one the training images, and using it as a query image. I am using functions from certain MATLAB toolboxes (imwarp and such), but those are non-essential to the demo and you could replace them with equivalent OpenCV ones...
I must say that this approach is not the most robust one.. Consider using other techniques such as the bag-of-word model, which OpenCV already implements.