i searched face detection for matlab to my project.
i found one:
http://people.kyb.tuebingen.mpg.de/kienzle/fdlib/fdlib.htm
i downloaded the source code, but it didn't work, i got that error from matlab:
??? Undefined function or method 'fdmex' for input arguments of type
'uint8'.
Error in ==> tinytest at 10 s = fdmex(x', threshold);
the main script is:
x = imread('geeks.jpg');
% decision threshold.
% change this to a smaller value, if too many false detections occur.
% change it to a larger value, if faces are not recognized.
% a reasonable range is -10 ... 10.
threshold = 0;
imagesc(x); hold on; colormap gray;
s = fdmex(x', threshold);
for i=1:size(s,1)
h = rectangle('Position',[s(i,1)-s(i,3)/2,s(i,2)-s(i,3)/2,s(i,3),s(i,3)], ...
'EdgeColor', [1,0,0], 'linewidth', 2);
end
axis equal;
axis off
can you find the error?
Usually when I see a uint8 error and a grayscale image, its a red flag to me that I need to do
colorImg=imread('imageName.jpg')
% Even if the image is grayscale, if its png or jpg,
% it will load in as a color image almost exclusively
img=rgb2gray(colorImg)
If you look at the img output, you will notice now its of type double instead of uint8 :)
If that doesn't work, hopefully macduffs will, mine just seems easier if that actually does fix it. :)
Depending on your version of matlab, it looks like the fdlib, comes with a .dll, rename it to .mexw32 or whatever your host machine desires. You can determine this by running:
>> mexext
mexw32
on the Matlab command prompt. Use the mex extension and rename the fdmex.dll to fdmex.mexw32, or whatever mexext returns and it should run flawlessly.
If I run in on my Windows XP machine, I get that beautiful picure:
However, if you do not have a 32 bit machine, the author of the software writes on the link in the question:
Please note that all builds were optimized for Intel Pentium CPUs. If
you would like to run it on a different platform, or have any other
questions, please let me know.
He has a link to his profile and email, so I recommend contacting him for a 64 bit version of the executable.
If you have a recent version of Matlab with the Computer Vision System Toolbox installed, you can use vision.CascadeObjectDetector system object to detect faces in images.
Related
I am using the BNT-toolbox, a big library written in matlab for inference in bayesian networks.
I had to add this toolbox to the path of MATLAB. But after doing that I can't use the default legend function any more.
I think that this library might have his own legend function, overwriting the default one. How can I manually tell MATLAB that I want the original one and not the one in the new toolbox?
Tried in Matlab 2018b and 2020a
EDIT: to reproduce it:
When I run the testscript, it shows the lines and the legend.
https://github.com/bayesnet/bnt, this is the toolbox I talked about. I downloaded it, unzipped and then added it to my path with Home -> Set path -> add folder with subfolder
When I run the script now, it shows the lines and not the legend.
NOTE: when I tried another way of plotting (see testscript 2), the legend shows itself again. So this is a working "workaround"
Testscript1: (location: C:\Users\TomDe\Downloads\FullBNT-1.0.7\bnt\own\testscript1.m)
x = linspace(0,pi);
y1 = cos(x);
plot(x,y1)
hold on
y2 = cos(2*x);
plot(x,y2)
legend('cos(x)','cos(2x)')
Testscript2
% Some other code
tiledlayout(2,1)
nexttile
plot(inputPath)
hold on
plot(sensorPath)
plot(inputInference)
hold off
title('The Input sequence and sensor readings ')
legend('Path', 'sensor', 'Inference')
You can check that that is indeed the case with the which function:
>> which legend -all
It's generally a bad idea to overshadow MATLAB's own functions. I highly suggest you avoid this problem in the first place. Create a MATLAB package and place the source code of this toolbox in there.
For demonstration purposes only, I'll show how to call the real legend.m:
>> wd = pwd;
>> cd 'C:\Program Files\MATLAB\R2020a\toolbox\matlab\scribe\'
>> legend(...)
>> cd(wd);
this being the location of the file on a MATLAB R2020a install.
There are two things you can do:
You always want to use the default legend, never the one in the toolbox: use the -end option to your addpath call when adding the BNT toolbox directory, so that its functions appear at the end of the path. MATLAB will always find functions by looking through the path directories in turn, the directories earlier in the path therefore have precedence.
You want to use both versions of legend, and want to choose which one to use: write a little support function that removes the BTN toolbox from your path, calls legend, then adds the toolbox back in. Such a function looks like this (save it as original_legend.m somewhere in your path, then use it in the same way you'd call legend but using this new name instead):
function out = original_legend(varargin)
rmpath /path/to/bnt/toolbox
out = legend(varargin{:});
addpath /path/to/bnt/toolbox
I have developed a GUI with Matlab's AppDesigner. To pick up mouse clicks, I set the ButtonDownFcn callback for the image I have plotted. Then inside the callback I read the hit.IntersectionPoint:
ax = app.UIAxes;
ih = imagesc(I, 'Parent', ax);
ih.ButtonDownFcn = {#im_ButtonDownFcn, app};
_
function im_ButtonDownFcn(im, hit, app)
mouse_pos = flip(round(hit.IntersectionPoint(1:2))); %[NaN NaN NaN]
On my computer everything works, but on a colleague's computer, it returns NaNs. As far as we can tell, the differences are:
Mine (works) Colleague (NaN)
Win10 MacOS
USB Mouse Laptop Trackpad
Does anyone know if either of these could be a factor in IntersectionPoint returning NaNs? Or have another suggestion that we could troubleshoot?
I have read here that the IntersectionPoint is only available from 2018a onwards. My colleague uses 2017b, but when I tested on 2017b on my computer, the IntersectionPoint still worked.
I'm trying to get this Matlab example from the MathWorks site to work with Octave 4.0.0:
http://www.mathworks.com/help/vision/examples/motion-based-multiple-object-tracking.html
I made a GitHub repository for what I have so far, which mostly a re-formatted version of the MathWorks link above:
https://github.com/MicrocontrollersAndMore/Matlab_Octave_Multiple_Object_Tracking
The only changes I have made so far are:
-Made a separate main.m file to run multiObjectTracking.m
-Since I don't have the 'atrium.avi' file used by MathWorks, I changed the VideoFileReader line in the code to use '768x576.avi', which is included with OpenCV ('768x576.avi' is also uploaded to the GitHub repo linked above)
-Minor spacing and comment changes
-added "pkg load image;" at the beginning of main.m and multiObjectTracking.m, in the few test Octave computer vision programs I did this seemed to be necessary, otherwise I would get an error to the effect of "library image has been installed but not loaded"
Currently when I run the program I get the following error:
error: 'vision' undefined near line 38 column 18
error: called from
multiObjectTracking>setupSystemObjects at line 38 column 16
multiObjectTracking at line 14 column 7
main at line 14 column 1
In other words in the function:
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function obj = setupSystemObjects()
% initialize Video I/O, create objects for reading a video from a file, drawing the tracked objects in each frame, and playing the video
obj.reader = vision.VideoFileReader('768x576.avi'); % create a video file reader
obj.videoPlayer = vision.VideoPlayer('Position', [20, 400, 700, 400]); % create two video players, one to display the video,
obj.maskPlayer = vision.VideoPlayer('Position', [740, 400, 700, 400]); % and one to display the foreground mask
% Create System objects for foreground detection and blob analysis
% The foreground detector is used to segment moving objects from the background. It outputs a binary mask, where the pixel value
% of 1 corresponds to the foreground and the value of 0 corresponds to the background
obj.detector = vision.ForegroundDetector('NumGaussians', 3, 'NumTrainingFrames', 40, 'MinimumBackgroundRatio', 0.7);
% Connected groups of foreground pixels are likely to correspond to moving objects. The blob analysis System object is used to find such groups
% (called 'blobs' or 'connected components'), and compute their characteristics, such as area, centroid, and the bounding box.
obj.blobAnalyser = vision.BlobAnalysis('BoundingBoxOutputPort', true, 'AreaOutputPort', true, 'CentroidOutputPort', true, 'MinimumBlobArea', 400);
end
The 'vision' object is not recognized.
It is my understanding that 'vision' is part of the Matlab Toolbox, but I am unable to confirm this since I don't have access to Matlab.
So here are my questions so far:
-Is there an Octave equivalent of the 'vision' object?
-What other differences should I be aware of to get this Matlab program running under Octave ??
I have been attempting to use the following site:
http://www.peterkovesi.com/matlabfns/
but have not been very successful so far getting these examples working or as a guide to the Matlab to Octave translation I'm attempting.
Any assistance from Octave experts or those who have gotten computer vision working in both Matlab and Octave would be greatly appreciated.
Those are Matlab functions from the Computer Vision System Toolbox.
General rule is, Octave comes short in matching Matlab toolboxes and when it has something, you need to install the Octave packages separately.
The link to the site you supplied does not seem to provide support for vision object functionalities.
I am trying to train a cascade object detector using the built-in function in Matlab (vision toolbox). However, the following message came up after running the command.
*
Error using trainCascadeObjectDetector (line 245)
Error reading instance 1 from image 2, bounding box possibly out of image bounds.
*
I don't understand why the bounding box can be out of bounds. All the parameters for my positive images are set up correctly (starting point x,y, width, and height. I used createMask(h) to create a mask and find the minimum coordinates for x and y to be starting point and max-min for each dimension to be the width and height), and the negative images (as far as I know) are just images without any setup needed.
Anyone ever ran into the same problem? How did you solve it?
EDIT:
Here's the code. I don't have the toolbox for training the "data" struct, so I wrote one myself
positive_samples=struct;
list=dir('my_folder_name_which_I_took_out');
L=length(list)-3; %Set L to be the length of the image list.
for i=1:length(list)
positive_samples(i).imageFilename=list(i).name;
end
positive_samples(:,1)=[]; %first 3 lines do not contain file names
positive_samples(:,1)=[];
positive_samples(:,1)=[];
for j=1:1
imshow(positive_samples(j).imageFilename);
title(positive_samples(j).imageFilename);
h=imrect;
h1=createMask(h);
I=imread(positive_samples(j).imageFilename);
[le, wi, hi]=size(I);
tempmat=[];
count=1;
for l=1:le
for m=1:wi
if h1(l,m)==1
tempmat(count,1)=l;
tempmat(count,2)=m;
count=count+1;
end
end
end
positive_samples(j).objectBoundingBoxes(1,1)=min(tempmat(:,1));
positive_samples(j).objectBoundingBoxes(1,2)=min(tempmat(:,2));
positive_samples(j).objectBoundingBoxes(1,3)=max(tempmat(:,2))-min(tempmat(:,2));
positive_samples(j).objectBoundingBoxes(1,4)=max(tempmat(:,1))-min(tempmat(:,1));
imtool close all
end
trainCascadeObjectDetector('animalfinder.xml', positive_samples, 'my_neative_folder_name', 'FalseAlarmRate', 0.2, 'NumCascadeStages', 3);
sorry if it's messy......
I did not run the code, because I don't own the toolbox, but the following lines are very "suspicious":
positive_samples(j).objectBoundingBoxes(1,1)=min(tempmat(:,1));
positive_samples(j).objectBoundingBoxes(1,2)=min(tempmat(:,2));
positive_samples(j).objectBoundingBoxes(1,3)=max(tempmat(:,2))-min(tempmat(:,2));
positive_samples(j).objectBoundingBoxes(1,4)=max(tempmat(:,1))-min(tempmat(:,1));
I would expect:
positive_samples(j).objectBoundingBoxes(1,1)=min(tempmat(:,2));
positive_samples(j).objectBoundingBoxes(1,2)=min(tempmat(:,1));
positive_samples(j).objectBoundingBoxes(1,3)=max(tempmat(:,2))-min(tempmat(:,2));
positive_samples(j).objectBoundingBoxes(1,4)=max(tempmat(:,1))-min(tempmat(:,1));
Some suggestions to shorten your code, they are not related to the problem:
You can shorten line 4 to 9 to a single line, avoiding the loop: [positive_samples(1:L).im]=list(4:end).name
And this loop can be replaced as well:
tempmat=[];
count=1;
for l=1:le
for m=1:wi
if h1(l,m)==1
tempmat(count,1)=l;
tempmat(count,2)=m;
count=count+1;
end
end
end
shorter and faster code:
[y,x]=find(h1);
tempmat=[y x];
There is a better way to label your positive samples. The Computer Vision System Toolbox now includes the Training Image Labeler app (as of release 2014a). If you do not have R2014a you should try the Cascade Training GUI app.
I always thought that, in MATLAB, a graphic handle X whose property HandleVisibility is set to anything else than on will not show up when using findobj(h) or get(h, 'Children') with h being the parent of X. However, this seems to be true only under Windows and not under Linux. I am using MATLAB R2011b both on Debian 6.0.6 (squeeze) and Windows 7. If I run the following code under Windows:
figure;plot(randn(1,1000));
h = get(gcf, 'Children');
Then I get a single handle in h, which corresponds to the axes that contain my random plot. This is what I would expect. However, if I run exactly the same code in Linux, h contains an array of 10 handles. Indeed most of those handles are just UI elements, whose HandleVisibility property is set to off. For instance:
get(h(end), 'Type') % returns: 'uitoolbar'
get(h(end), 'HandleVisibility') % returns 'off'
Is there a reason for this apparently inconsistent behavior? Can this be reproduced by others? In case it could be relevant, the Debian server that I am using runs Sun JAVA 1.6.0_26, which is not the default on Debian (openJDK).
I was unable to reproduce on either r2011a or r2012b, with Sun JAVA.
One workaround might be to filter based on visibility:
visibleChildren = findobj(get(h,'children'),'HandleVisibility','on')
Sounds like something specific to your install.