How to interpret error while camera calibrating - matlab

I'm trying to calibrate camera using MatLab.
Is reprojection error a distance between a pattern keypoint detected in a calibration image, and a corresponding world point projected into the same image?
It shows Overall Mean Error (for example 0.86 pixels) and that's what I want to understand.
Does 0 is the best result?

Related

Matlab: Reprojection error is high as compare to calibrated reprojection error

I calibrated stereo cameras using Matlab Stereo camera app and mine reprojection error was nearly 0.19 pixels. But when I used same calibrated parameters in finding out 3D coordinates of an object, the reprojection error was on higher side nearly 7-8 pixels. What I'm doing is finding out the center of the object in left and right images say (xl,yl) and (xr,yr) respectively and then used these points to find out reprojection error. What I'm doing wrong? Why reprojection error so high as compare to calibrated reprojection error?
Thanks.

Does distance from camera to calibration pattern affect calibration parameters?

I'm trying to use to stereo camera measure distance from cameras to a dynamic object(a moving car for example). I used a checkerboard pattern with 7 by 8 squares with square size of 89 millimeters(~ 3.5 inches). distance from camera to pattern was 212 centimeters (~ 83.5 inches). I'm using Python and OpenCV
My questions are:
that does the distance from pattern to camera affect much at the calibration parameters? It is stated in One of Matlab examples that distance from camera to pattern in calibration process should be the same as object distance that it is desired to measure1.
Should I use bigger board size and increase the camera to pattern distance to get more accurate results for my application?
I think that the specific distance you use for the calibration shouldn't really matter. What does matter is, that you take as many possible different images of your checkerboard as possible. At least 15. Checkerboard should be moved so that you cover the whole camera field. Checkerboard should be also imaged at different out of plane orientations. Having a checkerboard with more squares should also be beneficial as this means more corner points per image. Size of the squares shouldn't make a difference.
On the other hand, camera calibration should be performed with fixed focus which also shouldn't change after the calibration. So, in practice, I guess that this forces you to perform calibration at similar distance that will be used later for the experiment.

Verify that camera calibration is still valid

How do you determine that the intrinsic and extrinsic parameters you have calculated for a camera at time X are still valid at time Y?
My idea would be
to use a known calibration object (a chessboard) and place it in the camera's field of view at time Y.
Calculate the chessboard corner points in the camera's image (at time Y).
Define one of the chessboard corner points as world origin and calculate the world coordinates of all remaining chessboard corners based on that origin.
Relate the coordinates of 3. with the camera coordinate system.
Use the parameters calculated at time X to calculate the image points of the points from 4.
Calculate distances between points from 2. with points from 5.
Is that a clever way to go about it? I'd eventually like to implement it in MATLAB and later possibly openCV. I think I'd know how to do steps 1)-2) and step 6). Maybe someone can give a rough implementation for steps 2)-5). Especially I'd be unsure how to relate the "chessboard-world-coordinate-system" with the "camera-world-coordinate-system", which I believe I would have to do.
Thanks!
If you have a single camera you can easily follow the steps from this article:
Evaluating the Accuracy of Single Camera Calibration
For achieving step 2, you can easily use detectCheckerboardPoints function from MATLAB.
[imagePoints, boardSize, imagesUsed] = detectCheckerboardPoints(imageFileNames);
Assuming that you are talking about stereo-cameras, for stereo pairs, imagePoints(:,:,:,1) are the points from the first set of images, and imagePoints(:,:,:,2) are the points from the second set of images. The output contains M number of [x y] coordinates. Each coordinate represents a point where square corners are detected on the checkerboard. The number of points the function returns depends on the value of boardSize, which indicates the number of squares detected. The function detects the points with sub-pixel accuracy.
As you can see in the following image the points are estimated relative to the first point that covers your third step.
[The image is from this page at MATHWORKS.]
You can consider point 1 as the origin of your coordinate system (0,0). The directions of the axes are shown on the image and you know the distance between each point (in the world coordinate), so it is just the matter of depth estimation.
To find a transformation matrix between the points in the world CS and the points in the camera CS, you should collect a set of points and perform an SVD to estimate the transformation matrix.
But,
I would estimate the parameters of the camera and compare them with the initial parameters at time X. This is easier, if you have saved the images that were used when calibrating the camera at time X. By repeating the calibrating process using those images you should get very similar results, if the camera calibration is still valid.
Edit: Why you need the set of images used in the calibration process at time X?
You have a set of images to do the calibrations for the first time, right? To recalibrate the camera you need to use a new set of images. But for checking the previous calibration, you can use the previous images. If the parameters of the camera are changes, there would be an error between the re-estimation and the first estimation. This can be used for evaluating the validity of the calibration not for recalibrating the camera.

Matlab/OpenCV stereo vision distance measurement is not accurate

After stereo calibration, when I run the Matlab example for stereo depth estimation (SDE), the distances are wrong: at about 2 meters, it always reports distance as less than 1m.
And my 3D scene reconstruction looks cone-shaped instead of like the real scene.
Disparity map is very noisy (non-smooth), but resembles the scene.
If I 'feed' the SDE script the example file instead of webcam input, it runs okay, all looks great; when I feed it from two webcams ( 'Logitech HD Pro Webcam C920' ) that's when I get the above bad results, beginning with rough disparity map.
I've tried many different calibration attempts with just a few images up to about 60, with Matlab's checkerboard pattern at different angles (never > 45) and distance to cameras about 8 to 20'. Camera lenses are 3.8175" apart always, and are mounted to top edge of laptop. Followed Matlab's recommended workflow.
What am I doing wrong in calibration?
Matlab R2015a.
Laptop Windows 7 64-bit
Checkerboard pattern is 37" x 27"
............JUST DISCOVERED PROBLEM:
Was creating disparity map with this:
disparityMap = disparity(frameLeftGray, frameRightGray);
However, my camera #1 is on the right, and Matlab says default disparity range is [0 64] and for cam #1 right it should be [-128 0], but that changes the disparity map to all uniform blue.
I got it working. (1) left/right of calibration and images and detection data structures must match. (2) Use mm for checkboard square size. Inches causes malfunction, because all else is in mm.

Stereo matching

I am using Camera Calibration Toolbox for Matlab. After calibration I have intrinsic and extrinsic parameters of stereo camera system. Next, I would like to determine the distance between the camera system and the object. To get this information, I used the function stereo_triangulation which is included in the Toolbox. Input are two matrixes including pixel coordinates of correspondences in the left and right image.
I tried to get coordinates of correspondences with using of Basic Block Matching method which is described in Matlab's help for Stereo Vision.
Resolution of my pictures is 1280x960 pixels. I know that the biggest disparity is around 520 pixels. I set the maximum of disparity range to 520. But then determine the coordinates takes ages. It is not possible use in practice. Calculating of disparity map is much faster with using of Matlab's function disparity(). But I want the step before - coordinates of correspondences.
Please can you suggest how can I effectively get the coordinates with Matlab?
Disparity and 3D are related by simple formulas (see below) so the time for calculating 3D data and disparity map should be the same. The notation is
f - focal length in pixels,
B - separation between cameras,
u, v - row and column in the system centered on the middle of the image,
d-disparity,
x, y, z - 3D coordinates.
z=f*B/d;
x=z*u/f;
y=z*v/f;
1280x960 is too large resolution for any correlation stereo to work in real time. Think about it: you have to loop over a 2d image, over 2d correlation window and over the range of disparities. This means 5 embedded loops! I don't work with Matlab anymore but I know that it is quite slow.