matlab control the digital camera - matlab

I have a digital camera that doesn't support for taking a photo from the computer.
can I control this camera by matlab? or may I need a digital camera that supports it in order to control it by the matlab?
I just want to take a photo from matlab (there is a usb that is connected between the camera and the computer).

If you want to control cameras with Matlab you should have the Image Acquisition Toolbox. Additionally the camera you want to connect must be supported by the Toolbox.
You may want to check out
http://www.mathworks.com/products/imaq/

In case your camera is also having A/V R output(like in Sony digital camera), you can use a USB TV Tuner for capturing the analog signal and then use matlab to capture the video/images in real time from camera. Works for me !!

Related

MATLAB - Run webcam parallel to processing

Hello and thank you in advance.
I am working on a MATLAB algorithm, using the computer vision toolbox, to detect objects from a live camera feed, displaying frames with bounding boxes on a deployable video player.
Due to limitations of my hardware, the detection will be slower than the maximum FPS delivered by the camera.
Now, I'd like to display the webcam video feed at maximum speed, not waiting for the detection to finish so that I will have a fluent output video with detections whenever they will be inserted.
Is there a way?
My first approach was to use the parfeval function to somehow run the detection parallel, but failed due to my lack of knowledge, how to give the frame to the detector and insert the resulting bounding boxes to the frame, "whenever they are finished".

Using Vision Caltech Camera Calibration toolbox for MATLAB

I am currently using the camera calibration toolbox from vision caltech:http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/example.html
It has been good so far and I was able to obtain the camera parameters of my camera and everything using the checkered board.
So now I have been taking pictures with my camera and I want to undistort them using the calibration results I obtained from the caltech calibration toolbox.
The parameters are saved in a .mat file but I cannot find a way to use them on other images I took.
Anyone knows how to do that?
Thanks;
There is an easier way to calibrate cameras in MATLAB. The Computer Vision System Toolbox also includes the undistortImage function, which is what you are looking for.

Matlab Camera Calibration: saving chessboard calibration images

I'm currently trying to calibrate a camera and use those data to calibrate a projector. The first steps of this process start with printing a chessboard, generate chessboard calibration images and use them to calibrate the camera. Matlab documentation is quite thorough but it doesn't mention how one can generate its own chessboard calibration images. I can only assume this is a fairly simple thing to do but I'm new to Matlab so haven't figured out yet so any help would be greatly appreciated.
You generate the calibration images by taking pictures of the chessboard with your camera.
Also, if you have a recent version of MATLAB with the Computer Vision System Toolbox, try the Camera Calibrator app
https://sites.google.com/site/visheshpanjabihomepage/codes
The second set of codes, will help you collect the images for the camera calibration toolbox by Dr. Boquet (Link of which is given on the prescribed page itself).

Camera Calibration on MATLAB

I am very new to camera calibration and I have been trying to work with the camera calibration app from MATLAB's computer vision toolbox.
So I followed the steps they suggested on the website and so far so good, I was able to obtain the intrinsic parameters of the camera.
So now, I am kind of confused about what should I do with the "cameraParameter" object that was created when the calibration was done.
So my questions are:
(1) What should I do with the cameraParameter object that was created?
(2) How do I use this object when I am using the camera to capture images of something?
(3) Do I need the checker board around each time I capture images for my experiment?
(4) Should the camera be placed at the same spot each time?
I am sorry if those questions are really beginner level, camera calibration is new to me and I was not able to find my answers.
Thank you so much for your help.
I assume you are working with 1 just camera, so only intrinsic parameters of the camera are in the game.
(1),(2). Once your camera is calibrated, you need to use this parameters to undistort the image. Cameras dont take the images as they are in reality as the lenses distort it a bit, and the calibration parameters are for fixing the images. More in wikipedia.
About when you need to recalibrate the camera (3): if you set up the camera and don't change its focus, then you can use the same calibration parameters, but once you change the focal distance a recalibration is necessary.
(4) As long as you dont change the focal distance and you are not using a stereo camera sistem you can change your camera freely.
What you are looking for are two separate calibration steps: Alignment of depth image to color image and conversion from depth to a point cloud. Both functions are provided by windows sdk. There are matlab wrappers that call these SDK functions. You may want to do your own calibration only if you are not satisfied with the manufacturer calibration information stored on Kinect. Usually the error is within 1-2 pixels in the 2D alignment, and 4mm in 3D.
When you calibrate a single camera, you can use the resulting cameraParameters object for several things. First, you can remove the effects of lens distortion using the undistortImage function, which requires cameraParameters. There is also a function called extrinsics, which you can use to locate your calibrated camera in the world relative to some reference object (e. g. a checkerboard). Here's an example of how you can use a single camera to measure planar objects.
A depth sensor, like the Kinect, is a bit of different animal. It already gives you the depth in real units at each pixel. Additional calibration of the Kinect is useful if you want a more precise 3D reconstruction than what it gives you out of the box.
It would generally be helpful if you could tell us more about what it is you are trying to accomplish with your experiments.

iPhone - detecting motion with gyroscope/accelerometer

I'm trying to detect a swinging motion with an iPhone 4 using the gyro/accelerometer. I searched for some posts on SO about this, but couldn't find anything specific to my issues.
Do I need to do any sort of calibration for data from the gyroscope/accelerometer?
Anyone think of how I would measure a swinging motion?
Thanks!
1: Most iPhone games using the accelerometer don't do any calibration, but not all iphones are the same; there is some variation in accelerometer calibration. You could add a manual or automatic calibration to your program. If however, detecting a swinging motion is all you want, calibration is not necessary.
2: Apple has a nice little app that generates graphs of accelerometer motions in the iPhone SDK. You can download and build that and see the measurements for the motion you want. Then you can write code to detect similar accelerometer measurements.