does anybody know how to set a custom resolution in a custom iOS camera using AVFoundation (AVCaptureStillImageOutput)?
I know you can select a preset using AVCaptureSession, but I need an output resolution of 920x920 (not provided by a preset). Currently I use AVCaptureSessionPresetHigh and resize the UIImage after but that seems silly and a lot extra, unnecessary processing.
You can't, but what you can do is iterate through all the AVCaptureDeviceFormats looking for the one closer to your resolution.
To get a complete list of all the available formats, just query the capture device using the property -formats.
This example from Apple shows how to pick the best frame rate:
- (void)configureCameraForHighestFrameRate:(AVCaptureDevice *)device
{
AVCaptureDeviceFormat *bestFormat = nil;
AVFrameRateRange *bestFrameRateRange = nil;
for ( AVCaptureDeviceFormat *format in [device formats] ) {
for ( AVFrameRateRange *range in format.videoSupportedFrameRateRanges ) {
if ( range.maxFrameRate > bestFrameRateRange.maxFrameRate ) {
bestFormat = format;
bestFrameRateRange = range;
}
}
}
if ( bestFormat ) {
if ( [device lockForConfiguration:NULL] == YES ) {
device.activeFormat = bestFormat;
device.activeVideoMinFrameDuration = bestFrameRateRange.minFrameDuration;
device.activeVideoMaxFrameDuration = bestFrameRateRange.minFrameDuration;
[device unlockForConfiguration];
}
}
}
Please, check out one of my old questions that I answered myself, it may help you.
Capture picture from video using AVFoundation
Related
I'm confused!
I'm trying to manually adjust the exposure to fit the CGPoint in the center of the preview. I'm taking the device object and using setExposureMode and setExposurePointOfInterest in order to do the manipulation. The first thing I do is check to see if exposure mode is supported by the device. If not supported then return. If it is supported then set the values. My confusion is stemming from the fact that the value for device isExposureModeSupported:exposureMode returns NO. But it is supported! I have an iPhone 5c. If I ignore the return statement I do not receive any errors.
- (void)device:(AVCaptureDevice *)device exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point
{
BOOL exposureModeSupported = [device isExposureModeSupported:exposureMode];
if (!exposureModeSupported)
return;
if ([device lockForConfiguration:&error]) {
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device setExposurePointOfInterest:point];
CALayer *exposeRect = [CALayer layer];
exposeRect.frame = CGRectMake(self.center.x-30, self.center.y-30, 60, 60);
exposeRect.borderColor = [UIColor whiteColor].CGColor;
exposeRect.borderWidth = 2;
exposeRect.name = #"exposeRect";
[self.previewLayer addSublayer:exposeRect];
[NSTimer scheduledTimerWithTimeInterval: 1
target: self
selector: #selector(dismissExposeRect)
userInfo: nil
repeats: NO];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device unlockForConfiguration];
}
}
How can I check if exposure mode is supported if I can't trust the value returned?
I ended up AND-ing the check, but I'm not sure this is the correct way to check. The condition now looks like this:
if (![device isExposurePointOfInterestSupported] && ![device isExposureModeSupported:exposureMode])
return;
Has anyone else come across this and does anyone know how to properly handle this?
Thank you in advance.
Yes, you should check for exposurePointOfInterestSupported AND isExposureModeSupported:.
In your case, you are checking if the AVCaptureExposureMode given as an argument in your function is supported, but set the exposure to AVCaptureExposureModeContinuousAutoExposure, which is not necessarily supported.
I guess nobody wanted to chime in on this one. I ended up AND-ing the check, but I'm not sure this is the correct way to check but it worked.
I have and application OpenCV based to process images.
I need a clean image to process data. When I´m in a zone with fluorescent light, it appear in the image some banding. In Android, I solved that problem configuring the camera parameter ANTIBANDING_50HZ, here is the reference, and it looks and process right.
But in apple reference, I cannot find a way to avoid this problem. I have been adjusting some options to improve the image, but their are not solving the banding.
My camera is configured using this code:
- (BOOL) setupCaptureSessionParameters
{
NSLog(#"--- Configure Camera options...");
/*
* Create capture session with optimal size to OpenCV processing
*/
captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPreset640x480;
AVCaptureDevice *cameraBack =[self videoDeviceWithPosition:AVCaptureDevicePositionBack];
if ([cameraBack lockForConfiguration:nil])
{
NSLog(#"lockForConfiguration...");
// No autofocus
if ( [cameraBack isFocusModeSupported:AVCaptureFocusModeLocked])
{
cameraBack.focusMode = AVCaptureFocusModeLocked;
}
// Focus center image always
if ( [cameraBack isFocusPointOfInterestSupported])
{
cameraBack.focusPointOfInterest = CGPointMake(0.5, 0.5);
}
// Autoexpose color is have a several change of lights
if ( [cameraBack isExposurePointOfInterestSupported] )
{
cameraBack.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
}
// Auto adjust white balance is user aim to a reflectant surface
if ( [cameraBack isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance])
{
cameraBack.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
}
// Only Focus far
if ( [cameraBack isAutoFocusRangeRestrictionSupported])
{
cameraBack.autoFocusRangeRestriction = AVCaptureAutoFocusRangeRestrictionFar;
}
// Choose best rate depending preset
AVCaptureDeviceFormat *bestFormat = nil;
AVFrameRateRange *bestFrameRateRange = nil;
for ( AVCaptureDeviceFormat *format in [cameraBack formats] )
{
for ( AVFrameRateRange *range in format.videoSupportedFrameRateRanges )
{
if ( range.maxFrameRate > bestFrameRateRange.maxFrameRate )
{
bestFormat = format;
bestFrameRateRange = range;
}
}
}
if (bestFormat)
{
cameraBack.activeFormat = bestFormat;
cameraBack.activeVideoMinFrameDuration = bestFrameRateRange.minFrameDuration;
cameraBack.activeVideoMaxFrameDuration = bestFrameRateRange.maxFrameDuration;
}
[cameraBack unlockForConfiguration];
NSLog(#"unlockForConfiguration!");
}
}
Pictures:
Follow Apple´s documentation about CIFilter
Android and IOS are very different about Camera Parameters. Android allows you to customise camera parameter to avoid banding. In th other hand, in IOS you need to use to work with CIFilter class. Android and IOS works in different directions.
i am new to AVFoundation and i am trying to implement a video camera with AVFoundation here is my basic setup. Basically, when you click a button it will call the showCamera method. In here i create the session and then add an audio input and video input then add the video output.
Where in here do i add the AVCaptureConnection and how do i do it? Is there some tutorial that shows how to use the connections? Any help is appreciated.
- (IBAction) showCamera
{
//Add the camview to the current view ontop of controller
[[[[[UIApplication sharedApplication] delegate] self] window] addSubview:camView];
session = [[AVCaptureSession alloc] init];
//Set preset on session to make recording scale high
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
session.sessionPreset = AVCaptureSessionPresetHigh;
}
// Add inputs and outputs.
NSArray *devices = [AVCaptureDevice devices];
//Print out all devices on phone
for (AVCaptureDevice *device in devices)
{
if ([device hasMediaType:AVMediaTypeVideo])
{
if ([device position] == AVCaptureDevicePositionBack)
{
//Add Rear Video input to session
[self addRearCameraInputToSession:session withDevice:device];
}
}
else if ([device hasMediaType:AVMediaTypeAudio])
{
//Add Microphone input to session
[self addMicrophoneInputToSession:session withDevice:device];
}
else
{
//Show error that your camera does not have a phone
}
}
//Add movie output
[self addMovieOutputToSession:session];
//Construct preview layer
[self constructPreviewLayerWithSession:session onView:camView];
}
You don't add AVCaptureConnections manually. When you have both an input and an output added to the AVCaptureSession object, the connections are automatically created for you. Quoth the documentation:
When an input or an output is added to a session, the session greedily forms connections between all the compatible capture inputs’ ports and capture outputs.
Unless you need to disable one of the automatically-created connections, or change the videoMirrored or videoOrientation properties, you shouldn't have to worry about them at all.
Take a look at following URLs...
Documentation from Apple:
An Article:
Video Recording using AVFoundation Framework iPhone?
I think, it will help you....
I would like to write a camera application where you record video using the iPhone's camera, but I can't find a way to alter the framerate of the recorded video. For example, I'd like to record at 25 frames per second instead of the default 30.
Is it possible to set this framerate in any way, and if yes how?
You can use AVCaptureConnection's videoMaxFrameDuration and videoMinFrameDuration properties. See http://developer.apple.com/library/ios/#DOCUMENTATION/AVFoundation/Reference/AVCaptureConnection_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40009522
Additionally, there is an SO question that addresses this (with a good code example):
I want to throttle video capture frame rate in AVCapture framework
As far as I could tell, you can't set the FPS for recording. Look at the WWDC 2010 video for AVFoundation. It seems to suggest that you can but, again, as far as I can tell, that only works for capturing frame data.
I'd love to be proven wrong, but I'm pretty sure that you can't. Sorry!
You will need AVCaptureDevice.h
Here is working code here:
- (void)attemptToConfigureFPS
{
NSError *error;
if (![self lockForConfiguration:&error]) {
NSLog(#"Could not lock device %# for configuration: %#", self, error);
return;
}
AVCaptureDeviceFormat *format = self.activeFormat;
double epsilon = 0.00000001;
int desiredFrameRate = 30;
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
NSLog(#"Pre Minimum frame rate: %f Max = %f", range.minFrameRate, range.maxFrameRate);
if (range.minFrameRate <= (desiredFrameRate + epsilon) &&
range.maxFrameRate >= (desiredFrameRate - epsilon)) {
NSLog(#"Setting Frame Rate.");
self.activeVideoMaxFrameDuration = (CMTime){
.value = 1,
.timescale = desiredFrameRate,
.flags = kCMTimeFlags_Valid,
.epoch = 0,
};
self.activeVideoMinFrameDuration = (CMTime){
.value = 1,
.timescale = desiredFrameRate,
.flags = kCMTimeFlags_Valid,
.epoch = 0,
};
// self.activeVideoMinFrameDuration = self.activeVideoMaxFrameDuration;
// NSLog(#"Post Minimum frame rate: %f Max = %f", range.minFrameRate, range.maxFrameRate);
break;
}
}
[self unlockForConfiguration];
// Audit the changes
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
NSLog(#"Post Minimum frame rate: %f Max = %f", range.minFrameRate, range.maxFrameRate);
}
}
I am trying hard to emulate the basic functionality of the built in camera app. Thus far I have become stuck on the 'tap to focus' feature.
I have a UIView from which I am collecting UITouch events when a single finger is tapped on the UIView. This following method is called but the camera focus & the exposure are unchanged.
-(void)handleFocus:(UITouch*)touch
{
if( [camera lockForConfiguration:nil] )
{
CGPoint location = [touch locationInView:cameraView];
if( [camera isFocusPointOfInterestSupported] )
camera.focusPointOfInterest = location;
if( [camera isExposurePointOfInterestSupported] )
camera.exposurePointOfInterest = location;
[camera unlockForConfiguration];
[cameraView animFocus:location];
}
}
'camera' is my AVCaptureDevice & it is non-nil. Can someone perhaps tell me where I am going wrong?
Clues & boos all welcome.
M.
This snippet might help you...There is a CamDemo provided by apple floating around which allows you to focus, change exposure while tapping, set flash, swap cameras and more, it emulates the camera app pretty well, not sure if youll be able to find it since it was part of wwdc, but if u leave some email address in the comments i can email you the sample code...
- (void) focusAtPoint:(CGPoint)point
{
AVCaptureDevice *device = [[self videoInput] device];
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
} else {
id delegate = [self delegate];
if ([delegate respondsToSelector:#selector(acquiringDeviceLockFailedWithError:)]) {
[delegate acquiringDeviceLockFailedWithError:error];
}
}
}
}