iPhone Camera App - how to stay in Camera view? - iphone

I've implemented the camera inside of my app with the default showsCameraControls = YES, and the issue I am having is when the user confirms that the image is a keeper, it dismisses the camera with [self.delegate didFinishWithCamera]. I would like to remain in the Camera view until the user is done taking photos. Without [self.delegate didFinishWithCamera], the app hangs after the user confirms they want to keep the photo and never returns back to the live camera feed. How do I remain in the camera view? Your help is appreciated!
#implementation PHFPhotoOverlayVC
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
self.imagePickerController = [[UIImagePickerController alloc] init];
self.imagePickerController.delegate = self;
}
return self;
}
- (void)setupImagePicker:(UIImagePickerControllerSourceType)sourceType
{
self.imagePickerController.sourceType = sourceType;
if (sourceType == UIImagePickerControllerSourceTypeCamera)
{
self.imagePickerController.mediaTypes =
[NSArray arrayWithObjects:(NSString *) kUTTypeImage, nil];
self.imagePickerController.showsCameraControls = YES;
#if false
if ([[self.imagePickerController.cameraOverlayView subviews] count] == 0)
{
CGRect overlayViewFrame = self.imagePickerController.cameraOverlayView.frame;
CGRect newFrame = CGRectMake(0.0,
CGRectGetHeight(overlayViewFrame) -
self.view.frame.size.height - 10.0,
CGRectGetWidth(overlayViewFrame),
self.view.frame.size.height + 10.0);
self.view.frame = newFrame;
[self.imagePickerController.cameraOverlayView addSubview:self.view];
}
#endif
}
}
#pragma mark -
#pragma mark UIImagePickerControllerDelegate
- (void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
self.imagePickerController.mediaTypes =
[NSArray arrayWithObjects:(NSString *) kUTTypeImage, nil];
self.imagePickerController.showsCameraControls = YES;
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
if (self.delegate)
[self.delegate didTakePicture:image];
[self.delegate didFinishWithCamera];
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[self.delegate didFinishWithCamera];
}
#end

In order to achieve this, you will have to implement your own method for taking a picture, since this default button causes the view to dismiss itself.
Obviously, you can do this by setting showsCameraControls=NO, but then you'd have to recreate all the other functionality that you still want.
I've never done this, but it should be possible to leave showsCameraControls=YES and use the cameraOverlayView to simply layer an identical "Take Picture" button over the existing one. If you do that, you just need to have that button call the method -takePicture from the instance of your UIImagePickerViewController.
Hopefully that's enough, feel free to comment and ask for any clarification.

You're going to need to implement a custom camera - check out Apple's SquareCam example
You'll be using AVFoundation and will be able to implement any kind of custom behavior when a photo is taken (including just staying on that camera page) - you'll even have to save it to the Photo Library yourself (the SquareCam example shows you how to do this). This also means that if you need the crop/resize controls, you'll have to create them yourself, as well as a picker view (grid gallery view) if you want the user to be able to review their photos after taking them.
It's probably around intermediate level stuff - took me a few days to implement, but if your app is in any way focused on photos, this is definitely the way to go. Gives you complete control of the camera UI and behavior.
Oh, and yeah you'll have to implement the flash button, shutter button, shutter effect, tap to focus, and anything else yourself too. AVFoundation basically gives you a straight pipe to the camera lens (figuratively).

Related

Entering UIImagePickerController editing mode

I use a UIImagePickerController (photoPicker) to take pictures with the camera, and edit those picture (allowEditing = YES). The user can switch between Camera and Library with a button in photoPicker.view, but I want to remove this button when photoPicker is in editing mode.
In photoLibrarySourceType the picker uses a push to show editing mode, so it works with this method:
- (void)navigationController:(UINavigationController *)navigationController willShowViewController:(UIViewController *)viewController animated:(BOOL)animated
{
if (navigationController == photoPicker) {
NSLog(#"navigation ");
if ([navigationController.viewControllers indexOfObject:viewController] == 1)
{
// If second UIViewController, set your overlay.
NSLog(#"editing");
}
}
}
But in cameraSourceType, there is no navigation between camera and editing mode.
I came up with a little hack and it works for me. Am still a little new so please optimize at will as i couldn't find another way to do it.
The basic idea is to create an invisible button over the default camera button that performs a method that calls [uiimagepickercontroller takePicture]
here is my sample code
#implementation myViewController
{
UIImagePickerController *myImagePickerController;
UIButton * fakeCameraButton;
}
- (IBAction)showImagePicker:(id)sender{
// accessible camera validations implied
myImagePickerController = [[UIImagePickerController alloc]init];
myImagePickerController.sourceType=UIImagePickerControllerSourceTypeCamera;
myImagePickerController.showsCameraControls = YES;
myImagePickerController.delegate=self;
myImagePickerController.allowsEditing=YES;
CGFloat myViewHeight = 100; // you play around with the positions to get your desired frame size. This worked for me
UIView * myView = [[UIView alloc] initWithFrame:CGRectMake(0, myImagePickerController.view.frame.size.height - myViewheight, picker.view.frame.size.width, myViewheight)];
fakeCameraButton = [[UIButton alloc] initWithFrame:CGRectMake(myView.frame.size.width / 2 - 30, myView.frame.size.height/2-15, 60, 60)];// position it over the default camera button
[fakeCameraButton addTarget:self action:#selector(cameraButtonPressed) forControlEvents:UIControlEventTouchUpInside];
[fakeCameraButton setTitle:#"" forState:UIControlStateNormal];
[myView addSubview:fakeCameraButton];
myImagePickerController.cameraOverlayView = myView;
[self presentViewController:myImagePickerController animated:YES completion:nil];
}
-(void)cameraButtonPressed{
// Make sure you check if camera is ready before this method.
[myImagePickerController takePicture];
// you are in editing/preview mode here if allowsEditing = YES
// you can remove all your custom overlay views by
myImagePickerController.cameraOverlay = nil
}
Like i said before just make sure you check if camera is ready. You can find answers here on stackoverflow on how to do it.
This code may suck so editing or alternative solutions are welcomed.
I need dev buddy to help me learn more about ios dev and ask questions. anyone interested?

ZBar API Embedded Scanner Blur issue

I am using ZBar iPhone SDK in one of my projects (iOS SDK 5.1 ,XCode 4.4.1 and device running iOS 5.5.1). I am using the embedded scanner from the examples provided in the SDk itself.
Now the issue which I am facing is that I successfully scan a bar code and move to another view controller ( using navigation controller). When I come back (pop the second view controller) the scanner i.e the ZBarReaderView doesn't scan the subsequent bar codes , infact the overlay shows a blur image of the scanned barcode and is never able to scan it properly.
This is what all I have implemented . In BarScannerViewController.h I have declared
ZBarReaderView* readerView;
with property
#property (nonatomic , retain) IBOutlet UIImageView* imgvScannedBarCode;
Now this is connected to one of the views in xib.
Finally I use set up the required methods as follows -
- (void)viewDidLoad {
[super viewDidLoad];
// the delegate receives decode results
readerView.readerDelegate = self;
[readerView start];
}
- (void) viewDidAppear: (BOOL) animated {
// run the reader when the view is visible
[activityIndicatorScanning startAnimating];
[readerView start];
}
- (void) viewWillDisappear: (BOOL) animated {
[activityIndicatorScanning stopAnimating];
[readerView stop];
}
With all this set up when I scan any bar code say EAN123 for the first time I get the call back in
- (void) readerView: (ZBarReaderView*) view
didReadSymbols: (ZBarSymbolSet*) syms
fromImage: (UIImage*) img
{
// do something useful with results
ZBarSymbol *symbol = nil;
for(symbol in syms) {
barCodeFound = YES;
break;
}
// EXAMPLE: do something useful with the barcode data
NSLog(#"%#",symbol.data);
}
but on subsequent runs (After I push a view and come back on this screen again) I get blurred view.
Am I missing something here ? Any help/Suggestion/Comments would be helpful.
Here's the code that I use to start (and endlessly restart) the scanner. Interestingly, I note that I never stop the scan, but it works very reliably.
- (void) startScan
{
ZBarReaderViewController *reader = [ZBarReaderViewController new];
reader.readerDelegate = self;
ZBarImageScanner *scanner = reader.scanner;
[scanner setSymbology: ZBAR_I25
config: ZBAR_CFG_ENABLE
to: 0];
// present and release the controller
[self presentViewController:reader animated:YES completion:nil]; // Modal
[reader release];
}
I could solve the Blur issue by reconfiguring the SDK in my project. I followed the embedded scanner example as provided on ZBarSDk. I guess I might have missed some essential settings while configuring it earlier.

UIAlertView Rendering Error

I have been working on an app for a couple of months now, but have finally run into an issue that I can't solve myself, and can't find anything on the internet to help.
I am using several normal UIAlertViews, in my app. Some have 2 buttons, some have 3 buttons, and a couple have 2 buttons and a text field. However all have the same issue. When you call [someAlertView show]; the alert view appears as normal, but then suddenly its graphics context seems to get corrupted as you can see from the screenshot.
This happens on both iPhone and iPad simulators (both 5.0 and 5.1), and happens on an iPad and iPhone4S device as well.
The image showing through is whatever happens to be behind the alertView.
The Alert still works, I can click the buttons, type in the text field, then when it dismisses the delegate methods are called correctly and everything goes back to normal. When the alertView appears again, the same thing happens.
The view behind the alert is a custom UIScrollView subclass with a content size of approximately 4000 pixels by 1000 with a UIImage as the background. The png file is mostly transparent, so is only about 80kB in memory size, and the phone is having no issues rendering it - the scroll view is still fully responsive and not slow.
It also has a CADisplayLink timer attached to it as part of the subclass. I have tried disabling this just before the alertView is shown, but it makes no difference so I am doubtful that is the issue.
This app is a partial rewrite of one I made for a university project, and that one could display UIAlertViews over the top of a scrollView of the same size and subclass without issue. The difference between this app and that one is that in my old app, I had subclassed UIAlertView to add extra things such as a pickerView, however I decided that I didn't like the way it looked so moved everything out of the alert and am just sticking with a standard UIAlertView.
This is how the alertView in the screenshot is called:
- (IBAction)loadSimulation:(id)sender {
importAlert = [[UIAlertView alloc] initWithTitle:#"Load Simulation" message:#"Enter Filename:" delegate:self cancelButtonTitle:#"Cancel" otherButtonTitles:#"Load", nil];
[importAlert setAlertViewStyle:UIAlertViewStylePlainTextInput];
[importAlert showPausingSimulation:self.simulationView]; //Calling [importAlert show]; makes no difference.
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
[self hideOrganiser]; //Not an issue as the problem occurs on iPad as well.
}
}
With this being the categorised AlertView to add the ability to stop the scrollViews CADisplay link.
#interface UIAlertView(pauseDisplayLink)
- (void)showPausingSimulation:(UILogicSimulatorView*)simulationView;
#end
#implementation UIAlertView(pauseDisplayLink)
- (void)showPausingSimulation:(UILogicSimulatorView *)simulationView {
[simulationView stopRunning];
[simulationView removeDisplayLink]; //displayLink needs to be removed from the run loop, otherwise it will keep going in the background and get corrupted.
[self show];
}
I get no memory warnings when this happens, so I am doubtful it is due to lack of resources.
Has anyone come across an issue like this before? If you need further information I can try to provide it, but I am limited in what code I can post. Any help would be appreciated, I've been trying to solve this for two weeks and can't figure it out.
Edit:
It appears that it is not the AlertView at all (or rather it is not just the alertView), as the problem goes away when I remove the scroll view behind it, so there must be some issue between the two. This is the code for my UIScrollView subclass:
.h file:
#import
#import
#class ECSimulatorController;
#interface UILogicSimulatorView : UIScrollView {
CADisplayLink *displayLink;
NSInteger _updateRate;
ECSimulatorController* _hostName;
}
#property (nonatomic) NSInteger updateRate;
#property (nonatomic, strong) ECSimulatorController* hostName;
- (void) removeDisplayLink;
- (void) reAddDisplayLink;
- (void) displayUpdated:(CADisplayLink*)timer;
- (void) startRunning;
- (void) stopRunning;
- (void) refreshRate:(NSInteger)rate;
- (void) setHost:(id)host;
- (void)setMinimumNumberOfTouches:(NSInteger)touches;
- (void)setMaximumNumberOfTouches:(NSInteger)touches;
#end
.m file:
#import "UILogicSimulatorView.h"
#import "ECSimulatorController.h"
#import <QuartzCore/QuartzCore.h>
#implementation UILogicSimulatorView
#synthesize updateRate = _updateRate;
#synthesize hostName = _hostName;
- (void)reAddDisplayLink {
[displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSDefaultRunLoopMode]; //allows the display link to be re-added to the run loop after having been removed.
}
- (void)removeDisplayLink {
[displayLink removeFromRunLoop:[NSRunLoop mainRunLoop] forMode:NSDefaultRunLoopMode]; //allows the display link to be removed from the Run loop without deleting it. Removing it is essential to prevent corruption between the games and the simulator as both use CADisplay link, and only one can be in the run loop at a given moment.
}
- (void)startRunning {
[self refreshRate:self.updateRate];
[displayLink setPaused:NO];
}
- (void)refreshRate:(NSInteger)rate {
if (rate > 59) {
rate = 59; //prevent the rate from being set too an undefined value.
}
NSInteger frameInterval = 60 - rate; //rate is the number of frames to skip. There are 60FPS, so this converts to frame interval.
[displayLink setFrameInterval:frameInterval];
}
- (void)stopRunning {
[displayLink setPaused:YES];
}
- (void)displayUpdated:(CADisplayLink*)timer {
//call the function that the snakeController host needs to update
[self.hostName updateStates];
}
- (void)setHost:(ECSimulatorController*)host;
{
self.hostName = host; //Host allows the CADisplay link to call a selector in the object which created this one.
}
- (id)initWithFrame:(CGRect)frame
{
//Locates the UIScrollView's gesture recogniser
if(self = [super initWithFrame:frame])
{
[self setMinimumNumberOfTouches:2];
displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(displayUpdated:)]; //CADisplayLink will update the logic gate states.
self.updateRate = 1;
[displayLink setPaused:YES];
}
return self;
}
- (void)setMinimumNumberOfTouches:(NSInteger)touches{
for (UIGestureRecognizer *gestureRecognizer in [self gestureRecognizers])
{
if([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]])
{
//Changes the minimum number of touches to 'touches'. This allows the UIPanGestureRecogniser in the object which created this one to work with one finger.
[(UIPanGestureRecognizer*)gestureRecognizer setMinimumNumberOfTouches:touches];
}
}
}
- (void)setMaximumNumberOfTouches:(NSInteger)touches{
for (UIGestureRecognizer *gestureRecognizer in [self gestureRecognizers])
{
if([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]])
{
//Changes the maximum number of touches to 'touches'. This allows the UIPanGestureRecogniser in the object which created this one to work with one finger.
[(UIPanGestureRecognizer*)gestureRecognizer setMaximumNumberOfTouches:touches];
}
}
}
#end
Well, I have managed to come up a solution to this. Really it is probably just masking the issue rather than finding the route cause, but at this point I will take it.
First some code:
#interface UIView (ViewCapture)
- (UIImage*)captureView;
- (UIImage*)captureViewInRect:(CGRect)rect;
#end
#implementation UIView (ViewCapture)
- (UIImage*)captureView {
return [self captureViewInRect:self.frame];
}
- (UIImage*)captureViewInRect:(CGRect)rect
{
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.layer renderInContext:context];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenShot;
}
#end
- (void)showPausingSimulation:(UILogicSimulatorView *)simulationView {
[simulationView stopRunning];
UIView* superView = simulationView.superview;
CGPoint oldOffset = simulationView.contentOffset;
for (UIView* subview in simulationView.subviews) {
//offset subviews so they appear when content offset is (0,0)
CGRect frame = subview.frame;
frame.origin.x -= oldOffset.x;
frame.origin.y -= oldOffset.y;
subview.frame = frame;
}
simulationView.contentOffset = CGPointZero; //set the offset to (0,0)
UIImage* image = [simulationView captureView]; //Capture the frame of the scrollview
simulationView.contentOffset = oldOffset; //restore the old offset
for (UIView* subview in simulationView.subviews) {
//Restore the original positions of the subviews
CGRect frame = subview.frame;
frame.origin.x += oldOffset.x;
frame.origin.y += oldOffset.y;
subview.frame = frame;
}
[simulationView setHidden:YES];
UIImageView* imageView = [[UIImageView alloc] initWithFrame:simulationView.frame];
[imageView setImage:image];
[imageView setTag:999];
[superView addSubview:imageView];
[imageView setHidden:NO];
superView = nil;
imageView = nil;
image = nil;
[self show];
}
- (void)dismissUnpausingSimulation:(UILogicSimulatorView *)simulationView {
UIView* superView = simulationView.superview;
UIImageView* imageView = (UIImageView*)[superView viewWithTag:999];
[imageView removeFromSuperview];
imageView = nil;
superView = nil;
[simulationView setHidden:NO];
[simulationView startRunning];
}
Then modifying the dismiss delegate method in my class to have this line:
- (void)alertView:(UIAlertView *)alertView didDismissWithButtonIndex:(NSInteger)buttonIndex {
[alertView dismissUnpausingSimulation:self.simulationView];
...
When the alert view is called, but before it is shown, I need to hide the simulator to prevent it corrupting the alert. However just hiding it is ugly as then all is visible behind is a empty view.
To fix this, I first make a UIImage from the simulator views graphics context. I then create a UIImageView with the same frame as the simulator and set the UIImage as its image.
I then hide the simulator view (curing the alert issue), and add my new UIImageView to the simulators superview. I also set the tag of the image view so I can find it later.
When the alert dismisses, the image view is then recovered based on its tag, and removed from its superview. The simulator is then unhidden.
The result is that the rendering issue is gone.
I know its too late for an answer to this question. Lately I had experianced this very same issue.
My Case:
Added couple of custom UIViews with background images and some controlls to the scroll view with shadow effect. I had also set the shadowOffset.
The Solution:
After some step by step analysis, I found out that setting the setShadowOpacity caused The rendering problem for me. When i commented that line of code, it cured the UIAlertView back to normal appearance.
More:
To make sure, I created a new project mimicing the original ui with shadowOpacity. But it didnt caused the rendering problem as i expected. So I am not sure about the root cause. For me it was setShadowOpacity.

Basic description on how to record video in iOs 4

Hey guys i was curious if anybody could give me a very brief description of how to make an app record video in iOs 4. I know how to do all the media and whatnot using the os 3 method of using the UIImagePickerController but I do not know if that is still available in iOs4 and if not, could someone please give me a very brief description on how to do it using the new method? (No code required, but is more than welcome.)
-Thank you!
It's pretty straightforward.
I just made a view-based app with a single button on the interface to test this. The button's action is - (IBAction)shootButtonPressed;
You have to check if the device supports video recording and then configure the image picker controller to only shoot video. This code will only work on an actual device.
In the main view controller header, I made it conform to two protocols: UIImagePickerControllerDelegate and UINavigationControllerDelegate
Then I implemented the button press method like this;
- (IBAction)shootButtonPressed;
{
BOOL canShootVideo = [UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera];
if (canShootVideo) {
UIImagePickerController *videoRecorder = [[UIImagePickerController alloc] init];
videoRecorder.sourceType = UIImagePickerControllerSourceTypeCamera;
videoRecorder.delegate = self;
NSArray *mediaTypes = [UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
NSArray *videoMediaTypesOnly = [mediaTypes filteredArrayUsingPredicate:[NSPredicate predicateWithFormat:#"(SELF contains %#)", #"movie"]];
BOOL movieOutputPossible = (videoMediaTypesOnly != nil);
if (movieOutputPossible) {
videoRecorder.mediaTypes = videoMediaTypesOnly;
[self presentModalViewController:videoRecorder animated:YES];
}
[videoRecorder release];
}
}
You also have to implement two more methods to handle when a movie's shot & chosen and when the user cancels the video camera picker.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self dismissModalViewControllerAnimated:YES];
// save the movie or something here, pretty much the same as doing a photo
NSLog(#"movie captured %#", info);
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
// process the cancellation of movie picker here
NSLog(#"Capture cancelled");
}
Super easy.
For more details, see the Multimedia Programming Guide --> About Audio and Video --> Using Video --> Recording and Editing Video. It's in the Apple Docs, although a little scattered for my taste.

Detecting when camera's iris is open on iPhone

For a cutom camera overlay I need to find out, when the iris is opened, because my overlay will allways shown while the iris is close (and then animating to open).
Any ideas ?
You can listen for the PLCameraViewIrisAnimationDidEndNotification notification. Since this is not officially documented, you might be in violation of the Apple TOS, but I think so long as you write your code so that it's defensive against the possibility that the name or contract of this notification might change (so in the future you might not get the event) you'll probably be ok. In other words, use a timer or other technique to ensure that the thing you want done when the iris is open will definitely happen eventually even if you never get the notification...
Trivial example without the defensive programming. (Of course, you can register an interest only for this specific notification as well, see the docs for the notification center.)
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(notificationCallback:)
name:nil
object:nil
];
- (void) notificationCallback:(NSNotification *) notification {
if ([[notification name] isEqualToString:#"PLCameraViewIrisAnimationDidEndNotification"]) {
NSLog(#"Iris open");
// we don't need to listen any more
[[NSNotificationCenter defaultCenter] removeObserver:self];
}
}
It seems that PLCameraViewIrisAnimationDidEndNotification no longer gets notified in iOS5.
I can't figure out what is a suitable solution when the iris has finished opening, there must another option rather than using a 3 second timer.
Check here: https://devforums.apple.com/message/561008#561008
I have a ViewController (ALImagePickerController) which holds, initializes and presents the UIImagePickerController as a child view controller (I have another child view controller for presenting the taken image which is not shown here) and I present (as a modal) the ALImagePickerController when I want to use the camera. So during this the viewDidAppear of the ViewContoller I add an animation to bring in the camera overlay gracefully as the shutter animation disappears.
#interface ALImagePickerController ()
#property (nonatomic) UIImagePickerController *cameraController;
#property (nonatomic) CameraOverlayView *overlayView;
....
#end
#implementation ALImagePickerController
....
- (void)viewDidLoad {
[super viewDidLoad];
[UIApplication sharedApplication].statusBarHidden = YES;
self.cameraController = [UIImagePickerController new];
self.cameraController.sourceType = UIImagePickerControllerSourceTypeCamera;
self.cameraController.delegate = self;
self.cameraController.allowsEditing = NO;
self.cameraController.showsCameraControls = NO;
....
self.overlayView = [CameraOverlayView new];
....
self.overlayView.alpha = 0;
self.cameraController.cameraOverlayView = self.overlayView;
....
// add as child view controller
[self addChildViewController:self.cameraController];
[self.view addSubview:self.cameraController.view];
[self.cameraController didMoveToParentViewController:self];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[UIApplication sharedApplication].statusBarHidden = NO;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// smoothly bring in the overlay as the native camera shutter animation opens.
[UIView animateWithDuration:0.2 delay:0.3 options:UIViewAnimationCurveEaseOut animations:^{
self.overlayView.alpha = 1.f;
} completion:nil];
}
....
#end
The way I solved this problem is I initialize all the elements with the hidden property set to YES, then call a 3-second delayed selector after I call the camera, where I set all the elements to hidden = NO. It's not an ideal solution but it seems to work, and any lag after the iris is opened is negligible.
You should already know when the camera is ready to take a picture. At least the way I use a custom camera overlay, I init the view with something like self.sourceType = UIImagePickerControllerSourceTypeCamera; and the other usual setup, and the camera is ready (or "iris is open") at that point.
In summary, if one is using a custom camera overlay the way I am used to using it, one will know when the iris is open because it is under your control.