I'm trying to overlay OpenGL ES content, using a GLKView, on top of a live video feed from the camera using a UIImagePickerController. I've read several tutorials, books, and posts but, still can't find the answer. For your reference, some of the posts I've read are here, here and here.
In viewDidLoad I'm doing the following:
// Create an OpenGL ES 2.0 context and provide it to the view
// NOTE: viewOverlay is of class GLKView and a property of the view controller
viewOverlay.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
viewOverlay.opaque = NO;
viewOverlay.backgroundColor=[UIColor clearColor];
// Make the new context current
[EAGLContext setCurrentContext:viewOverlay.context];
// Create a base effect that provides standard OpenGL ES 2.0
// Shading Language programs and set constants to be used for
// all subsequent rendering
self.baseEffect = [[GLKBaseEffect alloc] init];
self.baseEffect.useConstantColor = GL_TRUE;
self.baseEffect.constantColor = GLKVector4Make(
1.0f, // Red
0.0f, // Green
0.0f, // Blue
1.0f);// Alpha
// Set the background color stored in the current context
glClearColor(0.0f, 0.0f, 0.0f, 0.0f); // background color
And in viewDidAppear I'm doing this:
UIImagePickerController *uip;
uip = [[UIImagePickerController alloc] init];
uip.sourceType = UIImagePickerControllerSourceTypeCamera;
uip.showsCameraControls = NO;
uip.toolbarHidden = YES;
uip.navigationBarHidden = YES;
uip.wantsFullScreenLayout = YES;
uip.cameraViewTransform = CGAffineTransformScale(uip.cameraViewTransform, CAMERA_TRANSFORM, CAMERA_TRANSFORM);
[viewOverlay addSubview:uip.view];
[viewOverlay sendSubviewToBack:uip.view];
[viewOverlay bringSubviewToFront:viewOverlay];
If I step through the program, I can see the OpenGL objects get rendered once. But, when UIImagePickerController is added as a sub-view, it's the only thing on screen. If I comment out the last three lines, the OpenGL objects are rendered but, of course the camera isn't visible.
I'd like the camera's video image to be rendered in back of the OpenGL objects I'm drawing. Creating an augmented reality effect. Any and all help would be greatly appreciated!
Mike
What do you want your camera to do - just display a camera view for an AR effect, or actually use/manipulate the image data too? If you need anything other than a simple AR display, use AVFoundation.
I've had loads of trouble with this before and read the same posts you have, so I offer two solutions:
1) Use a Container View
In your storyboard file, create a UIViewController with a UIView for the camera and a Container View on top with a segue connected to a GLKViewController. Link your main UIViewController to your camera code and link your GLKViewController to your OpenGL ES code. Both the UIView and the GLKView inside the container will load and run seperately when the UIViewController is visible.
2) App delegate hack
Setup your UIImagePickerController inside the app delegate, specifically the method:
(BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
Then, add the subview to your window property. This will always make sure that your camera gets loaded onto your window before any views, and ensures it is at the bottom of the layer stack.
Related
I am trying to make a drawing application following this tutorial: http://www.effectiveui.com/blog/2011/12/02/how-to-build-a-simple-painting-app-for-ios/, and then I tried to make it such that I don't only draw on the entire screen, I only draw on a UIView that is inside of another UIView. (What I call a nested UIView)
My code is currently on github: https://github.com/sammy0025/SimplePaint
The only parts I tweaked with the original code from the tutorial is to change some class prefix names, enabled ARC so no deallocs, used storyboards (which works fine with the original code from the tutorial), and change this code in the main view controller implementation file (SPViewController.m):
-(void)viewDidLoad
{
[super viewDidLoad];
nestedView.backgroundColor=[UIColor colorWithWhite:0 alpha:0]; //This is to make the nested view transparent
SPView *paint = [[SPView alloc] initWithFrame:nestedView.bounds]; //original code is SPView *paint=[[SPView alloc] initWithFrame:self.view.bounds];
[nestedView addSubview:paint]; //original code is [self.view addSubview:paint];
}
My question is how do I make sure that I only draw inside the nested UIView?
Add clipping to your sub view. E.g. self.bounds would cover the whole area of a view.
UIBezierPath *p = [UIBezierPath bezierPathWithRect:self.bounds];
[p addClip];
I believe clipping for a view should be based on bounds initially and that you may shrink it by adding an new clipping. But there might be a difference between iOS and OS X here.
Rather than using an SPView inside a nested view, consider just changing the frame of the SPView to match what you want. This should solve your problem of only allowing drawing within a given rect.
I think you're seeing issues because of your storyboard configuration. I dont know much about storyboard, but I was able to get this to work programmatically by throwing out storyboard's view and building my own in viewDidAppear.
-(void)viewDidAppear:(BOOL)animated{
UIView *newView = [[UIView alloc] initWithFrame:self.view.bounds];
newView.backgroundColor = [UIColor greenColor];
SPView *newPaint = [[SPView alloc] initWithFrame:CGRectInset(self.view.bounds, 40,40)];
[newView addSubview:newPaint];
self.view = newView;
}
I'd like to invoke the camera and display a live image in a small preview window (similar to below) that is embedded in a standard viewController. The code below creates the live reduced camera image, but I cannot see the other objects on the NIB file. Thoughts appreciated.
imagePicker = [[UIImagePickerController alloc] init];
//Setting the control source type as the Camera device.
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
//Camera display is off
imagePicker.showsCameraControls = NO;
imagePicker.navigationBarHidden = YES;
//Picking only the rear camera.
imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceRear;
//Turning the camera flash off.
imagePicker.cameraFlashMode = UIImagePickerControllerCameraFlashModeOn;
// Make camera view partial screen:
imagePicker.cameraViewTransform = CGAffineTransformScale(imagePicker.cameraViewTransform, 0.5, 0.5);
// add subView
[self.view addSubview:imagePicker.view];
[imagePicker viewWillAppear:YES];
[imagePicker viewDidAppear:YES];
// Show the picker:
[self presentModalViewController:imagePicker animated:YES];
Without running the code, it looks like you are changing the live image, not the size of the view. So the original fullsize view is showing over the top of your other views.
Have you tried using camerOverlayView to overlay the viewcontrollers view on top of the live image?
cameraOverlayView
The custom view to display on top of the default image picker interface.
#property (nonatomic, retain) UIView *cameraOverlayView Discussion You
can use an overlay view to present a custom view hierarchy on top of
the default image picker interface. The image picker layers your
custom overlay view on top of the other image picker views and
positions it relative to the screen coordinates. If you have the
default camera controls set to be visible, incorporate transparency
into your view, or position it to avoid obscuring the underlying
content.
In my iPad app I have a view controller with a small table view. When you tap on the table view it opens a modal view controller that is a larger and more refined version of the small table view. I would like to create an animation from a pre-rendered image of the large view controller by scaling the image down to be the size of the small table view and zoom it to full screen size and then replace the image with the "real" view controller.
Something like:
LargeViewController* lvc = [[LargeViewController alloc] init];
[self presentModalViewController:lvc byZoomingFromRect:CGRectMake(50,50,200,300)];
I know you can produce an image from a view:
- (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, [[UIScreen mainScreen] scale]);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
But how do I make the view controller draw itself (offscreen) so I can take it's view and scale the image in an animation to fill screen?
Thanks in advance.
I suppose you want to create your very own animation. Last month I played around with something like that. My solution was adding a custom view (maybe taken from a view controller) to the current view as an overlay. This works with layers, too.
First you fetch the Image from your "future" or "present" view controller, like you did in your code example above. Normally the view controllers content should be available while rendering to the context.
Now you have the image. The manipulation of the image must be done by you.
Add the image to a UIImageView. This ImageView can be added as subview or layer. Now you have a layer where you can freely draw above your actual user interface. Sometimes you have to move the layer or view around, so that it perfectly overlays your view. This depends on your view setup. If you are dealing with Tableviews, adding a subview is not that easy. So better use the layer.
After all the work was done, present the new view controller without animation, so that it appears immediately.
Remove the layer or view from your parent view after the work was done, and clean up.
This sounds complicated, but once you've done that you have a template for that. In "WWDC 2011, Session 309 Introducing Interface Builder Storyboarding" apple introduced 'custom segues', where you'll find a mechanism for exactly what you want to do. The code below is a cut out of an older project and is somehow messy and must be cleaned up. But for showing the principle this should work:
-(void) animate {
static LargeViewController* lvc = [[LargeViewController alloc] init];
UIGraphicsBeginImageContextWithOptions(self.bounds.size, view.opaque, [[UIScreen mainScreen] scale]);
[lvc.view.layer renderInContext:UIGraphicsGetCurrentContext()];
// Create a ImageView to display your "zoomed" image
static UIImageView* displayView = [[UIImageView alloc] initWithFrame:self.view.frame];
static UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Add your image to the view
displayView.image = img;
// insert the view above your actual view, adjust coordinates in the
// frame property of displayView if overlay is misaligned
[[self.view] addSubview:displayView];
// alternatively you can use the layer
// [self.view.layer addSublayer:displayView.layer];
// draw the imageView
[displayView setNeedsDisplay];
// do something in background. You may create your own
// construction, i.e. using a timer
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSDate *now = [NSDate date];
NSTimeInterval animationDuration = 3.;
NSTimeInterval t = -[now timeIntervalSinceNow];
while (t < animationDuration) {
t = -[now timeIntervalSinceNow];
// Do some animation here, by manipulation the image
// or the displayView
// <calculate animation>, do something with img
// you have exact timing information in t,
// so you can set the scalefactor derived from t
// You must not use an UIImage view. You can create your own view
// and do sth. in draw rect. Do whatever you want,
// the results will appear
// in the view if you added a subview
// or in a layer if you are using the layer
dispatch_sync(dispatch_get_main_queue(), ^{
// display the result
displayView.image = img;
[displayView setNeedsDisplay];
});
}
});
// now the animation is done, present the real view controller
[self presentModalViewController:lvc animated:NO];
// and clean up here
}
Perhaps you could use something like
CGAffineTransform tr = CGAffineTransformScale(lvc.view.transform, 0.5, 0.5);
to embed a scaled down version of the view in your parent view controller, then present lvc modally and restore scale when the user taps the view.
UIKit takes care of most of this for you. While jbat100's solution could be made to work too, you should be able to do this simply by setting lvc's initial frame to the smaller rect you want to start out at and then when you set the frame too its full size, the implicit animation for changing the frame will handle the zooming animation for you. Each UIView has a CALayer that its content is drawn in and that layer has several implicit animtions setup to animated changes to certain properties such as the frame or position properties. Here is my untested stab at it:
.
.
lvc.view.frame = CGRectMake(50,50,200,300);
[self performSelector:#selector(setFrameToFullScreen) withObject:nil afterDelay:0];
}
- (void)setFrameToFullScreen {
lcv.view.frame = [UIScreen mainScreen].bounds;
}
The performSelector:withObject:afterDelay call will cause setFrameToFullScreen to be called on the next run loop cycle. If you don't do something like that, then only the final frame will be used and the system won't recognize the change in the frame and apply its implicit animation to the views layer.
Is it possible to implement a smooth transition when the app loads, from the launch image to the first view?
The default behavior is on/off, with an immediate change: the launch image appears, then it instantaneously disappears to let the main view controller take place. I'd like to implement some fading or zooming in or out.
Is this possible?
Thank you!
There's no framework support, but you can get that result if you do it yourself, manually. Depending on what your launch image is, and what your UI looks like, you can do it in different ways, but basically: make your first view controller load and display your default.png image in an image view when it loads up. Then animate a fade out of that image to reveal your actual UI.
Modified Dancreek's answer to do it all in AppDelegate application:didFinishLaunchingWithOptions. I like this because the code is guaranteed to only run at app start, and it's not polluting any of the view controllers.
It's very simple:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// set up your root view and stuff....
//.....(do whatever else you need to do)...
// show the main window, overlay with splash screen + alpha dissolve...
UIImageView *splashScreen = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Default.png"]];
[self.window addSubview:splashScreen];
[self.window makeKeyAndVisible];
[UIView animateWithDuration:0.3 animations:^{splashScreen.alpha = 0.0;}
completion:(void (^)(BOOL)) ^{
[splashScreen removeFromSuperview];
}
];
}
You are in luck. I just did this a few min ago. You need a splash screen. An image on your view that is exactly the same as your default image that the device loads. Then in your app have it dismiss with a fade animation called from the viewDidAppear function
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self performSelector:#selector(killSplashScreen) withObject:nil afterDelay:1.0];
}
- (void)killSplashScreen {
[UIView animateWithDuration:0.5 animations:^{splashScreen.alpha = 0.0;} completion:NULL];
}
We often use something called "splashView" to do this. It was written by Shannon Applecline and available under the CC license. You will have to do some Googling to find it.
//
// splashView.h
// version 1.1
//
// Created by Shannon Appelcline on 5/22/09.
// Copyright 2009 Skotos Tech Inc.
//
// Licensed Under Creative Commons Attribution 3.0:
// http://creativecommons.org/licenses/by/3.0/
// You may freely use this class, provided that you maintain these attribute comments
//
// Visit our iPhone blog: http://iphoneinaction.manning.com
//
I am trying to rotate my object like shakes dice.
please suggest simplest way to implement it in my iphone application.
Any kind of sample code or documentation.
http://www.amazon.com/OpenGL-SuperBible-Comprehensive-Tutorial-Reference/dp/0321498828
If you want to do this without using OpenGL you can use a UIImageView and the set a series of animation images. By creating a series of images you can create a "rolling dice" effect. Here is an example of how to do this:
// create a UIImageView
UIImageView *rollDiceImageMainTemp = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"rollDiceAnimationImage1.png"]];
// position and size the UIImageView
rollDiceImageMainTemp.frame = CGRectMake(0, 0, 100, 100);
// create an array of images that will represent your animation (in this case the array contains 2 images but you will want more)
NSArray *savingHighScoreAnimationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"rollDiceAnimationImage1.png"],
[UIImage imageNamed:#"rollDiceAnimationImage2.png"],
nil];
// set the new UIImageView to a property in your view controller
self.viewController.rollDiceImage = rollDiceImageMainTemp;
// release the UIImageView that you created with alloc and init to avoid memory leak
[rollDiceImageMainTemp release];
// set the animation images and duration, and repeat count on your UIImageView
[self.viewController.rollDiceImageMain setAnimationImages:savingHighScoreAnimationImages];
[self.viewController.rollDiceImageMain setAnimationDuration:2.0];
[self.viewController.rollDiceImageMain.animationRepeatCount:3];
// start the animation
[self.viewController.rollDiceImageMain startAnimating];
// show the new UIImageView
[self.viewController.view addSubview:self.rollDiceImageMain];
You can modify the creation and setup including the size, position, number of images, the duration, repeat count, etc as needed. I've used this feature of UIImageView to create simple animation effects and it works great!
Please let me know if you try this and if it works for your needs. Also, post any follow up questions.
Bart
nice your example, but do you think it's possible to rotate / control the rotation with touch gesture ?