iPhone UIImagePickerController: show camera preview on top of openGl viewport - iphone

i've a question about UIImagePickerController class reference, in photo-camera mode.
I'm developing an OpenGl game. I should add to game photo-camera features: game should open a window (say for example 200x200 pixels, in the middle of screen), that display in real time a preview of photo-camera IN FRONT OF GL VIEWPORT. So photo-camera preview must be in a window of 200x200 in front of our Gl viewport that display game.
I've some questions:
- main problem is that i've difficult to open UIImagePickerController window in front of our Gl viewport (normally UIImagePickerController window covers all iPhone screen);
which is the better way to capture an image buffer periodically to perform some operations, like face detection (we have library to perform this on a bitmap image) ?
iPhone can reject such approach? It's possible to have this approach with camera (camera preview window that partially overlap an openGl viewport) ?
at the end, it's possible to avoid visualization of camera shutter? I'd like to initialize camera without opening sound and shutter visualization.
This is a screenshot:
http://www.powerwolf.it/temp/UIImagePickerController.jpg

If you want to do something more custom than what Apple intended for the UIImagePickerController, you'll need to use the AV Foundation framework instead. The camera input can be ported to a layer or view. Here is an example that will get you half way there (it is intended for frame capture). You could modify it for face detection by taking sample images using a timer. As long as you use these public APIs, it'll be accepted in the app store. There are a bunch of augmented reality applications that use similar techniques.
http://developer.apple.com/library/ios/#qa/qa2010/qa1702.html

Related

Apply custom camera filters on live camera preview - Swift

I'm looking to make a native iPhone iOS application in Swift 3/4 which uses the live preview of the back facing camera and allows users to apply filters like in the built in Camera app. The idea was for me to create my own filters by adjusting Hue/ RGB/ Brightness levels etc. Eventually I want to create a HUE slider which allows users to filter for specific colours in the live preview.
All of the answers I came across for a similar problem were posted > 2 years ago and I'm not even sure if they provide me with the relevant, up-to-date solution I am looking for.
I'm not looking to take a photo and then apply a filter afterwards. I'm looking for the same functionality as the native Camera app. To apply the filter live as you are seeing the camera preview.
How can I create this functionality? Can this be achieved using AVFoundation? AVKit? Can this functionality be achieved with ARKit perhaps?
Yes, you can apply image filters to the camera feed by capturing video with the AVFoundation Capture system and using your own renderer to process and display video frames.
Apple has a sample code project called AVCamPhotoFilter that does just this, and shows multiple approaches to the process, using Metal or Core Image. The key points are to:
Use AVCaptureVideoDataOutput to get live video frames.
Use CVMetalTextureCache or CVPixelBufferPool to get the video pixel buffers accessible to your favorite rendering technology.
Draw the textures using Metal (or OpenGL or whatever) with a Metal shader or Core Image filter to do pixel processing on the CPU during your render pass.
BTW, ARKit is overkill if all you want to do is apply image processing to the camera feed. ARKit is for when you want to know about the camera’s relationship to real-world space, primarily for purposes like drawing 3D content that appears to inhabit the real world.

Is there a way to render pixels directly on iPhone?

I want to port a game I've made which renders the screen itself 50 fps (doesn't use opengl).
What is the best way to port this to the iPhone?
I was reading about Framebuffer Objects. Is this a good approach to render a buffer of pixels to the screen at high speeds?
The fastest way to get pixels on the screen is via OpenGL.
Need more info about how your game currently renders to the screen, but I don't see how FBOs will help as they're usually used for getting a copy of the render buffer, i.e. for creating a screen recording, or compositing custom textures on fly.
If i ever need to create an app where I have to access the pixels directly and dont have direct access to the hardware I use SDL as it just requires you to create a surface and from there you can manipulate the pixels directly. and as far as im aware you can use SDL on the Iphone, maybe even accelerate it using opengl too

How to create a distorted screen

I want to show distorted image as error page for my application. If possible this can be a screenshot of home screen with some graphics distortion. Is this possible.
Thanks You.
As Daniel A. White's comment mentions, this probably will cause your application to be rejected from the App Store, but it can be accomplished in many ways. I think this technique would be acceptable if your own interface appeared broken, but not acceptable if you made any iOS supplied looks appear broken.
You could just use your favorite image editor (i.e. Photoshop) to distort a screen shot, and displayed it by putting it in a separate UIView. The image would be static. It couldn't react to the contents of your program's interface.
If your interface is drawn with OpenGL ES 2.0, you could draw your regular interface to a texture, then use that texture as input to another GLSL program that applied the distortion.

Overlay layer on photo, and export result at higher resolution than the native display

I want to take a photo with the device's camera, overlay a CALayer on top of it, and export the photo at the native resolution of the device's camera.
The best solution i could come up with so far, was to overlay the CALayer on top of the UIView, and capture the current view state using [CALayer renderInContext:]. However the result of this will be at the devices native screen resolution, not the original resolution of the camera.
Any better suggestions?
Thanks as always
What you actually want to do in this situation is draw the image offscreen using core graphics. Slightly more complicated, but the best solution I have found.

Rendering splash screen on the iPhone using Open GL ES

I want to render a splash screen on the iPhone whilst using an Open GL view. The iPhone screen as we know is 320x480, which is not a power of 2.
Before I enter into the world of chopping the texture up and rendering sub parts, or embedding the screen on another texture page I was wondering if there was another way?
Is it possible to overlay another view that I could render to using CoreGraphics functions? Or is it possible to render to a Open GL surface using Core Graphics functions.
What would you recommend?
Cheers
Rich
Its entirely possible to write some code, which creates a 512x512 texture, load an image into it and then render only a portion of that texture (by mapping onto a polygon and altering the texture mapping UV co-ordinates).
This method is best for static images only, you couldn't really perform pixel-by-pixel real-time updates for this; updating the texture via open GL ES is currently too slow.
I would recommend that you read Apple's Human Interface Guidelines for iPhone, especially the several parts where they warn you over and over not to make splash screens.