Can the exposure time be manually adjusted for an iOS cameras? - iphone

I want to adjust the exposure of the iPhone/iPod touch camera with intimate detail. I would prefer to take a series of photos with decreasing exposure times to obtain a sequence of images (for HDR reconstruction). Is this possible?
If not, what's the next best thing? It seems you can set a point of interest in the image for the autoexposure. Perhaps I could search for a dark/light region of the image and then use this exposurePointOfInterest to adjust the exposure, but this seems like a very indirect solution that is also error-prone. If anybody has tried an alternative, such an answer is also desirable.

As iOS gives control of frame durations by
MinFrameDuration
MaxFrameDuration
since exposure times vary based on fram rate and frame duration
By setting min and max frame rate to a particular value
You will be locking the fram rate.
That will effect your exposure times.
This is also very indirect way of controlling, may be it helps your case
some example would be like this:
if (conn.isVideoMinFrameDurationSupported)
conn.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (conn.isVideoMaxFrameDurationSupported)
conn.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);

Since you would have to decrease the shutter speed of the camera, this unfortunately does not appear to be possible, and more importantly, against the HIG:
Changing the behavior of iPhone external hardware is a violation of
the iPhone Developer Program License Agreement. Applications must
adhere to the iPhone Human Interface Guidelines as outlined in the
iPhone Developer Program License Agreement section 3.3.7
Related article Apple Removes Camera+ iPhone App From The App Store After Developer Reveals Hack To Enable Hidden Feature.
If it can be done programatically, instead of with the hardware, you might have a chance, but then its just an effect on an image,not a true long exposure picture.
There are some simulated slow shutter apps that do get approved like Slow Shutter or Magic Shutter.
Related article: New iPhone Camera App “Magic Shutter” Hits The App Store.

This is supported since iOS 8:
http://developer.xamarin.com/guides/ios/platform_features/intro_to_manual_camera_controls/
Have a look at AVCaptureExposureModeCustom and CaptureDevice.LockExposure...

I tried to do this for my motion activated camera app (Pocket Sentry) and I found that it is not possible to do this AND get approved in the app store.

I have been trying to do this myself. I think its possible only by using the exposure point of interest property. I am detecting the dark and bright spots and then adjusting the point accordingly.
Please refer : Detecting bright/dark points on iPhone screen
Does anyone know a better way to do this?

I am not too sure, but you should try using AVFoundation class to build the camera app, following the apple's sample code:
AVCam Sample Code
And then try to leverage the exposureMode property of the Class:
exposureMode Class Reference

Related

How to improve camera quality in ARKit

I am building an ARKit app where we want to be able to take a photo of the scene. I am finding the image quality of the ARCamera view is not good enough to take photos with on an iPad Pro.
Standard camera image:
ARCamera image:
I have seen an Apple forum post that mentions this could be iPad Pro 10.5 specific and is related to fixed lens position (https://forums.developer.apple.com/message/262950#262950).
Is there are public way to change the setting?
Alternatively, I have tried to use AVCaptureSession to take a normal photo and apply it to sceneView.scene.background.contents to switch out a blurred image for higher res image at the point the photo is taken but can't get AVCapturePhotoOutput to work with ARKit
Update: Congrats to whoever filed feature requests! In iOS 11.3 (aka "ARKit 1.5"), you can control at least some of the capture settings. And you now get 1080p with autofocus enabled by default.
Check ARWorldTrackingConfiguration.supportedVideoFormats for a list of ARConfiguration.VideoFormat objects, each of which defines a resolution and frame rate. The first in the list is the default (and best) option supported on your current device, so if you just want the best resolution/framerate available you don't have to do anything. (And if you want to step down for performance reasons by setting videoFormat, it's probably better to do that based on array order rather than hardcoding sizes.)
Autofocus is on by default in iOS 11.3, so your example picture (with a subject relatively close to the camera) should come out much better. If for some reason you need to turn it off, there's a switch for that.
There's still no API for changing the camera settings for the underlying capture session used by ARKit.
According to engineers back at WWDC, ARKit uses a limited subset of camera capture capabilities to ensure a high frame rate with minimal impact on CPU and GPU usage. There's some processing overhead to producing higher quality live video, but there's also some processing overhead to the computer vision and motion sensor integration systems that make ARKit work — increase the overhead too much, and you start adding latency. And for a technology that's supposed to show users a "live" augmented view of their world, you don't want the "augmented" part to lag camera motion by multiple frames. (Plus, on top of all that, you probably want some CPU/GPU time left over for your app to render spiffy 3D content on top of the camera view.)
The situation is the same between iPhone and iPad devices, but you notice it more on the iPad just because the screen is so much larger — 720p video doesn't look so bad on a 4-5" screen, but it looks awful stretched to fill a 10-13" screen. (Luckily you get 1080p by default in iOS 11.3, which should look better.)
The AVCapture system does provide for taking higher resolution / higher quality still photos during video capture, but ARKit doesn't expose its internal capture session in any way, so you can't use AVCapturePhotoOutput with it. (Capturing high resolution stills during a session probably remains a good feature request.)
config.videoFormat = ARWorldTrackingConfiguration.supportedVideoFormats[1]
I had to look for a while on how to set the config, so maybe it will help somebody.
This lets you pick the one with the highest resolution, you can change it so that it picks by most fps, etc...
if let videoFormat = ARWorldTrackingConfiguration.supportedVideoFormats.sorted { ($0.imageResolution.width * $0.imageResolution.height) < ($1.imageResolution.width * $1.imageResolution.height) }.last{
configuration.videoFormat = videoFormat
}

what is difference between AVCapture and camera default of iPhone

my app use AVCapture for capture image, this is my supervisor's ideal. But i research in internet and a can't get any information about the difference between AVCapture and default camera of iPhone or iPop (tab focus or camera quality...). please tell me what advance of AVFoundation framework ...
with the AVCaptureSession you can give your recorder a lot more functionality. You can customize nearly every aspect of the recording session. and you can ever get the raw data straight from the camera. the code can get quite complex however, and nothing is taken care of for you.
With the iOS default image capture controller you will be stuck with a few presets, and you will only have a little bit of camera functionality. But it is really simple to implement.
updated with link to apple code
If you want to see how to use the AVFoundation to do you camera recording you will probably like this app from apple.
Like I said, you will have to do everything manually. so be prepared for a handful of work.
AVCam demo app by Apple

iOS 4.2, looking for a way to manipulate the iPhone 4 camera's focus distance

I am working in an AR project, and we want to manipulate the focusing distance of the iPhone4 camera. Is this even possible? So far, we've found just toggling and auto focusing as options listed here : http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html%23//apple_ref/occ/instm/AVCaptureDevice/isAdjustingFocus
Thanks in advance for any tips! :)
Regarding the API it seems that the only supported actions are:
- check if AF is supported on the device (iPhones 3GS an 4 only I think)
- enable/disable AF
- set the point-of-interest that is NOT the distance, but only a point in the camera view.
Certainly not what you want to do.
Might be supported in private API... but that would not pass the validation process.
A workaround might be to see how much pixels move as the user shifts slightly, to get a sense of how distant some parts of the image are, and then set the AF point to a region of the image either closer or further based on that.
But, also file a Radar ticket requesting access to specify the focus distance if possible - if enough people ask Apple will add it to the API.

iPhone demo help: anyone know of a faster screen capture alternative to UIGetScreenImage()?

I'm working on an iPhone app that I'm going to be demo'ing to a live audience soon.
I'd really like to demo the app live over VGA to a projector, rather than show screenshots.
I bought a VGA adapter for iPhone, and have adapted Rob Terrell's TVOutManager to suit my needs. Unfortunately, the frame rate after testing on my television at home just isn't that good - even on an iPhone 4 (perhaps 4-5 frames per second, it varies).
I believe the reason for this slowness is that the main routine I'm using to capture the device's screen (which is then being displayed on an external display) is UIGetScreenImage(). This routine, which is no longer allowed to be part of shipping apps, is actually quite slow. Here's the code I'm using to capture the screen (FYI mirrorView is a UIImageView):
CGImageRef cgScreen = UIGetScreenImage();
self.mirrorView.image = [UIImage imageWithCGImage:cgScreen];
CGImageRelease(cgScreen);
Is there a faster method I can use to capture the iPhone's screen and achieve a better frame rate (shooting for 20+ fps)? It doesn't need to pass Apple's app review - this demo code won't be in the shipping app. If anyone knows of any faster private APIs, I'd really appreciate the help!
Also, the above code is being executed using a repeating NSTimer which fires every 1.0/desiredFrameRate seconds (currently every 0.1 seconds). I'm wondering if instead wrapping those calls in a block and using GCD or an NSOperationQueue might be more efficient than having the NSTimer invoke my updateTVOut obj-c method that currently contains those calls. Would appreciate some input on that too - some searching seems to indicate that obj-c message sending is somewhat slow compared to other operations.
Finally, as you can see above, the CGImageRef that UIGetScreenImage() returns is being turned into a UIImage and then that UIImage is being passed to a UIImageView, which is probably resizing the image on the fly. I'm wondering if the resizing might be slowing things down even more. Ideas of how to do this faster?
Have you looked at Apple's recommended alternatives to UIGetScreenImage? From the "Notice regarding UIGetScreenImage()" thread:
Applications using UIGetScreenImage() to capture images from the camera should instead use AVCaptureSession and related classes in the AV Foundation Framework. For an example, see Technical Q&A QA1702, "How to capture video frames from the camera as images using AV Foundation". Note that use of AVCaptureSession is supported in iOS4 and above only.
Applications using UIGetScreenImage() to capture the contents of interface views and layers should instead use the -renderInContext: method of CALayer in the QuartzCore framework. For an example, see Technical Q&A QA1703, "Screen capture in UIKit applications".
Applications using UIGetScreenImage() to capture the contents of OpenGL ES based views and layers should instead use the glReadPixels() function to obtain pixel data. For an example, see Technical Q&A QA1704, "OpenGL ES View Snapshot".
New solution: get an iPad 2 and mirror the output! :)
I don't know how fast is this but it worth a try ;)
CGImageRef screenshot = [[UIApplication sharedApplication] _createDefaultImageSnapshot];
[myVGAView.layer setContents:(id)screenshot];
where _createDefaultImageSnapshot is a private API. (Since is for demo... its ok I suppose)
and myVGAView is a normal UIView.
If you get CGImageRefs then just pass them to the contents of a layer, its lighter and should be a little bit faster (but just a little bit ;) )
I haven't got the solution you want (simulating video mirroring) but you can move your views to the external display. This is what I do and there is no appreciable impact on the frame rate. However, obviously since the view is no longer on the device's screen you can no longer directly interact with it or see it. If you have something like a game controlled with the accelerometer this shouldn't be a problem, however something touch based will require some work. What I do is have an alternative view on the device when the primary view is external. For me this is a 2D control view to "command" the normal 3D view. If you have a game you could perhaps create an alternative input view to control the game with (virtual buttons/joystick etc.) really depends on what you have as to how to work around it best.
Not having jailbroken myself I can't say for sure but I am under the impression that a jailbroken device can essentially enable video mirroring (like they use at the apple demos...). If true, that is likely your easiest route if all you want is a demo.

Getting Framerate Performance on iPhone

I've just come off the PSP where performance testing was easy. You just turned off 'vsync' and printed out the frameratem, then change something and see whether the frame rate goes up or down...
Is there any way to do the same thing on the iPhone? How do you turn vsync off? The Instruments tool is next to useless. Its chief problem being that it running it adversely affects the performance of the app! Also, the frame rate it reports is extremely sporadic.
I don't want any fancy tool that reports call trees and time spent in each function. I just want an unrestricted frame rate and some way to see what it is. Is there a high precision counter that you can use on the iPhone? Something like QueryPerformanceCounter in windows?
Also, is there anyway for you to somehow KILL backround processes so you know they can't effect the performance, perhaps solving the sporatic frame rate problem?
Profile your app with Instruments and use the Core Animation instrument. It gives a frame rate.
You're taking the try-something-and-measure approach. Very indirect. It's easy to tell exactly what is taking the time; it doesn't depend on what else is going on and doesn't require learning a new tool. All you need is a debugger that you can interrupt.
You can't kill background processes on the iPhone. That would make it possible for a buggy or malicious app to interfere with the phone function and the needs of all other functions on the iPhone are subordinated to the phone.
Try QuartzDebug or OpenGL Profiler.
Use instruments to get the frame rate.
To do this, run profile on your app (click and hold on the run button in xcode and choose profile). Make sure you are running your app on device. Choose openGL ES analysis. Look at the data display under core animation frames per second.
You want to aim for 60fps.