Accessing iOS 6 new APIs for camera exposure and shutter speed - iphone

On Apple's iOS 6.0 feature page, it used to say
Take advantage of the built-in camera’s advanced features. New APIs let you control focus, exposure, and region of interest. You can also access and display faces with face detection APIs, and leverage hardware-enabled video stabilization.
This text has since been removed, and I can't find new methods in the API for controlling exposure. In class AVCaptureDevice under "Exposure Settings" there is no new property/method for iOS 6.0. Do you know where i can find new features for exposure in API?

It's true that there is an -exposureMode property on AVCaptureDevice, but that's only for setting the mode (off/auto/continuous) and not the actual f-stop, SS, or ISO. Camera apps that provide "exposure" control all seem to do it through post-processing.
However, it seems there are undocumented APIs in the framework to do this. Check out the full headers for AVCaptureDevice.h (via a class-dump) and note the following methods:
- (void)setManualExposureSupportEnabled:(BOOL)arg1;
- (BOOL)isManualExposureSupportEnabled;
- (void)setExposureGain:(float)arg1;
- (float)exposureGain;
- (void)setExposureDuration:(struct { long long x1; int x2; unsigned int x3; long long x4; })arg1;
- (struct { long long x1; int x2; unsigned int x3; long long x4; })exposureDuration;
- (void)setExposureMode:(int)arg1;
- (int)exposureMode;
- (BOOL)isExposureModeSupported:(int)arg1;
My guess is gain is equivalent f-stop (fixed aperture), and duration is shutter speed. I wonder if these are used for the iPhone 5's low-light boost mode.
You can also use otool to poke around and try to piece together the symbols. There's likely a new constant in exposureMode for enabling manual control, and exposureDuration seems like it has flags too. When calling these, make sure to use the new -isExposureModeSupported: and also call -respondsToSelector: to check compatibility.
As always, using private APIs is frowned upon by Apple and is cause for rejection from the App Store. There might be ways around this, such as hiding the calls using -performSelector: or obc_msgsend with rot13 strings or something, as I'm pretty sure they only do static analysis on the app binary.

I've managed to 'trick' the camera into running a shorter exposure time, but I suspect it will only be of use to those doing similar (macro) image acquires. I first set up AVCaptureDevice to use AVCaptureExposureModeContinuousAutoExposure and set the flash to TorchMode. I then UnlockForConfiguration and set up a key-value observer to watch for adjustingExposure to finish. I then re-lock the device, flip to AVCaptureExposureModeLocked, and turn off the Torch. This has the effect of brute-force setting a shorter shutter speed than what the camera would select on the un-illuminated scene. By playing with the Torch level I can set any relative shutter speed value I want (it would be best of course to leave the torch on, but in my application it produces glare on the subject). Again this only really works when your object distance is very close (less than say 6 inches), but it's allowed me to eliminate hand shake blurring in my close-up images. The down side is that the images are darker since I don't have a way of spoofing the camera gain, but not a problem in my particular application.

It looks like they've updated that linked text—there's no mention of new APIs for exposure:
Use powerful new features of the built-in camera. New APIs support real-time video stabilization, an improved LED flash, and face detection and display. You can get reports of dropped frames during capture and leverage new utilities to map UI touches to focus and exposure commands. And apps that support iPhone 5 can take advantage of low light boost mode.
There is an opt-in low-light boost mode for iPhone 5, detailed here by Jim Rhoades (and in this developer forum post, log-in required).

As a follow-up to Michael Grinich's excellent information, I found that there is an order dependency on some of the calls in the private API. To use "manual" exposure controls, you have to enable them before you set the mode, like so:
#define AVCaptureExposureModeManual 3
NSError* error = nil;
if ([captureDevice lockForConfiguration:&error]) {
captureDevice.manualExposureSupportEnabled = YES;
if ([captureDevice isExposureModeSupported:AVCaptureExposureModeManual]) {
captureDevice.exposureMode = AVCaptureExposureModeManual;
captureDevice.exposureGain = ...;
captureDevice.exposureDuration = {...};
}
[captureDevice unlockForConfiguration];
}
All of this is demonstrated in iOS-ManualCamera.

Starting with iOS 8.0, this is now finally possible.
See setExposureModeCustomWithDuration etc. in the Apple documentation.
Here is an article discussing how to use the APIs.

Related

Adjust camera focus in ARKit

I want to adjust the device's physical camera focus while in augmented reality. (I'm not talking about the SCNCamera object.)
In an Apple Dev forum post, I've read that autofocus would interfere with ARKit's object detection, which makes sense to me.
Now, I'm working on an app where the users will be close to the object they're looking at. The focus the camera has by default makes everything look very blurry when closer to an object than around 10cm.
Can I adjust the camera's focus before initializing the scene, or preferably while in the scene?
20.01.2018
Apparently, there's still no solution to this problem. You can read more about this at this reddit post and this developer forum post for private API workarounds and other (non-helping) info.
25.01.2018
#AlexanderVasenin provided a useful update pointing to Apple's documentation. It shows that ARKit will be able to support not just focusing, but also autofocusing as of iOS 11.3.
See my usage sample below.
As stated by Alexander, iOS 11.3 brings autofocus to ARKit.
The corresponding documentation site shows how it is declared:
var isAutoFocusEnabled: Bool { get set }
You can access it this way:
var configuration = ARWorldTrackingConfiguration()
configuration.isAutoFocusEnabled = true // or false
However, as it is true by default, you should not even have to set it manually, unless you chose to opt out.
UPDATE: Starting from iOS 11.3 ARKit supports autofocusing, and it's enabled by default (more info). Manual focusing still aren't available.
Prior to iOS 11.3 ARKit did not supported neither manual focus adjust nor autofocusing.
Here is Apple's reply on the subject (Oct 2017):
ARKit does not run with autofocus enabled as it may adversely affect plane detection. There is an existing feature request to support autofocus and no need to file additional requests. Any other focus discrepancies should be filed as bug reports. Be sure to include the device model and OS version. (source)
There is another thread on Apple forums where a developer claims he was able to adjust autofocus by calling AVCaptureDevice.setFocusModeLocked(lensPosition:completionHandler:) method on private AVCaptureDevice used by ARKit and it appears it's not affecting tracking. Though the method itself is public, the ARKit's AVCaptureDevice is not, so using this hack in production would most likely result in App Store rejection.
if #available(iOS 16.0, *) {
// This property is nil on devices that aren’t equiped with an ultra-wide camera.
if let device = ARWorldTrackingConfiguration.configurableCaptureDeviceForPrimaryCamera {
do {
try device.lockForConfiguration ()
// configuration your focus mode
// you need to change ARWorldTrackingConfiguration().isAutoFocusEnabled at the same time
device.unlockForConfiguration ()
} catch {
}
}
} else {
// Fallback on earlier versions
}
Use configurableCaptureDeviceForPrimaryCamera method, and this is only available after iOS 16 or later.
Documentation
/
ARKit
/
Configuration Objects
/
ARConfiguration
/
configurableCaptureDeviceForPrimaryCamera

iOS determine if VoiceOver is still talking

Is there a way to determine whether VoiceOver is currently announcing and when it stops. I've tried UIAccessibilityVoiceOverStatusChanged but my understanding is that this is only if you switch VoiceOver on or off. Any help would be greatly appreciated. thanks.
We use otherAudioIsPlaying, the problem is some app's running in the background like some pedometer monitors etc. turn on the audio it seems and never release it so even though nothing is actually being spoken or played otherAudioIsPlaying always returns 1 until you remove the other application from the background. So now not only can you not play music but you have no idea that another application in the background will mess up this test. Apple really needs to put in an API to determine if Voice Over is currently speaking or not.
These are all the Accessibility booleans that I found in the documentation:
UIAccessibilityPostNotification
UIAccessibilityIsVoiceOverRunning
UIAccessibilityIsMonoAudioEnabled
UIAccessibilityIsClosedCaptioningEnabled
UIAccessibilityRegisterGestureConflictWithZoom
I don't think that there are any booleans to do what you are talking about.
You could use the audio session's "OtherAudioIsPlaying" property to check if another system process is using the audio hardware at the moment. It should be "true" if VoiceOver is speaking and "false" if not.
Actually this might not work properly if the user is playing music in the background. But most users running VoiceOver will usually not have any other audio enabled permanently, since it makes it harder to understand what VoiceOver is saying.
Here is an example for usage:
UInt32 otherAudioIsPlaying;
UInt32 propertySize = sizeof(otherAudioIsPlaying);
AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying,&propertySize, &otherAudioIsPlaying);
if(otherAudioIsPlaying) {
// other application is generating sound output (including VoiceOver)
// but might also be any other app (like iPod App)
}

How to disable iOS System Sounds

I am working on an iPad app that connects with an accessory that plays sound. When the iPad is connected to the accessory, I would like to mute all system sounds but allow other sounds (iPod).
Part of the reason for this is that the accessory is such that it is intended to be used during a live performance. Clearly it would be annoying to have e-mail, alert, or any other system sound running through and amplified (crazy loud).
I have looked at using AVAudioSession (read Audio Sessions to learn more) and tried all of the AudioSessionCategories. None of these categories will mute the system sound, instead it will only allow you to mute application sounds (iPod) - not useful for my purposes.
I also found docs on "System Sound Services", but this only allows you to play system sounds. There is no api here to disable system sounds while your app is running.
A final note, we have made it easy to adjust the iPad level (volume) by including the MPVolumeView, but we expect the user to want to play iPod music. If while playing iPod music (or music from another app) and an e-mail comes through, you'd be amazed how LOUD / ANNOYING that e-mail suddenly becomes when going through our accessory. It's even possible it could damage equipment. :D
It is possible to change the system sounds, which turns out to be the ringer btw, using the AVSystemController. However, AVSystemController exists in the private Celestial framework. Since this framework is referenced by UIKit, it is still possible to use this class without directly referencing it.
Apple prohibits using private API's, so that alone makes this a bad idea. Given my circumstance, I think they may make an exception, BUT I will likely abandon this course since after taking it I realized that it didn't fix my problem. It does indeed mute the sounds, but as soon as I plug in to my accessory, the system sounds come out at max volume even though the ringer volume is set to 0. This leads me to believe the answer to solving my problem is in the MFI documentation.
Anyhow, here is how to change the ringer using private framework / api (which will get your app rejected without some kind of special permission).
short answer:
[[AVSystemController sharedAVSystemController] setVolumeTo:0 forCategory:#"Ringtone"];
answer without having to directly reference Celestial frameork / AVSystemController.h:
- (void) setSystemVolumeLevelTo:(float)newVolumeLevel
{
Class avSystemControllerClass = NSClassFromString(#"AVSystemController");
id avSystemControllerInstance = [avSystemControllerClass performSelector:#selector(sharedAVSystemController)];
NSString *soundCategory = #"Ringtone";
NSInvocation *volumeInvocation = [NSInvocation invocationWithMethodSignature:
[avSystemControllerClass instanceMethodSignatureForSelector:
#selector(setVolumeTo:forCategory:)]];
[volumeInvocation setTarget:avSystemControllerInstance];
[volumeInvocation setSelector:#selector(setVolumeTo:forCategory:)];
[volumeInvocation setArgument:&newVolumeLevel atIndex:2];
[volumeInvocation setArgument:&soundCategory atIndex:3];
[volumeInvocation invoke];
}
Using MediaPlayer framework, we can set the level of SYSTEM sound
[[MPMusicPlayerController applicationMusicPlayer] setVolume:0];
Best you can do is encourage your users to go into airplane mode.

Turn off display in iPhone OS (iOS)

is there a way to programmatically turn off the display in iOS? Not just turning brightness down, but off like the way the Phone App does. I am happy to use private API, since this is for personal use.
Thanks!
You can turn off the display by enabling the proximity monitoring. It will automatically turn off the screen, like in the Phone app, by placing the phone near your ears or by placing a finger over the IR sensor at the top of the phone.
[UIDevice currentDevice].proximityMonitoringEnabled = YES;
You can do this, (obviously, using Private APIs of course) :
on iOS5:
#include <stdio.h>
#include <dlfcn.h>
int (*SBSSpringBoardServerPort)() = (int (*)())dlsym(RTLD_DEFAULT, "SBSSpringBoardServerPort");
int port = SBSSpringBoardServerPort();
void (*SBDimScreen)(int _port,BOOL shouldDim) = (void (*)(int _port,BOOL shouldDim))dlsym(RTLD_DEFAULT, "SBDimScreen");
and then use
SBDimScreen(port,YES);
whenever you want to dim, and
SBDimScreen(port,NO);
whenever you want to undim.
On iOS6:
void (*BKSDisplayServicesSetScreenBlanked)(BOOL blanked) = (void (*)(BOOL blanked))dlsym(RTLD_DEFAULT, "BKSDisplayServicesSetScreenBlanked");
and then use:
BKSDisplayServicesSetScreenBlanked(1); // 1 to dim, 0 to undim
"Dim" here means totally turn off the screen. This is what the system uses when e.g. a proximity event occurs while in a call.
The only way I know of, public or private, is using the power button.
You might look at -[UIApplication setProximitySensingEnabled:(BOOL)], or -[UIApplication setIdleTimerDisabled:YES], this might lead to something useful
Have you tried:
[[UIScreen mainScreen] setBrightness: yourvalue];
SO question 8936999: iPhone: How can we programmatically change the brightness of the screen?
Proximity doesn't work on all devices. There's a much simpler solution to this problem without resorting to private APIs.
Swift
UIScreen.main.wantsSoftwareDimming = true
UIScreen.main.brightness = 0.0
Without wantsSoftwareDimming, the backlight will never completely turn off.
The docs have this cautionary sentence:
The default value is false. Enabling it may cause a loss in performance.
I do not think there is any to turn off the display (simulating iphone sleep button) except changing the brightness.
This link might help.

iPhone Proximity Sensor

Can the iPhone SDK take advantage of the iPhone's proximity sensors? If so, why hasn't anyone taken advantage of them? I could picture a few decent uses.
For example, in a racing game, you could put your finger on the proximity sensor to go instead of taking up screen real-estate with your thumb. Of course though, if this was your only option, then iPod touch users wouldn't be able to use the application.
Does the proximity sensor tell how close you are, or just that something is in front of it?
There is a public API for this. -[UIApplication setProximitySensingEnabled:(BOOL)] will turn the feature on. BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.
Assuming you mean the sensor that shuts off the screen when you hold it to your ear, I'm pretty sure that is just an infrared sensor inside the ear speaker. If you start the phone app (you don't have to be making a call) and hold something to cast a shadow over the ear speaker, you can make the display shut off.
When you asked this question it was not accessible via the public API. You can now access the sensor's state via UIDevice's proximityState property. However, it wouldn't be that useful for games, since it is only an on/off thing, not a near/far measure. Plus, it's only available on the iPhone and not the iPod touch.
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e, if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get the proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
The proximity sensor works via measuring IR reflectance. If you hold the iPhone up to a webcam, you can see a small, pulsing IR LED.
There's a lot of confusion between the proximity sensor and the ambient light sensor. The iPhone has both. The Touch does not have a proximity sensor, making it a poor choice for user input. It would be a bad idea anyway since Apple isn't obligated to locate it in the same place in future devices; you aren't supposed to know or care where it is.
The proximity sensor works by pulsing an infrared LED and measuring the amount of reflectance. You can see this using your iSight camera (most digital cameras are sensitive to IR.) Just launch Photo Booth, initiate a call (or play a voicemail) on the phone and point it at your iSight camera. Note the flashing light next to the earpiece; cover it with your finger and the screen will go black.
The ambient light sensor's API is evidently private at this point.
Just to update, this is possible.
device = [UIDevice currentDevice];
// Turn on proximity monitoring
[device setProximityMonitoringEnabled:YES];
// To determine if proximity monitoring is available, attempt to enable it.
// If the value of the proximityMonitoringEnabled property remains NO, proximity
// monitoring is not available.
// Detect whether device supports proximity monitoring
proxySupported = [device isProximityMonitoringEnabled];
// Register for proximity notifications
[notificationCenter addObserver:self selector:#selector(proximityChanged:) name:UIDeviceProximityStateDidChangeNotification object:device];
As benzado points out, you can use:
// Returns a BOOL, YES if device is proximate
[device proximityState];
There is no public API for this.
In iPhone 3.0 there is official support for the proximity sensor. Have a look at UIDevice proximityMonitoringEnabled in the docs.
If you aren't aiming for the AppStore, you can read my articles here on getting access to those:
Proximity Sensor: http://iphonedevwiki.net/index.php/AppleProxShim
Ambient Light Sensor: http://iphonedevwiki.net/index.php/AppleISL29003
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e. if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
answered Jul 22 '09 at 5:49
Kevin Lambert
I've encoutered this problem too. It took me a long time to figure out the real reason of why the proximity sensor is not working. When orientation is UIInterfaceOrientationLandscapeLeft or UIInterfaceOrientationLandscapeRight, proximity sensor does not work; while in portrait mode it works well. My iPhone is iPhone 4S (iOS SDK 5.0).
Those proximity sensors are basically a matrix of conductors. The vertical "wires" are tracks on one side of a thin sheet of insulator, the horizontal ones are on the other side. The intersections function as capacitors. Your finger carries an electrostatic charge, so capacitance of each junction varies with proximity. FETs amplify the signal and biasing sets a threshold. In practice the circuit is more complex than that because it has to detect a relative change and reject noise.
But anyway, what the sensor grid tells you is that a field effect has been sensed, and that field effect is characteristic of object about the size of a fingertip and resting on the surface of the display. The centroid of the capacitive disturbance is computed (probably by hardware) and the coordinates are (presumably) reported as numbers on a port most likely brought to the attention of the device OS by an interrupt. In something as sexy as an iPhone there's probably a buffer of the last dozen or so positions so it can work out direction and speed. Probably these are also computed by hardware and presented as numbers on the same port.
#Dipak Patel & #Coderer
You can download working code at
http://spazout.com/google_cheats_independent_iphone_developers_screwed
It has a working implementation of proximityStateChanged a undocumented method in UIApplication.
Hope this helps.
To turn the screen off it's conceivable that more than one sensors is used to figure out if the screen should be turned off or not. The IR proximity sensor described by Cryptognome in conjunction with the Touch screen sensor described by Peter Wone could work out if the iphone is being held close to your face (or something else with a slight electric charge) or if its just very close to something in-animate.