iPhone Proximity Sensor - iphone

Can the iPhone SDK take advantage of the iPhone's proximity sensors? If so, why hasn't anyone taken advantage of them? I could picture a few decent uses.
For example, in a racing game, you could put your finger on the proximity sensor to go instead of taking up screen real-estate with your thumb. Of course though, if this was your only option, then iPod touch users wouldn't be able to use the application.
Does the proximity sensor tell how close you are, or just that something is in front of it?

There is a public API for this. -[UIApplication setProximitySensingEnabled:(BOOL)] will turn the feature on. BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.

Assuming you mean the sensor that shuts off the screen when you hold it to your ear, I'm pretty sure that is just an infrared sensor inside the ear speaker. If you start the phone app (you don't have to be making a call) and hold something to cast a shadow over the ear speaker, you can make the display shut off.
When you asked this question it was not accessible via the public API. You can now access the sensor's state via UIDevice's proximityState property. However, it wouldn't be that useful for games, since it is only an on/off thing, not a near/far measure. Plus, it's only available on the iPhone and not the iPod touch.

Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e, if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get the proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.

The proximity sensor works via measuring IR reflectance. If you hold the iPhone up to a webcam, you can see a small, pulsing IR LED.

There's a lot of confusion between the proximity sensor and the ambient light sensor. The iPhone has both. The Touch does not have a proximity sensor, making it a poor choice for user input. It would be a bad idea anyway since Apple isn't obligated to locate it in the same place in future devices; you aren't supposed to know or care where it is.
The proximity sensor works by pulsing an infrared LED and measuring the amount of reflectance. You can see this using your iSight camera (most digital cameras are sensitive to IR.) Just launch Photo Booth, initiate a call (or play a voicemail) on the phone and point it at your iSight camera. Note the flashing light next to the earpiece; cover it with your finger and the screen will go black.
The ambient light sensor's API is evidently private at this point.

Just to update, this is possible.
device = [UIDevice currentDevice];
// Turn on proximity monitoring
[device setProximityMonitoringEnabled:YES];
// To determine if proximity monitoring is available, attempt to enable it.
// If the value of the proximityMonitoringEnabled property remains NO, proximity
// monitoring is not available.
// Detect whether device supports proximity monitoring
proxySupported = [device isProximityMonitoringEnabled];
// Register for proximity notifications
[notificationCenter addObserver:self selector:#selector(proximityChanged:) name:UIDeviceProximityStateDidChangeNotification object:device];
As benzado points out, you can use:
// Returns a BOOL, YES if device is proximate
[device proximityState];

There is no public API for this.

In iPhone 3.0 there is official support for the proximity sensor. Have a look at UIDevice proximityMonitoringEnabled in the docs.

If you aren't aiming for the AppStore, you can read my articles here on getting access to those:
Proximity Sensor: http://iphonedevwiki.net/index.php/AppleProxShim
Ambient Light Sensor: http://iphonedevwiki.net/index.php/AppleISL29003

Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e. if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
answered Jul 22 '09 at 5:49
Kevin Lambert
I've encoutered this problem too. It took me a long time to figure out the real reason of why the proximity sensor is not working. When orientation is UIInterfaceOrientationLandscapeLeft or UIInterfaceOrientationLandscapeRight, proximity sensor does not work; while in portrait mode it works well. My iPhone is iPhone 4S (iOS SDK 5.0).

Those proximity sensors are basically a matrix of conductors. The vertical "wires" are tracks on one side of a thin sheet of insulator, the horizontal ones are on the other side. The intersections function as capacitors. Your finger carries an electrostatic charge, so capacitance of each junction varies with proximity. FETs amplify the signal and biasing sets a threshold. In practice the circuit is more complex than that because it has to detect a relative change and reject noise.
But anyway, what the sensor grid tells you is that a field effect has been sensed, and that field effect is characteristic of object about the size of a fingertip and resting on the surface of the display. The centroid of the capacitive disturbance is computed (probably by hardware) and the coordinates are (presumably) reported as numbers on a port most likely brought to the attention of the device OS by an interrupt. In something as sexy as an iPhone there's probably a buffer of the last dozen or so positions so it can work out direction and speed. Probably these are also computed by hardware and presented as numbers on the same port.

#Dipak Patel & #Coderer
You can download working code at
http://spazout.com/google_cheats_independent_iphone_developers_screwed
It has a working implementation of proximityStateChanged a undocumented method in UIApplication.
Hope this helps.

To turn the screen off it's conceivable that more than one sensors is used to figure out if the screen should be turned off or not. The IR proximity sensor described by Cryptognome in conjunction with the Touch screen sensor described by Peter Wone could work out if the iphone is being held close to your face (or something else with a slight electric charge) or if its just very close to something in-animate.

Related

iPhone headphone output - mono control left / right ear

I'm working on an iOS6 and above GPS navigation route creation and following app. At the moment I've integrated the OpenEARS framework to provide text to speech directions to the user.
I have setup an AVAudioSession and overrode the audio category to headphones. I'm looking for some advice in limiting my audio output to the left / right headphone channel depending on what physical direction the user must travel to reach the destination.
This is my first foray into Audio on iOS and am happy to change from OpenEARS if someone has knowledge of completing this with another TTS library.
Thanks, Ben
To anyone it may help in the future...
After spending hours looking at Core Audio and AudioToolbox methods for balancing I stumbled over a pan property within AVAudioPlayer. OpenEARS, and ultimately Flite, TTS systems use the AVAudioPlayer to play the converted audio file. Cheers!
The audio player’s stereo pan position.
#property float pan Discussion By setting this property you can
position a sound in the stereo field. A value of –1.0 is full left,
0.0 is center, and 1.0 is full right.
Availability Available in iOS 4.0 and later. Declared In
AVAudioPlayer.h

Accessing iOS 6 new APIs for camera exposure and shutter speed

On Apple's iOS 6.0 feature page, it used to say
Take advantage of the built-in camera’s advanced features. New APIs let you control focus, exposure, and region of interest. You can also access and display faces with face detection APIs, and leverage hardware-enabled video stabilization.
This text has since been removed, and I can't find new methods in the API for controlling exposure. In class AVCaptureDevice under "Exposure Settings" there is no new property/method for iOS 6.0. Do you know where i can find new features for exposure in API?
It's true that there is an -exposureMode property on AVCaptureDevice, but that's only for setting the mode (off/auto/continuous) and not the actual f-stop, SS, or ISO. Camera apps that provide "exposure" control all seem to do it through post-processing.
However, it seems there are undocumented APIs in the framework to do this. Check out the full headers for AVCaptureDevice.h (via a class-dump) and note the following methods:
- (void)setManualExposureSupportEnabled:(BOOL)arg1;
- (BOOL)isManualExposureSupportEnabled;
- (void)setExposureGain:(float)arg1;
- (float)exposureGain;
- (void)setExposureDuration:(struct { long long x1; int x2; unsigned int x3; long long x4; })arg1;
- (struct { long long x1; int x2; unsigned int x3; long long x4; })exposureDuration;
- (void)setExposureMode:(int)arg1;
- (int)exposureMode;
- (BOOL)isExposureModeSupported:(int)arg1;
My guess is gain is equivalent f-stop (fixed aperture), and duration is shutter speed. I wonder if these are used for the iPhone 5's low-light boost mode.
You can also use otool to poke around and try to piece together the symbols. There's likely a new constant in exposureMode for enabling manual control, and exposureDuration seems like it has flags too. When calling these, make sure to use the new -isExposureModeSupported: and also call -respondsToSelector: to check compatibility.
As always, using private APIs is frowned upon by Apple and is cause for rejection from the App Store. There might be ways around this, such as hiding the calls using -performSelector: or obc_msgsend with rot13 strings or something, as I'm pretty sure they only do static analysis on the app binary.
I've managed to 'trick' the camera into running a shorter exposure time, but I suspect it will only be of use to those doing similar (macro) image acquires. I first set up AVCaptureDevice to use AVCaptureExposureModeContinuousAutoExposure and set the flash to TorchMode. I then UnlockForConfiguration and set up a key-value observer to watch for adjustingExposure to finish. I then re-lock the device, flip to AVCaptureExposureModeLocked, and turn off the Torch. This has the effect of brute-force setting a shorter shutter speed than what the camera would select on the un-illuminated scene. By playing with the Torch level I can set any relative shutter speed value I want (it would be best of course to leave the torch on, but in my application it produces glare on the subject). Again this only really works when your object distance is very close (less than say 6 inches), but it's allowed me to eliminate hand shake blurring in my close-up images. The down side is that the images are darker since I don't have a way of spoofing the camera gain, but not a problem in my particular application.
It looks like they've updated that linked text—there's no mention of new APIs for exposure:
Use powerful new features of the built-in camera. New APIs support real-time video stabilization, an improved LED flash, and face detection and display. You can get reports of dropped frames during capture and leverage new utilities to map UI touches to focus and exposure commands. And apps that support iPhone 5 can take advantage of low light boost mode.
There is an opt-in low-light boost mode for iPhone 5, detailed here by Jim Rhoades (and in this developer forum post, log-in required).
As a follow-up to Michael Grinich's excellent information, I found that there is an order dependency on some of the calls in the private API. To use "manual" exposure controls, you have to enable them before you set the mode, like so:
#define AVCaptureExposureModeManual 3
NSError* error = nil;
if ([captureDevice lockForConfiguration:&error]) {
captureDevice.manualExposureSupportEnabled = YES;
if ([captureDevice isExposureModeSupported:AVCaptureExposureModeManual]) {
captureDevice.exposureMode = AVCaptureExposureModeManual;
captureDevice.exposureGain = ...;
captureDevice.exposureDuration = {...};
}
[captureDevice unlockForConfiguration];
}
All of this is demonstrated in iOS-ManualCamera.
Starting with iOS 8.0, this is now finally possible.
See setExposureModeCustomWithDuration etc. in the Apple documentation.
Here is an article discussing how to use the APIs.

iPhone motion data blocked by Compass Calibration setting

My app has an augmented reality camera view which uses CMMotionManager to find heading relative to true north. However, I am finding that if the user switches OFF Location Services / System Settings / Compass Calibration, then the device motion data stops returning sensible values (in particular the gravity vector values), and makes the app useless. Quite a few people do this because they believe doing so saves battery life.
Does anyone know exactly what this setting does to the device or to CMMotionManager?
How can my app determine what it is set to?
Does setting it OFF necessarily mean that CMMotionManager won't work?
You can check your true heading value in your heading delegate. If the device calibration is "off" in the setting then this value will be static and equal to "-1" every time. SO you can check in the heading delegate
if (new.heading == -1)
{
//Calibration in setting is Off
}
else
{
//Calibration is On
}
Hope this work. As it works for me.
Thanks and Regards
Abhishek Goyal

iphone - Using the Device motion instead of Accelerometer, Gyroscope and Magnetometer

If I'm on iOS4, not using a specific handler, and needing the same update interval, what is the difference using the Device motion instead of specific Accelerometer, Gyroscope and Magnetometer updates ?
What happens if one of those 3 features is not available on the device if I use the Device motion ?
I mean, what will return deviceMotionAvailable if one of those feature is not available ?
In iOS 4 the magnetometer is not yet included in device motion API but handled by CLLocationManager (this changed in iOS 5). So if you have a gyro and an accelerometer, deviceMotioAvailable will return true independent of magnetometer. On the other hand if the gyro is missing you will always get false and you need to stay with accelerometerData.
Because Device Motion has one timestamp for both sensors, you will get reliable interpolated values for both sensors. Otherwise Device Motion wouldn't be able to do sensor fusion, the main advantage why this is the preferred way.
You can not rely on a fix frequency for CLLocationManager. didUpdateHeading is called whenever the system 'thinks' it should be. To get the different time coordinates between CLLocationManager and CMDeviceMotion normalised, you can have a look at NSTimeInterval to unix timestamp

Turing off functionality dpending on device iPhone, or iPod touch

I have what i thought was a relatively simple question but i cannot find an answer to it yet. I have an iPhone app that uses GPS on one of its screens. I want to disable this screen using code when the app loads,so disable it when a iPod touch is being used. This is so it can still be useful on a iPod touch as there is a lot of functionality that a iPod touch user can use.
Thanks.
You can get there with #Aaron's answer, but that's not the way to do it. Use [CLLocationManager locationServicesEnabled]; to tell if you can determine the users's location. This is a lot more robust than making decisions based on the device model.
to get the device info..
http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIDevice_Class/Reference/UIDevice.html
NSString *deviceType = [UIDevice currentDevice].model;
I think if you are just checking for GPS then you will need to access the CLLocationManager to see if it is on or off