iPhone motion data blocked by Compass Calibration setting - iphone

My app has an augmented reality camera view which uses CMMotionManager to find heading relative to true north. However, I am finding that if the user switches OFF Location Services / System Settings / Compass Calibration, then the device motion data stops returning sensible values (in particular the gravity vector values), and makes the app useless. Quite a few people do this because they believe doing so saves battery life.
Does anyone know exactly what this setting does to the device or to CMMotionManager?
How can my app determine what it is set to?
Does setting it OFF necessarily mean that CMMotionManager won't work?

You can check your true heading value in your heading delegate. If the device calibration is "off" in the setting then this value will be static and equal to "-1" every time. SO you can check in the heading delegate
if (new.heading == -1)
{
//Calibration in setting is Off
}
else
{
//Calibration is On
}
Hope this work. As it works for me.
Thanks and Regards
Abhishek Goyal

Related

Adjust camera focus in ARKit

I want to adjust the device's physical camera focus while in augmented reality. (I'm not talking about the SCNCamera object.)
In an Apple Dev forum post, I've read that autofocus would interfere with ARKit's object detection, which makes sense to me.
Now, I'm working on an app where the users will be close to the object they're looking at. The focus the camera has by default makes everything look very blurry when closer to an object than around 10cm.
Can I adjust the camera's focus before initializing the scene, or preferably while in the scene?
20.01.2018
Apparently, there's still no solution to this problem. You can read more about this at this reddit post and this developer forum post for private API workarounds and other (non-helping) info.
25.01.2018
#AlexanderVasenin provided a useful update pointing to Apple's documentation. It shows that ARKit will be able to support not just focusing, but also autofocusing as of iOS 11.3.
See my usage sample below.
As stated by Alexander, iOS 11.3 brings autofocus to ARKit.
The corresponding documentation site shows how it is declared:
var isAutoFocusEnabled: Bool { get set }
You can access it this way:
var configuration = ARWorldTrackingConfiguration()
configuration.isAutoFocusEnabled = true // or false
However, as it is true by default, you should not even have to set it manually, unless you chose to opt out.
UPDATE: Starting from iOS 11.3 ARKit supports autofocusing, and it's enabled by default (more info). Manual focusing still aren't available.
Prior to iOS 11.3 ARKit did not supported neither manual focus adjust nor autofocusing.
Here is Apple's reply on the subject (Oct 2017):
ARKit does not run with autofocus enabled as it may adversely affect plane detection. There is an existing feature request to support autofocus and no need to file additional requests. Any other focus discrepancies should be filed as bug reports. Be sure to include the device model and OS version. (source)
There is another thread on Apple forums where a developer claims he was able to adjust autofocus by calling AVCaptureDevice.setFocusModeLocked(lensPosition:completionHandler:) method on private AVCaptureDevice used by ARKit and it appears it's not affecting tracking. Though the method itself is public, the ARKit's AVCaptureDevice is not, so using this hack in production would most likely result in App Store rejection.
if #available(iOS 16.0, *) {
// This property is nil on devices that aren’t equiped with an ultra-wide camera.
if let device = ARWorldTrackingConfiguration.configurableCaptureDeviceForPrimaryCamera {
do {
try device.lockForConfiguration ()
// configuration your focus mode
// you need to change ARWorldTrackingConfiguration().isAutoFocusEnabled at the same time
device.unlockForConfiguration ()
} catch {
}
}
} else {
// Fallback on earlier versions
}
Use configurableCaptureDeviceForPrimaryCamera method, and this is only available after iOS 16 or later.
Documentation
/
ARKit
/
Configuration Objects
/
ARConfiguration
/
configurableCaptureDeviceForPrimaryCamera

How to get the amount of available-light in the area surrounding the iPhone using Objective C? [duplicate]

I notice on my iPhone, after a few seconds of being in direct sun light, the screen will adjust to become brighter, dimmer etc. I was wondering if there was a way to interact with this sensor?
I have an application which is used outside. When you go into direct light, it becomes very difficult to see the screen for a few momments, before it adjusts. And even then, it's not always as bright as I'd like it to be. I would like to implement a high contrast skin for outdoor viewing, and a low contrast for indoor viewing.
Is this possible to read light sensor data, and if so, how do I extract these sensor values?
I would assume there is a light sensor however, as the camera knows when to use the flash.
On the other hand this is a different idea (maybe a silly one), using the brightness of the device's screen you can get some value of the external conditions.
From 0.12 (Dark) to 0.99 (Light)
The next line will get those values, give it a try, put some light on and off over the device to get different values.
NSLog(#"Screen Brightness: %f",[[UIScreen mainScreen] brightness]);
Obviously Automatic Brightness feature should be turned on in order to get this to work.
Regards.
To read the ambient light sensor data, you need to use IOHID in the IOKit framework.
http://iphonedevwiki.net/index.php/AppleISL29003
http://iphonedevwiki.net/index.php/IOKit.framework
However, this requires private headers, so if you use it, Apple probably won't let your app into the app store.
I continually ask the iOS forums whether there will be support for ambient light sensor readings in the future, but to no avail.
You can actually use the camera to do this, which is independent of the user's screen brightness settings (and works even if Automatic Brightness is OFF).
You read the Brightness Value from the video frames' metadata as I explain in this Stack Overflow answer.
Try using GSEventSetBacklightLevel();, which requires <GraphicsServices/GraphicsServices.h>. This is how one can programmatically adjust the brightness levels. There is also a get option, so I think that may have the information you're after.
For Swift 5, here is how to use the brightness detection which indirectly gives you the luminosity of the outside:
/// A view controller (you can use any UIView or AnyObj)
class MyViewConroller: UIViewController {
/// Remove observers on deinit
deinit {
removeObservers()
}
// MARK: - Observers management helpers
/// Add my observers to the vc
func addObservers() {
NotificationCenter.default.addObserver(self, selector: #selector(onScreenBrightnessChanged(_:)), name: UIScreen.brightnessDidChangeNotification, object:nil)
}
/// Clean up observers
func removeObservers() {
NotificationCenter.default.removeObserver(self)
}
/// Load the views
func loadView() {
// Add my observes to the vc
addObservers()
}
/**
Handles brightness changes
*/
#objc func onScreenBrightnessChanged(_ sender: Notification) {
// Tweak as needed: 0.5 is a good value for me
let isDark = UIScreen.main.brightness < 0.5. // in 0...1
// Do whatever you want with the `isDark` flag: here I turn the headlights off
vehicle.turnOnTheHeadlights( isDark )
}
}
For iOS 14 and above, Apple has provided SensorKit (https://developer.apple.com/documentation/sensorkit/srsensor/3377673-ambientlightsensor ) for explicit access to all kinds of sensors and system logs (call logs, message logs, etc.). In addition to the raw lux value, you can also get the chromaticity of the ambient light and the orientation relative to the device sensor.
(From https://developer.apple.com/documentation/sensorkit/srambientlightsample )
Measuring Light Level
var chromaticity: SRAmbientLightSample.Chromaticity A coordinate pair
that describes the sample’s light brightness and tint.
struct SRAmbientLightSample.Chromaticity A coordinate pair that
describes light brightness and tint.
var lux: Measurement An object that describes the
sample’s luminous flux.
var placement: SRAmbientLightSample.SensorPlacement The light’s
location relative to the sensor.
enum SRAmbientLightSample.SensorPlacement Directional values that
describe light-source location with respect to the sensor.
However, you need to request for approval for such an App to be accepted and published on App Store.

Why does my heading not change using Core Location?

I'm using a WiFi iPad 1st gen with iOS 5.1.1
I have set up Core Location similar to this blog post:
http://blog.objectgraph.com/index.php/2012/01/10/how-to-create-a-compass-in-iphone/
In:
- (void)locationManager:(CLLocationManager*)manager
didUpdateHeading:(CLHeading*)heading
I log the true heading value:
NSLog(#"heading: %f", manager.heading.trueHeading);
The result is always "true heading: -1.000000" no matter which direction I point the iPad.
I've rotated the device on all 3 axis and enabled location services in settings.
Any ideas on why this doesn't work? Does heading reporting not work on an iPad 1st gen?
Here is what helped in my case :
Select Location Services (under privacy settings)
Select System Services at the bottom of the list of apps (in the Location Services)
Turn your Compass Calibration ON
It seems that this way, the trueheading-values no longer turn out to be -1 !
(see picture below) !
To get valid heading data you must also configure the CLLocationManager to update location as well. From the Apple documentation:
Note: If you want heading objects to contain valid data for the
trueHeading property, your location manager object should also be
configured to deliver location updates. You can start the delivery of
these updates by calling the location manager object’s
startUpdatingLocation method.
Thus call startUpdatingLocation
Note that 'heading' refers to the direction that a defined device axis is pointed. The 'orientation', which you can read with [UIDevice currentDevice].orientation tells you the tilt of the device.
The -1 means the compass heading is invalid. You probably need to perform compass calibration.

iphone - Using the Device motion instead of Accelerometer, Gyroscope and Magnetometer

If I'm on iOS4, not using a specific handler, and needing the same update interval, what is the difference using the Device motion instead of specific Accelerometer, Gyroscope and Magnetometer updates ?
What happens if one of those 3 features is not available on the device if I use the Device motion ?
I mean, what will return deviceMotionAvailable if one of those feature is not available ?
In iOS 4 the magnetometer is not yet included in device motion API but handled by CLLocationManager (this changed in iOS 5). So if you have a gyro and an accelerometer, deviceMotioAvailable will return true independent of magnetometer. On the other hand if the gyro is missing you will always get false and you need to stay with accelerometerData.
Because Device Motion has one timestamp for both sensors, you will get reliable interpolated values for both sensors. Otherwise Device Motion wouldn't be able to do sensor fusion, the main advantage why this is the preferred way.
You can not rely on a fix frequency for CLLocationManager. didUpdateHeading is called whenever the system 'thinks' it should be. To get the different time coordinates between CLLocationManager and CMDeviceMotion normalised, you can have a look at NSTimeInterval to unix timestamp

iPhone Proximity Sensor

Can the iPhone SDK take advantage of the iPhone's proximity sensors? If so, why hasn't anyone taken advantage of them? I could picture a few decent uses.
For example, in a racing game, you could put your finger on the proximity sensor to go instead of taking up screen real-estate with your thumb. Of course though, if this was your only option, then iPod touch users wouldn't be able to use the application.
Does the proximity sensor tell how close you are, or just that something is in front of it?
There is a public API for this. -[UIApplication setProximitySensingEnabled:(BOOL)] will turn the feature on. BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.
Assuming you mean the sensor that shuts off the screen when you hold it to your ear, I'm pretty sure that is just an infrared sensor inside the ear speaker. If you start the phone app (you don't have to be making a call) and hold something to cast a shadow over the ear speaker, you can make the display shut off.
When you asked this question it was not accessible via the public API. You can now access the sensor's state via UIDevice's proximityState property. However, it wouldn't be that useful for games, since it is only an on/off thing, not a near/far measure. Plus, it's only available on the iPhone and not the iPod touch.
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e, if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get the proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
The proximity sensor works via measuring IR reflectance. If you hold the iPhone up to a webcam, you can see a small, pulsing IR LED.
There's a lot of confusion between the proximity sensor and the ambient light sensor. The iPhone has both. The Touch does not have a proximity sensor, making it a poor choice for user input. It would be a bad idea anyway since Apple isn't obligated to locate it in the same place in future devices; you aren't supposed to know or care where it is.
The proximity sensor works by pulsing an infrared LED and measuring the amount of reflectance. You can see this using your iSight camera (most digital cameras are sensitive to IR.) Just launch Photo Booth, initiate a call (or play a voicemail) on the phone and point it at your iSight camera. Note the flashing light next to the earpiece; cover it with your finger and the screen will go black.
The ambient light sensor's API is evidently private at this point.
Just to update, this is possible.
device = [UIDevice currentDevice];
// Turn on proximity monitoring
[device setProximityMonitoringEnabled:YES];
// To determine if proximity monitoring is available, attempt to enable it.
// If the value of the proximityMonitoringEnabled property remains NO, proximity
// monitoring is not available.
// Detect whether device supports proximity monitoring
proxySupported = [device isProximityMonitoringEnabled];
// Register for proximity notifications
[notificationCenter addObserver:self selector:#selector(proximityChanged:) name:UIDeviceProximityStateDidChangeNotification object:device];
As benzado points out, you can use:
// Returns a BOOL, YES if device is proximate
[device proximityState];
There is no public API for this.
In iPhone 3.0 there is official support for the proximity sensor. Have a look at UIDevice proximityMonitoringEnabled in the docs.
If you aren't aiming for the AppStore, you can read my articles here on getting access to those:
Proximity Sensor: http://iphonedevwiki.net/index.php/AppleProxShim
Ambient Light Sensor: http://iphonedevwiki.net/index.php/AppleISL29003
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e. if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
answered Jul 22 '09 at 5:49
Kevin Lambert
I've encoutered this problem too. It took me a long time to figure out the real reason of why the proximity sensor is not working. When orientation is UIInterfaceOrientationLandscapeLeft or UIInterfaceOrientationLandscapeRight, proximity sensor does not work; while in portrait mode it works well. My iPhone is iPhone 4S (iOS SDK 5.0).
Those proximity sensors are basically a matrix of conductors. The vertical "wires" are tracks on one side of a thin sheet of insulator, the horizontal ones are on the other side. The intersections function as capacitors. Your finger carries an electrostatic charge, so capacitance of each junction varies with proximity. FETs amplify the signal and biasing sets a threshold. In practice the circuit is more complex than that because it has to detect a relative change and reject noise.
But anyway, what the sensor grid tells you is that a field effect has been sensed, and that field effect is characteristic of object about the size of a fingertip and resting on the surface of the display. The centroid of the capacitive disturbance is computed (probably by hardware) and the coordinates are (presumably) reported as numbers on a port most likely brought to the attention of the device OS by an interrupt. In something as sexy as an iPhone there's probably a buffer of the last dozen or so positions so it can work out direction and speed. Probably these are also computed by hardware and presented as numbers on the same port.
#Dipak Patel & #Coderer
You can download working code at
http://spazout.com/google_cheats_independent_iphone_developers_screwed
It has a working implementation of proximityStateChanged a undocumented method in UIApplication.
Hope this helps.
To turn the screen off it's conceivable that more than one sensors is used to figure out if the screen should be turned off or not. The IR proximity sensor described by Cryptognome in conjunction with the Touch screen sensor described by Peter Wone could work out if the iphone is being held close to your face (or something else with a slight electric charge) or if its just very close to something in-animate.