I'm using a WiFi iPad 1st gen with iOS 5.1.1
I have set up Core Location similar to this blog post:
http://blog.objectgraph.com/index.php/2012/01/10/how-to-create-a-compass-in-iphone/
In:
- (void)locationManager:(CLLocationManager*)manager
didUpdateHeading:(CLHeading*)heading
I log the true heading value:
NSLog(#"heading: %f", manager.heading.trueHeading);
The result is always "true heading: -1.000000" no matter which direction I point the iPad.
I've rotated the device on all 3 axis and enabled location services in settings.
Any ideas on why this doesn't work? Does heading reporting not work on an iPad 1st gen?
Here is what helped in my case :
Select Location Services (under privacy settings)
Select System Services at the bottom of the list of apps (in the Location Services)
Turn your Compass Calibration ON
It seems that this way, the trueheading-values no longer turn out to be -1 !
(see picture below) !
To get valid heading data you must also configure the CLLocationManager to update location as well. From the Apple documentation:
Note: If you want heading objects to contain valid data for the
trueHeading property, your location manager object should also be
configured to deliver location updates. You can start the delivery of
these updates by calling the location manager object’s
startUpdatingLocation method.
Thus call startUpdatingLocation
Note that 'heading' refers to the direction that a defined device axis is pointed. The 'orientation', which you can read with [UIDevice currentDevice].orientation tells you the tilt of the device.
The -1 means the compass heading is invalid. You probably need to perform compass calibration.
Related
I am working in the cllocation manager and get the current location of user but unable to find the user status like user is in running position or stop position . So please help me , How can get its corrent status when user walking programmatically in iphone.
Use the speed attribute of CLLocation, asumimg GPS enabled and set to best accuracy.
If (loc.speed > 7/3.6) then running
walking: < 6 km/h
stopped is more difficult: no location update (measured with a timer)
But you cannot distinguish between "stopped" and "no GPS available" (e.g in underground, or indoors). That would need the acceleration sensor, if the state needs to be in real time.
My app has an augmented reality camera view which uses CMMotionManager to find heading relative to true north. However, I am finding that if the user switches OFF Location Services / System Settings / Compass Calibration, then the device motion data stops returning sensible values (in particular the gravity vector values), and makes the app useless. Quite a few people do this because they believe doing so saves battery life.
Does anyone know exactly what this setting does to the device or to CMMotionManager?
How can my app determine what it is set to?
Does setting it OFF necessarily mean that CMMotionManager won't work?
You can check your true heading value in your heading delegate. If the device calibration is "off" in the setting then this value will be static and equal to "-1" every time. SO you can check in the heading delegate
if (new.heading == -1)
{
//Calibration in setting is Off
}
else
{
//Calibration is On
}
Hope this work. As it works for me.
Thanks and Regards
Abhishek Goyal
If I'm on iOS4, not using a specific handler, and needing the same update interval, what is the difference using the Device motion instead of specific Accelerometer, Gyroscope and Magnetometer updates ?
What happens if one of those 3 features is not available on the device if I use the Device motion ?
I mean, what will return deviceMotionAvailable if one of those feature is not available ?
In iOS 4 the magnetometer is not yet included in device motion API but handled by CLLocationManager (this changed in iOS 5). So if you have a gyro and an accelerometer, deviceMotioAvailable will return true independent of magnetometer. On the other hand if the gyro is missing you will always get false and you need to stay with accelerometerData.
Because Device Motion has one timestamp for both sensors, you will get reliable interpolated values for both sensors. Otherwise Device Motion wouldn't be able to do sensor fusion, the main advantage why this is the preferred way.
You can not rely on a fix frequency for CLLocationManager. didUpdateHeading is called whenever the system 'thinks' it should be. To get the different time coordinates between CLLocationManager and CMDeviceMotion normalised, you can have a look at NSTimeInterval to unix timestamp
I having some problem on the iPhone/iPad compass development.
The trueHeading taken from the CLHeading alway give me the '-1' value, I'm stuck here. Here is my code:
self.locationManager = [[[CLLocationManager alloc] init] autorelease];
self.locationManager.desiredAccuracy = kCLLocationAccuracyBest;
self.locationManager.delegate = self;
self.locationManager.headingFilter = 0.5; //1 degrees
[self.locationManager startUpdatingHeading];
I also found out something, that is when I on the map app or the compass app which has use the location stuff, the trueHeading value suddenly read correct. I wonder what is the cause, any idea? It happen on both iPhone4 and on the iPad.
It also happen whenever I off the Location Services in settings and re-enable it, it will become unable to read the correct trueHeading value, i wonder because the location services cannot be enable by the app I creating?
anyway, thank in advance
---My Solution---
see below.
to avoid the heading keep returning -1.000000, not JUST run startUpdatingHeading but run startUpdatingLocation together, this helps.
Try using this...
CLLocationManager *locationManager = [[CLLocationManager alloc] init];
locationManager.delegate=self;
locationManager.desiredAccuracy=kCLLocationAccuracyBestForNavigation;
// Start heading updates.
if (locationManager.headingAvailable && locationManager.locationServicesEnabled)
{
locationManager.headingFilter = kCLHeadingFilterNone;
[locationManager startUpdatingHeading];
}
and after doing this CLLocationManager delegate methods calls
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading {
// Use the true heading if it is valid.
[lblAccuracy setText:[NSString stringWithFormat:#"%.1fmi",newHeading.headingAccuracy]];
}
But this coding works on device not in simulator...
Happy coding..
---My Solution---
What I did was, add in the [self.locationManager startUpdatingLocation] to before or after the [self.locationManager startUpdatingHeading]; (when Location Services is off & re-enable from the Settings). I'm not sure this is a good solution, but this is what I did to make it work, if you have any better solution please share.
I had some trouble with the location manager myself and found out that for me it helped to unplug the iPhone from the computer when testing. Somehow the calibration alert only popped up after unplugging the iPhone.
I had this same problem. I moved startUpdatingHeading into a button action, then moved it back to where the CLLocationManager is allocated -- where it had been working fine -- and it started returning only -1.
I rebooted my iPad and it started working again. Hopefully it stays that way.
Edit: Nope, it didn't stay that way. I had to use startUpdatingLocation too. Won't this wear down the battery though? I set desiredAccuracy to kCLLocationAccuracyThreeKilometers, because I am not using location data anyway.
A TRUE reading requires knowing the magnetic variation for the place where you are using the compass. From the previous discussion, it appears to be that the function that corrects the true direction from magnetic direction needs your location for obtaining the variation value. If you don't like to use the location GPS information in your code, I suggest reading the magnetic reading and correct the value by yourself. You need to obtain the variation for the desired location first then apply the following formula: T=M ± V, where T is the true direction, M is the compass magnetic reading and V is the variation. Use "+" for East and "-" for West. I found the allowing web site provide the variation(magnetic declination) for any needed location: http://www.geomag.nrcan.gc.ca/calc/mdcal-eng.php.
When location services are off, the didUpdateHeading delegate method returns only the magnetic heading. You can use it according to your needs. According to Apple docs..
To begin the delivery of heading-related events, assign a delegate to
the location manager object and call its startUpdatingHeading method.
If location updates are also enabled, the location manager returns
both the true heading and magnetic heading values. If location updates
are not enabled, the location manager returns only the magnetic
heading value.
Working on this problem now. I can get updates from Core Motion when I use SpriteKit. It's about being able to call a function continuously like once a frame (1/60th of a second) or every few frames. Without using SpriteKit, the documentation says to invoke the updates within a closure, which I assume will be on their own thread and up to you to release.
There's an algorithm for converting the magnetometer readings to actual degrees relative to true north. Picture a graph that looks like the time domain function of alternating current and you'll see that interpolating the data is a simple matter of applying Maxwell's equations. Here's an example on honeywell
Can the iPhone SDK take advantage of the iPhone's proximity sensors? If so, why hasn't anyone taken advantage of them? I could picture a few decent uses.
For example, in a racing game, you could put your finger on the proximity sensor to go instead of taking up screen real-estate with your thumb. Of course though, if this was your only option, then iPod touch users wouldn't be able to use the application.
Does the proximity sensor tell how close you are, or just that something is in front of it?
There is a public API for this. -[UIApplication setProximitySensingEnabled:(BOOL)] will turn the feature on. BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.
Assuming you mean the sensor that shuts off the screen when you hold it to your ear, I'm pretty sure that is just an infrared sensor inside the ear speaker. If you start the phone app (you don't have to be making a call) and hold something to cast a shadow over the ear speaker, you can make the display shut off.
When you asked this question it was not accessible via the public API. You can now access the sensor's state via UIDevice's proximityState property. However, it wouldn't be that useful for games, since it is only an on/off thing, not a near/far measure. Plus, it's only available on the iPhone and not the iPod touch.
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e, if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get the proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
The proximity sensor works via measuring IR reflectance. If you hold the iPhone up to a webcam, you can see a small, pulsing IR LED.
There's a lot of confusion between the proximity sensor and the ambient light sensor. The iPhone has both. The Touch does not have a proximity sensor, making it a poor choice for user input. It would be a bad idea anyway since Apple isn't obligated to locate it in the same place in future devices; you aren't supposed to know or care where it is.
The proximity sensor works by pulsing an infrared LED and measuring the amount of reflectance. You can see this using your iSight camera (most digital cameras are sensitive to IR.) Just launch Photo Booth, initiate a call (or play a voicemail) on the phone and point it at your iSight camera. Note the flashing light next to the earpiece; cover it with your finger and the screen will go black.
The ambient light sensor's API is evidently private at this point.
Just to update, this is possible.
device = [UIDevice currentDevice];
// Turn on proximity monitoring
[device setProximityMonitoringEnabled:YES];
// To determine if proximity monitoring is available, attempt to enable it.
// If the value of the proximityMonitoringEnabled property remains NO, proximity
// monitoring is not available.
// Detect whether device supports proximity monitoring
proxySupported = [device isProximityMonitoringEnabled];
// Register for proximity notifications
[notificationCenter addObserver:self selector:#selector(proximityChanged:) name:UIDeviceProximityStateDidChangeNotification object:device];
As benzado points out, you can use:
// Returns a BOOL, YES if device is proximate
[device proximityState];
There is no public API for this.
In iPhone 3.0 there is official support for the proximity sensor. Have a look at UIDevice proximityMonitoringEnabled in the docs.
If you aren't aiming for the AppStore, you can read my articles here on getting access to those:
Proximity Sensor: http://iphonedevwiki.net/index.php/AppleProxShim
Ambient Light Sensor: http://iphonedevwiki.net/index.php/AppleISL29003
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e. if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
answered Jul 22 '09 at 5:49
Kevin Lambert
I've encoutered this problem too. It took me a long time to figure out the real reason of why the proximity sensor is not working. When orientation is UIInterfaceOrientationLandscapeLeft or UIInterfaceOrientationLandscapeRight, proximity sensor does not work; while in portrait mode it works well. My iPhone is iPhone 4S (iOS SDK 5.0).
Those proximity sensors are basically a matrix of conductors. The vertical "wires" are tracks on one side of a thin sheet of insulator, the horizontal ones are on the other side. The intersections function as capacitors. Your finger carries an electrostatic charge, so capacitance of each junction varies with proximity. FETs amplify the signal and biasing sets a threshold. In practice the circuit is more complex than that because it has to detect a relative change and reject noise.
But anyway, what the sensor grid tells you is that a field effect has been sensed, and that field effect is characteristic of object about the size of a fingertip and resting on the surface of the display. The centroid of the capacitive disturbance is computed (probably by hardware) and the coordinates are (presumably) reported as numbers on a port most likely brought to the attention of the device OS by an interrupt. In something as sexy as an iPhone there's probably a buffer of the last dozen or so positions so it can work out direction and speed. Probably these are also computed by hardware and presented as numbers on the same port.
#Dipak Patel & #Coderer
You can download working code at
http://spazout.com/google_cheats_independent_iphone_developers_screwed
It has a working implementation of proximityStateChanged a undocumented method in UIApplication.
Hope this helps.
To turn the screen off it's conceivable that more than one sensors is used to figure out if the screen should be turned off or not. The IR proximity sensor described by Cryptognome in conjunction with the Touch screen sensor described by Peter Wone could work out if the iphone is being held close to your face (or something else with a slight electric charge) or if its just very close to something in-animate.