I want to create a compass app that function exactly the same as current iphone ! Can somebody point me direction ,please? I am new at this !
How to create an Compass in iOS:
Step 1: Create a project (e.g. CompassExample) and include frameworks
#import <CoreLocation/CoreLocation.h>
#import <QuartzCore/QuartzCore.h>
ViewContoller.h Steps:
Step 2: In .h file, create Location Manager Object
CLLocationManager *locationManager;
<CLLocationManagerDelegate>
#property (nonatomic,retain) CLLocationManager *locationManager;
#synthesize locationManager;
IBOutlet UIImageView *compassImage;
Step 3: Dowload this Compass Image
ViewController.m Steps:
Step 4: In .m file’s viewDidLoad function, initialize the location manager.
locationManager=[[CLLocationManager alloc] init];
locationManager.desiredAccuracy = kCLLocationAccuracyBest;
locationManager.headingFilter = 1;
locationManager.delegate=self;
[locationManager startUpdatingHeading];
Step 5: In .m file, implement delegate function.
First, you need to convert degree (e.g 360) to radian (e.g 3.14=2PI).
Second, multiply -1 in order to rotate opposite direction that you twist your phone.
Third, apply rotation with core animation function. In this example, I’m rotating from the current value to the new value. The duration is 0.5 second as you see below.
- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading{
// Convert Degree to Radian and move the needle
float oldRad = -manager.heading.trueHeading * M_PI / 180.0f;
float newRad = -newHeading.trueHeading * M_PI / 180.0f;
CABasicAnimation *theAnimation;
theAnimation=[CABasicAnimation animationWithKeyPath:#"transform.rotation"];
theAnimation.fromValue = [NSNumber numberWithFloat:oldRad];
theAnimation.toValue=[NSNumber numberWithFloat:newRad];
theAnimation.duration = 0.5f;
[compassImage.layer addAnimation:theAnimation forKey:#"animateMyRotation"];
compassImage.transform = CGAffineTransformMakeRotation(newRad);
NSLog(#"%f (%f) => %f (%f)", manager.heading.trueHeading, oldRad, newHeading.trueHeading, newRad);
}
That’s it! Of course, you have to deploy the app on your device to check the direction.
Credit for this Code goes to kiichi, you can download the Project at Github Link
Related
I am trying to get it so that when you rotate an iOS 7 map that the annotations rotate along with the camera heading. Imagine I had pin annotations that must point to North at all times.
This would seem simple at first, there should be a MKMapViewDelegate for getting the camera rotation but there isn't.
I've tried using the map delegates to then query the map view's camera.heading object but firstly these delegates only seem to be called once before and once after a rotation gesture:
- (void)mapView:(MKMapView *)mapView regionWillChangeAnimated:(BOOL)animated
- (void)mapView:(MKMapView *)mapView regionDidChangeAnimated:(BOOL)animated
I also tried using KVO on the camera.heading object but this doesn't work and the camera object seems to be some kind of proxy object that only updates once the rotation gesture is complete.
My most successful method so far is to add a rotation gesture recogniser to calculate a rotation delta and use this with the camera heading reported at the beginning of the region change delegate. This works to a point but in OS 7 you can 'flick' your rotation gesture and it adds velocity which I can't seem to track. Is there any way to track the camera heading in real-time?
- (void)mapView:(MKMapView *)mapView regionWillChangeAnimated:(BOOL)animated
{
heading = self.mapView.camera.heading;
}
- (void)rotationHandler:(UIRotationGestureRecognizer *)gesture
{
if(gesture.state == UIGestureRecognizerStateChanged) {
CGFloat headingDelta = (gesture.rotation * (180.0/M_PI) );
headingDelta = fmod(headingDelta, 360.0);
CGFloat newHeading = heading - headingDelta;
[self updateCompassesWithHeading:actualHeading];
}
}
- (void)mapView:(MKMapView *)mapView regionDidChangeAnimated:(BOOL)animated
{
[self updateCompassesWithHeading:self.mapView.camera.heading];
}
Apple does not give real time updates to any map information unfortunately. Your best bet is to set up a CADisplayLink and update whatever you need to when it changes. Something like this.
#property (nonatomic) CLLocationDirection *previousHeading;
#property (nonatomic, strong) CADisplayLink *displayLink;
- (void)setUpDisplayLink
{
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(displayLinkFired:)];
[displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
}
- (void)displayLinkFired:(id)sender
{
double difference = ABS(self.previousHeading - self.mapView.camera.heading);
if (difference < .001)
return;
self.previousHeading = self.mapView.camera.heading;
[self updateCompassesWithHeading:self.previousHeading];
}
I have an application in which user track his/her route when jogging or cycling, So i need perfect location, so user's routes will be perfect.
But, I have one problem in this,
locManager = [[CLLocationManager alloc] init];
[locManager setDesiredAccuracy:kCLLocationAccuracyBestForNavigation];
[locManager setDelegate:self];
[locManager startUpdatingLocation];
In viewDidLoad. Using this didUpdateToLocation method called multiple times when I just dont move device a little and on map very strange route draw.
I just cant understand why this happen, if I am doing some wrong or missing something.
Thanks.......
I use locationManager.distanceFilter = 500; (or so) // meters
to prevent multiple calls from happening. just remember to call this BEFORE you start updating your location
You can set the distancefilter of the location manager hope this may help you
locationManager=[[CLLocationManager alloc] init];
locationManager.delegate=self;
locationManager.desiredAccuracy=kCLLocationAccuracyNearestTenMeters;
locationManager.distanceFilter=10.0;
[locationManager startUpdatingLocation];
When you first start location services, you'll generally see multiple location updates come whether you're moving or not. If you examine the horizontalAccuracy of the locations as they come in, you'll see that while it's "warming" up it will show a series of locations with greater and greater accuracy (i.e. smaller and smaller horizontalAccuracy values) until it reaches quiescence.
You could disregard those initial locations until horizontalAccuracy falls below a certain value. Or, better, during start up, you could disregard the previous location if (a) the distance between a new location and the old location is less than the horizontalAccuracy of the old location and (b) if the horizontalAccuracy of the new location is less than that of the prior location.
For example, let's assume you're maintaining an array of CLLocation objects, as well as a reference to the last drawn path:
#property (nonatomic, strong) NSMutableArray *locations;
#property (nonatomic, weak) id<MKOverlay> pathOverlay;
Furthermore, let's assume your location update routine is just adding to the array of locations and then indicating that the path should be redrawn:
- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
NSLog(#"%s", __FUNCTION__);
CLLocation* location = [locations lastObject];
[self.locations addObject:location];
[self addPathToMapView:self.mapView];
}
Then the addPathToMapView can therefore remove the second from last location if it's less accurate than the last one and if the distance between them is less than the most recent location's accuracy.
- (void)addPathToMapView:(MKMapView *)mapView
{
NSInteger count = [self.locations count];
// let's see if we should remove the penultimate location
if (count > 2)
{
CLLocation *lastLocation = [self.locations lastObject];
CLLocation *previousLocation = self.locations[count - 2];
// if the very last location is more accurate than the previous one
// and if distance between the two of them is less than the accuracy,
// then remove that `previousLocation` (and update our count, appropriately)
if (lastLocation.horizontalAccuracy < previousLocation.horizontalAccuracy &&
[lastLocation distanceFromLocation:previousLocation] < lastLocation.horizontalAccuracy)
{
[self.locations removeObjectAtIndex:(count - 2)];
count--;
}
}
// now let's build our array of coordinates for our MKPolyline
CLLocationCoordinate2D coordinates[count];
NSInteger numberOfCoordinates = 0;
for (CLLocation *location in self.locations)
{
coordinates[numberOfCoordinates++] = location.coordinate;
}
// if there is a path to add to our map, do so
MKPolyline *polyLine = nil;
if (numberOfCoordinates > 1)
{
polyLine = [MKPolyline polylineWithCoordinates:coordinates count:numberOfCoordinates];
[mapView addOverlay:polyLine];
}
// if there was a previous path drawn, remove it
if (self.pathOverlay)
[mapView removeOverlay:self.pathOverlay];
// save the current path
self.pathOverlay = polyLine;
}
Bottom line, just get rid of locations that are less accurate than the next one you have. You could get even more aggressive in the pruning process if you want, but there are tradeoffs there, but hopefully this illustrates the idea.
startUpdatingLocation
Will continuously update a user's location even when the location does not change. You just need to structure your app to handle these continuous updates according to your needs.
Try reading Apple's documentation on this subject. It is confusing at first but try anyway.
http://developer.apple.com/library/ios/#documentation/UserExperience/Conceptual/LocationAwarenessPG/CoreLocation/CoreLocation.html#//apple_ref/doc/uid/TP40009497-CH2-SW1
I think this is what you need.startMonitoringForRegion:desiredAccuracy
for Example see the following github link.
Try this Bread Crumb sample code provided by Apple..
http://developer.apple.com/library/ios/#samplecode/Breadcrumb/Introduction/Intro.html
Add this,
[locManager stopUpdatingLocation];
into your updateUserLocation delegate method.
Review the following code snippet:
-(void)mapView:(MKMapView *)mapView didUpdateUserLocation:(MKUserLocation *)userLocation
{
[_locationManager stopUpdatingLocation];
}
locationManager.startUpdatingLocation() fetch location continuously and didUpdateLocations method calls several times,
Just set the value for locationManager.distanceFilter value before calling locationManager.startUpdatingLocation().
As I set 200 working fine
locationManager.distanceFilter = 200
locationManager.startUpdatingLocation()
I started playing around with Cocos2D and I figured out to do a sprite animation with sprite sheets.
Now I have a little robot walking around on my screen. But I am wondering how I can put one animation into another.
For example, my robot has a walking animation and I want to move his arms separately. But his shoulders will rise and fall during walking and the arms should move relative to his shoulder position.
My walk animation has five sprites and I have nine different arm positions. Now, I could add the arms to every single walking sprite for all nine arm positions. but then I would have 45 Images instead of 14.
That's a good question (finally!).
What I would do is separate your character into different sprites (you already did this) that are animatable on their own.
Then, whenever a frame of that animation is presented, I would have a block of code executed that modified the position of the arms animation, so it matches the shoulders.
To execute that block of code, you'd need Cocos 1.1 or better, since they added the CCAnimationFrame there, however, those frames can only execute code via an NSNotification, so I made an improvement so we can set a block on a frame, and that block would be executed whenever that frame is displayed.
Just find CCAnimation.h and modify the CCAnimationFrame interface to look like this:
typedef void(^FrameBlock)(CCSprite *sprite);
/** CCAnimationFrame
A frame of the animation. It contains information like:
- sprite frame name
- # of delay units.
- offset
#since v1.1
*/
#interface CCAnimationFrame : NSObject <NSCopying>
{
CCSpriteFrame* spriteFrame_;
float delayUnits_;
NSDictionary *userInfo_;
FrameBlock frameBlock;
}
/** CCSpriteFrameName to be used */
#property (nonatomic, readwrite, retain) CCSpriteFrame* spriteFrame;
/** how many units of time the frame takes */
#property (nonatomic, readwrite) float delayUnits;
/** A CCAnimationFrameDisplayedNotification notification will be broadcasted when the frame is displayed with this dictionary as UserInfo. If UserInfo is nil, then no notification will be broadcasted. */
#property (nonatomic, readwrite, retain) NSDictionary *userInfo;
/** If the block is not NULL, it will be executed when the frame becomes visible **/
#property (nonatomic, readwrite, copy) FrameBlock frameBlock;
/** initializes the animation frame with a spriteframe, number of delay units and a notification user info */
-(id) initWithSpriteFrame:(CCSpriteFrame*)spriteFrame delayUnits:(float)delayUnits userInfo:(NSDictionary*)userInfo;
-(id) initWithSpriteFrame:(CCSpriteFrame*)spriteFrame delayUnits:(float)delayUnits block:(FrameBlock)block;
#end
And then open CCAnimation.m and make sure the CCAnimationFrame implementation looks like this:
#implementation CCAnimationFrame
#synthesize spriteFrame = spriteFrame_, delayUnits = delayUnits_, userInfo=userInfo_;
#synthesize frameBlock;
-(id) initWithSpriteFrame:(CCSpriteFrame *)spriteFrame delayUnits:(float)delayUnits userInfo:(NSDictionary*)userInfo
{
if( (self=[super init]) ) {
self.spriteFrame = spriteFrame;
self.delayUnits = delayUnits;
self.userInfo = userInfo;
}
return self;
}
-(id) initWithSpriteFrame:(CCSpriteFrame*)spriteFrame delayUnits:(float)delayUnits block:(FrameBlock)block{
self = [self initWithSpriteFrame:spriteFrame delayUnits:delayUnits userInfo:nil];
if(self){
[self setFrameBlock:block];
}
return self;
}
-(void) dealloc
{
CCLOGINFO( #"cocos2d: deallocing %#", self);
[spriteFrame_ release];
[userInfo_ release];
[super dealloc];
}
-(id) copyWithZone: (NSZone*) zone
{
CCAnimationFrame *copy = [[[self class] allocWithZone: zone] initWithSpriteFrame:[[spriteFrame_ copy] autorelease] delayUnits:delayUnits_ userInfo:[[userInfo_ copy] autorelease] ];
return copy;
}
-(NSString*) description
{
return [NSString stringWithFormat:#"<%# = %08X | SpriteFrame = %08X, delayUnits = %0.2f >", [self class], self, spriteFrame_, delayUnits_ ];
}
#end
Then, when creating the animation frames, add a block to them in which you set the position of the arms to match the shoulders.
I hope it helps.
I've got quite a lot of pins to put on my map so I think it would be a nice idea to cluster those annotations. I'm not really sure how to achieve this on iPhone, I was able to work something out with google maps and some javascript examples. But iPhone uses its mkmapview and I have no idea how to cluster annotations in there.
Any ideas or frameworks that you know and are good? Thanks.
You don't necessarily need to use a 3rd party framework because since iOS 4.2, MKMapView has a method called - (NSSet *)annotationsInMapRect:(MKMapRect)mapRect which you can use to do your clustering.
Check out the WWDC11 Session video 'Visualizing Information Geographically with MapKit'. About half way through it explains how to do it. But I'll summarize the concept for you:
Use Two maps (second map is never added to the view hierarchy)
Second map contains all annotations (again, it's never drawn)
Divide map area into a grid of squares
Use -annotationsInMapRect method to get annotation data from
invisible map
Visible map builds its annotations from this data from invisible map
Fortunately, you don't need 3rd party framework's anymore. iOS 11 has native clustering support.
You need to implement mapView:clusterAnnotationForMemberAnnotations: method.
Get more details in the Apple example: https://developer.apple.com/sample-code/wwdc/2017/MapKit-Sample.zip
Since this is a very common problem and i needed a solution i have wrote a custom subclass of MKMapView which supports clustering. Then i made it available open source! You can get it here: https://github.com/yinkou/OCMapView.
It manages the clustering of the annotations and you can handle their views by yourself.
You don't have to do anything but to copy the OCMapView folder to your project, create a MKMapView in your nib and set its class to OCMapView. (Or create and delegate it in code like a regular MKMapView)
By using Apple demo code it's easy to implement clustering concept in our code. Reference link
Simply we can use following code for the Clustering
Steps to implement clustering
Step1 : The important thing is for clustering we use two mapviews(allAnnotationsMapView, ), One is for reference(allAnnotationsMapView).
#property (nonatomic, strong) MKMapView *allAnnotationsMapView;
#property (nonatomic, strong) IBOutlet MKMapView *mapView;
In viewDidLoad
_allAnnotationsMapView = [[MKMapView alloc] initWithFrame:CGRectZero];
Step2 : Add all annotations to the _allAnnotationsMapView, In below _photos are the annotations array.
[_allAnnotationsMapView addAnnotations:_photos];
[self updateVisibleAnnotations];
Step3 : Add below methods for clustering, in this PhotoAnnotation is the custom annotation.
MapViewDelegate methods
- (void)mapView:(MKMapView *)aMapView regionDidChangeAnimated:(BOOL)animated {
[self updateVisibleAnnotations];
}
- (void)mapView:(MKMapView *)aMapView didAddAnnotationViews:(NSArray *)views {
for (MKAnnotationView *annotationView in views) {
if (![annotationView.annotation isKindOfClass:[PhotoAnnotation class]]) {
continue;
}
PhotoAnnotation *annotation = (PhotoAnnotation *)annotationView.annotation;
if (annotation.clusterAnnotation != nil) {
// animate the annotation from it's old container's coordinate, to its actual coordinate
CLLocationCoordinate2D actualCoordinate = annotation.coordinate;
CLLocationCoordinate2D containerCoordinate = annotation.clusterAnnotation.coordinate;
// since it's displayed on the map, it is no longer contained by another annotation,
// (We couldn't reset this in -updateVisibleAnnotations because we needed the reference to it here
// to get the containerCoordinate)
annotation.clusterAnnotation = nil;
annotation.coordinate = containerCoordinate;
[UIView animateWithDuration:0.3 animations:^{
annotation.coordinate = actualCoordinate;
}];
}
}
}
clustering Handling methods
- (id<MKAnnotation>)annotationInGrid:(MKMapRect)gridMapRect usingAnnotations:(NSSet *)annotations {
// first, see if one of the annotations we were already showing is in this mapRect
NSSet *visibleAnnotationsInBucket = [self.mapView annotationsInMapRect:gridMapRect];
NSSet *annotationsForGridSet = [annotations objectsPassingTest:^BOOL(id obj, BOOL *stop) {
BOOL returnValue = ([visibleAnnotationsInBucket containsObject:obj]);
if (returnValue)
{
*stop = YES;
}
return returnValue;
}];
if (annotationsForGridSet.count != 0) {
return [annotationsForGridSet anyObject];
}
// otherwise, sort the annotations based on their distance from the center of the grid square,
// then choose the one closest to the center to show
MKMapPoint centerMapPoint = MKMapPointMake(MKMapRectGetMidX(gridMapRect), MKMapRectGetMidY(gridMapRect));
NSArray *sortedAnnotations = [[annotations allObjects] sortedArrayUsingComparator:^(id obj1, id obj2) {
MKMapPoint mapPoint1 = MKMapPointForCoordinate(((id<MKAnnotation>)obj1).coordinate);
MKMapPoint mapPoint2 = MKMapPointForCoordinate(((id<MKAnnotation>)obj2).coordinate);
CLLocationDistance distance1 = MKMetersBetweenMapPoints(mapPoint1, centerMapPoint);
CLLocationDistance distance2 = MKMetersBetweenMapPoints(mapPoint2, centerMapPoint);
if (distance1 < distance2) {
return NSOrderedAscending;
} else if (distance1 > distance2) {
return NSOrderedDescending;
}
return NSOrderedSame;
}];
PhotoAnnotation *photoAnn = sortedAnnotations[0];
NSLog(#"lat long %f %f", photoAnn.coordinate.latitude, photoAnn.coordinate.longitude);
return sortedAnnotations[0];
}
- (void)updateVisibleAnnotations {
// This value to controls the number of off screen annotations are displayed.
// A bigger number means more annotations, less chance of seeing annotation views pop in but decreased performance.
// A smaller number means fewer annotations, more chance of seeing annotation views pop in but better performance.
static float marginFactor = 2.0;
// Adjust this roughly based on the dimensions of your annotations views.
// Bigger numbers more aggressively coalesce annotations (fewer annotations displayed but better performance).
// Numbers too small result in overlapping annotations views and too many annotations on screen.
static float bucketSize = 60.0;
// find all the annotations in the visible area + a wide margin to avoid popping annotation views in and out while panning the map.
MKMapRect visibleMapRect = [self.mapView visibleMapRect];
MKMapRect adjustedVisibleMapRect = MKMapRectInset(visibleMapRect, -marginFactor * visibleMapRect.size.width, -marginFactor * visibleMapRect.size.height);
// determine how wide each bucket will be, as a MKMapRect square
CLLocationCoordinate2D leftCoordinate = [self.mapView convertPoint:CGPointZero toCoordinateFromView:self.view];
CLLocationCoordinate2D rightCoordinate = [self.mapView convertPoint:CGPointMake(bucketSize, 0) toCoordinateFromView:self.view];
double gridSize = MKMapPointForCoordinate(rightCoordinate).x - MKMapPointForCoordinate(leftCoordinate).x;
MKMapRect gridMapRect = MKMapRectMake(0, 0, gridSize, gridSize);
// condense annotations, with a padding of two squares, around the visibleMapRect
double startX = floor(MKMapRectGetMinX(adjustedVisibleMapRect) / gridSize) * gridSize;
double startY = floor(MKMapRectGetMinY(adjustedVisibleMapRect) / gridSize) * gridSize;
double endX = floor(MKMapRectGetMaxX(adjustedVisibleMapRect) / gridSize) * gridSize;
double endY = floor(MKMapRectGetMaxY(adjustedVisibleMapRect) / gridSize) * gridSize;
// for each square in our grid, pick one annotation to show
gridMapRect.origin.y = startY;
while (MKMapRectGetMinY(gridMapRect) <= endY) {
gridMapRect.origin.x = startX;
while (MKMapRectGetMinX(gridMapRect) <= endX) {
NSSet *allAnnotationsInBucket = [self.allAnnotationsMapView annotationsInMapRect:gridMapRect];
NSSet *visibleAnnotationsInBucket = [self.mapView annotationsInMapRect:gridMapRect];
// we only care about PhotoAnnotations
NSMutableSet *filteredAnnotationsInBucket = [[allAnnotationsInBucket objectsPassingTest:^BOOL(id obj, BOOL *stop) {
return ([obj isKindOfClass:[PhotoAnnotation class]]);
}] mutableCopy];
if (filteredAnnotationsInBucket.count > 0) {
PhotoAnnotation *annotationForGrid = (PhotoAnnotation *)[self annotationInGrid:gridMapRect usingAnnotations:filteredAnnotationsInBucket];
[filteredAnnotationsInBucket removeObject:annotationForGrid];
// give the annotationForGrid a reference to all the annotations it will represent
annotationForGrid.containedAnnotations = [filteredAnnotationsInBucket allObjects];
[self.mapView addAnnotation:annotationForGrid];
for (PhotoAnnotation *annotation in filteredAnnotationsInBucket) {
// give all the other annotations a reference to the one which is representing them
annotation.clusterAnnotation = annotationForGrid;
annotation.containedAnnotations = nil;
// remove annotations which we've decided to cluster
if ([visibleAnnotationsInBucket containsObject:annotation]) {
CLLocationCoordinate2D actualCoordinate = annotation.coordinate;
[UIView animateWithDuration:0.3 animations:^{
annotation.coordinate = annotation.clusterAnnotation.coordinate;
} completion:^(BOOL finished) {
annotation.coordinate = actualCoordinate;
[self.mapView removeAnnotation:annotation];
}];
}
}
}
gridMapRect.origin.x += gridSize;
}
gridMapRect.origin.y += gridSize;
}
}
By following above steps we can achieve clustering on mapview, it is not necessary to use any third party code or framework. Please check the Apple sample code here. Please let me know if you have any doubts on this.
Have you looked at ADClusterMapView ? https://github.com/applidium/ADClusterMapView
It does precisely just this.
I just wanted to clustering pins, just showing its number. The following one
https://www.cocoacontrols.com/controls/qtree-objc fits my expectations.
I recently forked off of ADClusterMapView mentioned in another answer and resolved many, if not all, of the issues associated with the project. It's a kd-tree algorithm and animates the clustering.
It's available open source here https://github.com/ashare80/TSClusterMapView
Try this framework (XMapView.framework); it now supports iOS 8.
This framework doesn't need you to change your current project structure and it can directly be used to your MKMapView. There is a zip file. It gives you an example to cluster 200 pins at once. After I tested it in an iPod I found it is very smooth.
http://www.xuliu.info/xMapView.html
This library supports:
clustering different categories
clustering all categories
setting up your own cluster radius and so on
hide or show a certain of categories
individually handle and control each pin in the map
There is a pretty cool and well maintained library for both Objective-C and Swift here: https://github.com/bigfish24/ABFRealmMapView
It does clustering really well and also handles large amounts of points due to its integration with Realm.
I am developing an application that computes the distance travelled by the person. I am testing it on a iPad (Device-3.2). My iPad is using WiFi to get the current location. The results are highly inaccurate even though i have filtered the values. I don't know if GPS will give me accurate results. I am pasting the entire code below. Please verify the code and in case of errors please let me know. It would b very helpful if some one test the code on iPhone(3g) or iPad(3g). If not possible then just check the logic.....also i want to compute the calories burnt ..is there any formula to do so..? I have made simple view based project.....and used a distance label in nib file to set distance value but distance is updating at a very rapid rate....please correct it.
// iPacometerViewController.h
#interface iPacometerViewController : UIViewController {
CLLocationManager *locationManager;
CLLocation *oldLocat;
CLLocation *newLocat;
IBOutlet UILabel *distanceLabel;
}
#property(nonatomic,assign)IBOutlet UILabel *distanceLabel;
#property(nonatomic,retain)CLLocationManager *locationManager;
#property(nonatomic,retain)CLLocation *oldLocat;
#property(nonatomic,retain)CLLocation *newLocat;
-(void)computeDistanceFrom:(CLLocation *)oldL tO:(CLLocation *)newL;
#end
// iPacometerViewController.m
#import "iPacometerviewController.h"
#implementation iPacometerViewController
static double distance = 0.0;
#synthesize locationManager;
#synthesize oldLocat;
#synthesize newLocat;
#synthesize distanceLabel;
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
[super viewDidLoad];
//initializing location manager
locationManager =[[CLLocationManager alloc]init];
locationManager.delegate = self;
locationManager.distanceFilter = 150.0f;
locationManager.desiredAccuracy = kCLLocationAccuracyBest;
[locationManager startUpdatingLocation];
oldLocat = [[CLLocation alloc]init];
newLocat = [[CLLocation alloc]init];
}
- (void)locationManager:(CLLocationManager *)manager
didUpdateToLocation:(CLLocation *)newLocation
fromLocation:(CLLocation *)oldLocation
{
if (newLocation.horizontalAccuracy 60.0) return; // data is too long ago, don't use it
NSLog(#"oldd %#",oldLocation);
self.oldLocat = oldLocation;
self.newLocat = newLocation;
if(oldLocat!=nil)
{
[self computeDistanceFrom:oldLocat tO:newLocat];
}
}
-(void)computeDistanceFrom:(CLLocation *)oldL tO:(CLLocation *)newL
{
NSLog(#"oldd %#",oldL);
NSLog(#"new %#",newL);
CLLocationDistance currentDistance = [oldL distanceFromLocation:newL];
NSLog(#"you have travel=%f",currentDistance);
distance = distance + currentDistance;
double distanceInKm = distance/1000;
NSString *distanceLabelValue = [NSString stringWithFormat:#"%1.2f Kms",distanceInKm];
distanceLabel.text = distanceLabelValue;
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
- (void)dealloc {
//[mapView release];
[oldloct release];
[newLocat release];
[locationManager release];
[super dealloc];
}
#end
your logic is fine. Its the iPhone's logic that is bad.
There are two main issues that I see when writing this sort of code.
1) What I call poisoned points. These are cached points that the iPhone reports erroneously. You can check for them by dropping any points where the timestamp is the same as or earlier than the latest point. It may also be worth recording a frequency of visits to a point. The poisoned points tend to be visited time and time again (maybe 5 times) each time as a jump from your real track. If you can spot them, you can rule them out.
2) Accuracy - especially height changes. Look at the horizontal and vertical accuracy figures returned with each point. And look at the height. If they are inaccurate, then the velocity and therefore distance traveled will be too. One cause of bad distances is if one end of your pair has an altitude, and the other does not: the "not" gets classed as zero - so if you are 200m above the sea level at the time you have just travelled 200m without moving!
To improve accuracy, you may be better off writing your own great circle distance algorithm (or even simple Pythagoras given the distances are small) which ignores height, or just doing some better filtering on the points you use.
I'm struggling to find a reliable way of measuring location, distance & speed with the iPhone (3GS). and iPad or Touch is just going to be worse I guess.
If you can drop me a mail at andrew dot hawken at fiftyeggs dot co dot uk I will send you my raw iPhone logging software. It would be great if you can run it for a couple of trips and let me have the results - I'm trying to solve the same problem, but only have GPS datasets to work on.