Optimization for MKAnnotations and Core Data - iphone

I have some locations ( in this case >3000 ) stored with Core Data. Once I open the map, I fetch the locations and store them in an array. Each time the mapview region is changed I call a function which will calculate which annotations are visible in the current visibleMaprect and filter them by pixel-distance. ( I know there would be more complex optimizations, like quadtrees, but I would not really implement it right now, if it's not extremely necessary ).
This is my code :
//locations is an array of NSManagedObjects
for (int i =0 ; i < [locations count]; i++)
{
// managed object class for faster access, valueforkey takes ages ...
LocationEntity * thisLocation = [locations objectAtIndex:i];
CLLocationCoordinate2D coord = CLLocationCoordinate2DMake( [thisLocation.latitude doubleValue], [thisLocation.longitude doubleValue]) ;
// mapRect is mapView.visibleMapRect
BOOL isOnScreen = MKMapRectContainsPoint(mapRect, MKMapPointForCoordinate(coord));
if (isOnScreen)
{
CGPoint cgp = [mapView convertCoordinate:coord toPointToView:mapView];
// compare the distance to already existing annotations
for (int idx = 0; idx < [annotations count] && hasEnoughDistance; idx++)
{
CGPoint cgp_prev = [mapView convertCoordinate:[[annotations objectAtIndex:idx] coordinate] toPointToView:mapView];
if ( getDist(cgp, cgp_prev) < dist ) hasEnoughDistance = FALSE;
}
}
if (hasEnoughDistance)
// if it's ok, create the annotation, add to an array and after the for add all to the map
}
The map is freezing for a few seconds after each zoom/movement.
I checked with time profiler and the simple obtainment of coordinates is taking sometimes 1 whole second, sometimes just 0.1, even though the coordinates are indexed attributes in my model... Also these type of lines seem to take ages :
CGPoint cgp = [mapView convertCoordinate:coord toPointToView:mapView];
Any suggestions how could I calculate the pixel/point distance between two annotations/coordinates without going through this function ? Or any optimization suggestions for Core Data?
Thanks :)

Ok, I sort of missed the not having them too close bit from your explanation. The conversion between the coordinates is very slow. The way you can alleviate it is to precompute the coordinates into map points with MKMapPointForCoordinate and store them persistently - they only depend on the coordinates. Then you can quickly calculate the distance between the map points of two annotations, scale it depending on your current zoom level of the map and this will quite closely relate to the actual distance on the screen. It should be accurate enough and will be much faster.
I would recommend calculating the squared distance and comparing it to squared dist. You would be saving a lot on the sqrt().
If you still get boggled down on getDist() (or getSqDist()) you could either go for a kd tree or use the Accelerate Framework to do the calculations. I've done the latter when I needed to calculate distances between many points and the speedup was very good. But the details of this is an another cup of tea. Let me know if you need any help with that.
The fact that your coordinates are indexed would only help if you actually searched for annotations by the coordinates, so it won't help if you just look through all of them.
A way of dealing with long loading times from CoreData would be to try making your annotations as lightweight as possible, so only storing the coordinates and map points. Then you could have a way of getting the rest of the annotation data as needed. This could be done with the proxy pattern.
One more thing. Fast enumeration might be faster and is better practice as well, so
for(LocationEntity* thisLocation in locations)
instead of
for (int i =0 ; i < [locations count]; i++)

Related

Put MKPointAnnotation on MKDirections at a specific distance

I'm using MKDirections to draw a route from point A to point B, and I'm looking to put a MKPointAnnotation at specific distance from the starting point.
For example, I have a MKMap with MKDirections drawn from LA to NY. I then would like to place a pin at the 10 mile marker for the route the user has decided to take.
Any thoughts on how to find the lat/long, using the chosen route at a specific distance?
Thanks.
- (void)showDirections:(MKDirectionsResponse *)response
{
NSInteger iDistance = 0;
BOOL bPointinStep = NO;
self.response = response;
for (MKRoute *route in _response.routes)
{
NSLog(#"route.distance: %.0f", route.distance);
responseNumber = responseNumber + 1;
[_mapView addOverlay:route.polyline level:MKOverlayLevelAboveRoads];
for (MKRouteStep *step in route.steps)
{
iDistance = iDistance + step.distance;
NSLog(#"iDistance: %.0ld", (long)iDistance);
NSLog(#"step.distance: %.0f", step.distance);
NSLog(#"%#", step.instructions);
if (iProgress < iDistance && !bPointinStep) {
NSLog(#"pin point is on this step");
bPointinStep = YES;
}
}
}
}
This works, as I'm able to determine which step the pin point should be placed, but my question is on how to determine where within the step.
I'll explain my approach to solve a similar problem.
You can use the property step.polyline.coordinate. It gives you the latitude and longitude of the end point of each step of your route. When your variable iDistance exceeds the 10 miles, the end point of this step (or end point of the previous step) will give an approximate coordinate that you are looking for. In any case you will have the segment containing "10 miles" distance.
To obtain a more accurate coordinate you can perform some simple trigonometric operations between the end point of this segment and the endpoint of the previous segment, to calculate the distance and put the MKPointAnnotation .
I hope it helps you.

Searching CoreData with the maps visibleRegion

I have several thousand locations stored in CoreData and I would like to search for locations that are within a Google Maps visibleRegion. I was previously doing a search with a bounding box but the addition of the bearing feature breaks this type of query. I have several ideas but this must be a common problem with some well thought out solutions. I'd be interested to see if any solutions use geohashes.
This is my query that breaks when the bearing is not due north.
NSPredicate *predicate = [NSPredicate predicateWithFormat:#"(lat > %f AND lat < %f AND lng > %f AND lng < %f)",
[self.googleMap.projection visibleRegion].nearLeft.latitude,
[self.googleMap.projection visibleRegion].farLeft.latitude,
[self.googleMap.projection visibleRegion].nearLeft.longitude,
[self.googleMap.projection visibleRegion].nearRight.longitude
];
You can calculate an axis-aligned bounding box of the visible region, and then use that to look up your locations. Some of them will still be outside of the actual visible area, but at least you'll filter out most of them.
The GMSCoordinateBounds class in GMSCoordinateBounds.h can be used to make this easier:
GMSMapView* _mapView = ...;
GMSCoordinateBounds* bounds =
[[GMSCoordinateBounds alloc]
initWithRegion: [_mapView.projection visibleRegion]];
CLLocationCoordinate2D northEast = bounds.northEast;
CLLocationCoordinate2D southWest = bounds.southWest;
Note also that there is currently a bug with visibleRegion being too large, see here:
https://code.google.com/p/gmaps-api-issues/issues/detail?id=5107
See here for a workaround to that problem:
Google Maps iOS SDK: How do I get accurate latitude and longitude coordinates from a camera's visibleRegion?

Finding closest match of Latitude & Longitude in a list of Lat/Longs

I'm creating an iPhone app with Weather lookup for particular locations and I have the following problem that i'm not sure the best way to tackle.
I have the latitude and longitude of a location and want to find the closest lat/long match from a list of 5000+ locations
The 5000+ locations come from a JSON feed from Met Office Datapoint API and are in the form of a NSArray of NSDictionaries, the NSDictionary includes id, lat, long and name.
I want to match my location to the nearest location from the list from the Met Office and grab the id key value.
Many Thanks in advance
I'm assuming you're using CLLocation objects in this...
- (CLLocation*)closestLocationToLocation:(CLLocation*)currLocation
{
CLLocationDistance minDistance;
CLLocation *closestLocation = nil;
for (CLLocation *location in arrayOfLocations) {
CLLocationDistance distance = [location distanceFromLocation:currLocation];
if (distance <= minDistance
|| closestLocation == nil) {
minDistance = distance;
closestLocation = location;
}
}
//closestLocation is now the location from your array which is closest to the current location or nil if there are no locations in your array.
return closestLocation;
}
There may be a quicker way of doing this but this will get it done.
EDITED to use CLLocation functions
I did a similar thing once (finding all lat/lon Objects surrounding a point with a max. radius) and used the formula given here:
http://www.movable-type.co.uk/scripts/latlong.html
However, that was quite time consuming. So I sort of "boxed" the objects fist. Based on the calculation above (reverted of course) I calculated the latitude and longitude of those coordinates north, west, south and east that had exactly the maximum distance. With hose max and min values (for lat and lon) I queried all objects in question. And only for those I calculated the exact distance and included them in the list of results or excluded them.
However, so far that does not exactly match your problem. But I tried to fasten the calculations even further. For that I said to myself, that I do not need the exact distance from the searched object to mine but it is enogh to know wether it is closer than one of the boxes coordinates. And that part is the one that corresponds well to your question:
Your case could be much easier. Assuming that the locations in question (the shortest once) are close to the one location which you try to assign, all this complex math may not play a role. You do not need the exact distance. What you need is the cosest one. For that I would assume that the earth is flat and that the distances between longitudes (or latitude) are linear. That is not true of course but should be good enough to figure out, which of those is the closest.
Going from there you could use pythagoras.
Distance = sqrt(sqr(difference-in-lat) + sqr(difference-in-lon));
For the mere purpose of comparing the distances and finding the shortest, you could even replace the time consuming square route with a much faster sqare operation.
Square-Of-Distance = sqr(difference-in-lat) + sqr(difference-in-lon).
And then compare the various Square-Of-Distance rather than the Distance. The result will be the same but much faster.
BTW, that was a PHP projects. That's why I cannot provide sample code but just explain the algorihm.
I'd suggest something like this:
NSMutableArray *tempArray = [NSMutableArray new];
for (NSMutableDictionary *location in yourArrayOfLocations){
CLLocation coord;
coord.latitude = [location objectForKey:#"latitude"];
coord.longitude = [location objectForKey:#"longitude"];
[location setValue:[usersLocation distanceFromLocation:coord] forKey:#"distance"];
[tempArray addObject:location];
}
// Now sort the array
NSArray *sortedArray = [tempArray sortedArrayUsingComparator:^(id o1, id o2) {
NSDictionary *location1 = (NSDictionary *)o1;
NSDictionary *location2 = (NSDictionary *)o2;
return [[location1 objectForKey:#"distance"] compare:[location2 objectForKey:#"distance"]];
}];
[tempArray release];
Now you have an array ordered by distance. You can use the object at index 0, as it is the closest to the user's position.
Good Luck!

iPhone SDK: Collision detection, does it have to be a rectangle?

I am making a basic platform game for the iPhone and I have encountered a problem with my collision detection.
if (CGRectIntersectsRect(player.frame, platform.frame))
pos2 = CGPointMake(0.0, +0.0);
else
pos2 = CGPointMake(0.0, +10.0);
The collision detection is to stop in-game-gravity existing when the player is on a platform, the problem is with the fact that the collision detection is the rectangle around the player, is there anyway to do collision detection for the actual shape of an image (with transparency) rather that the rectangle around it?
You'll have to program this on your own, and beware the pixel-by-pixel collision is probably too expensive for the iPhone. My recommendation is to write a Collidable protocol (called an interface in every other programming language), give it a collidedWith:(Collidable *)c function, and then just implement that for any object that you want to allow collision for. Then you can write case-by-case collision logic. Similarly, you can make a big superclass that has all the information you'd need for collision (in your case either an X, Y, width, and height, or an X, Y, and a pixel data array) and a collidesWith method. Either way you can write a bunch of different collision methods - if you're only doing pixel collision for a few things, it won't be much of a performance hit. Typically, though, it's better to do bounding box collision or some other collision based on geometry, as it is significantly faster.
The folks over at metanetsoftware made some great tutorials on collision techniques, among them axis separation collsion and grid based collision, the latter of which sounds like it would be more viable for your game. If you want to stick with brute force collision detection, however (checking every object against every other object), then making a bounding box that is simply smaller than the image is typically the proper way to go. This is how many successful platformers did it, including Super Mario Brothers.You might also consider weighted bounding boxes - that is, you have one bounding box for one type of object and a different sized one for others. In Mario, for example, you have a larger box to hit coins with than you do enemies.
Now, even though I've warned you to do otherwise, I'll oblige you and put in how to do pixel-based collision. You're going to want to access the pixel data of your CGImage, then iterate through all the pixels to see if this image shares a location with any other image. Here's some code for it.
for (int i = 0; i < [objects count]; i++)
{
MyObject *obj1 = [objects objectAtIndex:i];
//Compare every object against every other object.
for (int j = i+1; j < [objects count]; j++)
{
MyObject *obj2 = [objects objectAtIndex:j];
//Store whether or not we've collided.
BOOL collided = NO;
//First, do bounding box collision. We don't want to bother checking
//Pixels unless we are within each others' bounds.
if (obj1.x + obj1.imageWidth >= obj2.x &&
obj2.x + obj2.imageWidth >= obj1.x &&
obj1.y + obj1.imageHeight >= obj2.y &&
obj2.y + obj2.imageGeight >= obj1.y)
{
//We want to iterate only along the object with the smallest image.
//This way, the collision checking will take the least time possible.
MyObject *check = (obj1.imageWidth * obj1.imageHeight < obj2.imageWidth * obj2.imageHeight) ? obj1 : obj2;
//Go through the pixel data of the two objects.
for (int x = check.x; x < check.x + check.imageWidth && !collided; x++)
{
for (int y = check.y; y < check.y + check.imageHeight && !collided; y++)
{
if ([obj1 pixelIsOpaqueAtX:x andY:y] && [obj2 pixelIsOpaqueAtX:x andY:y])
{
collided = YES;
}
}
}
}
}
}
I made it so pixelIsOpaque takes a global coordinate rather than a local coordinate, so when you program that part you have to be careful to subtract the x and y out of that again, or you'll be checking out of the bounds of your image.

how to calculate two coordinates distance in objective c?

as title how to? i have tried the code from google earth, but seem like the result is different with the google map calculation result. below provided the code i did
-(double)GetDistance:(double)lat1 long1:(double)lng1 la2:(double)lat2 long2:(double)lng2 {
//NSLog(#"latitude 1:%.7f,longitude1:%.7f,latitude2:%.7f,longtitude2:%.7f",lat1,lng1,lat2,lng2);
double radLat1 = [self rad:lat1];
double radLat2 = [self rad:lat2];
double a = radLat1 - radLat2;
double b = [self rad:lng1] -[self rad:lng2];
double s = 2 * asin(sqrt(pow(sin(a/2),2) + cos(radLat1)*cos(radLat2)*pow(sin(b/2),2)));
s = s * EARTH_RADIUS;
s = round(s * 10000) / 10000;
return s;
}
-(double)rad:(double)d
{
return d *3.14159265 / 180.0;
}
the EARTH_RADIUS value is 6378.138
by using this function by provided two coordinates the result come out is 4.5kM
but when i use google map get direction between two same coordinates, it show me the distance is about 8km
can anyone help to point out the problem of my code?
Since this is tagged iPhone, why not use the built-in distance function rather than rolling your own? location1 and location2 are CLLocation objects.
CLLocationDistance distance = [location1 getDistanceFrom:location2];
Here is a simple code (supposing you just have latitude and longitude of the two points)
CLLocation *startLocation = [[CLLocation alloc] initWithLatitude:startLatitude longitude:startLongitude];
CLLocation *endLocation = [[CLLocation alloc] initWithLatitude:endLatitude longitude:endLongitude];
CLLocationDistance distance = [startLocation distanceFromLocation:endLocation]; // aka double
Don't forget to add MapKit Framework to your project, and import MapKit in your file :
#import <MapKit/MapKit.h>
Google Maps is likely to be giving you the driving distance, whereas the great circle equation you have listed is going to be the straight line surface distance. If there was a straight line surface road directly from point A to point B, Google Maps would likely give you the same distance as the equation you have there.
Since
getDistanceFrom:
isDeprecated
Try use the
[newLocation distanceFromLocation:oldLocation
You should be able to use the google API directly to calculate either great circle distance or driving distance depending on your application needs.
See GLatLong::distanceFrom and GDirections::getDistance.