I am showing an image in an UIImageView and i'd like to convert coordinates to x/y values so i can show cities on this image.
This is what i tried based on my research:
CGFloat height = mapView.frame.size.height;
CGFloat width = mapView.frame.size.width;
int x = (int) ((width/360.0) * (180 + 8.242493)); // Mainz lon
int y = (int) ((height/180.0) * (90 - 49.993615)); // Mainz lat
NSLog(#"x: %i y: %i", x, y);
PinView *pinView = [[PinView alloc]initPinViewWithPoint:x andY:y];
[self.view addSubview:pinView];
which gives me 167 as x and y=104 but this example should have the values x=73 and y=294.
mapView is my UIImageView, just for clarification.
So my second try was to use the MKMapKit:
CLLocationCoordinate2D coord = CLLocationCoordinate2DMake(49.993615, 8.242493);
MKMapPoint point = MKMapPointForCoordinate(coord);
NSLog(#"x is %f and y is %f",point.x,point.y);
But this gives me some really strange values:
x = 140363776.241755 and y is 91045888.536491.
So do you have an idea what i have to do to get this working ?
Thanks so much!
To make this work you need to know 4 pieces of data:
Latitude and longitude of the top left corner of the image.
Latitude and longitude of the bottom right corner of the image.
Width and height of the image (in points).
Latitude and longitude of the data point.
With that info you can do the following:
// These should roughly box Germany - use the actual values appropriate to your image
double minLat = 54.8;
double minLong = 5.5;
double maxLat = 47.2;
double maxLong = 15.1;
// Map image size (in points)
CGSize mapSize = mapView.frame.size;
// Determine the map scale (points per degree)
double xScale = mapSize.width / (maxLong - minLong);
double yScale = mapSize.height / (maxLat - minLat);
// Latitude and longitude of city
double spotLat = 49.993615;
double spotLong = 8.242493;
// position of map image for point
CGFloat x = (spotLong - minLong) * xScale;
CGFloat y = (spotLat - minLat) * yScale;
If x or y are negative or greater than the image's size, then the point is off of the map.
This simple solution assumes the map image uses the basic cylindrical projection (Mercator) where all lines of latitude and longitude are straight lines.
Edit:
To convert an image point back to a coordinate, just reverse the calculation:
double pointLong = pointX / xScale + minLong;
double pointLat = pointY / yScale + minLat;
where pointX and pointY represent a point on the image in screen points. (0, 0) is the top left corner of the image.
Related
EDIT: I believe my issue is that this code works for integer zoom levels, but I would like it to work for float zoom levels.
I have an iOS app in which the user can switch between a RouteMe-based map and a MapKit-based map.
When they switch sources, I would like to be able to show the exact same area in one as in the other. However, I can't figure out how to make them match because RouteMe and MapKit use different data structures to describe the map bounds.
Here is some code that gets it to be somewhat close, but it's not exact. This code comes from: http://troybrant.net/blog/2010/01/set-the-zoom-level-of-an-mkmapview/
I'm not sure whether this code should be fixed, or possibly I am overlooking a much easier solution. The code executes starting with the last method listed:
#define MERCATOR_OFFSET 268435456
#define MERCATOR_RADIUS 85445659.44705395
#pragma mark -
#pragma mark Map conversion methods
- (double)longitudeToPixelSpaceX:(double)longitude {
return round(MERCATOR_OFFSET + MERCATOR_RADIUS * longitude * M_PI / 180.0);
}
- (double)latitudeToPixelSpaceY:(double)latitude {
return round(MERCATOR_OFFSET - MERCATOR_RADIUS * logf((1 + sinf(latitude * M_PI / 180.0)) / (1 - sinf(latitude * M_PI / 180.0))) / 2.0);
}
- (double)pixelSpaceXToLongitude:(double)pixelX {
return ((round(pixelX) - MERCATOR_OFFSET) / MERCATOR_RADIUS) * 180.0 / M_PI;
}
- (double)pixelSpaceYToLatitude:(double)pixelY {
return (M_PI / 2.0 - 2.0 * atan(exp((round(pixelY) - MERCATOR_OFFSET) / MERCATOR_RADIUS))) * 180.0 / M_PI;
}
- (MKCoordinateSpan)coordinateSpanWithMapView:(MKMapView *)mapView
centerCoordinate:(CLLocationCoordinate2D)centerCoordinate
andZoomLevel:(NSInteger)zoomLevel {
// convert center coordiate to pixel space
double centerPixelX = [self longitudeToPixelSpaceX:centerCoordinate.longitude];
double centerPixelY = [self latitudeToPixelSpaceY:centerCoordinate.latitude];
// determine the scale value from the zoom level
NSInteger zoomExponent = 20 - zoomLevel;
double zoomScale = pow(2, zoomExponent);
// scale the map’s size in pixel space
CGSize mapSizeInPixels = mapView.bounds.size;
double scaledMapWidth = mapSizeInPixels.width * zoomScale;
double scaledMapHeight = mapSizeInPixels.height * zoomScale;
// figure out the position of the top-left pixel
double topLeftPixelX = centerPixelX - (scaledMapWidth / 2);
double topLeftPixelY = centerPixelY - (scaledMapHeight / 2);
// find delta between left and right longitudes
CLLocationDegrees minLng = [self pixelSpaceXToLongitude:topLeftPixelX];
CLLocationDegrees maxLng = [self pixelSpaceXToLongitude:topLeftPixelX + scaledMapWidth];
CLLocationDegrees longitudeDelta = maxLng - minLng;
// find delta between top and bottom latitudes
CLLocationDegrees minLat = [self pixelSpaceYToLatitude:topLeftPixelY];
CLLocationDegrees maxLat = [self pixelSpaceYToLatitude:topLeftPixelY + scaledMapHeight];
CLLocationDegrees latitudeDelta = -1 * (maxLat - minLat);
// create and return the lat/lng span
MKCoordinateSpan span = MKCoordinateSpanMake(latitudeDelta, longitudeDelta);
return span;
}
- (void)setCenterCoordinate:(CLLocationCoordinate2D)centerCoordinate
zoomLevel:(NSUInteger)zoomLevel
animated:(BOOL)animated {
// use the zoom level to compute the region
MKCoordinateSpan span = [self coordinateSpanWithMapView:self
centerCoordinate:centerCoordinate
andZoomLevel:zoomLevel];
MKCoordinateRegion region = MKCoordinateRegionMake(centerCoordinate, span);
// set the region like normal
[self setRegion:region animated:animated];
}
Unfortunately this is a limitation of the Google Maps API, which only accepts integer values when setting the map's zoom level: Apple's MapKit code is calling the underlying Google Maps APIs when you set a MKMapView's displayed area, and the result – no matter which MapKit method you use to set the area – is a map that's zoomed out to the nearest integer zoom level.
Troy Brant's code takes you full circle, and puts a layer above the MapKit APIs that allows you to set the zoom level directly… but ultimately you don't have precise control over the area displayed by an MKMapView, unless the zoom level of your desired map happens to be an integer.
Several variations on this question have appeared on Stack Overflow (e.g., MKMapView setRegion "snaps" to predefined zoom levels? and MKMapView show incorrectly saved region), but so far no one has come up with a programmatic way to make a map with a non-integer zoom level, and I suspect it'd take cooperation between Google and Apple to ever make it happen.
I want to know the radius of visible area in iphone screen, as I will zoomout and zoom in the visible area will change, so I want to know the radius of that particular area, how can I do it?
Its not radius what is required.
You need to use the region parameter from mapView.
Check out apple docs, it is pretty much clear from those.
Go thru this tutorial. It will help you a lot
icode blog mapkit demo
specifically you need to set something like this..
MKCoordinateSpan span = [self coordinateSpanWithMapView:self centerCoordinate:centerCoordinate andZoomLevel:zoomLevel];
MKCoordinateRegion region = MKCoordinateRegionMake(centerCoordinate, span);
[self setRegion:region animated:animated];
where span can be calculated as
- (MKCoordinateSpan)coordinateSpanWithMapView:(MKMapView *)mapView
centerCoordinate:(CLLocationCoordinate2D)centerCoordinate
andZoomLevel:(NSUInteger)zoomLevel
{
// convert center coordiate to pixel space
double centerPixelX = [self longitudeToPixelSpaceX:centerCoordinate.longitude];
double centerPixelY = [self latitudeToPixelSpaceY:centerCoordinate.latitude];
// determine the scale value from the zoom level
NSInteger zoomExponent = 20 - zoomLevel;
double zoomScale = pow(2, zoomExponent);
// scale the map’s size in pixel space
CGSize mapSizeInPixels = mapView.bounds.size;
double scaledMapWidth = mapSizeInPixels.width * zoomScale;
double scaledMapHeight = mapSizeInPixels.height * zoomScale;
// figure out the position of the top-left pixel
double topLeftPixelX = centerPixelX - (scaledMapWidth / 2);
double topLeftPixelY = centerPixelY - (scaledMapHeight / 2);
// find delta between left and right longitudes
CLLocationDegrees minLng = [self pixelSpaceXToLongitude:topLeftPixelX];
CLLocationDegrees maxLng = [self pixelSpaceXToLongitude:topLeftPixelX + scaledMapWidth];
CLLocationDegrees longitudeDelta = maxLng - minLng;
// find delta between top and bottom latitudes
CLLocationDegrees minLat = [self pixelSpaceYToLatitude:topLeftPixelY];
CLLocationDegrees maxLat = [self pixelSpaceYToLatitude:topLeftPixelY + scaledMapHeight];
CLLocationDegrees latitudeDelta = -1 * (maxLat - minLat);
// create and return the lat/lng span
MKCoordinateSpan span = MKCoordinateSpanMake(latitudeDelta, longitudeDelta);
return span;
}
Cheers :)
I might be misunderstanding the question, but isn't it as simple as:
- (void)mapView:(MKMapView *)mapView regionDidChangeAnimated:(BOOL)animated {
CGFloat latD = mapView.region.span.latitudeDelta;
CGFloat lngD = mapView.region.span.longitudeDelta;
NSLog(#"This is the latitude delta of the visible map: %f", latD);
NSLog(#"This is the longitude delta of the visible map: %f", lngD);
}
I have a view with a point on its center.
I have an angle in degrees (or radian, that's not the problem).
I have a circle which center is the center of the view, and the radius is R.
I'd like to draw :
something (let's say an image) on the point that is placed on the
circle, at an angle of R from the vertical position.
an arc from the vertical position above the center that intersect
the circle, to that point
How may I do that ?
I think it you could calculate the image position with:
CGPoint center = self.view.center;
float x = radius * cos(angle);
float y = radius * sin(angle);
CGPoint newPoint = CGPointMake(center.x + x, center.y + y);
Let me know if it worked.
As for drawing an arc you would have two points, one is newPoint that is calculated above (on circle depending the angle) and point above the center intersecting the circle which is calculated easily:
CGPoint pointAboveCenter = CGPointMake(center.x, center.y + radius);
I'm trying to implement a small radar that plots targets based on latitude and longitude coordinates similar to the radar in the Layar AR iPhone app. I have the compass and locationManager working to get the lat/lon, heading and distance between two points. However I'm having trouble plotting the points onto the x-y plane. Could you point me in the right direction(so-to-speak)?
This is the method that I am using to plot but the results are not correct:
-(void) addTargetIndicatorWithHeading:(float)heading andDistance:(float)distance{
//draw target indicators
//need to convert radians and distance to cartesian coordinates
float radius = 50;
float x0 = 0.0;
float y0 = 0.0;
//convert heading from radians to degrees
float angle = heading * (180/M_PI);
//x-y coordinates
float x1 = (x0 + radius * sin(angle));
float y1 = (y0 + radius * cos(angle));
TargetIndicator *ti = [[TargetIndicator alloc] initWithFrame:CGRectMake(x1, y1, 5, 5)];
[self addSubview:ti];
[ti release];
}
I guess the problem lies within the present view's origin coordinate not being added to ur coordinate.
just modify your x1 and y1 by adding the origin.x and origin.y of the current view to which you add ti as a subview.
I figured out what was wrong but I'm don't know the reasoning behind it. First I should not have been converting the radians to degrees. This gives me the correct positioning but it is rotated 180 degrees. So to fix it, I subtract the radians from PI.
Here is the solution:
-(void) addTargetIndicatorWithHeading:(float)heading andDistance:(float)distance{
//draw target indicators
//need to convert radians and distance to cartesian coordinates
float radius = 50;
//origin offset
float x0 = 50.0;
float y0 = 50.0;
//convert heading from radians to degrees and rotate by 180 deg
float angle = M_PI - heading;
float x1 = (x0 + radius * sin(angle));
float y1 = (y0 + radius * cos(angle));
TargetIndicator *ti = [[TargetIndicator alloc] initWithFrame:CGRectMake(x1, y1, 5, 5)];
[self addSubview:ti];
[ti release];
}
How does one get the 4 coordinates for a UIImageView?
I know the CGRect can be obtained and the origin.x and origin.y, but how can all 4 corners be found?
EDIT: I am rotating the UIImageViews, thats why I asked :P
You could add width and height of the rectangle to get the coordinates of the other 3 points.
CGRect rect = view.bounds;
CGPoint topLeft = rect.origin;
CGPoint topRight = CGPointMake(rect.origin.x + rect.size.width, rect.origin.y);
CGPoint bottomLeft =CGPointMake(rect.origin.x, rect.origin.y + rect.size.height);
CGPoint bottomRight = CGPointMake(rect.origin.x + rect.size.width,
rect.origin.y + rect.size.height);
Then you could use CGPointApplyAffineTransform to get the transformed coordinates of them under your specified transform.
CGPoint center = view.center;
CGAffineTransform transf = CGAffineTransformMakeTranslation(-rect.size.width/2,
-rect.size.height/2);
transf = CGAffineTransformConcat(transf, view.transform);
transf = CGAffineTransformTranslate(transf, center.x, center.y);
topLeft = CGPointApplyAffineTransform(topLeft, transf);
//...
(note: not tested.)
This is my solution:
[self] is a subclass of UIImageView
[self.transform] is the transform i make on [self]:
CGAffineTransform transform = CGAffineTransformMakeTranslation(-center.x, -center.y);
transform = CGAffineTransformConcat(transform, self.transform);
CGAffineTransform transform1 = CGAffineTransformMakeTranslation(center.x, center.y);
transform = CGAffineTransformConcat(transform, transform1);
CGPoint leftTopPoint = CGPointApplyAffineTransform(leftTopPoint, transform);
CGPoint rightTopPoint = CGPointApplyAffineTransform(rightTopPoint, transform);
CGPoint rightBottomPoint = CGPointApplyAffineTransform(rightBottomPoint, transform);
CGPoint leftBottomPoint = CGPointApplyAffineTransform(leftBottomPoint, transform);
You can get the size.width and size.height. Adding those to the x and y will give you the other coordinates.
Whilst these are (of course) relative to the superview, you can use the frame property to obtain a CGRect containing the origin and size of the UIImageView. You can then simply add the relevant size to the relevant origin point to obtain the full set of coordinates.
See the frame section in the UIView class reference for more information.
Construct a rotation matrix http://en.wikipedia.org/wiki/Rotation_matrix. You should calculate the initial positions of the corners relative to the point which is the center of rotation. Store those positions in an array and keep them all the time. You calculate new positions by passing the angle in a 2x2 rotation matrix and multiplying them with initial positions.
Well, given you know the angle of rotation, this is the maths to get the y coordinate of the top right corner:
Sin (angle of rotation) = height difference y / width
Therefore if you're rotating the rectangle by 10 deg and it has a width of 20pt:
Sin 10 = yDiff / 20
Which means you can do this:
yDiff = Sin 10 * 20
This gives you the difference in y from the y coordinate of the origin to the y coordinate of the top right corner. Add this value to the current y origin of your rectangle to get the actual y coordinate of your top right corner. The next step is to use pythagoras on your width and the yDiff to get the xDiff and do the same (add it to the x coordinate) to get the x coordinate of your right hand corner. I hope this makes sense.
Now you just need to do it again for each other corner - imagine, if you will, that the rectangle has rotated through 90 deg, you can just reapply the logic, however x is y and vice versa. :) etc