Strange translations of lat/long coordinates to screen position - iphone

I have a map on my app that displays 2 kinds of annotations: locations and clusters of locations. When I zoom in on the a cluster the cluster is expanded to show the locations contained within the cluster. When these locations are added to the map the coordinate of their parent cluster is stored in the annotation object. What I want to do is have an animation so that, when these locations are added, they spread out from their parent cluster's location. Here is my code for mapView:didAddAnnotationViews:
- (void)mapView:(MKMapView *)mapView didAddAnnotationViews:(NSArray *)views
{
MKAnnotationView *aV;
for (aV in views)
{
if (([aV.annotation isKindOfClass:[PostLocationAnnotation class]]) && (((PostLocationAnnotation *)aV.annotation).hasParent))
{
CLLocationCoordinate2D startCoordinate = ((PostLocationAnnotation *)aV.annotation).parentCoordinate;
CGPoint startPoint = [ffMapView convertCoordinate:startCoordinate toPointToView:self.view];
CGRect endFrame = aV.frame;
aV.frame = CGRectMake(startPoint.x, startPoint.y, aV.frame.size.width, aV.frame.size.height);
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:1.00];
[UIView setAnimationCurve:UIViewAnimationCurveEaseInOut];
[aV setFrame:endFrame];
[UIView commitAnimations];
((PostLocationAnnotation *)aV.annotation).hasParent = NO;
}
}
}
This seems to make the map points fly in from some location far outside the focus of the view. Through debugging I have found that there is a massive discrepancy between the values for endFrame.origin and startPoint - for one particular aV, endFrame.origin comes up as (32657,21781) (both CGFloats) and startPoint comes up as (159.756256,247.213226) (again, both CGFloats). I am assuming that the endFrame.origin is the correct value as the points are ending up where I want them, they're just coming from somewhere far away. I've used the convertCoordinate:toPointToView: method in a lot of other parts of my code and had no problems. I've also tried multiplying the X and Y values of startPoint by various values but a single coefficient doesn't hold for every value of startPoint. Any idea what's going on?

The annotation view's frame seems to be based on the current zoom level and then offset from the screen coordinates. To compensate for this, offset the startPoint by the annotationVisibleRect.origin.
Also, when calling convertCoordinate:toPointToView:, I think it's safer to convert to the map view's frame instead of self.view since the map view might be a different size than the container view.
Try the following changes:
CGPoint startPoint = [ffMapView convertCoordinate:startCoordinate
toPointToView:ffMapView];
//BTW, using the passed mapView parameter instead of referencing the ivar
//would make the above line easier to re-use.
CGRect endFrame = aV.frame;
aV.frame = CGRectMake(startPoint.x + mapView.annotationVisibleRect.origin.x,
startPoint.y + mapView.annotationVisibleRect.origin.y,
aV.frame.size.width,
aV.frame.size.height);

Related

ios MapKit, following a path

I have a path rendering on my map. I have a series of waypoints and I'm simulating movement through them.
When changing the display region it appears that my coordinate when converted to a CGPoint is floorf by Apple's implementation. This causes a very jittery appearance instead of a smooth one.
Is there anyway to get around this?
[m_mapView setRegion: MKCoordinateRegionMake(coord, MKCoordinateSpanMake(0, 0))];
The map then tries to center on this given point. However the point may not be pixel aligned as can be view by the following function.
CGPoint point = [m_mapView convertCoordinate:coord toPointToView:m_mapView];
Thus the mapview floors the centerpoint's result to align all pixels for the underlying map.
I make the view larger than the screen to account for the offset and to avoid clipping.
I simulate a point moving along the route and place it using an annotation and then center on that point at 30 frames per second.
Assume coord is the position of the moving point.
[m_mapView setCenterCoordinate: coord];
CGPoint p = [m_mapView convertCoordinate:m_mapView.centerCoordinate toPointToView:m_mapView];
CGPoint p1 = [m_mapView convertCoordinate:coord toPointToView:m_mapView];
CGPoint offset = CGPointMake(p.x - p1.x, p.y - p1.y);
CGRect frame = CGRectInset(CGRectMake(0.0f, 0.0f, 1024.0f, 1024.0f), -50.0f, -50.0f);
frame.origin.x+= offset.x;
frame.origin.y+= offset.y;
[m_mapView setFrame: frame];

Setting boundary conditions for object to traverse a path through dragging the object

I am developing an application in which there is one object which is draggable.The user can drag this object in a specified path, not outside the path.
For achieving this I had to set the boundary conditions so that if the touch point is beyond that so the object does not move.
My problem is that my object size is lets say 30*30.
Now if I touch the object from its hair means at above portion and then drag it to down it gives one value, but if I touch the same object at some below position lets say his chin , then move the object at the same position where we moved the previous one.Now in this case, the the point of view are different from the previous one.
So I am not able to understand what should I set the boundary conditions,if I set the conditions of first value then in his case the second touch process will not able to move, which is wrong and if I set the second case value then the first case object will able to move below the boundary also.
I am not using cocos 2D.It is simple ipad application.
For more understanding I have added the image now if you touch the green circle on his left portion and moved towards the right, the last value is like 760 but if you touch on the right portion and moved towards the right then the last points come as 780 something. same case happens if object is dragged vertically.
My basic requirement is that the object should only move inside the path shown in image not outside it.
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint pt = [[touches anyObject] locationInView:gamePice];
startLocation = pt;
}
- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
CGPoint pt1 = [[touches anyObject] locationInView:gamePice];
CGPoint pt = [[touches anyObject] locationInView:self.view];
NSLog(#"moved=%f|%f",pt.x,pt.y);
if (pt.x>=306 && pt.y>=148 && pt.x<=720 && pt.y<=175) { //these conditions how to set
CGRect frame = [gamePice frame];
frame.origin.x += pt1.x - startLocation.x;
frame.origin.y += pt1.y - startLocation.y;
[gamePice setFrame:frame];
}
}
Please suggest how to implement this feature or what approach should I follow.
Make an outer CGRect, lets say , CGRectMake(306, 148,400,100) , its your path rect, you many need multiple rects for that, but for now start with one.
make new frame for gamePice, that will be the frame according to touch location, lets say you make it like this,
CGRect frame = [gamePice frame];
frame.origin.x += pt1.x - startLocation.x;
frame.origin.y += pt1.y - startLocation.y;
then your condition will be,
if (CGRectContainsRect(outerRect, frame) )
{
[gamePice setFrame:frame];
}
// else dont move...

Co-ordinates of the four points of a uiview which has been rotated

Is it possible to get this? If so, can anyone please tell me how?
Get the four points of the frame of your view (view.frame)
Retrieve the CGAffineTransform applied to your view (view.transform)
Then apply this same affine transform to the four points using CGPointApplyAffineTransform (and sibling methods of the CGAffineTransform Reference)
CGPoint topLeft = view.bounds.origin;
topLeft = [[view superview] convertPoint:topLeft fromView:view];
CGPoint topRight = CGPointMake(view.bounds.origin.x + view.bounds.width, view.bounds.origin.y);
topRight = [[view superview] convertPoint:topRight fromView:view];
// ... likewise for the other points
The first point is in the view's coordinate space, which is always "upright". Then the next statement finds the point that point corresponds to in the parent view's coordinate space. Note for an un-transformed view, that would be equal to view.frame.origin. The above calculations give the equivalent of the corners of view.frame for a transformed view.

How to get UIView's Position in DrawRect method

I have 4 views and i am drawing circles inside these views.The user is able to move these views.How can i get the position of each view?
If your views are instance variables in your class you can use view.frame or view.center but your question is quite low on details so I'm just shooting in the dark.
CGPoint origin = view.origin;
CGPoint center = view.center;
CGRect frame = view.frame;
All these things can be used to get what you want.

(iphone) how to set view.center when detaching a view from scroll view and adding it to another view?

I'd like to move a view from a scrollview to a uiview.
I'm having trouble changing it's center(or frame) so that it remains in the same position in screen (but in a different view, possibly the superview of scrollview).
How should I convert the view's center/frame?
Thank you.
EDIT:
CGPoint oldCenter = dragView.center;
CGPoint newCenter = [dragView convertPoint: oldCenter toView: self.navigationView.contentView];
dragView.center = newCenter;
[self.navigationView.contentView addSubview: dragView];
I can also use (NSSet*) touches since i'm in touchesBegan:
I was having hard time to make it work but the doc wasn't so clear to me.
You can use convertPoint:toView: method of UIView. It is used to convert a point from one view's coordinate system to another. See Converting Between View Coordinate Systems section of UIView class reference. There are more methods available.
-edit-
You are using the wrong point when calling convertPoint: method. The given point should be in dragView's coordinate system where as dragView.center is in its superview's coordinate system.
Use the following point and it should give you the center of dragView in its own coordinate system.
CGPoint p;
p = CGPointMake(dragView.bounds.size.width * 0.5, dragView.bounds.size.height * 0.5);