I'm trying to implement a small radar that plots targets based on latitude and longitude coordinates similar to the radar in the Layar AR iPhone app. I have the compass and locationManager working to get the lat/lon, heading and distance between two points. However I'm having trouble plotting the points onto the x-y plane. Could you point me in the right direction(so-to-speak)?
This is the method that I am using to plot but the results are not correct:
-(void) addTargetIndicatorWithHeading:(float)heading andDistance:(float)distance{
//draw target indicators
//need to convert radians and distance to cartesian coordinates
float radius = 50;
float x0 = 0.0;
float y0 = 0.0;
//convert heading from radians to degrees
float angle = heading * (180/M_PI);
//x-y coordinates
float x1 = (x0 + radius * sin(angle));
float y1 = (y0 + radius * cos(angle));
TargetIndicator *ti = [[TargetIndicator alloc] initWithFrame:CGRectMake(x1, y1, 5, 5)];
[self addSubview:ti];
[ti release];
}
I guess the problem lies within the present view's origin coordinate not being added to ur coordinate.
just modify your x1 and y1 by adding the origin.x and origin.y of the current view to which you add ti as a subview.
I figured out what was wrong but I'm don't know the reasoning behind it. First I should not have been converting the radians to degrees. This gives me the correct positioning but it is rotated 180 degrees. So to fix it, I subtract the radians from PI.
Here is the solution:
-(void) addTargetIndicatorWithHeading:(float)heading andDistance:(float)distance{
//draw target indicators
//need to convert radians and distance to cartesian coordinates
float radius = 50;
//origin offset
float x0 = 50.0;
float y0 = 50.0;
//convert heading from radians to degrees and rotate by 180 deg
float angle = M_PI - heading;
float x1 = (x0 + radius * sin(angle));
float y1 = (y0 + radius * cos(angle));
TargetIndicator *ti = [[TargetIndicator alloc] initWithFrame:CGRectMake(x1, y1, 5, 5)];
[self addSubview:ti];
[ti release];
}
Related
I am showing an image in an UIImageView and i'd like to convert coordinates to x/y values so i can show cities on this image.
This is what i tried based on my research:
CGFloat height = mapView.frame.size.height;
CGFloat width = mapView.frame.size.width;
int x = (int) ((width/360.0) * (180 + 8.242493)); // Mainz lon
int y = (int) ((height/180.0) * (90 - 49.993615)); // Mainz lat
NSLog(#"x: %i y: %i", x, y);
PinView *pinView = [[PinView alloc]initPinViewWithPoint:x andY:y];
[self.view addSubview:pinView];
which gives me 167 as x and y=104 but this example should have the values x=73 and y=294.
mapView is my UIImageView, just for clarification.
So my second try was to use the MKMapKit:
CLLocationCoordinate2D coord = CLLocationCoordinate2DMake(49.993615, 8.242493);
MKMapPoint point = MKMapPointForCoordinate(coord);
NSLog(#"x is %f and y is %f",point.x,point.y);
But this gives me some really strange values:
x = 140363776.241755 and y is 91045888.536491.
So do you have an idea what i have to do to get this working ?
Thanks so much!
To make this work you need to know 4 pieces of data:
Latitude and longitude of the top left corner of the image.
Latitude and longitude of the bottom right corner of the image.
Width and height of the image (in points).
Latitude and longitude of the data point.
With that info you can do the following:
// These should roughly box Germany - use the actual values appropriate to your image
double minLat = 54.8;
double minLong = 5.5;
double maxLat = 47.2;
double maxLong = 15.1;
// Map image size (in points)
CGSize mapSize = mapView.frame.size;
// Determine the map scale (points per degree)
double xScale = mapSize.width / (maxLong - minLong);
double yScale = mapSize.height / (maxLat - minLat);
// Latitude and longitude of city
double spotLat = 49.993615;
double spotLong = 8.242493;
// position of map image for point
CGFloat x = (spotLong - minLong) * xScale;
CGFloat y = (spotLat - minLat) * yScale;
If x or y are negative or greater than the image's size, then the point is off of the map.
This simple solution assumes the map image uses the basic cylindrical projection (Mercator) where all lines of latitude and longitude are straight lines.
Edit:
To convert an image point back to a coordinate, just reverse the calculation:
double pointLong = pointX / xScale + minLong;
double pointLat = pointY / yScale + minLat;
where pointX and pointY represent a point on the image in screen points. (0, 0) is the top left corner of the image.
I'm trying to calculate the angle of the click i am making in relationship to the middle of the screen. But maybe i am confused on how atanf is suppsoed to work.
CGPoint pt = [self convertTouchToNodeSpace:[touches anyObject]];
float adj = pt.x - 512;
float opposite = pt.y - 384;
float combined = opposite / adj;
float tan = atanf(combined);
but when i try to NSLog Tan, i just get some giant number like 0.1253649
thoughts?
The right way to convert vector to angle is through atan2 function:
float angle = atan2f (pt.y - 384, pt.x - 512) * 180 / PI;
PS: Are you using cocos2d engine? It has ccpToAngle(...) function.
I have a view with a point on its center.
I have an angle in degrees (or radian, that's not the problem).
I have a circle which center is the center of the view, and the radius is R.
I'd like to draw :
something (let's say an image) on the point that is placed on the
circle, at an angle of R from the vertical position.
an arc from the vertical position above the center that intersect
the circle, to that point
How may I do that ?
I think it you could calculate the image position with:
CGPoint center = self.view.center;
float x = radius * cos(angle);
float y = radius * sin(angle);
CGPoint newPoint = CGPointMake(center.x + x, center.y + y);
Let me know if it worked.
As for drawing an arc you would have two points, one is newPoint that is calculated above (on circle depending the angle) and point above the center intersecting the circle which is calculated easily:
CGPoint pointAboveCenter = CGPointMake(center.x, center.y + radius);
I am making an iphone app where a ball will roll around the screen based on how the user tilts the device. If the device is lies flat on the table theoretically the ball would not move. If the device is tilted standing completely upward the I want the ball to roll straight down at maximum speed. The speed depends on how far from the flat position the device is tilted. Also, it also works for if the user tilts right or left or up or combinations of the four. I am using the accelerometer right now and the ball moves and it works okay, I am just not real familiar with physics. If someone has any suggestions on how to get this to work smoothly please let me know.
Thanks!
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
float xx = -[acceleration x];
float yy = [acceleration y];
float z = -[acceleration z];
z = 1 - z;
NSString * zaxis = [NSString stringWithFormat:#"%f", z];
lblz.text = zaxis;
lbly.text = [NSString stringWithFormat:#"%f", yy];
lblx.text = [NSString stringWithFormat:#"%f", xx];
CGFloat newx;
CGFloat newy;
if (yy > 0)
{
newy = ball.center.y - ((1 - yy) * z);
}
else
{
newy = ball.center.y + ((1 - yy) * z);
}
if (xx > 0)
{
newx = ball.center.x - ((1 - xx) * z);
}
else
{
newx = ball.center.x + ((1 - xx) * z);
}
CGPoint newPoint = CGPointMake(newx, newy);
ball.center = newPoint;
If you want to make it look more realistic and leverage existing stuff, look at some of the existing physics engines and 2d frameworks, Box2d and Cocos2d, but there are many others.
I think the key thing you are messing here is the difference between acceleration and velocity. You want the 'amount of tilt' to work as an acceleration. Each frame the balls Velocity should change by the acceleration, then the balls position should change by the balls velocity.
So just in X it should be something like:
float accelX = acceleration.x;
mVel.x += accelX; \\mVel is a member variable you have to store
ball.center.x += mVel.x;
---More Complex version
Now the more I think about it, it might not be the 'amount of tilt' you want to be the acceleration. You might want the amount of tilt to be the 'Target Velocity.' But you still want to use an acceleration to get there.
mTargetVel.x = acceleration.x;
//Now apply an acceleration to the velocity to move towards the Target Velocity
if(mVel.x < mTargetVel.x) {
mVel.x += ACCEL_X; //ACCEL_X is just a constant value that works well for you
}
else if(mVel.x > mTargetVel.x) {
mVel.x -= ACCEL_X;
}
//Now update the position based on the new velocity
ball.center.x += mVel.x;
How does one get the 4 coordinates for a UIImageView?
I know the CGRect can be obtained and the origin.x and origin.y, but how can all 4 corners be found?
EDIT: I am rotating the UIImageViews, thats why I asked :P
You could add width and height of the rectangle to get the coordinates of the other 3 points.
CGRect rect = view.bounds;
CGPoint topLeft = rect.origin;
CGPoint topRight = CGPointMake(rect.origin.x + rect.size.width, rect.origin.y);
CGPoint bottomLeft =CGPointMake(rect.origin.x, rect.origin.y + rect.size.height);
CGPoint bottomRight = CGPointMake(rect.origin.x + rect.size.width,
rect.origin.y + rect.size.height);
Then you could use CGPointApplyAffineTransform to get the transformed coordinates of them under your specified transform.
CGPoint center = view.center;
CGAffineTransform transf = CGAffineTransformMakeTranslation(-rect.size.width/2,
-rect.size.height/2);
transf = CGAffineTransformConcat(transf, view.transform);
transf = CGAffineTransformTranslate(transf, center.x, center.y);
topLeft = CGPointApplyAffineTransform(topLeft, transf);
//...
(note: not tested.)
This is my solution:
[self] is a subclass of UIImageView
[self.transform] is the transform i make on [self]:
CGAffineTransform transform = CGAffineTransformMakeTranslation(-center.x, -center.y);
transform = CGAffineTransformConcat(transform, self.transform);
CGAffineTransform transform1 = CGAffineTransformMakeTranslation(center.x, center.y);
transform = CGAffineTransformConcat(transform, transform1);
CGPoint leftTopPoint = CGPointApplyAffineTransform(leftTopPoint, transform);
CGPoint rightTopPoint = CGPointApplyAffineTransform(rightTopPoint, transform);
CGPoint rightBottomPoint = CGPointApplyAffineTransform(rightBottomPoint, transform);
CGPoint leftBottomPoint = CGPointApplyAffineTransform(leftBottomPoint, transform);
You can get the size.width and size.height. Adding those to the x and y will give you the other coordinates.
Whilst these are (of course) relative to the superview, you can use the frame property to obtain a CGRect containing the origin and size of the UIImageView. You can then simply add the relevant size to the relevant origin point to obtain the full set of coordinates.
See the frame section in the UIView class reference for more information.
Construct a rotation matrix http://en.wikipedia.org/wiki/Rotation_matrix. You should calculate the initial positions of the corners relative to the point which is the center of rotation. Store those positions in an array and keep them all the time. You calculate new positions by passing the angle in a 2x2 rotation matrix and multiplying them with initial positions.
Well, given you know the angle of rotation, this is the maths to get the y coordinate of the top right corner:
Sin (angle of rotation) = height difference y / width
Therefore if you're rotating the rectangle by 10 deg and it has a width of 20pt:
Sin 10 = yDiff / 20
Which means you can do this:
yDiff = Sin 10 * 20
This gives you the difference in y from the y coordinate of the origin to the y coordinate of the top right corner. Add this value to the current y origin of your rectangle to get the actual y coordinate of your top right corner. The next step is to use pythagoras on your width and the yDiff to get the xDiff and do the same (add it to the x coordinate) to get the x coordinate of your right hand corner. I hope this makes sense.
Now you just need to do it again for each other corner - imagine, if you will, that the rectangle has rotated through 90 deg, you can just reapply the logic, however x is y and vice versa. :) etc