Position image onscreen according to the touches location, limit the image's location to a circle - iphone

I have a problem regarding positioning an image according to the touches location, however limited to a circle.
It works for the most part, but if the angle (from the touches location to the desired location) is less than 0, it positions the image on the wrong side of the circle.
Perhaps it's some maths that I've done wrong.
Anyway, here's the code:
float newHeight, newWidth, centerPointX, centerPointY;
newHeight = -(invertedY.y - (view.frame.origin.y+view.frame.size.height/2));
newWidth = -(invertedY.x - (view.frame.origin.x+view.frame.size.width/2));
float tangent = newHeight/newWidth;
float calculatedAngle = atanf(tangent);
float s, c, d, fX, fY;
d = view.frame.size.width/2+30;
if (calculatedAngle < 0) {
s = sinf(calculatedAngle) * d;
c = cosf(calculatedAngle) * d;
} else {
s = -sinf(calculatedAngle) * d;
c = -cosf(calculatedAngle) * d;
}
fX = view.center.x + c;
fY = view.center.y + s;
[delegate setPoint:CGPointMake(fX, fY)];
NSLog(#"angle = %.2f", calculatedAngle);
Any help appreciated.

I think the best way to limit location to a circle is calculate vector from center to touch location. Calculate vector length then divide it by that length so it would be normalized. Then multiply normalized vector by radius of circle and finally add this vector to the center to compute new location.
CGPoint touch, center;
CGPoint vector = CGPointMake(touch.x-center.x, touch.y-center.y);
float length = sqrtf(vector.x*vector.x + vector.y*vector.y);
// Normalize and multiply by radius (r)
vector.x = r * vector.x / length;
vector.y = r * vector.y / length;
[delegate setPoint:CGPointMake(center.x + vector.x, center.y + vector.y)];

Related

Where a vector would intersect the screen if extended towards it's direction (swift)

I'm trying to write a function in swift, which returns a CGPoint where the extension of a vector (which is within a screen) will intersect the screen. Let's assume that the screen is 800 x 600. It's like the scheme:
The function should have the following parameters:
func calcPoint(start: CGPoint, end: CGPoint) -> CGPoint
start: CGPoint(x: x1, y: y1) - this is the beginning of the vector.
end: CGPoint(x: x1, y: y1) - this is the end point of the vector.
the return point is the one at which the vector intersects the screen (CGPoint(x: x3, y: y3) as shown at the scheme).
The values for the vector start and end are aways points within the screen (the rectangle 0, 0, 800, 600).
EDIT (for Alexander):
Is there a formula, which in the given situation will make it easy to write the function, in not the obvious way using if ... else ... and triangle vertices ratio?
To compute point E you can look at the triangles given by your setting. You have the Triangle ABC and DBE. Note that they are similar, such that we can set up following relation AB : AC = DB : DE using the intercept theorem (AB etc. stands for the line segment between A and B). In the given setting you know all points but E.
Using start and end Points from given setting:
In case start and end have the same x or y-coordinate it is only the top bottom or left right border with the same coordinate.
Using the absolute values it should work for all four corners of your rectangle. Then of course you have to consider E being out of your rectangle, again the same relation can be used AB : AC = D'B : D'E'
A pure swift solution for everyone interested in such (thanks to Ivo Ivanoff):
// Example for iOS
/// The height of the screen
let screenHeight = UIScreen.main.bounds.height
/// The width of the screen
let screenWidth = UIScreen.main.bounds.width
func calculateExitPoint(from anchor : CGPoint, to point: CGPoint) -> CGPoint {
var exitPoint : CGPoint = CGPoint()
let directionV: CGFloat = anchor.y < point.y ? 1 : -1
let directionH: CGFloat = anchor.x < point.x ? 1 : -1
let a = directionV > 0 ? screenHeight - anchor.y : anchor.y
let a1 = directionV > 0 ? point.y - anchor.y : anchor.y - point.y
let b1 = directionH > 0 ? point.x - anchor.x : anchor.x - point.x
let b = a / (a1 / b1)
let tgAlpha = b / a
let b2 = directionH > 0 ? screenWidth - point.x : point.x
let a2 = b2 / tgAlpha
exitPoint.x = anchor.x + b * directionH
exitPoint.y = point.y + a2 * directionV
if (exitPoint.x > screenWidth) {
exitPoint.x = screenWidth
} else if (exitPoint.x < 0) {
exitPoint.x = 0;
} else {
exitPoint.y = directionV > 0 ? screenHeight : 0
}
return exitPoint
}
Any kind of optimizations are welcomed ;-)
There is no single formula, because intersection depends on starting point position, line slope and rectangle size, and it may occur at any rectangle edge.
Here is approach based on parametric representation of line. Works for any slope (including horizontal and vertical). Finds what border is intersected first, calculates intersection point.
dx = end.x - start.x
dy = end.y - start.y
//parametric equations for reference:
//x = start.x + dx * t
//y = start.y + dy * t
//prerequisites: potential border positions
if dx > 0 then
bx = width
else
bx = 0
if dy > 0 then
by = height
else
by = 0
//first check for horizontal/vertical lines
if dx = 0 then
return ix = start.x, iy = by
if dy = 0 then
return iy = start.y, ix = bx
//in general case find parameters of intersection with horizontal and vertical edge
tx = (bx - start.x) / dx
ty = (by - start.y) / dy
//and get intersection for smaller parameter value
if tx <= ty then
ix = bx
iy = start.y + tx * dy
else
iy = by
ix = start.x + ty * dx
return ix, iy

I have a line from the center point of a circle to another point. I want to find the point where the line intersects the circumference of the circle

I have tried several different solutions but no luck so far.
- (CGPoint)contractLineTemp:(CGPoint)point :(CGPoint)circle :(float)circleRadius {
CGFloat x,y;
x = point.x - circle.x;
y = point.y - circle.y;
CGFloat theta = atan2(x, y);
CGPoint newPoint;
newPoint.x = circle.x + circleRadius * sin(theta);
newPoint.y = circle.y + circleRadius * cos(theta);
return newPoint;
}
- (CGPoint)contractLineTemp:(CGPoint)startPoint :(CGPoint)endPoint :(float)scaleBy {
float dx = endPoint.x - startPoint.x;
float dy = endPoint.y - startPoint.y;
float scale = scaleBy * Q_rsqrt(dx * dx + dy * dy);
return CGPointMake (endPoint.x - dx * scale, endPoint.y - dy * scale);
}
Both of these solutions kind of work. If I draw the line to the center of the circle you can see that it intersects the circle exactly where it should.
http://www.freeimagehosting.net/le5pi
If I use either of the solutions above and draw to the circumference of the circle depending on the angle it is no longer going towards the center of the circle. In the second image the line should be in the middle of the right edge of the circle and going straight right.
http://www.freeimagehosting.net/53ovs
http://www.freeimagehosting.net/sb3b2
Sorry for the links. I am to new to currently post images.
Thanks for you help.
It's easier to treat this as a vector problem. Your second approach is close, but you don't correctly scale the vector between the two points. It's easier to work with a normalized vector in this case, although you have to assume that the distance between the two points on the line is non-zero.
Given:
double x0 = CIRC_X0; /* x-coord of center of circle */
double y0 = CIRC_Y0; /* y-coord of center of circle */
double x1 = LINE_X1; /* x-coord of other point on the line */
double y1 = LINE_Y1; /* y-coord of other point on the line */
Then the vector between the two points is (vx,vy):
double vx = x1 - x0;
double vy = y1 - y0;
It's easier to work with a unit vector, which we can get by normalizing (vx,vy):
double vmag = sqrt(vx*vx + vy*vy);
vx /= vmag; /* Assumption is vmag > 0 */
vy /= vmag;
Now, any point along the line can be described as:
x0 + dist * vx
y0 + dist * vy
where dist is the distance from the center. The intersection of the circle and the line must be a distance of CIRC_RADIUS from the center, so:
double x_intersect = x0 + CIRC_RADIUS * vx;
double y_intersect = y0 + CIRC_RADIUS * vy;
I think that there may be a convention conflict on what theta, x and y are. The atan2 function yields values in the range -pi..pi, by taking the convention of theta as the angle growing from the X axis towards Y. However you are considering theta as the angle from Y to X.
Try changing the code:
CGFloat theta = atan2(y, x);
CGPoint newPoint;
newPoint.x = circle.x + circleRadius * cos(theta);
newPoint.y = circle.y + circleRadius * sin(theta);
Although your formulae are consistent within a coordinate system, it may have conflict with the screen/display device coordinate system.

How to get angle between two POI?

How do I calculate the angle in degrees between the coordinates of two POIs (points of interest) on an iPhone map application?
I'm guessing you try to calculate the degrees between the coordinates of two points of interest (POI).
Calculating the arc of a great circle:
+(float) greatCircleFrom:(CLLocation*)first
to:(CLLocation*)second {
int radius = 6371; // 6371km is the radius of the earth
float dLat = second.coordinate.latitude-first.coordinate.latitude;
float dLon = second.coordinate.longitude-first.coordinate.longitude;
float a = pow(sin(dLat/2),2) + cos(first.coordinate.latitude)*cos(second.coordinate.latitude) * pow(sin(dLon/2),2);
float c = 2 * atan2(sqrt(a),sqrt(1-a));
float d = radius * c;
return d;
}
Another option is to pretend you are on cartesian coordinates (faster but not without error on long distances):
+(float)angleFromCoordinate:(CLLocationCoordinate2D)first
toCoordinate:(CLLocationCoordinate2D)second {
float deltaLongitude = second.longitude - first.longitude;
float deltaLatitude = second.latitude - first.latitude;
float angle = (M_PI * .5f) - atan(deltaLatitude / deltaLongitude);
if (deltaLongitude > 0) return angle;
else if (deltaLongitude < 0) return angle + M_PI;
else if (deltaLatitude < 0) return M_PI;
return 0.0f;
}
If you want the result in degrees instead radians, you have to apply the following conversion:
#define RADIANS_TO_DEGREES(radians) ((radians) * 180.0 / M_PI)
You are calculating the 'Bearing' from one point to another here. There's a whole bunch of formula for that, and lots of other geographic quantities like distance and cross-track error, on this web page:
http://www.movable-type.co.uk/scripts/latlong.html
the formulae are in several formats so you can easily convert to whatever language you need for your iPhone. There's also javascript calculators so you can test your code gets the same answers as theirs.
If the other solutions dont work for you try this:
- (int)getInitialBearingFrom:(CLLocation *)first
to:(CLLocation *)second
{
float lat1 = [self degreesToRad:first.coordinate.latitude];
float lat2 = [self degreesToRad:second.coordinate.latitude];
float lon1 = [self degreesToRad:first.coordinate.longitude];
float lon2 = [self degreesToRad:second.coordinate.longitude];
float dLon = lon2 - lon1;
float y = sin (dLon) * cos (lat2);
float x1 = cos (lat1) * sin (lat2);
float x2 = sin (lat1) * cos (lat2) * cos (dLon);
float x = x1 - x2;
float bearingRadRaw = atan2f (y, x);
float bearingDegRaw = bearingRadRaw * 180 / M_PI;
int bearing = ((int) bearingDegRaw + 360) % 360; // +- 180 deg to 360 deg
return bearing;
}
For final bearing, simply take the initial bearing from the end point to the start point and reverse it (using θ = (θ+180) % 360).
You need these 2 helpers:
-(float)radToDegrees:(float)radians
{
return radians * 180 / M_PI;
}
-(float)degreesToRad:(float)degrees
{
return degrees * M_PI /180;
}

Box2d Calculating Trajectory

I'm trying to make physics bodies generated at a random position with a random velocity hit a target. I gleaned and slightly modified this code from the web that was using chipmunk to run in Box2d
+ (CGPoint) calculateShotForTarget:(CGPoint)target from:(CGPoint) launchPos with:(float) velocity
{
float xp = target.x - launchPos.x;
float y = target.y - launchPos.y;
float g = 20;
float v = velocity;
float angle1, angle2;
float tmp = pow(v, 4) - g * (g * pow(xp, 2) + 2 * y * pow(v, 2));
if(tmp < 0){
NSLog(#"No Firing Solution");
}else{
angle1 = atan2(pow(v, 2) + sqrt(tmp), g * xp);
angle2 = atan2(pow(v, 2) - sqrt(tmp), g * xp);
}
CGPoint direction = CGPointMake(cosf(angle1),sinf(angle1));
CGPoint force = CGPointMake(direction.x * v, direction.y * v);
NSLog(#"force = %#", NSStringFromCGPoint(force));
NSLog(#"direction = %#", NSStringFromCGPoint(direction));
return force;
}
The problem is I don't know how to apply this to my program, I have a gravity of -20 for y but putting 20 for g and a lower velocity like 10 for v gets me nothing but "No Firing Solution".
What am I doing wrong?
A lower velocity of 10 is never going to work the projectile doesn't have enough power to travel the distance.
The error in the calculation is that everything is in meters except for the distance calculations which are in pixels!
Changing the code to this fixed the crazy velocities i was getting:
+ (CGPoint) calculateShotForTarget:(CGPoint)target from:(CGPoint) launchPos with:(float) velocity
{
float xp = (target.x - launchPos.x) / PTM_RATIO;
float y = (target.y - launchPos.y) / PTM_RATIO;
float g = 20;
float v = velocity;
float angle1, angle2;
float tmp = pow(v, 4) - g * (g * pow(xp, 2) + 2 * y * pow(v, 2));
if(tmp < 0){
NSLog(#"No Firing Solution");
}else{
angle1 = atan2(pow(v, 2) + sqrt(tmp), g * xp);
angle2 = atan2(pow(v, 2) - sqrt(tmp), g * xp);
}
CGPoint direction = CGPointMake(cosf(angle1),sinf(angle1));
CGPoint force = CGPointMake(direction.x * v, direction.y * v);
NSLog(#"force = %#", NSStringFromCGPoint(force));
NSLog(#"direction = %#", NSStringFromCGPoint(direction));
return force;
}

2d collision between line and a point

Im trying to understanding collision detection in 2d world. I recently got this tutorials http://www.gotoandplay.it/_articles/2003/12/bezierCollision.php. I have question which puzzled me a lot - on the flash demo ball is dropping without responding if i try to swap the starting and end point.
Can someone explain me , how the simulation works.
I have modified this the sample code. It works perfect until the start and end point are swapped, Here is same code in objective c
Thanks in advance. .
-(void)render:(ccTime)dt {
if(renderer)
{
CGPoint b = ball.position;
float bvx = ball.vx;
float bvy = ball.vy;
bvx += .02;
bvy -= .2;
b.x += bvx;
b.y += bvy;
float br = ball.contentSize.width/2;
for ( int p = 0 ; p < [map count] ; p++ ) {
line *l = [map objectAtIndex:p];
CGPoint p0 = l.end;
CGPoint p1 = l.start;
float p0x = p0.x, p0y = p0.y, p1x = p1.x, p1y = p1.y;
// get Angle //
float dx = p0x - p1x;
float dy = p0y - p1y;
float angle = atan2( dy , dx );
float _sin = sin ( angle );
float _cos = cos ( angle );
// rotate p1 ( need only 'x' ) //
float p1rx = dy * _sin + dx * _cos + p0x;
// rotate ball //
float px = p0x - b.x;
float py = p0y - b.y;
float brx = py * _sin + px * _cos + p0x;
float bry = py * _cos - px * _sin + p0y;
float cp = ( b.x - p0x ) * ( p1y - p0y ) - ( b.y - p0y ) * ( p1x - p0x );
if ( bry > p0y - br && brx > p0x && brx < p1rx && cp > 0 ) {
// calc new Vector //
float vx = bvy * _sin + bvx * _cos;
float vy = bvy * _cos - bvx * _sin;
vy *= -.8;
vx *= .98;
float __sin = sin ( -angle );
float __cos = cos ( -angle );
bvx = vy * __sin + vx * __cos;
bvy = vy * __cos - vx * __sin;
// calc new Position //
bry = p0y - br;
dx = p0x - brx;
dy = p0y - bry;
b.x = dy * __sin + dx * __cos + p0x;
b.y = dy * __cos - dx * __sin + p0y;
}
}
ball.position = b;
ball.vx = bvx;
ball.vy = bvy;
if ( b.y < 42)
{
ball.position = ccp(50, size.height - 42);
ball.vx = .0f;
ball.vy = .0f;
}
}
}
The order of the points defines an orientation on the curve. If the start point is on the left and the end point on the right, then the curve is oriented so that "up" points above the curve. However, if you swap the start/end points the curve is oppositely oriented, so now "up" actually points below the curve.
When your code detects a collision and then corrects the velocity it is using the curve's orientation. That is why when the ball drops on the curve with the start/end points swapped it appears to jump through the curve.
To correct this your collision resolution code should check which side of the curve the ball is on (with respect to the curve's orientation), and adjust accordingly.
If you swap l.end and l.start it will serve for line without the segment (l.start, l.end). This is because all values are signed here.
Algorithm turns the plane so that line is horizontal and one of the segment ends doesn't move. After that it is easy to understand whether the ball touches the line. And if it does, its speed should change: in rotated plane it just reverses y-coordinate and we should rotate it back to get line not horizontal again.
In fact not a very good implementation. All this can be done without sin, cos, just vectors.