CGrectContainPoints & Touches queries - iphone

Im getting straight to the point. I have static coordinates stored as array and i want to compare this coordinates with user touch.
//touch handling
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchPoint = [touch locationInView:touch.view];
//comparing touches
if (CGRectContainsPoint((CGRectMake(x1, y1, w, h)) , touchPoint)) {
// do something
// this is where i got stuck coz i got 2 more sets of x & y. (x2-y2 & x3-y3)
but right now im stuck here coz i dont know how to structure my codes and i want to compare 3 save location touches to user touches so that when they hit the right spot points/score will be added but when they hit the wrong spot life will be deducted. Thanks.

If you had points stored like this . . .
CGPoint p1 = CGPointMake(100,100);
CGPoint p2 = CGPointMake(200,200);
try something like this :
// Get the location of the user's touch
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchPoint = [touch locationInView:touch.view];
float maxDistance = 10;
// Is it in the right place?
if (distanceBetween(touchPoint, p1) < maxDistance)
NSLog(#"touched point 1");
else
if (distanceBetween(touchPoint, p2) < maxDistance)
NSLog(#"touched point 2");
where distanceBetween is a function that looks something like (some maths)
// Distance between two CGPoints
float distanceBetween(CGPoint p1, CGPoint p2) {
float dx = p1.x-p2.x;
float dy = p1.y-p2.y;
return sqrt( dx*dx + dy*dy);
}
Hope that helps,
Sam

Related

Not let UIButton drag outside Circle

I am new to ios.
I have a view in which i had drawn a circle using Quarts core.
I had put one UIButton in that circle and given the Funcnality to DRag and drop that button.
Now i want constraint that the button cant be draged out of that circle area.
The TouchDragOutSide Event of Button is
- (void) draggedOut: (UIControl *) c withEvent: (UIEvent *) ev
{
if([viewCanvas pointInside:[[[ev allTouches] anyObject] locationInView:viewCanvas ] withEvent:ev])
c.center = [[[ev allTouches] anyObject] locationInView:viewCanvas ];
}
At this point the button cant be drag out side of rectangle View area.
Thanks for Help
try this..
if([viewCanvas pointInside:[[[ev allTouches] anyObject] locationInView:viewCanvas ] withEvent:ev])
{
UITouch *touch = [[ev touchesForView:c] anyObject];
CGPoint location = [touch locationInView:c];
if((location.x<(viewCanvas.frame.origin.x+viewCanvas.frame.size.width))&&(location.y<(viewCanvas.frame.origin.y+viewCanvas.frame.size.height)))
{
c.center = [[[ev allTouches] anyObject] locationInView:viewCanvas ];
}
}
I had make solution by the equation by using this equation.
(x-center_x)^2 + (y - center_y)^2 < radius^2
you have the center (x,y) of circle and radius. take the center of the button and check whether it is inside of your x+ radius - width/2 if it moves in x direction and y+ radius - height/2 if it in a y direction. check for the 4 directions and give proper + / -

iPhone: how to quantize the position of a touched object

Hi all you smart people out there!
I want to create a touch interface to an iOS app, that allows the user to drag an object on the screen around.
However, this object should be restricted to move along the perimeter of a circle, so that if the user is trying to drag the object outside that path, it would stick to the nearest point of that circle.
I have done some iPhone programming, but my math is poor. Please help!
All you have to do is set the frame of the view to follow the equation of a circle (of the form: (x-a)^2 + (y-b)^2 = r^2). Once you detect the touch point, you can restrict the view's frame according to the x or the y coordinate of the touch point (both ways are the same).
#define circleRadius 60.0 // r in the above eqtn
#define circlesCenter_X 160.0 // a in the above eqtn
#define circlesCenter_Y 200.0 // b in the above eqtn
#define circleCenter_y(x) sqrtf(circleRadius*circleRadius - (x-circlesCenter_X)*(x-circlesCenter_X))
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event touchesForView:firstPieceView] anyObject];
CGPoint previousLocation = [touch previousLocationInView:self.view];
CGPoint location = [touch locationInView:self.view];
CGFloat delta_x = previousLocation.x - location.x; // constrained by x in this eg.
CGFloat newX = firstPieceView.center.x-delta_x;
// do limit the view's x-coordinate for possible solutions
if(newX<circlesCenter_X - circleRadius)
newX = circlesCenter_X - circleRadius;
if(newX>circlesCenter_X + circleRadius)
newX = circlesCenter_X + circleRadius;
firstPieceView.center = CGPointMake(newX, circleCenter_y(newX)*(location.y>=circlesCenter_Y?1:-1) + circlesCenter_Y);
}
EDIT- Better solution:
#define circleRadius 60.0 // r in the above eqtn
#define circlesCenter_X 160.0 // a in the above eqtn
#define circlesCenter_Y 200.0 // b in the above eqtn
#define slope(x,y) (y-circlesCenter_Y)/(x-circlesCenter_X)
#define pointOnCircle_X(m) circleRadius/(sqrtf(m*m + 1))
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event touchesForView:self.view] anyObject];
CGPoint location = [touch locationInView:self.view];
CGFloat slope;
CGPoint pointOnCircle;
if(location.x==circlesCenter_X){// case for infinite slope
pointOnCircle.x = circlesCenter_X;
if(location.x<circlesCenter_X){
pointOnCircle.y = circlesCenter_Y - circleRadius;
}else{
pointOnCircle.y = circlesCenter_Y + circleRadius;
}
}else{
slope = slope(location.x,location.y);
if(location.x<circlesCenter_X){
pointOnCircle.x = circlesCenter_X - pointOnCircle_X(slope);
}else{
pointOnCircle.x = circlesCenter_X + pointOnCircle_X(slope);
}
pointOnCircle.y = slope * (pointOnCircle.x - circlesCenter_X) + circlesCenter_Y;
}
firstPieceView.center = pointOnCircle;
}
This can be applied similarly for Android, Blackberry, etc too!

UIImageView how to track clockwise or counterclockwise user motion

I have successfully implemented rotating a uiimageview using this code from this post
iPhone - Wheel tracking finger jumps to same starting position?
My question to fellow developers is that the user on touch can rotate the image both clockwise and counterclockwise. Is there a way I can detect in which direction is the user moving the view?
Its important to me because I am making a clock app. I let users move the min hand form 0 min all the way to 60 min. When it reaches there I move the hour hand one up. But the user can spin the min hand counter clockwise in which case I need to move the hour hand one down. Any ideas?
you can set a member variable named "lastPoint" to record the point last time you moved
the you can calc the direction next time
CGPoint lastPoint;
- (void) touchedBegin:(NSSet *)touches withEvent:(UIEvent *)event {
lastPoint = [touch locationInView:self.view];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
//you can save the point , then used in next time you want
int tInput = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:tInput];
CGPoint location = [touch locationInView:self.view];
float theAngle = atan2( location.y-imageX.center.y, location.x-imageX.center.x );
float theLastAngle = atan2( lastPoint.y-imageX.center.y, lastPoint.x-imageX.center.x );
lastPoint = location;
// compare theAngle & theLastAngle , so you can know it is clockwise or counterclockwise.
}

ApplyLinearImpulse() working in one direction only

I am trying to use both b2PrismaticJoint and b2MouseJoint. I need to move my projectile along x-axis to position it for target and want to only swipe vertically without moving the projectile to throw it in that direction. I am using ApplyLinearImpulse() but no matter in which direction i swipe it's direction is always towards top-right. The code is:
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *myTouch = [touches anyObject];
CGPoint location = [myTouch locationInView:[myTouch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
b2Vec2 locationWorld = b2Vec2(location.x/PTM_RATIO, location.y/PTM_RATIO);
if (_mouseJoint) {
_world->DestroyJoint(_mouseJoint);
_mouseJoint = NULL;
}
if(_primJoint) {
_world->DestroyJoint(_primJoint);
_primJoint = NULL;
}
if (hit) {
_strikerBody->ApplyLinearImpulse(locationWorld, _strikerBody->GetPosition());
}
}
Looks like your locationWorld vector is pointing in the same direction every time. I think you want something like this:
b2Vec2 impulseDirection = locationWorld - _strikerBody->GetPosition();
impulseDirection.Normalize();
const double Force = 10 * _strikerBody->GetMass(); //or anything you want
_strikerBody->ApplyLinearImpulse( Force*impulseDirection, _strikerBody->GetPosition() );
Now the impulse will be applied to the center of the _strikerBody in the touch direction (relative to the _strikerBody) .

Multiple objects in touchesbegan?

I am trying to have multiple objects in a touchesbegan method (2 UIImageViews)
I am using the below code but isn't working. No errors but the location is just messed up. What should I do instead?
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *touch = [[event allTouches] anyObject];
if (image1.image == [UIImage imageNamed:#"ball.png"]){
CGPoint location = [touch locationInView:touch.view];
image1.center = location;
}
if (image1.image == [UIImage imageNamed:#"ball2.png"]){
CGPoint location = [touch locationInView:touch.view];
image2.center = location;
}
}
If you want to identify both image view in touchesbegen try this
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event //here enable the touch
{
// get touch event
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(image1.frame, touchLocation))
{
NSLog(#"image1 touched");
//Your logic
}
if (CGRectContainsPoint(image2.frame, touchLocation))
{
NSLog(#"image2 touched");
//your logic
}
}
Hope this help
I guess, image1.image in your second if condition needs to image2.image if you need to over lap image1 and image2 center. But if you need to move the images you have to do the follow -
Check whether the touch point belongs to both the objects.
If yes, move both the images relatively.( i.e., add the moved amount to the earlier image center ). Not making the touch point as the image center location.
Ex: If image1 center is at (x1, y1) and image2 center is at (x2, y2). Now the touch point (x3, y3) belongs both to image image1 and image2. If the new dragged is at (x4,y4), the drag amount is x4-x3 and y4-y3 along x,y directions respectively. Add this drag amount to both the image's center for the image to appear at the new location.
Pseudo Code
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
float touchXBeginPoint = [touch locationInView:touch.view].x;
float touchYBeginPoint = [touch locationInView:touch.view].y;
// Now Check touchXBeginPoint, touchYBeginPoint lies in image1, image2
// Calculate the offset distance.
if( /* both are true */ )
{
// Add the offset amount to the image1, image2 center.
}
else if( /* image1 touch is true */ )
{
// Add the offset amount to the image1 center
}
else if( /* image2 touch is true */ )
{
// Add the offset amount to the image2 center
}
}
Download the source code of iPhone Game Dev chapter 3 and see how images are being moved.