Touchesbegan updating each frame - swift

I am making a game where there is a button: while the button is pressed(touched), the player moves forward. When the button is released, the player stops moving.
Now I can do this with the touchesbegan and touchesended functions but that is not what I want since I am going to add more buttons and enable multitouch, there must be an easier way..
Anyone knows how?

How to retrieve the touch location in swift
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
let touch = touches.anyObject()! as UITouch
let location = touch.locationInView(self)
}
You can do the same for all the other functions

Here is the answer for Objective-C, although being familiar with frames, locations on screen, layer properties etc is highly recommended as this solution detects coincidences between the user's touch and the location of the elements in the screen (I'm talking about portrait, landscape... animations, etc).
Hope it helps.
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(aViewForExample.frame, touchLocation)){
//the user has touched within this element, so what's next?
}
}

Related

How to make 'touchesBegan' method work for a specific view?

In my add contact page, i have a view and a scrollview on it and again a view on it. and in that last view i ve textboxes, etc. i have given the 'touchesBegan' method but it is called only for the view at the bottom. how to point that method to another view i.e., the view at the top?
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.AddView endEditing:YES];
}
One way this is how you can do :
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch= [touches anyObject];
if ([touch view] == image1)
{
//Action
}
}
Please Note : As you are using UIScrollView you might not able to get touches method for UIScrollView. In that case you might have to use UIGesture.
Here is the answer in Swift:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent) {
let touch = touches.first as! UITouch
if(touch.view == myView){
// Code
}
}
First check the property UserInteractionEnabled of that whole controls and set to YES
After check out your bottom view frame that its not over on that views
And after you can checkout that with bellow condition and do something with particular controls touch event..
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
[touch locationInView:viewBoard];
if([touch.view isKindOfClass:[UIImageView class]])
{
UIImageView *tempImage=(UIImageView *) touch.view;
if (tempImage.tag == yourImageTag)
{
/// write your code here
}
}
}
try this :
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch1 = [touches anyObject];
CGPoint touchLocation = [touch1 locationInView:self.finalScore];
if(CGRectContainsPoint(YourView.frame, touchLocation));
{
//Do stuff.
}
}
try
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:vTouch] anyObject];
if(!touch)
return;
CGPoint pointNow = [touch locationInView:otherView];
{
// code here
}
}
If your view is a parent view then can use this code:
if let touch = touches.first {
let position = touch.location(in: yourView)
let pnt: CGPoint = CGPoint(x: position.x, y: position.y)
if (yourView.bounds.contains(pnt)) {
//You can use: yourView.frame.contains(pnt)
//OK.
}
}
Referring to the Apple documentation on UIResponder, all UIView objects (which includes UIWindow), UIApplication object, UIViewController objects are all instances of UIResponder. To handle a specific type of event, a responder must override the corresponding methods.
In our case touches is our type of event. So our responder should implement the following methods.
touchesBegan(:with:), touchesMoved(:with:), touchesEnded(:with:), and touchesCancelled(:with:)
Since we only wish to know when a user has touched a particular view, we only need to implement touchesBegan(_:with:). Since we are not overriding the other methods we must call super.touchesBegan(touches, with: event). If we were overriding ALL of the other methods, we wouldn't be required to call super.
Looking at touchesBegan(_:with:) the parameter touches is a set of UITouch instances. Each instance represents the touches for the starting phase of the event, which is represented by the parameter event.
For touches in a view, this set contains only one touch by default. Hence touches.first is the only UITouch instance in the set. We then access the property view, this represents the view or window in which the touch occurred. Lastly we compare the view that has been touched with your desired view.
It should be noted that if you wish to receive multiple touches- you must set the view's isMultipleTouchEnabled property to true. Then the set of touches will have more than one UITouch instance and you will have to handle that accordingly.
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
if let touch = touches.first, touch.view == myView {
// Do something
}
}

Determine How far drag point is

I have a UIImageView and I am trying to determine when a drag is performed, how far that drag is from the origin. I currently have this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(myimage.frame, location) == 0){
}
else
{ //user didn't tap inside image}
The image itself does not move, but a person can take their finger, click on the image and then drag their finger. I am just trying to determine that drag distance.
If you want to calculate distance, you need to remember the point (store it somewhere) in touchesBegan if the user tapped on your image.
Then in touchesMoved or touchesEnd you will be able to get current point and calculate distance to your original point.
If you need to get distance from UIImageView origin, you can call [touch locationInView:myImage];
And I suggest you to use UIGestureRecognizer class instead of handling touches by yourself. They are simpler to implement.

iPhone SDK - Where did I finish touching the screen?

Is there any way for the touchesEnded method to tell me the position of where I lifted my finger?
Also, can it tell me which finger was lifted if I had 2 (or more) on the device? (e.g. if I put finger 1 down, then finger 2 down, then lifted finger 2 but held down finger 1, could it tell me that I have ended finger 2?)
Thanks
Jon
Look at the touch methods that are inherited by UIViews, specifically touchesEnded:withEvent: will tell you when / where the touches ended (i.e. where you lifted your finger).
Just override these methods in your view controller like this and you can find out where the touch refers to :
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
// Get a touch
UITouch *touch = [touches anyObject];
// Where did it end?
CGPoint endedAt = [touch locationInView:[self view]];
...
The touch object will be the same (isEqual:) to the touch that was sent in the touchesBegan:withEvent: method. This should let you track multiple touches if you store the touches that you are interested in in your touchesBegan and compare them in your touchesEnded

touch event on a quartz circle drawing?

I have drawn a circle on the iphone simulator using quartz 2d with fill. Is there a way by which I can detect a touch event on that circle?
Thanks
Harikant Jammi
If you have a CGPath of that circle, you can obtain a CGPoint of where the user's finger fell inside of touchesBegan and check to see whether it falls within this CGPath using the following code.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
// Depending on your code, you may need to check a different view than self.view
// You should probably check the docs for the arguments of this function--
// It's been a while since I last used it
if (CGPathContainsPoint(yourCircle, nil, location, nil)) {
// Do something swanky
} else {
// Don't do teh swank
}
}
I haven't tested it, but if the surrounding space around the circle is set to an alpha of 0, then maybe only the circle would accept touches. You'd probably have to turn off isOpaque for the view, and have no background color.
If that doesn't work, then you'll have to process the tap location and write code to see if it is within the circle area or not.
Just put a custom invisible button with 0.0 alpha on that circle, so the button will always be able to detect touch.

How can I detect touch in cocos2d?

I am developing a 2d game for iPhone by using cocos2d.
I use many small sprite (image) in my game. I want to touch two similar types of sprite(image) and then both sprite(image) will be hidden.
How can I detect touch in a specific sprite(image) ?
A better way to do this is to actually use the bounding box on the sprite itself (which is a CGRect). In this sample code, I put all my sprites in a NSMutableArray and I simple check if the sprite touch is in the bounding box. Make sure you turn on touch detection in the init. If you notice I also accept/reject touches on the layer by returning YES(if I use the touch) or NO(if I don't)
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint location = [self convertTouchToNodeSpace: touch];
for (CCSprite *station in _objectList)
{
if (CGRectContainsPoint(station.boundingBox, location))
{
DLog(#"Found sprite");
return YES;
}
}
return NO;
}
Following Jonas's instructions, and adding onto it a bit more ...
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [[[Director sharedDirector] convertCoordinate: touch.location];
CGRect particularSpriteRect = CGMakeRect(particularSprite.position.x, particularSprite.position.y, particularSprite.contentSize.width, particularSprite.contentSize.height);
if(CGRectContainsPoint(particularSpriteRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
You may need to adjust the x/y a little to account for the 'centered positioning' in Cocos
In your layer that contains your sprite, you need to say:
self.isTouchEnabled = YES;
then you can use the same events that you would use in a UIView, but they're named a little differently:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
//in your touchesEnded event, you would want to see if you touched
//down and then up inside the same place, and do your logic there.
}
#david, your code has some typos for cocos 0.7.3 and 2.2.1, specifically CGRectMake instead of CGMakeRect and [touch location] is now [touch locationInView:touch.view].
here's what I did:
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x, sprite.position.y, sprite.contentSize.width, sprite.contentSize.height);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
#Genericrich: CGRectContainsPoint works in CocosLand because of the call 2 lines above:
[[Director sharedDirector] convertCoordinate:]
The Cocos2D objects will be using the OpenGL coordinate system, where 0,0 is the lower left, and UIKit coordinates (like where the touch happened) have 0,0 is upper left. convertCoordinate: is making the flip from UIKit to OpenGL for you.
Here's how it worked for me...
Where spriteSize is obviously the sprite's size... :P
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x-spriteSize/2, sprite.position.y-spriteSize/2, spriteSize, spriteSize);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
this is a good tutorial explaining the basic touch system
http://ganbarugames.com/2010/12/detecting-touch-events-in-cocos2d-iphone/
first, write
self.isTouchEnabled = YES;
then, you need to implement the functions ccTouchesEnded, ccTouchesBegan, etc
from what I understood, you want to be able to 'match' two sprites that can be on different coordinates on the screen.
a method for doing this.. : (im sure theres many other methods)
consider having 2 global variables.
so everytime a touch touches a sprite, you use the CGRectContainsPoint function that is mentioned several times to find which sprite has been touched. then, you can save the 'tag' of that sprite in one of the global variables.
You do the same for the second touch, and then you compare the 2 global variables.
you should be able to figure out the rest but comment if you have problems.