I have 2 CCLayers stacked on top of each other; both are touch enabled. I want the top CCLayer to respond to and consume touches and the bottom layer to not react to the touches that the top layer consumes.
The top layer has a CCTouchBegan method that looks like this:
- (BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
NSLog(#"touched!");
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
//DO STUFF WITH TOUCH
return YES;
}
The bottom layer has a CCTouchesEnded method that looks like this:
- (void) ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
//choose one of the touches to work with
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
//DO STUFF WITH THE TOUCH
}
The result is that the top layer does not consume the touches or even respond to them at all. Only the bottom layer responds to the touches.
in your first layer you use method from CCTargetedTouchDelegate. CCLayer register self as CCStandardTouchDelegate. That's why your second layer responds to the touches.
Related
I have written a few games using cocos2d but they were all controlled by the accelerometer and used only simple touch events. All I need to do is register when the screen is touched, anywhere. I don't need any information about the position or velocity. The character currently moves across the screen and the users should be able to touch to make the character move up the screen. the current code does not work as intended. The character is not effected by the touch, it just continues to move down the screen. Please advise. Below is the code I am trying to use now.
In the game update method:
if (IsTouched == TRUE) {
SealPositionBasedOnTouchInt = SealPositionBasedOnTouchInt - (100*dt);
}
else {
SealPositionBasedOnTouchInt = SealPositionBasedOnTouchInt + (100*dt);
}
SealSwimming.position = ccp(SealPositionBasedOnTouchInt, SealSwimming.position.y);
The touch events:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for( UITouch *touch in touches )
{
CGPoint location = [touch locationInView: [touch view]];
location = [[CCDirector sharedDirector] convertToGL: location];
IsTouched = TRUE;
}
}
You'll notice I do get the touch location, this is currently not used for anything but was in the sample.
Make sure the layer has touch events enabled isTouchEnabled
Have you overridden:
-(void) registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
in your layer/node that is receiving the touch?
You might also try and call from your code addTargetedDelegate:self without overriding registerWithTouchDispatcher, but I have never tried it.
Sergio has the answer to your problem with touches not registering. I had a similar problem recently. If it doesn't matter where the user touches the screen then swallowsTouches: YES is fine, but if you have a couple layers that need to register touches you might need to set it to NO.
I haven't tested this.
in .h
CCSprite *sprite;
in .m
-(id)init
{
if ((self=[super init))
{
CGSize s = [[CCDirector sharedDirector] winSize];
sprite = [CCSprite spriteWithFile:#"imagename.png"];
sprite.position = ccp(s.width/2, s.height/2);
[self addChild:sprite];
}
return self;
}
- (void) ccTouchesBegan: (NSSet *)touches withEvent: (UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView: [touch view]];
touchLocation = [[CCDirector sharedDirector] convertToGL: touchLocation];
[sprite setPosition:touchLocation];
//OR YOU CAN RUN AN ANIMATION TO MAKE IT LOOK LIKE IT'S WALKING
//[sprite runAction:[CCMoveTo actionWithDuration:5 position:touchLocation]];
}
I have several uiimages that I can move around on the iphonescreen using multitouches. The thing is that I want to separate them in two "teams" , a "team" of uiimages that I move inside an area of my choice and a "team" that I can move all over the screen.
My question is how to use the touch methods (touchesbegan, touchesended, touchesmoved) for both of the two uiimage "teams" and their cgpoints without the cgpoints from both "teams" crossing each other and giving wrong uiimages wrong positions on the screen. the 1st "team" uses the touchesbegan, touchesmoved and touchesended methods. the 2nd "team" only uses the touchesended method.
Here´s my code. I hope that the 2 "teams" don´t cross with each other in the touchesended method
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
//1st team
for(UITouch *touch in touches){
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self getRubyAtPoint:[touch locationInView:self.view]forEvent: nil];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesEnded");
//1st team
// Enumerates through all touch object
for (UITouch *touch in touches) {
// Sends to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEndEvent:[touch view] toPosition:[touch locationInView:self.view]];
}
//2nd team
UITouch *touch = [touches anyObject];
CGPoint currentTouch = [touch locationInView:self.view];
[self getPieceAtPoint:currentTouch];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//1st team
NSLog(#"touchesMoved");
// Enumerates through all touch objects
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEvent:[touch view] toPosition:[touch locationInView:self.view]];
}
}
If I understand, you are asking how to keep the methods such as getPieceAtPoint from acting on the wrong team.
I think I'd just add a unique tag range to each team and check that before acting on it.
Something like:
UITouch *touch = [touches anyObject];
if ([touch view].tag>=TEAM_TWO_BASE_TAG)
{
CGPoint currentTouch = [touch locationInView:self.view];
[self getPieceAtPoint:currentTouch];
}
I am making an application, in that I want to move the sprite in a specified region of the screen, With cocos2d I am not able to make move of sprite, I only know the method -(void)ccTouchMoved:(UITouch *)touches withEvent:(UIEvent *)event, but I don't know how to move a sprite,
can any one help me????
Thanks in advance
Try something like the following.
-(BOOL)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate:
[touch locationInView:touch.view]];
[yourSprite setPosition:ccp(location.x , location.y )];
return kEventHandled;
}
Edit: If you simply want to move the sprite without a touch event simply call
[yourSprite setPosition:ccp(someX, someY)];
I am trying to build an iPhone app by using Cocos2d. And i would like to set an image to another fixed position from a fixed position by using touch as my wish(speedy, or slowly). I have got some code but it does not work properly.
so friends it will more helpful to me if i get any solution.
The question is a little fuzzy, but if you want to set the position of a CocosNode you do:
[myNode setPosition:cpv(x,y)];
If you want the node to be offset from a touch location, you can do this by implementing ccTouchesBegan:withEvent
-(BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint convertedLocation = [[Director sharedDirector] convertCoordinate:location];
[myNode setPosition: cpv(convertedLocation.x - 100, convertedLocation.y - 100)];
return kEventHandled;
}
That will offset the CocosNode by -100,-100 to where the touch occurred.
The ccTouchesBegan:withEvent: should be implemented in your Layer, and isTouchesEnabled should be set to YES to enable touches.
I am developing a 2d game for iPhone by using cocos2d.
I use many small sprite (image) in my game. I want to touch two similar types of sprite(image) and then both sprite(image) will be hidden.
How can I detect touch in a specific sprite(image) ?
A better way to do this is to actually use the bounding box on the sprite itself (which is a CGRect). In this sample code, I put all my sprites in a NSMutableArray and I simple check if the sprite touch is in the bounding box. Make sure you turn on touch detection in the init. If you notice I also accept/reject touches on the layer by returning YES(if I use the touch) or NO(if I don't)
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint location = [self convertTouchToNodeSpace: touch];
for (CCSprite *station in _objectList)
{
if (CGRectContainsPoint(station.boundingBox, location))
{
DLog(#"Found sprite");
return YES;
}
}
return NO;
}
Following Jonas's instructions, and adding onto it a bit more ...
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [[[Director sharedDirector] convertCoordinate: touch.location];
CGRect particularSpriteRect = CGMakeRect(particularSprite.position.x, particularSprite.position.y, particularSprite.contentSize.width, particularSprite.contentSize.height);
if(CGRectContainsPoint(particularSpriteRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
You may need to adjust the x/y a little to account for the 'centered positioning' in Cocos
In your layer that contains your sprite, you need to say:
self.isTouchEnabled = YES;
then you can use the same events that you would use in a UIView, but they're named a little differently:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
//in your touchesEnded event, you would want to see if you touched
//down and then up inside the same place, and do your logic there.
}
#david, your code has some typos for cocos 0.7.3 and 2.2.1, specifically CGRectMake instead of CGMakeRect and [touch location] is now [touch locationInView:touch.view].
here's what I did:
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x, sprite.position.y, sprite.contentSize.width, sprite.contentSize.height);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
#Genericrich: CGRectContainsPoint works in CocosLand because of the call 2 lines above:
[[Director sharedDirector] convertCoordinate:]
The Cocos2D objects will be using the OpenGL coordinate system, where 0,0 is the lower left, and UIKit coordinates (like where the touch happened) have 0,0 is upper left. convertCoordinate: is making the flip from UIKit to OpenGL for you.
Here's how it worked for me...
Where spriteSize is obviously the sprite's size... :P
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x-spriteSize/2, sprite.position.y-spriteSize/2, spriteSize, spriteSize);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
this is a good tutorial explaining the basic touch system
http://ganbarugames.com/2010/12/detecting-touch-events-in-cocos2d-iphone/
first, write
self.isTouchEnabled = YES;
then, you need to implement the functions ccTouchesEnded, ccTouchesBegan, etc
from what I understood, you want to be able to 'match' two sprites that can be on different coordinates on the screen.
a method for doing this.. : (im sure theres many other methods)
consider having 2 global variables.
so everytime a touch touches a sprite, you use the CGRectContainsPoint function that is mentioned several times to find which sprite has been touched. then, you can save the 'tag' of that sprite in one of the global variables.
You do the same for the second touch, and then you compare the 2 global variables.
you should be able to figure out the rest but comment if you have problems.