I am developing a 2d game for iPhone by using cocos2d.
I use many small sprite (image) in my game. I want to touch two similar types of sprite(image) and then both sprite(image) will be hidden.
How can I detect touch in a specific sprite(image) ?
A better way to do this is to actually use the bounding box on the sprite itself (which is a CGRect). In this sample code, I put all my sprites in a NSMutableArray and I simple check if the sprite touch is in the bounding box. Make sure you turn on touch detection in the init. If you notice I also accept/reject touches on the layer by returning YES(if I use the touch) or NO(if I don't)
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint location = [self convertTouchToNodeSpace: touch];
for (CCSprite *station in _objectList)
{
if (CGRectContainsPoint(station.boundingBox, location))
{
DLog(#"Found sprite");
return YES;
}
}
return NO;
}
Following Jonas's instructions, and adding onto it a bit more ...
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [[[Director sharedDirector] convertCoordinate: touch.location];
CGRect particularSpriteRect = CGMakeRect(particularSprite.position.x, particularSprite.position.y, particularSprite.contentSize.width, particularSprite.contentSize.height);
if(CGRectContainsPoint(particularSpriteRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
You may need to adjust the x/y a little to account for the 'centered positioning' in Cocos
In your layer that contains your sprite, you need to say:
self.isTouchEnabled = YES;
then you can use the same events that you would use in a UIView, but they're named a little differently:
- (void)ccTouchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch* touch = [touches anyObject];
//in your touchesEnded event, you would want to see if you touched
//down and then up inside the same place, and do your logic there.
}
#david, your code has some typos for cocos 0.7.3 and 2.2.1, specifically CGRectMake instead of CGMakeRect and [touch location] is now [touch locationInView:touch.view].
here's what I did:
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x, sprite.position.y, sprite.contentSize.width, sprite.contentSize.height);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
#Genericrich: CGRectContainsPoint works in CocosLand because of the call 2 lines above:
[[Director sharedDirector] convertCoordinate:]
The Cocos2D objects will be using the OpenGL coordinate system, where 0,0 is the lower left, and UIKit coordinates (like where the touch happened) have 0,0 is upper left. convertCoordinate: is making the flip from UIKit to OpenGL for you.
Here's how it worked for me...
Where spriteSize is obviously the sprite's size... :P
- (BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView:touch.view]];
CGRect myRect = CGRectMake(sprite.position.x-spriteSize/2, sprite.position.y-spriteSize/2, spriteSize, spriteSize);
if(CGRectContainsPoint(myRect, location)) {
// particularSprite touched
return kEventHandled;
}
}
this is a good tutorial explaining the basic touch system
http://ganbarugames.com/2010/12/detecting-touch-events-in-cocos2d-iphone/
first, write
self.isTouchEnabled = YES;
then, you need to implement the functions ccTouchesEnded, ccTouchesBegan, etc
from what I understood, you want to be able to 'match' two sprites that can be on different coordinates on the screen.
a method for doing this.. : (im sure theres many other methods)
consider having 2 global variables.
so everytime a touch touches a sprite, you use the CGRectContainsPoint function that is mentioned several times to find which sprite has been touched. then, you can save the 'tag' of that sprite in one of the global variables.
You do the same for the second touch, and then you compare the 2 global variables.
you should be able to figure out the rest but comment if you have problems.
Related
I am trying to develop a game using openGL where i have used GLTextureLoader class to load images and these sprites are moving from left to right with some calculated velocity , i need to detect touch on these images.
Since your purpose is very simple, all you have to do is draw whatever object you have twice, the visible one and another one with just color on an invisible buffer. then you check for the location where the user pressed in the invisible buffer, see what color it is and there you have your object.
http://www.lighthouse3d.com/opengl/picking/index.php3?color1
That is the basic theory.
OpenGL is a rendering API. It only draws stuff.
Techniques like lighthouse3d do work, but glReadPixels is slow.
You should check this on the CPU; that is, for each drawn sprite, test if the touch position is inside.
I found out how to do it ,as per my requirement , As i said i am not the expert in openGL but managed to do it some way around.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[touches allObjects] objectAtIndex:0];
CGPoint touchLocation = [touch locationInView:self.view];
touchLocation = touchLocation = CGPointMake(touchLocation.x, 320 - touchLocation.y);
//self.player.moveVelocity = GLKVector2Make(50, 50);
//NSLog(#"Location in view %#", NSStringFromCGPoint(touchLocation));
//NSLog(#"bounding box is %#",NSStringFromCGRect([self.player boundingBox]));
GameSprite *temp;
for (GameSprite *tempSprite in self.children) {
if (CGRectContainsPoint([tempSprite boundingBox], touchLocation)) {
NSLog(#"touched the player");
temp =tempSprite;
}
}
[self.children removeObject:temp];
}
- (CGRect)boundingBox {
CGRect rect = CGRectMake(self.position.x, self.position.y, self.contentSize.width, self.contentSize.height);
return rect;
}
I am rather new to using Cocos2d for iPhone and I am having an issue with touch locations. At the moment I am simply trying to touch and move a sprite on the screen, this works fine when the layer is unmoved as well as when I translate the layer (changing self.position in X direction in my case) however, when I scale my layer (example: self.scale = .5) the touch no longer moves the sprite. I have done a lot of forum searching/google searching and I think my issue has to do with my coordinate transforms (node space/world space etc.) But I am not 100% sure. I did notice that when I scale, if I click the location where the sprite would be without the scale, then I could move the sprite. This leads me to believe that my transforms are not taking the scale into account.
Here is the coordinate transform code I am currently using to get touch locations:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *myTouch = [touches anyObject];
CGPoint location = [myTouch locationInView:[myTouch view]];
location = [self convertToNodeSpace:location];
location = [[CCDirector sharedDirector] convertToGL:location];
}
Here is the code that is checking if the location (same location variable as above) is touching a sprite, although I feel much more confident that this code is correct, who knows!
for (CCSprite *sprite in movableSprites) {
if (CGRectContainsPoint(sprite.boundingBox, touchLocation)) {
NSLog(#"Woohoo, you touched a sprite!");
break;
}
}
Let me know if you need anymore information and thanks for reading!
I think you should double the bounding box with the scale
for (CCSprite *sprite in movableSprites) {
if (CGRectContainsPoint(sprite.boundingBox*sprite.scale, touchLocation)) {
//touch sprite action
}
}
About converting the point, if I need an absolut screen point I always use:
convertToWorldSpace:CGPointZero.
I'm not really sure why you need this on your touch location, I would usually do this on sprites when I need to disregard their position in a parent node.
Other then that, If your game is not real multi-touch game you better use ccTouchBegan and not ccTouchesBegan.
Use this function to get sprite rect.
-(CGRect)getSpriteRect:(CCNode *)inSprite
{
CGRect sprRect = CGRectMake(
inSprite.position.x - inSprite.contentSize.width*inSprite.anchorPoint.x,
inSprite.position.y - inSprite.contentSize.height*inSprite.anchorPoint.y,
inSprite.contentSize.width,
inSprite.contentSize.height
);
return sprRect;
}
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *myTouch = [touches anyObject];
CGPoint location = [myTouch locationInView:[myTouch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
for (CCSprite *sprite in movableSprites)
{
CGRect rect = [self getSpriteRect:sprite];
if (CGRectContainsPoint(rect, location))
{
NSLog(#"Woohoo, you touched a sprite!");
break;
}
}
}
I have written a few games using cocos2d but they were all controlled by the accelerometer and used only simple touch events. All I need to do is register when the screen is touched, anywhere. I don't need any information about the position or velocity. The character currently moves across the screen and the users should be able to touch to make the character move up the screen. the current code does not work as intended. The character is not effected by the touch, it just continues to move down the screen. Please advise. Below is the code I am trying to use now.
In the game update method:
if (IsTouched == TRUE) {
SealPositionBasedOnTouchInt = SealPositionBasedOnTouchInt - (100*dt);
}
else {
SealPositionBasedOnTouchInt = SealPositionBasedOnTouchInt + (100*dt);
}
SealSwimming.position = ccp(SealPositionBasedOnTouchInt, SealSwimming.position.y);
The touch events:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for( UITouch *touch in touches )
{
CGPoint location = [touch locationInView: [touch view]];
location = [[CCDirector sharedDirector] convertToGL: location];
IsTouched = TRUE;
}
}
You'll notice I do get the touch location, this is currently not used for anything but was in the sample.
Make sure the layer has touch events enabled isTouchEnabled
Have you overridden:
-(void) registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
in your layer/node that is receiving the touch?
You might also try and call from your code addTargetedDelegate:self without overriding registerWithTouchDispatcher, but I have never tried it.
Sergio has the answer to your problem with touches not registering. I had a similar problem recently. If it doesn't matter where the user touches the screen then swallowsTouches: YES is fine, but if you have a couple layers that need to register touches you might need to set it to NO.
I haven't tested this.
in .h
CCSprite *sprite;
in .m
-(id)init
{
if ((self=[super init))
{
CGSize s = [[CCDirector sharedDirector] winSize];
sprite = [CCSprite spriteWithFile:#"imagename.png"];
sprite.position = ccp(s.width/2, s.height/2);
[self addChild:sprite];
}
return self;
}
- (void) ccTouchesBegan: (NSSet *)touches withEvent: (UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView: [touch view]];
touchLocation = [[CCDirector sharedDirector] convertToGL: touchLocation];
[sprite setPosition:touchLocation];
//OR YOU CAN RUN AN ANIMATION TO MAKE IT LOOK LIKE IT'S WALKING
//[sprite runAction:[CCMoveTo actionWithDuration:5 position:touchLocation]];
}
I am trying to build an iPhone app by using Cocos2d. And i would like to set an image to another fixed position from a fixed position by using touch as my wish(speedy, or slowly). I have got some code but it does not work properly.
so friends it will more helpful to me if i get any solution.
The question is a little fuzzy, but if you want to set the position of a CocosNode you do:
[myNode setPosition:cpv(x,y)];
If you want the node to be offset from a touch location, you can do this by implementing ccTouchesBegan:withEvent
-(BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint convertedLocation = [[Director sharedDirector] convertCoordinate:location];
[myNode setPosition: cpv(convertedLocation.x - 100, convertedLocation.y - 100)];
return kEventHandled;
}
That will offset the CocosNode by -100,-100 to where the touch occurred.
The ccTouchesBegan:withEvent: should be implemented in your Layer, and isTouchesEnabled should be set to YES to enable touches.
I just don't get it.
I use cocos2d for development of a small game on the iPhone/Pod. The framework is just great, but I fail at touch detection. I read that you just need to overwrite the proper functions (e.g. "touchesBegan" ) in the implementation of a class which subclasses CocosNode. But it doesn't work. What could I do wrong?
the function:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{NSLog(#"tickle, hihi!");}
did I get it totally wrong?
Layer is the only cocos2d class which gets touches.
The trick is that ALL instances of Layer get passed the touch events, one after the other, so your code has to handle this.
I did it like this:
-(BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint cLoc = [[Director sharedDirector] convertCoordinate: location];
float labelX = self.position.x - HALF_WIDTH;
float labelY = self.position.y - HALF_WIDTH;
float labelXWidth = labelX + WIDTH;
float labelYHeight = labelY + WIDTH;
if( labelX < cLoc.x &&
labelY < cLoc.y &&
labelXWidth > cLoc.x &&
labelYHeight > cLoc.y){
NSLog(#"WE ARE TOUCHED AND I AM A %#", self.labelString);
return kEventHandled;
} else {
return kEventIgnored;
}
}
Note that the cocos2d library has a "ccTouchesEnded" implementation, rather than the Apple standard. It allows you to return a BOOL indicating whether or not you handled the event.
Good luck!
Have you added this to your layers init method?
// isTouchEnabled is an property of Layer (the super class).
// When it is YES, then the touches will be enabled
self.isTouchEnabled = YES;
// isAccelerometerEnabled is property of Layer (the super class).
// When it is YES, then the accelerometer will be enabled
self.isAccelerometerEnabled = YES;
In order to detect touches, you need to subclass from UIResponder (which UIView does as well) . I am not familiar with cocos2D, but a quick look at the documentation reveals that CocosNode does not derive from UIResponder.
Upon further investigation, it looks like Cocos folks created a Layer class that derives from CocosNode. And that class implements the touch event handlers. But those are prefixed by cc.
See http://code.google.com/p/cocos2d-iphone/source/browse/trunk/cocos2d/Layer.h
Also see menu.m code and the below blog post article for more info on this:
http://blog.sapusmedia.com/2008/12/cocos2d-propagating-touch-events.html
maw, the CGPoint struct members x,y are floats. use #"%f" to format floats for printf/NSLog.
If you use the 0.9 beta of cocos2D it has a really simple touch detection for CocosNodes. The real beauty of this new detection is that it handles multiple touch tracking really well.
An example of this can be found here
http://code.google.com/p/cocos2d-iphone/source/browse/#svn/trunk/tests/TouchesTest
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Add a new body/atlas sprite at the touched location
CGPoint tapPosition;
for( UITouch *touch in touches ) {
CGPoint location = [touch locationInView: [touch view]];
tapPosition = [self convertToNodeSpace:[[CCDirector sharedDirector] convertToGL:location]]; // get the tapped position
}
}
think this can help you....
-Make your scene conforms to protocol CCTargetedTouchDelegate
-Add This line to init of your scene:
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:NO];
-Implement these functions:
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
return YES;
}
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event
{
//here touch is ended
}