Reinitialize ccMotionStreak for each new touch - iphone

I am trying to trace the movement of the user's finger on the screen for my iPhone / cocos2d game.
So far I can do this using a ccMotionStreak declared in the interface to my GameLayer and initialized in my init method. To draw the user's touch, I put the following code in touchesMoved:
UITouch *touch = [touches anyObject];
[streak setPosition:[self convertTouchToNodeSpace:touch]];
This works until I lift my finger up and make a new touch motion across the screen. Instead of drawing a new streak, my game connects the end of the old streak to the beginning of my new swipe, and continues the same streak. This is not what I want.
Is there a way to reset my ccMotionStreak? If not, the obvious solution seems to be to create a new streak on each new touch (and remove the old one), but I can't get this to work. When I move the initialization code for my streak out of the init method and into touchesBegan, the streak no longer shows up at all.
I am guessing this should be basic to achieve, but I just can't figure out the syntax. I am still learning ObjC / cocos2d. Can someone help?
Here is how I initialize my streak in my init method:
streak = [CCMotionStreak streakWithFade:3.0 minSeg:1 image:#"streak.png" width:4 length:8 color:ccc4(128,128,128,255)];
[self addChild:streak];

Did you remove/release the old streak on ccTouchesEnded and ccTouchesCancelled?
// in ccTouchesBegan
streak = [CCMotionStreak streakWithFade:3.0 minSeg:1 image:#"streak.png" width:4 length:8 color:ccc4(128,128,128,255)];
[streak setPosition:location];
[self addChild:streak];
// in ccTouchesEnded and ccTouchesCancelled
if (streak) {
[streak removeFromParentAndCleanup:YES];
streak = NULL;
}

Related

how to handle touch event like UITableView or ScollView

I made a customer control, inherit from UIView and add a lot of UIButtons on the UIView.
When a user touches and moves I will do some animation: let buttons move by the function touchesMoved:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
but buttonClick event seems to have a higher priority.
I want to it can like UITableView, scroll things have higher priority then button click.
You need to look into UIPanGestureRecognizer.
It allows you the ability to cancel events sent to other handlers.
Updated with additional information about how to safe previous points.
In the action callback, you gett notified of the initial touch location recognizer.state == UIGestureRecognizerStateBegan. You can save this point as an instance variable. You also get callbacks at various intervals recognizer.state == UIGestureRecognizerStateChanged. You can save this information also. Then when you get the callback with recognizer.state == UIGestureRecognizerStateEnded, you reset any instance variables.
- (void)handler:(UIPanGestureRecognizer *)recognizer
{
CGPoint location = [recognizer locationInView:self];
switch (recognizer.state)
{
case UIGestureRecognizerStateBegan:
self.initialLocation = location;
self.lastLocation = location;
break;
case UIGestureRecognizerStateChanged:
// Whatever work you need to do.
// location is the current point.
// self.lastLocation is the location from the previous call.
// self.initialLocation is the location when the touch began.
// NOTE: The last thing to do is set last location for the next time we're called.
self.lastLocation = location;
break;
}
}
Hope that helps.

How to pass touch event to another object?

I referenced the apps called "Comic Strip" and "Balloon Stickies Free"
When i add a speech balloon and touch s.b or s.b's tail, it works. But when i touch tail's around or between s.b and s.b's tail, it doesn't work. And Photo gets touch and works below the s.b.
So i tried to use hitTest:withEvent.
It works when i touch rectangle or tailRect first time. But when i touch other place in the object, and i touch rectangle or tailRect again, it doesn't work.
So how to modify this code ? i don't know .. help me please
- (id)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *hitView = [super hitTest:point withEvent:event];
if(CGRectContainsPoint(rectangle, currentPt)==YES || CGRectContainsPoint(tailRect, currentPt)==YES)
return hitView;
else
return nil;
}
Try overriding - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event instead.
Or take a look at Ole Begemann's OBShapedButton. Code can easily be modified
to work with UIView instead of UIButton.
See this post horizontal scrolling. Using this code you can get all touch events in a single UIWindow class. You have to write some code to pass control appropriately.

Cocos2D crashes with a Zombie when I try to stop it in the middle of an animation

I'm trying to get Cocos2D on the iPhone to "clean itself up" before I switch back to a UIView-based view in my iPhone app, but it's overreleasing (or, I am overreleasing) something and crashing, and I can't make heads or tails of it.
This is somewhat long, but I've tried to take care to organize it.
My node hierarchy looks like this, and the parenthesis indicate "what's in" each node:
CCScene (Menu)
CCLayer (Character)
CCLayer (Whole Animation)
CCSprite, CCSpriteBatchNode (Parts of Animation, there can be many of each type)
So, as the "Character" runs different animations, I remove the "animation" CCLayer from the "character" CCLayer, create a new "animation" CCLayer, and add it as a child. So far, that's caused no problems.
Finally, there's a button on CCScene that "ends" the Cocos part of the app. I want to return to UIKit-land when I tap the "End" button.
However, before I return back to UIView-land, I want to run one final animation on the Character, and when THAT is finished, terminate. To do that, I "register" a handler on the character CCLayer like this. I call the "final" animation, and then callback to the CCScene when the final animation is done (using KVO):
- (void) doEndWithHandler:(id<LWECharacterDelegate>)handler
{
// Handler is the CCScene, "self.parent" could work but I want it loosely coupled
self.endOfSessionHandler = handler;
// Tell character to start final animation -- this creates a new CCLayer,
// starts the animation, and assigns that CCLayer into self.animatedSequence
[self changeCharacterActionTo:END_ANIMATION key:nil]];
// The "animation" CCLayer has a property called "moving" -- observe it
[self.animatedSequence addObserver:self forKeyPath:#"moving" options:NSKeyValueObservingOptionNew context:NULL];
}
And then my observing code, which is called back to when "moving" becomes NO (=animation finishes):
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if ([keyPath isEqualToString:#"moving"])
{
BOOL movingStatus = [[change objectForKey:NSKeyValueChangeNewKey] boolValue];
if (movingStatus == NO)
{
// Stop observing
[object removeObserver:self forKeyPath:#"moving"];
// Stop all animations on this level
[self.animatedSequence stopAnimation];
[self removeChild:self.animatedSequence cleanup:YES];
[self removeFromParentAndCleanup:YES];
// Handler callback
if (self.endOfSessionHandler && [self.endOfSessionHandler respondsToSelector:#selector(characterDidFinishSession)])
{
[self.endOfSessionHandler characterDidFinishSession];
}
}
} // if key = moving
}
Now, you may say,
Didn't know you that you can just add a callback using CCCallFunc as an action on your animation at the end, so you know when you're done moving?
Yes, I do know that - but the point is that the CCScene "knows" when the end button is pressed -- not any one specific animation. The CCAction is already in motion when the end button is pressed, so I want to tell all sprites to STOP animating and destroy.
My animation CCLayer has some special code to tell me when the sprite(s) has stopped moving. That code is working well-- I use a CCCallFunc callback on the end of every animation to tell my "animation" CCLayer class that it's done.
Why it seems to be a problem, though, is that I get the KVO notification that "moving" has changed BEFORE the Cocos2D action stack trace has unwound. I'm pretty sure my problem is somewhere in there, because as soon as the KVO notification comes through, I try to stop everything (see the code above). Yet, not everything stops, because the Cocos2D framework crashes (overrelease) as it tries to "wrap up" the stack trace.
Is it not possible to stop an animation from within a CCCallFunc callback that is an action animating the same sprite?
For you true Cocos-heads out there, the exact line that is crashing is:
if( currentTarget->currentActionSalvaged ) {
// The currentAction told the node to remove it. To prevent the action from
// accidentally deallocating itself before finishing its step, we retained
// it. Now that step is done, it's safe to release it.
[currentTarget->currentAction release];
.. which is on line 327 of CCActionManager.m.
All right, I solved this one on my own. The key lesson here was this:
If you are "cleaning up" a Cocos session, you should be fine to do it FROM a Cocos callback (CCCallFunc), but do NOT call [[CCDirector sharedDirector] end] until after your Cocos callback stack trace has unwound.
That is all.

XCode - touchBegan - Recent Touches/New Touches

I have been using touches began to track as many as 8 touches, and each triggers an event. These touches can occur at the same time, or staggered.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Touch Began");
NSSet *allTouches = [event allTouches];
for (int i=0; i<allTouches.count; i++) {
UITouch *touch = [[allTouches allObjects] objectAtIndex:i];
if (/*touch inside button in question*/) {
//Trigger the event.
}
}
}
That code works for the multitouch, and it has no problems, EXCEPT: (See if you can guess)
Due to the way allTouches works, it literally gets all of the touches. Because of this, it loops through all of the touches that are currently active when the user starts another touch, and thus triggers the event of one of the buttons twice.
Ex: Johnny is pressing button 1. Event 1 occurs. Johnny leaves his finger on button 1, and presses button 2. Event 2 occurs, BUT button 1 is still a part of allTouches, and so, event 1 is triggered again.
So here's the question: How do I get the new touch?
The same touch object will be returned on subsequent calls to touchesBegan for any continuous touch. So just save each UITouch *touch that you have already handled as begun (and not yet ended), and as you iterate the next time in touchesBegan, skip the ones you've so saved/marked.

iPhone: Tracking/Identifying individual touches

I have a quick question regarding tracking touches on the iPhone and I seem to not be able to come to a conclusion on this, so any suggestions / ideas are greatly appreciated:
I want to be able to track and identify touches on the iphone, ie. basically every touch has a starting position and a current/moved position. Touches are stored in a std::vector and they shall be removed from the container, once they ended. Their position shall be updated once they move, but I still want to keep track of where they initially started (gesture recognition).
I am getting the touches from [event allTouches], thing is, the NSSet is unsorted and I seem not to be able to identify the touches that are already stored in the std::vector and refer to the touches in the NSSet (so I know which ones ended and shall be removed, or have been moved, etc.)
Here is my code, which works perfectly with only one finger on the touch screen, of course, but with more than one, I do get unpredictable results...
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{
[self handleTouches:[event allTouches]];
}
- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
{
[self handleTouches:[event allTouches]];
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
[self handleTouches:[event allTouches]];
}
- (void) touchesCancelled:(NSSet*)touches withEvent:(UIEvent*)event
{
[self handleTouches:[event allTouches]];
}
- (void) handleTouches:(NSSet*)allTouches
{
for(int i = 0; i < (int)[allTouches count]; ++i)
{
UITouch* touch = [[allTouches allObjects] objectAtIndex:i];
NSTimeInterval timestamp = [touch timestamp];
CGPoint currentLocation = [touch locationInView:self];
CGPoint previousLocation = [touch previousLocationInView:self];
if([touch phase] == UITouchPhaseBegan)
{
Finger finger;
finger.start.x = currentLocation.x;
finger.start.y = currentLocation.y;
finger.end = finger.start;
finger.hasMoved = false;
finger.hasEnded = false;
touchScreen->AddFinger(finger);
}
else if([touch phase] == UITouchPhaseEnded || [touch phase] == UITouchPhaseCancelled)
{
Finger& finger = touchScreen->GetFingerHandle(i);
finger.hasEnded = true;
}
else if([touch phase] == UITouchPhaseMoved)
{
Finger& finger = touchScreen->GetFingerHandle(i);
finger.end.x = currentLocation.x;
finger.end.y = currentLocation.y;
finger.hasMoved = true;
}
}
touchScreen->RemoveEnded();
}
Thanks!
It appears the "proper" way to track multiple touches is by the pointer value of the UITouch event.
You can find more details in the "Handling a Complex Multi-Touch Sequence" section of this
Apple Developer Documentation
To fix your problem scrap your "handleTouches" method. The first thing you do in your handleTouches method, is switch it on the touchPhase, but that is already given to you. If you recieve the touch in touchesBegan, you know the touch is in UITouchPhaseBegan. By funneling touches from the four touch methods into one method, you are defeating the purpose of having four delegate methods.
In each of those methods, Apple gives you an opportunity to deal with a different phase of the current touch.
The second thing is that you don't need to search the event for the current touch, it is given to you as a parameter: touches.
An event is comprised of sets of touches. For convienence, you are given the current touches even though it can also be found within event.
So, in touchesBegan, you start tracking a touch.
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event{
NSString *startPoint = NSStringFromCGPoint([[touches anyObject] locationInView:self]);
NSDictionary * touchData = [NSDictionary dictionaryWithObjectsandKeys: startPoint, #"location", touches, #"touch"]
[startingLocations addObject:touchData];
}
I'm using an array of dictionaries to hold my touch data.
Try to seperate your code and move it into the appropriate touch method. For direction, Apple has a couple sample projects that focus on touches and show you how to setup those methods.
Remember, these methods will get called automatically for each touch during each phase, you don't need to cycle through the event to find out what happened.
The pointer to each set of touches remains constant, just the data changes.
Also, I would read the iPhone OS programming guide section on event handling which goes into greater depth of what I said above with several diagrams explaining the relationship of touches to events over time.
An excerpt:
In iPhone OS, a UITouch object
represents a touch, and a UIEvent
object represents an event. An event
object contains all touch objects for
the current multi-touch sequence and
can provide touch objects specific to
a view or window (see Figure 3-2). A
touch object is persistent for a given
finger during a sequence, and UIKit
mutates it as it tracks the finger
throughout it. The touch attributes
that change are the phase of the
touch, its location in a view, its
previous location, and its timestamp.
Event-handling code evaluates these
attributes to determine how to respond
to the event.
You should be able to properly collate your touches by storing the previous location of all touches and then comparing these previous locations when new touches are detected.
In your -handleTouches method, you could put something like this in your for loop:
// ..existing code..
CGPoint previousLocation = [touch previousLocationInView:self];
// Loop through previous touches
for (int j = 0; j < [previousTouchLocationArray count]; j++) {
if (previousLocation == [previousTouchLocationArray objectAtIndex:j]) {
// Current touch matches - retrieve from finger handle j and update position
}
}
// If touch was not found, create a new Finger and associated entry
Obviously you'll need to do some work to integrate this into your code, but I'm pretty sure you can use this idea to correctly identify touches as they move around the screen. Also I just realized CGPoint won't fit nicely into an NSArray - you'll need to wrap these in NSValue objects (or use a different type of array).