cocos2d in iphone z order - iphone

I am working on a game with many sprites in a layer, and once you touch the sprite, it will do something.
my problem is the z order issue, most of my sprites are overlapping and when you touch the overlapping sprites, the one behind(i think with the lowest z order) react instead of the one in front.
I need to understand more on the z order for cocos2d.
how do I change the z order at runtime?
thanks!

Z Order and the touch handling system are separate. Z Order is only for the visual layout of your layers. The touch handling system, however, relies on the priority assigned when you register with the touch dispatcher. If you register two layers with the touch dispatcher who have the same priority, then the second layer will get the touches first, regardless of the Z ordering of the layers.
Here's the part that really confused me when I had the same issue. Whereas Z Order puts higher numbers on top of each other visually, it's exactly the opposite with touches. LOWER priority numbers actually get the touches first. To keep my own sanity, I refactored my code so that whenever possible I added layers in the same order as the Z index anyway, so the touches of the top layers would behave intuitively.
When this isn't possible, I use the touch priority system, and I define constants so that I don't get confused later. To register for touches using the priority system, use the following:
-(void) registerWithTouchDispatcher {
[[TouchDispatcher sharedDispatcher] addTargetedDelegate:self priority: DEFAULT_TOUCH_PRIORITY swallowsTouches:YES];
}

Are you using the Touch Dispatcher added in 0.8?
A singleton that dispatches touches like in v0.7.x (Standard) or one touch at the time (Targeted). The benefits of using Targeted touches is that you only receive the events (begin, move, end, cancel) of the touch that you claim. Using Targeted touches simplifies the touch handling code both in multi-touch and single-touch games. Unless you need a complex touch handling code, the use of Targeted touches is recommended.
Another benefit of the new TouchDispatcher is that it has a priority-queue, so you can alter the position of the touch consumers.
You can use the priority-queue to accomplish what you want. You can set the priority on each Sprite so you can define which Sprites should respond to a touch first and whether or not they should pass the touch on or swallow it so after it handles it nothing else will get the touch event.
The Touches Test example in the cocos2d project is probably the best place to look: http://code.google.com/p/cocos2d-iphone/source/browse/trunk/tests/TouchesTest (especially the Paddle.m class)
In the onEnter method in your sprite class you can set the priority of the touch dispatcher:
- (void)onEnter
{
[[TouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
[super onEnter];
}
Higher priority delegates will respond before low priority ones.

think of it as layers of paper. 1 will be on top of 0. 0 is somewhere above -5.
16 is above 14. That;s all it is, higher numbers are on top of lower numbers, whether you choose to use negative numbers or positive numbers.

Related

Unity3d - check for collision on non moving objects

I have a sphere build from multiple objects. What I want to do is when I touch/click an object, that object should find all adjunctive objects. But because none off them are moving, no collision detection can be used.
I can't find a way to detect these adjunctive objects even when the colliders do collide with each other, as I can see that in the scene. I tried all the possibilities, but none off them are working, because no objects are moving.
Is there a way to check for manual collision detection, or is there some sort of way to let Unity3d do the collision detection automatically?
You could keep a list of all those objects, then when your event happens you can send messages to all them to do what you want them to do.
Lets assume you want your sphere to break into little pieces. You send a Force message to the sphere. Then you use Newton's Laws of motion and find out how much velocity each piece gets. Remember velocity is a vector thus it has direction.
This is how I would do it and still keep the right amount of control over what happens in my game/simulation. Remember F = ma.
you could use RaycastHit (http://docs.unity3d.com/Documentation/ScriptReference/RaycastHit.html) for your collision, this also works on non-moving objects but it needs more performance
You can add rigidbody to every objects; when you touch one of them, give a force onto it, then it is going to move and trigger event of the adjacent objects.
for the reason you do not want to move the object you touch, you can cancel movement in the OnCollider or OnTrigger event handler function.
I managed to work around this by checking the distance from the selected object and all other objects that are part of the sphere. If the distance is below a certain value, then it is an adjunctive object.
Although this is certaintly not fool proof, it works without problems so far.
I am sorry I was not clear enough. Thanks for all the advice what so ever.

Libgdx applyforce to actor to finger touch/moving direction

Consider I am pulling bus by finger touch and drag movement. Road is not straight. so if i am moving my finger along the road . the Bus should follow the finger along with some rotations that will be needed when there is turn.
First finding the distance between actor and touchpoint and if it is less then
I am simply setting the position of bus (Actor) to touch points location.
now it feel like i am dragging bus.
Now I have to handle the rotations.
can i apply force to actor towards touchpoint??
can i handle rotation of actor towards touchpoint ...
simple logic in my mind is dragging finger means drawing a line..now i have to match center line of the actor to dragged line..
please give me hint regarding handling the rotation of bus.
Thanks,
It sounds to me like you just want the bus to follow a path like a normal bus would follow a road.
You can rotate a body by applying torque to it. That means you will use applyForce(...) and not use the center of mass as the point to apply the force to.
But you don't want to apply a force and make it move towards a certain target point like that, because that would just look weird and you would need some special handling for realistic car physics (for topdown you can see that here http://www.iforce2d.net/b2dtut/top-down-car).
Better just calculate a path yourself and calculate the angle between different points on that path. Then use body.setTransform(...) and set the position and the rotation of the bus manually. That's how you would do it as well, if you wouldn't have any physics engine.

How to detect a circle motion with UIGestureRecognizer

I want to be able to detect someone's finger drawing a circular motion on the screen - as if they were drawing an 'O'. Is this possible with UIGestureRecognizer?
I think the answer to this depends on your definition of circular motion and how you intend to use it. For example, do you want to know how many degrees along a circle the users finger has travelled? Or, do you only care about a circle being completed? What is the degree of accuracy you require? Do you want to allow for the motion to be interrupted or does this have to be more of a touch-down > draw-circle > touch-up (in other words, single motion)?
One approach would be to define a bunch of rectangular zones along the circumference and detect if the user is touching these in sequence. This can provide you with direction and a coarse indication of angle.
Another approach is to store the points between touch down and touch up and do some filtering and curve fitting to figure out what shape is approximated by the points. First low-pass-filter using a basic FIR filter and then look at the dx and dy from point to point. A circle (as a series of arcs) will have to fall within a certain range of slope changes from point to point, otherwise you have some other shape.
Yet another approach is to use a Neural Network to take the points and tell you what the shape looks like.
I think this may be what you need
How to detect circular gesture via Gesture Recognizer?
Instead using a gesture recognizer, this project reacts to circular motions tracking the angle of UITouch events.
My answer to my question:
I used this: http://iphonedevelopment.blogspot.com/2009/04/detecting-circle-gesture.html
.. but turned the CircleView into a custom UIGestureRecognizer. Everything lovely.
No, it doesn't recognize natively a circular motion.
You have to implement your own method to do that.
Here's how i needed to do it using the touches callbacks in my view controller but this could be made into a gesture too. Note, I was trying to detect multiple circle motions (2 or more clockwise or counterclockwise circles made during a touch event.
Store touchesMoved CGPoints in an array.
Create a min/max rect of all the points in your history array.
Divide this min/max rect into 4 smaller rects.
Assign each history point a quadrant using CGRectContainsPoint() for each of the 4 quadrants
A clockwise motion will have quadrants ascending. A counter-clockwise motion will have quadrants descending.
Check the ratio of width/height if you want to detect circles vs ovals

iPhone how do I know when the user is touching the screen but not moving?

I want to know when a user is touching the screen but not moving. Doing it in a pseudo fashion is simple enough - I know how many touches I have just by using touchesBegan and touchesEnded, but the problem is that only touchesMoved sends events. No events are sent if you aren't moving. This is being used to have a nice sliding scroll - you can fling the scrolling and it will continue to scroll even after you've released, but it will immediately stop if your finger is down but not moving. I can't just set a fingerDown boolean in touchesMoved and then set it false in my loop (where the scrolling and sliding is happening), because they're not synchronized.
Basically, I want to simulate having a touchesNotMoved event - whenever you are moving, a certain bool is true, when you're not moving, it's false.
Also please don't ask me why I'm not just using Apple's scrolling - there's a good reason that has nothing to do with this question. :-)
This isn't the answer to knowing when touches aren't active, but it is a reasonable solution to what I wanted to do, anyway.
So, if anyone else is trying to simulate Apple's scroll views (pixel-perfect scrolling plus a nice fling and slide), then you will want the following:
An integer keeping track of the number of touches.
A loop or timer that is separate from the touch events.
A vector (or two floats) storing a scroll "decay" amount.
A constant float decay value, like 0.95.
Then with those you can do the following:
- In touchesBegan, increment touchCount by [touches count].
- In touchesEnded, decrement touchCount by [touches count].
- In touchesMoved, create a vector that represents currentTouchPos - previousTouchPos.
- Scroll your view by that vector.
- Set your scrollDecay vector equal to that vector.
- In your main timer or loop, scroll your view by the scrollDecay only if touchCount <= 0. In addition, multiply the scrolLDecay vector by the decay value. Once it reaches a very low value (say, 0.1), set it to 0.
And you're done. It works quite well, I can't see any discernible differences between this and Apple's scrolling.

Simple iPhone drawing app with Quartz 2D

I am making a simple iPhone drawing program as a personal side-project.
I capture touches event in a subclassed UIView and render the actual stuff to a seperate CGLayer. After each render, I call [self setNeedsLayout] and in the drawRect: method I draw the CGLayer to the screen context.
This all works great and performs decently for drawing rectangles. However, I just want a simple "freehand" mode like a lot of other iPhone applications have.
The way I thought to do this was to create a CGMutablePath, and simply:
CGMutablePathRef path;
-(void)touchBegan {
path = CGMutablePathCreate();
}
-(void)touchMoved {
CGPathMoveToPoint(path,NULL,x,y);
CGPathAddLineToPoint(path,NULL,x,y);
}
-(void)drawRect:(CGContextRef)context {
CGContextBeginPath(context);
CGContextAddPath(context,path);
CGContextStrokePath(context);
}
However, after drawing for more than 1 second, performance degrades miserably.
I would just draw each line into the off-screen CGLayer, if it were not for variable opacity! The less-than-100% opacity causes dots to be left on the screen connecting the lines. I have looked at CGContextSetBlendingMode() but alas I cannot find an answer.
Can anyone point me in the right direction? Other iPhone apps are able to do this with very good efficiency.
The problem is that with CGStrokePath() the current mutable path gets closed and drawn and a new path is created when you move your finger. So you probably end up with a lot of paths for one touch "session", at least that's what your pseudocode seems to do.
You can try to begin a new mutable path when touches begin, use CGAddLineToPoint() when the touches move und end the path when touches end (much like your pseudocode shows). But in the draw method, you draw a copy of the current mutable path, and the actual mutable path is still being elongated until the touches end, so you only get one path for the whole touch session. After the touches end you can add the path permanently - you can for example put all paths into an array and iterate over them in the draw method.
What SanHolo said - plus you may want to throttle the adding of points, so it only adds a new point no more often than every 10ms, say (you'd need to play with the interval). You can do that with a simple timer.
Also, how are you instructing the view that it needs to redraw itself? You might want to throttle that too - and it could be on a longer interval than the point capturing (e.g. capture points no more than every 10ms, and redraw no more often than every 200ms - again you'd need to play with the numbers).
In both cases you'd need to make sure that, if nothing happens for longer than the interval the last point is captured, or the redraw is requested. That's where the timer comes in.