how to detect touch which appear on the location greater than resolution of device - iphone

I'm looking for help. I'm trying to make some puzzle game and I have thin scrollable layer at the bottom of my main layer which contains some puzzle shapes. I can scroll the layer and see every shape but shapes were positioned manually in code on the scrollable layer and the problem is, if I try to detect if I touched on sprite, which position is greater than 1024(in first iPad) it doesn't work. it doesn't work because touch can have position inside 1024X768 and the position of shape is for example 1500x100. to make it clearer, shapes are sprites and i try to detect them using CGRectContainsPoints method. Is there any other way to make it or have you any ideas? thanks in advance :]

What you could do is subclass your sprite, create a delegate for it and assign your main view/class as the delegate.
Implement the appropriate touch method, and send the message of what sprite was selected to your delegate (Main view or desired controller class).
With this every sprite has the same delegate, and sends a message to your controlling class as to what sprite has been selected and continue with desired functionality. No need for CGRectContainsPoint method.
This is cleaner, and more efficient.
Hope this helps!

Related

Adding two CCLayers in a single CCScene side by side

I am trying to add two CCLayers side by side (not one over another) on a single scene, with second layer having a table view on it. I have added the table view as the subview to the [[CCDirector sharedDirector] view]. The size of the second layer is that of the screen size & the size of the first layer is some what less than the size of the screen. When this screen appears, first second layer is shown. After clicking a particular button on the second layer, the scene animates to the right along with the table on second layer, showing the full first layer & some part of second layer.
Can anyone help me with it? I appreciate all your help in advance.
This has little or nothing to do with layers. A layer and its children are always drawn either on top of or behind another layer. And UIViews have no understanding of cocos2d layers or nodes at all.
What you want to do is to design the screen, then move the entire screen left or right. While moving, you need to update the table view's position accordingly every frame because it won't move along with cocos2d nodes. The CCUIViewWrapper may help but it's a little overkill if you merely want to update the view's position.
Whether you design your screen on the scene, a single or multiple layers makes no difference. In fact it's easier to use a single scene or layer because then you can animate both sides of the screen at once and in perfect synchronicity.

How to drag view with two finger

Hi all:
I want to write a gesture recognizer on the iphone, that I cold use two finger to drag the view. Just like we use two finger on the MacBook Pro's touch pad.
If the view's size is larger than the window's size of the device, I could use two finger to drag the view. is there any good way to solve it ?
(Thanks enamrik and Naveen Thunga. I achieved my goal just now, but there is a new question that how could I know the UILongPressGestureRecognizer is over. I want call a method and set some value when my finger leaves the screen of my iphone.)
The UILongPressGestureRecognizer class has a numberOfTouchesRequired property you might find useful
Use uigesturerecognizer's properties and enable View's multitouch property. See some sample on UIGestureRecognizer that may help you.

iPhone: Rotating a drawing without rotating the view within which it is drawn

Anyone who has gone through the Stanford CS193P class on iTunes will recognize this from Assignment 3 (HelloPoly). For those not familiar:
I have a custom view called polyView, an instance of PolygonView, a subclass of UIView. On this view, I use CGContextDrawLinearGradient to paint a color gradient over the entire rectangular view. Then I use CGContextDrawPath to stroke and fill a polygon within the bounds of polyView. And I have a UILabel called nameLabel in the center of the view (and polygon) that displays the name of the polygon (triangle, quadrilateral, pentagon, etc.). All of this works fine, and the code to do all this is in the -(void)drawRect method of the PolygonView class.
Where I ran into trouble is with an additional requirement to rotate the polygon within the view in response to user gestures. I used CGAffineTransformRotate() in response to touchesBegan() and touchesMoved() events within the PolygonView class, and this basically works, too. But I can only rotate the entire polyView, not just the polygon drawn on it. I'm sure I could go back and recalculate the path of the polygon and redraw/fill the path in response to each touchesMoved() event, but that would be expensive and can't be the best method. How can I use CGAffineTransformRotate to rotate just the polygon, without rotating the gradient-filled view or the label in the center?
Or is there some way to create the polygon on a layer that I can place over the background polyView at the desired rotation angle?
Thanks for any help you can give a beginner here!
Duane
You can do a CGContextSaveGState(...) just before doing the rotation transformation, then drawing the polygon and restoring the drawing state with a CGContextRestoreGState(...) afterwards, so as not to affect any other drawing in the view later.

touch multi UIViews

There are a series UIViews arranged very close.
alt text http://www.mobilepanda.com/questiontouch.png
I hope when my finger touches some of them, my app can detect which UIView touched.
Maybe one or two or three.(because the displayed parts of each UIView are too thin).
I hope to get the middle x value of the touch, then spread the UIView where the middle x value locates and the UIViews near it.
alt text http://www.mobilepanda.com/questiontouch1.png
My way is put a transparent UIView over all these UIView to detect the touch event.
I am not sure if this is ok? or there is any better solution.(for example, make each UIView has the capability to detect the touch, mix and decide which UIView is touched.
Welcome any comment
Thanks
interdev
You don't need to do all that. The OS will decide what the center point of the finger touch is and send an event with the touch x,y coordinates to the correct view. If you make them UIButton's (a subclass of UIView) instead of UIView's the OS will do all the work for you. All you have to do is attach callbacks to each button to the functions you want called for various events (like touchUpInside, touchDownInside, etc).

How can i rotate an arrow image by touch on that image?

I am working on a project where i need to rotate an image by touching it.
It can be rotate faster or slower depending on how the user touches it.
Can you show me some tutorials or how this can be done?
Place your image within a UIImageView, then either subclass that view and replace touchesBegan:withEvent: or set a delegate for it and implement the same method as a delegate method. This will give you the ability to respond to touch events (the beginning of a touch, in this case, although you can do the same for ending a touch or moving of the finger).
Within this touch handling method, you can implement something similar to what I describe here in order to perform a Core-Animation-enabled rotation of your UIImageView at a given speed. To alter the speed, change the duration property on the animation I provide. As I suggest there, you may want to look into a CAKeyframeAnimation to do a smoother animation with acceleration and deceleration at the beginning and end.
An easier way is to set up an NStimer and rotate the transform everytime it fires.
I've some sample code here that coincidentally does something similar:
http://github.com/kailoa/touchsamplecode/tree/master
Using Cocos2d, you can't have 'touch enabled' sprites, 'isTouchEnabled' is at the Layer level. You'll have to receive the touch at the layer level, then check the location of the touch against the location of the touchable sprite. The CGRect* functions include a 'rect contains point' which you can pass the touch location to, with the sprite's rect to see if it was 'touched', and which point you could then say [sprite runAction:[Rotate ....]]