how to move view with my finger in iphone Objective-c - iphone

i have this view that the size of him is 1280 X 345 and im moving it left and right in my screen.
now, my main question is how do i make it move with my finger (not swipeLeft / swipeRight) i want it to move with my finger like the home screen of the iPhone IOS.
now, im using this code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:infoView];
//[self.view setTransform:CGAffineTransformTranslate(CGAffineTransformIdentity, (location.x + self.view.frame.origin.x - dx), 0.0)];
NSLog(#"touch: %f",location.x);
infoView.frame = CGRectMake(infoView.frame.origin.x + location.x, 0, 1280, 345);
}
that should be the current way but i cant figur it out.
i have also try to find an answer in google and here, but, as you know.. i didn't found something useful.
i've also made this so you can understand it better.

Don't write code to do this yourself. Use UIScrollView. This is what it is designed for.

If you want to have the same effect as the iPhone/iPad home screen you should use a UIScrollView and enable paging.

If you really want to handle it yourself for some reason, avoiding UIScrollView, there are UIGestureRecognizers, specifically UIPanGestureRecognizer which can leverage most of the job handling multiple touch events in one place.

i used the UIPagecontrol and UIScrollView to make it. very easy and smart object !

Related

How to make certain part of image clickable in ios?

In one of my app i am using image for the whole screen. Which can be zoomed to some extent. That image has eight different shapes(includes person,shapes,etc).What i am trying to do is i need to make certain each shape of the image is clickable. Touching each part takes to different screens.I didn't have any idea about how to achieve this. I googled it but no solution.
1.) Is this possible by using co-ordinates(will normal image and zoomed image differ in co-ordinates? How to achieve this by using co-ordinates?
2.) If not what will be the best approach to achieve my goal?
Any ideas/samples is much appreciated.
I would add a UITapGestureRecognizer to the imageView holding your image. And the locationOfTouch:inView: method to determine the coordinates of your touch.
Correct me if i don't understand your question. For me, that should be very simple? Just have couple of buttons which background is clear? And They are all on top of the image.
Check UIResponder and the touches methods in there. You'll probably want to hook in to something like -touchesEnded:withEvent: for detecting when a finger lifts off the screen.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGRect touchLocationInView = [touch locationInView:imageView];
// Do something to check that the rect is valid
// If valid, react to it
}
}
Also, a link to UITouch.

Touch Controls working on Simulator, not on Device

See topic. It's just with one application -- 2 of them work fine, but the third (and largest, go figure) doesn't respond to touch events. I tried changing a UIImageView's location on TouchesBegan, and that doesn't show up (but it does in the Simulator!)
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:self.view];
mothership.center = CGPointMake(80,80);
//etc...
I've tried both debug and release modes. Any idea on what would cause this? The rest of the game runs fine (enemy ships appear and shoot at you, so I know the rest of the code is working). Any advice is much appreciated. Thank you!
Don't respond to touches in the individual views. This can be problematic. Instead use the backing view to handle all touches. A good sample of this technique is in the Apple Sample Code: Touches
(requires dev login)
Weirdly enough, updating my ipod to 3.0 did the trick!

How can an underlying view know if its rectangle got touched, no matter if directly or indirectly?

I have an UIView which is transparent and covers almost the whole screen. I left 50 pixels at the top. It is a child of the View Controller's view.
Underneeth the UIView there's MyView that inherits from UIView, which matches the screen size. And inside this MyView class, I ask for a touch on it very simple with this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == self) {
NSLog(#"MyView touched");
}
}
Now the funny thing is, of course, that if the user touches on the transparent UIView which covers that MyView, I don't get "MyView touched" in the console. But when the user touches the little uncovered area of MyView at the top of the screen, then the touch arrives there.
That's logical to me, because I ask for [touch view] == self. But what if I wanted to know that the rectangular area of that MyView got touched (no matter if indirect or direct)?
Is there a way to catch any touch that appears on the screen/window and then just check if it matches the rectangular area of the view?
You should study the iPhone Application Programming Guide's section on Touch Events for the background you're looking for. The concept you want to master is the Responder Chain, so also look through the reference on UIResponder to understand what it's doing. You can definitely do everything you're talking about, and the full discussion is in the link above.

Detecting image touch (cocos2d)?

In cocos2d how would you detect a touch on an image? I'm having a lot of trouble with this so thanks in advance!
You implement the ccTouchesBegan/Ended/Moved methods within your Layer class, and then check the touch location against the container of the nodes you wish to detect touches for.
For example:
-(BOOL)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch touch = [touches anyObject];
CGPoint location = [[Director sharedDirector] convertCoordinate: [touch locationInView: [touch view]]];
CGRect mySurface = CGRectMake(100, 100, 50, 50);
if(CGRectContainsPoint(mySurface)) {
// do something
return kEventHandled;
}
return kEventIgnored;
}
Now, this all changes in Cocos2D 0.8 (which is in active beta now) by using 'Touch Delegates' and examples can be seen in the 'Touches Test' (which appears to be a pong game from the source I just looked over).
I'm not sure why Corey said to use UIKit controls to detect touches, since Cocos2D has it's own way of handling them.
Only layers can receive touches - it is not advised that you use a Layer for each touchable 'game object' (ie; players and objects) ...
You need to overly invisible touch surfaces on top of the game using standard UIKit classes.
You then detect and interpret touches through those objects and pass the controls to your game.
If you have a more specific problem, you can provide more info or ask another question.
This post will give you the answer
Problem with cocos2D for iPhone and touch detection
Problem with cocos2D for iPhone and touch detection

How to center align a sprite?

In cocos2d does anyone know how to center a sprite? Right now, I have a sprite that moves to where you touch on the screen. The problem is that the sprite is aligned to the lower left corner. This means that instead of moving down, if you touch just a little over the bottom the sprite will move up. Thanks in advance!
Here is my code...
(BOOL) ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *) event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView: [touch view]];
[mainSprite do:[MoveTo actionWithDuration:0.5 position:ccp(point.x, 480 - point.y)]];
return YES;
}
The default transformation anchor for sprites in Cocos2D is the center of the sprite, so it should be moving such that the center of the sprite ends up in the touched location as you have it now. Have you changed the sprite's transform anchor?
The only other thing I can think of is that if your mainSprite is a child of another CocosNode then you may need to convert the touch coordinates to node space using this method:
- (CGPoint)convertToNodeSpace:(CGPoint)worldPoint;
...on the parent node. However, I doubt that is the problem. Sorry if this is unhelpful.
EDIT: OP, if you read this, what version of Cocos2D are you using? I believe 0.8 (currently in the svn trunk) changes the way that anchoring works; for future reference it may be useful to others to know what you're working with.
I got it working! For anyone that wants to know here is the code...
[mainSprite setTransformAnchor:ccp(24.0, 64.5)];
24 is half of the sprites width
64.5 is half of the sprites height