iPhone process a swipe by user - iphone

Simple question here: How can I detect when I user swipes their finger on the screen of the iPhone?

You need to implement a gesture recognizer in your application.
In your interface:
#define kMinimumGestureLength 30
#define kMaximumVariance 5
#import <UIKit/UIKit.h>
#interface *yourView* : UIViewController {
CGPoint gestureStartPoint;
}
#end
kMinimumGestureLength is the minimum distance the finger as to travel before it counts as a swipe. kMaximumVariance is the maximum distance, in pixels, that the finger can end above the beginning point on the y-axis.
Now open your interface .xib file and select your view in IB, and make sure Multiple Touch is enabled in View Attributes.
In your implementation, implement these methods.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:self.view];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
CGFloat deltaX = fabsf(gestureStartPoint.x - currentPosition.x);
CGFloat deltaY = fabsf(gestureStartPoint.y - currentPosition.y);
if(deltaX >= kMinimumGestureLength && deltaY <= kMaximumVariance){
//do something
}
else if(deltaY >= kMinimumGestureLength && deltaX <= kMaximumVariance){
//do something
}
}
This is one way to implement a swipe recognizer. Also, you really should check out the Docs on this topic:
UISwipeGestureRecognizer

The UIGestureRecognizer is what you want. Especially the UISwipeGestureRecognizer subclass

Ah, I was able to answer my own question: http://developer.apple.com/library/ios/#samplecode/SimpleGestureRecognizers/Introduction/Intro.html#//apple_ref/doc/uid/DTS40009460
Thanks for the help everyone!

Related

Custom UISlider: avoid updating when dragging outside

I'm quite new to iPhone development and I'm building my first app :)
In one of my view controllers I built a customSlider that should behave as the native "slide to unlock" slider.
My doubt right now is how to implement the "drag outside" behaviour. As said, I would like to have it exactly as the native slider, that means that when the finger is dragging the slider if it moves out the slider, the slider should be going to zero.
My doubt is not on the animation part (I'm already using animation block successfully), but on control events part. Which control event should I use?
I'm using:
[customSlider addTarget:self action:#selector(sliderMoved:) forControlEvents:UIControlEventValueChanged];
to handle the sliding part (finger sliding the cursor), and
[customSlider addTarget:self action:#selector(sliderAction:) forControlEvents:UIControlEventTouchUpInside];
to handle the release part, but the problem is that if I release the finger outside, the sliderAction function it's not called.
EDIT:
I've tried to implement the solution #Bruno Domingues gave me, but I'm realizing that the issue is that by default UISlider keep getting updated even if the finger is dragged outside of it (try to open for example the Brightness section in System Preferences and you'll see that the slider will keep on updating even if you drag outside of it). So my question could be redefined: How to avoid this default behaviour and have the slider updating only when the finger is moving on it?
Simply interrupt the touches methods in your custom subclass and only forward the touches you want acted on to the superclass, like so:
in .h:
#interface CustomSlider : UISlider
#end
in .m:
#import "CustomSlider.h"
#implementation CustomSlider
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
CGPoint touchLocation = [[touches anyObject] locationInView:self];
if (touchLocation.x < 0 || touchLocation.y<0)return;
if (touchLocation.x > self.bounds.size.width || touchLocation.y > self.bounds.size.height)return;
[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
CGPoint touchLocation = [[touches anyObject] locationInView:self];
if (touchLocation.x < 0 || touchLocation.y<0)return;
if (touchLocation.x > self.bounds.size.width || touchLocation.y > self.bounds.size.height)return;
[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
CGPoint touchLocation = [[touches anyObject] locationInView:self];
if (touchLocation.x < 0 || touchLocation.y<0)return;
if (touchLocation.x > self.bounds.size.width || touchLocation.y > self.bounds.size.height)return;
[super touchesEnded:touches withEvent:event];
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
CGPoint touchLocation = [[touches anyObject] locationInView:self];
if (touchLocation.x < 0 || touchLocation.y<0)return;
if (touchLocation.x > self.bounds.size.width || touchLocation.y > self.bounds.size.height)return;
[super touchesCancelled:touches withEvent:event];
}
#end
Please note that this implementation will start updating the control if your finger moves back to the control. To eliminate this, simply set a flag if a touch is received outside of the view then check that flag in subsequent touches methods.
You need to set a [slider addTarget:self action:#selector(eventDragOutside:) forControlEvents:UIControlEventTouchDragOutside];
And add a function
- (IBAction)eventDragOutside:(UISlider *)sender {
sender.value = 0.0f;
}

Multiple Swipe views

So I'm trying to create an app that changes a label based on the direction you swipe. However, instead of being able to swipe anywhere on the screen I'd like it to only swipe on a specific view. I put a new view on the .xib file and used that view for the location of touches and named it leftView, but it still uses the whole screen. I'm using a touchesBegan and touchesMoved method so I can use diagonals.
Here's a bit of the code I'm using:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:leftView];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:leftView];
CGFloat deltaX = fabsf(gestureStartPoint.x - currentPosition.x);
CGFloat deltaY = fabsf(gestureStartPoint.y - currentPosition.y);
if ((deltaX >= kMinimumGestureLength && deltaY <= kMaximumVariance) && currentPosition.x < gestureStartPoint.x){
label.text = #"Left Horizontal swipe detected";
}
else if((deltaX >= kMinimumGestureLength && deltaY <= kMaximumVariance) && currentPosition.x > gestureStartPoint.x){
label.text = #"Right Horizontal swipe detected";
}
else if ((deltaY >= kMinimumGestureLength && deltaX <= kMaximumVariance) && currentPosition.y < gestureStartPoint.y){
label.text = #"Up Vertical swipe detected";
}
else if ((deltaY >= kMinimumGestureLength && deltaX <= kMaximumVariance) && currentPosition.y > gestureStartPoint.y){
label.text = #"Down Vertical swipe detected";
}
You can achieve what you're after by adding two UIGestureRecognizers (one for right swipe and another for left swipe) to the UIView that you want to receive the swipes.
Apple's SimpleGestureRecognizer sample project gives a great illustration of how to set this up.
Check the identity inspector whether the uiview class is selected for the UIView object in organizer. I guess u have implemented the touchesBegan method in the sub class of UIView.

drag UIView like apps in the home screen iPhone

I have a view control and inside I plan to place some controls like buttons textbox etc... I can drag my view along the x axis like:
1)
2)
with the following code:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
displaceX = location.x - ViewMain.center.x;
displaceY = ViewMain.center.y;
startPosX = location.x - displaceX;
}
CurrentTime = [[NSDate date] timeIntervalSince1970];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
location.x =location.x - displaceX;
location.y = displaceY;
ViewMain.center = location;
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
double time = [[NSDate date] timeIntervalSince1970]-CurrentTime;
UITouch *touch = [[event allTouches] anyObject];
if( [touch view] == ViewMain)
{
CGPoint location = [touch locationInView:self.view];
location.x =location.x - displaceX;
location.y = displaceY;
ViewMain.center = location;
double speed = (ViewMain.center.x-startPosX)/(time*2);
NSLog(#"speed: %f", speed);
}
}
not that I have to add the global variables:
float displaceX = 0;
float displaceY = 0;
float startPosX = 0;
float startPosY = 0;
double CurrentTime;
the reason why I created those variables is so that when I start dragging the view the view moves from the point where I touch it instead of from the middle.
Anyways if I touch a button or image the view will not drag even though the images have transparency on the background. I want to be able to still be able to drag the view regardless if there is an image on top of the view. I where thinking that maybe I need to place a large transparent view on top of everything but I need to have buttons, images etc. I want to be able to drag a view just like you can with:
note that I was able to drag the view regardless of wither I first touched an app/image or text. How could I do that?
I think your problem is that if you touch a UIButton or a UIImageView with interaction enabled, it doesn't pass the touch along.
For the images, uncheck the User Interaction Enabledproperty in IB.
For the buttons that are causing touchesBegan:withEvent:, etc. to not get called, then look at the following link: Is there a way to pass touches through on the iPhone?.
You may want to consider a different approach to this problem. Rather than trying to manually manage the content scrolling yourself you would probably be better off using a UIScrollView with the pagingEnabled property set to YES. This is the method Apple recommends (and it's probably the method used by Springboard.app in your last screenshot). If you are a member of the iOS developer program check out the WWDC 2010 session on UIScrollView for an example of this. I think they may have also posted sample code on developer.apple.com.

Handling touches on Iphone

Im having a little problem on handling touches in my apps.
I set my touchesBegan like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *currentTouch = [[event allTouches] anyObject];
touchPoint = [currentTouch locationInView:self.view];
if (CGRectContainsPoint(image1.frame, touchPoint)) {
image1IsTouched = YES;
}
}
Then i set my touch move like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *currentTouch = [[event allTouches] anyObject];
currentPoint = [currentTouch locationInView:currentTouch.view];
if(image1IsTouched == YES) {
image1.center = CGPointMake(currentPoint.x,currentPoint.y);
.....
}
}
Now i tried my app on actual unit and thats where i notice my problem. While im touching the image1 with 1 finger the app is doing ok and its checking for collision everytime i drag my finger. The problem occurs when i touch the screen with another finger while touching/dragging the image. The image im currently touching will jump to the other finger. I've tried [myView setMultipleTouchEnable:NO]; & using NSArray on touches and comparing the [touches count] with the touch but its not working. Can someone show me how to set a uiimageview to act on single touch only. Thanks.
First, you should use UITouch *currentTouch = [touches anyObject]; to get the current touch.
Second, you should check that touches.count == 1 to make sure there's only one finger on the screen, and ignore touch input if there's more than one, unless you wanted to support multitouch.

Problem with cocos2D for iPhone and touch detection

I just don't get it.
I use cocos2d for development of a small game on the iPhone/Pod. The framework is just great, but I fail at touch detection. I read that you just need to overwrite the proper functions (e.g. "touchesBegan" ) in the implementation of a class which subclasses CocosNode. But it doesn't work. What could I do wrong?
the function:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{NSLog(#"tickle, hihi!");}
did I get it totally wrong?
Layer is the only cocos2d class which gets touches.
The trick is that ALL instances of Layer get passed the touch events, one after the other, so your code has to handle this.
I did it like this:
-(BOOL)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGPoint cLoc = [[Director sharedDirector] convertCoordinate: location];
float labelX = self.position.x - HALF_WIDTH;
float labelY = self.position.y - HALF_WIDTH;
float labelXWidth = labelX + WIDTH;
float labelYHeight = labelY + WIDTH;
if( labelX < cLoc.x &&
labelY < cLoc.y &&
labelXWidth > cLoc.x &&
labelYHeight > cLoc.y){
NSLog(#"WE ARE TOUCHED AND I AM A %#", self.labelString);
return kEventHandled;
} else {
return kEventIgnored;
}
}
Note that the cocos2d library has a "ccTouchesEnded" implementation, rather than the Apple standard. It allows you to return a BOOL indicating whether or not you handled the event.
Good luck!
Have you added this to your layers init method?
// isTouchEnabled is an property of Layer (the super class).
// When it is YES, then the touches will be enabled
self.isTouchEnabled = YES;
// isAccelerometerEnabled is property of Layer (the super class).
// When it is YES, then the accelerometer will be enabled
self.isAccelerometerEnabled = YES;
In order to detect touches, you need to subclass from UIResponder (which UIView does as well) . I am not familiar with cocos2D, but a quick look at the documentation reveals that CocosNode does not derive from UIResponder.
Upon further investigation, it looks like Cocos folks created a Layer class that derives from CocosNode. And that class implements the touch event handlers. But those are prefixed by cc.
See http://code.google.com/p/cocos2d-iphone/source/browse/trunk/cocos2d/Layer.h
Also see menu.m code and the below blog post article for more info on this:
http://blog.sapusmedia.com/2008/12/cocos2d-propagating-touch-events.html
maw, the CGPoint struct members x,y are floats. use #"%f" to format floats for printf/NSLog.
If you use the 0.9 beta of cocos2D it has a really simple touch detection for CocosNodes. The real beauty of this new detection is that it handles multiple touch tracking really well.
An example of this can be found here
http://code.google.com/p/cocos2d-iphone/source/browse/#svn/trunk/tests/TouchesTest
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Add a new body/atlas sprite at the touched location
CGPoint tapPosition;
for( UITouch *touch in touches ) {
CGPoint location = [touch locationInView: [touch view]];
tapPosition = [self convertToNodeSpace:[[CCDirector sharedDirector] convertToGL:location]]; // get the tapped position
}
}
think this can help you....
-Make your scene conforms to protocol CCTargetedTouchDelegate
-Add This line to init of your scene:
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:NO];
-Implement these functions:
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
return YES;
}
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event
{
//here touch is ended
}