I have a grid of images/buttons, and I want the user to be able to drag their finger over the images/buttons such that when their finger touches each image/button an event is called.
How do I do this???
The best example I can think of now is the Contacts app, how you drag your finger down the list of letters (on the right) and as you touch each letter it jumps to that part of the contacts list.
You're going to want to implement -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event in your UIViewController. This method is called any time a touch moves on the screen. You can then either use the coordinates (use [myUITouch locationInView:self.view]) or [self.view.layer hitTest:[myUITouch locationInView:self.view]] to get the image/button the touch occurred in and work from that.
For example, if I have a row of ten images, each 32 pixels by 32 pixels, and I want to record when each of them is touched, I could do something like the following:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGSize buttonSize = myButton.bounds.size;
CGPoint touchPoint = [[touches anyObject] locationInView:self.view];
if (touchPoint.y < buttonSize.y) {
NSInteger buttonNumber = touchPoint.x/buttonSize.x;
NSLog(#"Button number %d pushed", buttonNumber);
}
}
Note this assumes multitouch is disabled or it will randomly pick one touch to record
Related
In my app, I allow the user to annotate a photo by adding arrows (custom ArrowView). There can be many arrows added, with various zoom & rotation.
I am trying implement selecting of arrow by touch. Currently, I am iterating & using
CGRectContainsPoint(arrowView.frame, touchPoint)
to decide which arrow to select based on a touch gesture.
But, this does not work well when some of the arrows are big & rotated to 45 degrees (since the frame becomes big).
Question:
I would like to use bounds of the arrow translated to parent co-ordinates instead of frame. How can I get this when scaling & rotation is applied?
Alternatively, is there a better method to solve this selection problem?
This code find the arrow under touchPoint:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
UIView *arrow = [self.view hitTest:touchPoint withEvent:event];
}
I'm having a problem with a jumping movement with using CGAffineTransformMakeRotation. It seems to work fine until I get to the top of quadrants 1 and 2. Below is the code I am using and youtube link that shows what happens. Any insight would be really appreciated.
My goal is to rotate a UIWebView with an svg inside. Since I can't easily detect touch on the UIWebView alone, I'm putting a blank UIImageView over it. This allows me to detect the touch and prevent the copy dialog from popping up.
http://www.youtube.com/watch?v=x_OmS0MPdEE&feature=youtu.be
- (void)touchesMoved: (NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
NSArray *allTouches = [touches allObjects];
CGPoint location = [[allTouches objectAtIndex:0] locationInView:self.view];
if(selected == 1) {
CGFloat rads = atan2f(location.y - (grid1.frame.size.height/2),location.x - (grid1.frame.size.width/2));
grid1.transform = grid1Btn.transform = CGAffineTransformMakeRotation(rads);
}
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
NSArray *allTouches = [touches allObjects];
CGPoint location = [[allTouches objectAtIndex:0] locationInView:self.view];
if(CGRectContainsPoint(grid1Btn.frame, location))
{
selected = 1;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
selected = -1;
}
You use some frame.size.* in your computations. Note, that those values (unlike bounds.size.*) are affected by transform. Use grid1.center instead. It is also more logical (you rotate around center).
The next thing you need to fix is that it should not jump to new position when touches start :)
The jerkiness is caused by the inherent inaccuracy of touch event locations.
Touch events that are located sufficiently close to the center of the frame have a high likelihood of straying to the diagonally opposite quadrant of the frame whilst a circle is being traced about the center of the frame.
This will result in a sudden jump of 90 degrees as observed in your video.
One way to avoid the problem is to introduce a dead zone about the center of the frame. Any touch that is located within a given radius of the center of the frame would not trigger the rotation to be recalculated.
Hope this helps.
I am trying to make a simple landscape "split screen" app where two players can play at once (each player gets one half of the screen), but I am having trouble tracking both touches at the same time. This is the code I am trying to use:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint point = [touch locationInView:self.view];
if (point.x < 240) {
[player1 updatePoint:point];
} else {
[player2 updatePoint:point];
}
}
}
But I am obviously doing something wrong. Although this code works fine, it will only track one finger and move the player on the side of the screen the finger is on. What is my code lacking? Is this task more difficult to pull off then I think it is?
Did you set UIView's multipleTouchEnabled to YES?
multipleTouchEnabled should be set to YES for the UIView.
I've a view with 10 character labels (big letters). Each character (label) has a different tag. The functionality that I'm developing is "Trace".
When user moves his finger over the view, i want to detect which character is touched.
I think I've to implement,
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
But I'm not knowing how to identify touch on the label and identify the character.
Can some one help me?
If you want to know the view that the touch began on (which view the user put their finger down on), you can read any of the touch items returned in the NSSet like so:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSInteger viewTag = [[[touches anyObject] view] tag];
//Check your view using the tags you've set...
}
However, even as the finger moves across the screen, this view property will ONLY return the view initially touched and not the view currently under the finger. To do this, you will need to track the coordinates of the current touch and determine which view it falls into, perhaps like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self.view];
//If all your views are collected as an array
for(UIView *view in views) {
if (CGRectContainsPoint([view frame], point))
{
//Bingo
}
}
}
The key to success on this is to make sure that you are getting your touches in coordinates reference to the proper views, so make sure to call locationInView: and CGRectContainsPoint() appropriately with values that match your application's view hierarchy and where your placing this code (i.e. the View, ViewController, etc.)
To detect simple touches you can use UIGestureRecognizer. Read the documentation for more on those. For more complex operations you do need to implement:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
To identify which item was touched you can give your item's tags values:
myView.tag = 4;
Then just check the tag value of the view reporting the touch and you know which it is.
I'm having some images from which the user should choose one. Now I don't want to just offer an flat scrolling area with a boring grid. Instead, I'd like to show up a wheel that contains those images. At the top would be a marker indicating the selection. Something similar to the Pickers.
The problem is not the rotation stuff; I'd use some geometric functions for that. But I have no idea how to actually get the scrolling gestures on that wheel. Where must I start?
BTW: With circular I don't mean something like the Pickers. I mean a real wheel that has a center axis and can be rolled. Like the very old telephones, like a bike wheel. Or a Picker turned by 90°, facing with the axis to you (Z-coordinate).
If you're talking about capturing gestures then here is the example they give in the docs.
Though I could have sworn I heard Alan Cannistraro say in one of the first CS193P lectures that you don't have to do this, that you can just trap the swipe event but I can't find that.
Could someone that actually knows what they are doing please correct me and I'll remove this post but for now I know this will work:
#define HORIZ_SWIPE_DRAG_MIN 12
#define VERT_SWIPE_DRAG_MAX 4
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
startTouchPosition = [touch locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentTouchPosition = [touch locationInView:self];
// If the swipe tracks correctly.
if (fabsf(startTouchPosition.x - currentTouchPosition.x) >= HORIZ_SWIPE_DRAG_MIN &&
fabsf(startTouchPosition.y - currentTouchPosition.y) <= VERT_SWIPE_DRAG_MAX)
{
// It appears to be a swipe.
if (startTouchPosition.x < currentTouchPosition.x)
[self myProcessRightSwipe:touches withEvent:event];
else
[self myProcessLeftSwipe:touches withEvent:event];
}
else
{
// Process a non-swipe event.
}
}
Just how similar to a picker view are you thinking? You can load up a picker view with your own custom subviews, which could be image views. That'd get you an actual picker view with your images, which might or might not be what you're actually aiming for.