My setup is as follows
- UIView
- CALayer (container)
-CAShapeLayer
-CAShapeLayer
-..
And i want to detect tap touches on every shapelayer to change its color
I have put a UITapGestureRecognizer on my UIView and have the following code
CGPoint point = [self tapWithPoint:[recognizer locationInView:pieView]];
PieSliceLayer* layerThatWasTapped = (PieSliceLayer *)[_containerLayer hitTest:point];
[(PieSliceLayer *)[layerThatWasTapped modelLayer] setFillColor:UIColor.redColor];
But it seems that it only changes 1 CAShapeLayer always the first that was added.
Related
I am using a UIPangesture to move a imageview and this imageview is subview of of view of class, when I move imageview then it move outside its superview. I want to set the boundary of moving this imageview, means the imageview should move only inside its superview.
i guess in the selector that is called by the pan gesture you are setting the view's center or origin accordingly to the point of the pan.
this is how your code should look in order to solve your problem:
- (void)thePanSelector:(id)sender{
UIPanGestureRecognizer *recognizer = (UIPanGestureRecognizer *)sender;
CGPoint p = [recognizer locationInView:theSuperView];
float boundedX = MAX(0,MIN(theSuperView.frame.size.width,p.x));
float boundedY = MAX(0,MIN(theSuperView.frame.size.height,p.y));
CGPoint boundedPoint = CGPointMake(boundedX,boundedY);
view.center = p;
}
this code is untested so you may need to play a little with the values of boundedX and boundedY
I'm trying to detect a touch on a UISubview of a view being animated
Here's my detection code:
//simple but not very elegant way of getting correct frame
CGRect keepFrame = self.fishContainer.layer.frame;
self.fishContainer.layer.frame = [[self.fishContainer.layer presentationLayer] frame];
//get touch location, taking presentation layer into account. (See above)
CGPoint p = [sender locationInView:self.fishContainer];
CALayer *layer =[self.fishContainer.layer presentationLayer];
//apply relevant transform
p = CGPointApplyAffineTransform(p,layer.affineTransform);
EBLog(#"checking point %#",NSStringFromCGPoint(p));
UIView *vToRemove = nil;
//find topmost view containing touched point
for (UIView *v in self.parasites) {
EBLog(#"-BOUND %#",NSStringFromCGRect(v.frame));
if(CGRectContainsPoint(v.frame, p))
{
vToRemove = v;
}
}
//OK, we have a view. Let's remove it.
if(vToRemove)
{
EBLog(#"found one");
[vToRemove removeFromSuperview];
[self.parasites removeObject:vToRemove];
if ([self.parasites count] == 0) {
[self showWinnerScreen];
[self stopGame];
}
}
//restore view frame
self.fishContainer.layer.frame = keepFrame;
Everything works correctly as long as I don't animate parasiteArea parentview.
When I animate parasiteArea's parentview (A CAAnimation consisting of move of the view, scale of the view, and rotate of the view) , the touch is outside the bounds of the expected subview.
UPDATE
I manged to get the detection working in most cases (see code above), by using the presentationLayer property and CGPointApplyAffineTransform. There is however, still some cases where it dosnt work.
I guess I need to translate the touch point to the coordinate space of the CAAnimation.
Or something like that? any suggestions?
I ended up using UIView animateWithDuration instead of CAAninmation. For my purpose the limited animation possibles were enough.
Hi i have the CALayer with content has images. Now i need to add the Tap gesture on it. How is is it possible please give any example for add Tap for CALayer.
This is my code:
CALayer *imageLayer=[CALayer layer];
imageLayer.frame=frame;
imageLayer.cornerRadius=10.0;
imageLayer.borderWidth=2.0;
imageLayer.borderColor=[UIColor colorWithRed:0.957 green:0.957 blue:0.957 alpha:1].CGColor;
CALayer *imagecontent=[CALayer layer];
imagecontent.frame=CGRectMake(16, 26, 153, 153);
UIImage *image=[self.pageImages objectAtIndex:page];
imagecontent.contents=(id)image.CGImage;
imagecontent.masksToBounds = YES;
[imageLayer addSublayer:imagecontent];
[self.scrollView.layer addSublayer:imageLayer];
Here i want add Tap gesture for "image content" please help me.
You cannot add gesture recognizers to layers, so you need to add a gesture recognizer to its containing view, and then perform an additional test to see if the tap happens to overlap the layer in question.
Add tap gesture recognizer to the parent view of the layer (let's assume that it's called parentView), then add this code to the selector of your gesture recognizer:
- (void)handleTap:(UITapGestureRecognizer *)sender {
CGRect layerFrame = CGRectMake(16, 26, 153, 153);
CGPoint tapPoint = [sender locationInView:parentView];
if (CGRectContainsPoint(layerRect, tapPoint)) {
// The tap happened inside the rectangle of your layer
...
}
}
Also you can do hitTest: on a CALayer to check whether you are tapping on that layer.
Just check the apple documentation for hittest.
Is there a way the Round Rect button to take exactly the same size of an image?Are there any round buttons? I have a project with many buttons-images and they get mixed together. The images are mostly circular and the buttons Rectangular, so when I place them close to each other they get mixed.
When the iPhone detects a touch on the screen, it finds the touched view using “hit testing”. By default, hit testing assumes that each view is a rectangle.
If you want hit testing to treat your view as a different shape, you need to create a subclass (of UIButton in your case) and override the pointInside:withEvent: method to test the shape you want to use.
For example:
#implementation MyOvalButton
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
UIBezierPath *path = [UIBezierPath bezierPathWithOvalInRect:self.bounds];
return [path containsPoint:point];
}
I haven't tested that code.
Swift version:
class MyOvalButton: UIButton {
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
return UIBezierPath(ovalIn: bounds).contains(point)
}
Don't forget to set your button's custom class to MyOvalButton in your storyboard or xib, if that's where you create the button.
Here's a demo, where I have connected the touch-down and touch-up events of the button to turn the background gray when the button is touched:
Answering on your question in topic:(hoping I understood what you really want)
[button setFrame:CGRectMake(button.frame.origin.x, button.frame.origin.y, image.size.width, image.size.height)];
That makes the button's frame be the same as the size of your image.
I have a UIViewController that is detecting touch events with touchesBegan. There are moving UIImageView objects that float around the screen, and I need to see if the touch event landed on one of them. What I am trying:
UITouch* touch = [touches anyObject];
if ([arrayOfUIImageViewsOnScreen containsObject: [touch view]]) {
NSLog(#"UIImageView Touched!");
}
But this never happens. Also if I were to do something like this:
int h = [touch view].bounds.size.height;
NSLog([NSString stringWithFormat: #"%d", h]);
it outputs the height of the entire UIViewController (screen) everytime, even if I touch one of the UIImageViews, so clearly [touch view] is not giving me the UIImageView. How do I detect when only a UIImageView is pressed? Please do not suggest using UIButtons.
Thank you!
If you only want to detect when a UIImageView is pressed, check the class:
if (touch.view.class == [UIImageView class]) {
//do whatever
}
else {
//isnt a UIImageView so do whatever else
}
Edit----
You haven't set the userInteraction to enabled for the UIImageView have you?!
I know you said please do not suggest using UIButtons, but buttons sound like the best/easiest way to me.
You could try sending the hitTest message to the main view's CALayer with one of the touches - it'll return the CALayer furthest down the subview hierarchy that you touched.
You could then test to see if the layer you touched is a layer of one of the UIImageView's, and proceed from there.
This code uses a point generated from a UIGestureRecognizer.
CGPoint thePoint = [r locationInView:self.view];
thePoint = [self.view.layer convertPoint:thePoint toLayer:self.view.layer.superlayer];
selectedLayer = [self.view.layer hitTest:thePoint];
If you want to check the touch means use CGRectContainsPoint.
1.Capture the touch event and get the point where you touched,
2.Make a CGRect which bounds the object you want to check the touch event,
3.Use CGRectContainsPoint(CGRect , CGPoint) and catch the boolean return value.
http://developer.apple.com/library/ios/#DOCUMENTATION/GraphicsImaging/Reference/CGGeometry/Reference/reference.html
Here is the class reference for CGRect.
Forgot about this question- the problem was that I did not wait until viewDidLoad to set userInteractionEnabled on my UIImageView.