Bring UIImageView on top of the other UIImageViews in my main view - iphone

I have a lot of UIImageViews in my main view, some of these have images, others are blank. I have setup code that will check which UIImageView currently contains an image. At the same time, this code will take care of allowing a UIImageView with an image to be moved around.
Now what happens is this: when moving a selected UIImageView around (for some reason and quite randomly), the image will not stay on top of the other UImageViews in the screen. The reason why I say that this is random is because it will stay on top of some of the other views, but not on top of others.
The behavior is unexpected, several problems arise:
Visually it looks bad; there is no reason why a touched UIImageView should slip under another.
The way I have the code going is to allow UIImageViews to be moved only if they contain an image. So if the UIImageView goes under another who does not contain an image, I cannot touch and move it again. It looks like it is stuck in place.
Please note that I have not been setting subviews at all for this code, thus why this behavior occurs is beyond me.
So what my question boils down to, is there any way that I can tell the code to:
Get the object that was touched.
If it is a UIImageView with an image, then allow me to move the UIImageView.
allow this UIImageView to supersede (be on top of) all other UIImageViews.
Code reference:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{UITouch *touch;
UITouch *touch;
CGPoint touchLocation;
for (UIImageView *fruit in fruitArray)
{
// Check that the UIImageView contains an image
if([fruit image] != nil)
{
// Get the touch event.
touch = [touches anyObject];
// Get the location for the touched object within the view.
touchLocation = [touch locationInView:[self view]];
// Bring the UIImageView touch location to the image's center.
if([touch view] == fruit)
{
fruit.center = touchLocation;
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
//allow the selected event (in our case a UIImageView) to be dragged
[self touchesBegan:touches withEvent:event];
}
Thank you for your help! Let me know if you need a better explanation

[self.view bringSubviewToFront:imageView];
This was what you were looking for...

Related

Disable touch events on certain areas of iPhone screen

I want to disable touches on all areas of the screen apart from a specific few points (e.g buttons). I.e. I don't want 'touchesBegan' to trigger at all when I tap anything other than a button. Calling
self.view.userInteractionEnabled = NO;
has the desired effect for not registering touches, but then of course I can't tap any buttons. I basically want the button to still work, even if there are 5 points touching the screen, i.e. all touch inputs have been used up, and the button represents the 6th.
Is this possible?
I've tried inserting a view with userInteraction disabled below my buttons, but it still registers touches when the user taps the screen. It seems the only way to disable touch registering is to do so on the entire screen (on the parent UIView).
UPDATE:
I've tried using gesture recognizers to handle all touch events, and ignore those that don't qualify. Here is my code:
#interface ViewController : UIViewController <UIGestureRecognizerDelegate>
...
- (void)viewDidLoad
{
[super viewDidLoad];
UIGestureRecognizer *allRecognizer = [[UIGestureRecognizer alloc] initWithTarget:self action:nil];
allRecognizer.delegate = self;
[self.view addGestureRecognizer:allRecognizer];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint coords = [touch locationInView:self.view];
NSLog(#"Coords: %g, %g", coords.x, coords.y);
if (coords.y < 200) {
[self ignoreTouch:touch forEvent:nil];
return TRUE;
}
return FALSE;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"%i touch(es)", [touches count]);
}
However the screen still 'reads' the touches, so if I place 5 fingers down, the 6th one won't trigger a button press...
You need to set up an invisible UIButton and lay it between the view that should not register touches and the UIButtons that should still be active.
Now you need to set the invisible button's 'userInteractionEnabled':
//userInteractionEnabled == NO => self.view registeres touches
//userInteractionEnabled == YES => self.view doesn't register touches
[_invisibleButton setUserInteractionEnabled:NO];
What really matters in this solution is that both - the invisible and the visible buttons are direct subviews of the VC's view.
You can download an example project from my dropbox:
https://dl.dropboxusercontent.com/u/99449487/DontTapThat.zip
However this example just prevents the handling of certain touches. Completly ignoring input isn't technically possible: Third party apps are not responsible for for detecting input. They are just responsible for handling input. The detection of touch input is done iOS.
The only way to build up a case like you describe it in the comments is to hope that iOS won't interpret the input of your case as a "finger" because it's most likely going to cover an area that's way bigger than a finger.
So in conclusion the best way would be to change the material of the case you're about to build or at least give it a non conductive coating. From a third party developers point of view there is no way to achieve your goals with software if there is a need for 5 fingers as described in the comments.
There is couple of methods in UIView that you can override:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event; // recursively calls -pointInside:withEvent:. point is in the receiver's coordinate system
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event; // default returns YES if point is in bounds
This should prevent to call touchBegin and other methods.
Regards.
I have a nice solution for this. What ever the area you want to hide the interaction place a transparent button on top of the area.
touchesBegan is a default method so it must call all the time when touch happens on view so there is no way-out, But you can still do one thing set
self.buttonPlayMusic.userInteractionEnabled = FALSE;
for the object you don't need touch may be this could be help you with your desired output.
Have you tried using a UIScrollView as the background ? i.e the area where you do not want touch events to be fired.
UIScrollView does not call the touch methods.
You can add UIImageView Control on that area where you want to disable touch event.you can add UIImageView object as top of self.view subViews.
Example
//area -- is that area where you want to disable touch on self.view
UIImageView *imageView=[[UIImageView alloc]initWithFrame:area];
[self.view addSubView:imageView];
You touchDelegate will always call in this way, but if you are doing some task on touch then you can do your task like this way.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
UIButton *touchObject=(UIButton*)[touch view];
if ([touchObject isKindOfClass:[UIButton class]])
{
//Do what ever you want on button touch
}
else{
return;
}
}

touchesEnded:withEvent: not called when a drag event moved out of the screen top/bottom

My problem is quite strange but simple.
I subclassed a my customer UIScrollView: MyScrollView, where i disabled the scroll:
self.scrollEnabled = NO;
that means apart from
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
all other UIScrollViewDelegate method won't be called
and in MyScrollView i do the content scroll by detecting the user touch movement on screen, that is to say no flipping, no bounces, my implementation is in the touchesMoved:withEvent: method
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// NSLog(#"touch.phase:%d",touch.phase);
CGPoint currentPosition = [touch locationInView:self];
CGPoint lastTouchPosition = [touch previousLocationInView:self];
CGFloat deltaY = lastTouchPosition.y - currentPosition.y;
CGFloat contentYOffset = self.contentOffset.y + deltaY;
[self setContentOffset:CGPointMake(0,contentYOffset) animated:NO];
}
after the user drag movement have been finished, i do my own method according to the content offset of MyScrollView in touchesEnded:withEvent:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
//some implementation according to current content offset
}
user moves finger on screen, whenever the finger left the screen surface, the touchesEnded:withEvent: get called and i can implement my own method correctly.
BUT, when user move finger on screen surface from inside the screen to outside either on top or bottom, then lift the finger up, the touchesEnded:withEvent: method never got called, it seems like the ios doesn't treat the move out bounds(top or bottom) event as a touches end event, also ios wouldn't know the what is going on when touch is outside it screen bounds
someone may suggest me to detect the current position in touchesMoved:withEvent: to check out whether it is inbounds or not. this MAY WORK WHEN THE MOVEMENT IS VERY SLOW, but when you move very fast, the system can not detect every point position, it seems like the movement is detected in a certain time interval.
can any one help me out how could i detect if the user finger has moved out of bounds or not
I think the touchesCancelled:withEvent: method will be called !
I have resolved this problem
bcs UIScrollView does too much work that we can not handle some event ourselves, fortunately the UIView can detect the touche move out of bounds and will invoke the touchesEnd:withEvent: method.
considering that replacing MyScrollView's superclass with UIView has too much work to do, so i figured out a simple way to resolve:
i added an TouchActionDetectView subclassed from UIView, whose work is to detect all user touches event and deliver those event to MyScrollView. of course i have to clear the background color of TouchActionDetectView to avoid blocking other view content.

Drag image from one view to another view In iphone/ipad

I am facing below problem:-
Please have a look in the below image the thing what i want is:-
I want to drag default images from view 1 to view 2 and images have to be always thr in view1 so its not like touch moved to draging images .
i tried a lot things in that but i succeed in draging image from one view to another but in view2 i am not able to get touch points so its just adding thr as frame
But i cnt able to touch tht image across view2 and even in view2
i want to do other functionality like zooming and others but first want to get touch points in view2.
I am giving description image about this problem.
The edited question is:-
i have done this simple demo in this i am transferring one view to another view and after getting view2 points its been in its limit boundaries.
but how can i get the default things remain in their.
i will modify this code i will add images in this view. but its just shows my thinking here so guide me here.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
if ([touch view] == view3) {
CGPoint location = [touch locationInView:self.view];
view3.center = location;
return;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
int tmpValue = 0;
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:0.5];
{
onceDone = TRUE;
if(onceDone == TRUE)
{
UITouch *touch = [touches anyObject];
CGPoint pt = [touch locationInView:self.view];
NSLog(#"X = %f y=%f",pt.x,pt.y);
view3.center = pt;
if(view3.frame.origin.x < view2.frame.origin.x )
//view3.frame.origin.x < view2.frame.origin.x)
[view3 setFrame:CGRectMake(159,view3.frame.origin.y, view3.frame.size.width, view3.frame.size.height)];
}
}
[UIView commitAnimations];
}
Please help me.
it is urgent.
Thanks in advance
If you are using pan gestures are touchesMoved method to drag and drop the images.Make super view of view1 and view2 to same UIView and calculate your gesture location or touched point with respect to that common super view.That means your view1 and view2 will be on the same view and each subView(image)is also added to main view only.So you can move images on main view where ever you want and can perform any operation on them.
We have been working in a similar issue, and we have made up with this solution:
We have an UIPanGestureRecognizer added to each button.
When we detect the pan gesture begin, we create the object we want to drag (in this case, it could be a copy of your image). You should have to set a rect equals to the button, or if you create the object in a different view, you should have to convert the object coordinates so the object seems in the same position as the button, but inside his view.
After that we move the last created object, instead of move the button.
And that is, the trick here is the UIPanGestureRecognizer works until you untouch the screen, but it is triggered while you move your finger for all the screen, not only inside the button!
I hope this works for you.

Passing touch events on to subviews

I have a view within a UIScrollView that loads an additional subview when the user presses a certain area. When this additional subview is visible, I want all touch events to be handled by this - and not by the scrollview.
It seems like the first couple events are being handled by the subview, but then touchesCancelled is called and the scrollview takes over the touch detection.
How can I make sure that the subview gets all the events as long as the movement activity is being performed on this view?
This is my implementation on touchesMoved - which I thought would do the job...
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[touches allObjects] objectAtIndex:0];
CGPoint touchPt = [touch locationInView:self];
UIView *hitView = [self hitTest:touchPt withEvent:event];
UIView *mySubView = subviewCtrl.view;
if(hitView == mySubView) {
[subviewCtrl.view touchesMoved:touches withEvent:event];
}
else {
NSLog(#"Outside of view...");
}
}
The responder chain hierarchy "normally" goes from subview to superview, so you shouldn't need to do the hitTest in your superview. The problem that you are having is not that you need the superview to invoke touchesMoved on the subview, but rather that UIScrollView subverts the normal responder chain hierarchy by intercepting touch events in order to deliver a smooth scrolling experience to the user. If you don't want this behaviour, then you can disable this behaviour in the scrollView by sending it the following message:
[scrollView setDelaysContentTouches:NO];
Note that this will make sure that your subview has first crack at handling the events in question (provided that it is in fact the first responder). This can negatively impact the scrolling and zooming performance of the scrollView, however, which is probably why Apple sets it to YES by default.

Why doesn't the touch event arrive at my UIImageView?

I have some nested views like this:
First, there is an big UIView. Inside this, there is another UIView. And inside this, there is an MyNiceButtons class that inherits from UIView. So I have:
UIView > UIView > MyNiceButtons (= an UIView).
In detail, the UIView creates an UIImageView and adds it as a child view to itself, which represents an graphical button. So inside the MyNiceButtons class implementation, I have this code to handle a touch event on the fake button image:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject]; // i just need single touches
// just to see which view is affected
UIView *tview = [touch view];
CGRect rect = tview.frame;
rect.origin.x = rect.origin.x + 20.0f;
tview.frame = rect;
[tview setBackgroundColor:[UIColor yellowColor]];
if ([touch view] == self.fakeButtonImageView) {
NSLog(#"touch on fake button detected");
}
}
All views in the hierarchy have userInteractionEnabled=YES. The touch arrives at the first view in the hierarchy (the parent of the parent).
UIView *tview = [touch view];
This returns just the parent of the parent, but not the UIImageView where the user actually tapped on.
What's the trick to recognize if the touch was on the button UIImageView?
I would first suggest that you take a good look at what you can do with a Custom-type UIButton (can set images for various states). 95% of the time, this should address your needs.
If you have some other need, I can help you debug w/ hitTest, etc.