How to make certain part of image clickable in ios? - iphone

In one of my app i am using image for the whole screen. Which can be zoomed to some extent. That image has eight different shapes(includes person,shapes,etc).What i am trying to do is i need to make certain each shape of the image is clickable. Touching each part takes to different screens.I didn't have any idea about how to achieve this. I googled it but no solution.
1.) Is this possible by using co-ordinates(will normal image and zoomed image differ in co-ordinates? How to achieve this by using co-ordinates?
2.) If not what will be the best approach to achieve my goal?
Any ideas/samples is much appreciated.

I would add a UITapGestureRecognizer to the imageView holding your image. And the locationOfTouch:inView: method to determine the coordinates of your touch.

Correct me if i don't understand your question. For me, that should be very simple? Just have couple of buttons which background is clear? And They are all on top of the image.

Check UIResponder and the touches methods in there. You'll probably want to hook in to something like -touchesEnded:withEvent: for detecting when a finger lifts off the screen.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGRect touchLocationInView = [touch locationInView:imageView];
// Do something to check that the rect is valid
// If valid, react to it
}
}
Also, a link to UITouch.

Related

how to move view with my finger in iphone Objective-c

i have this view that the size of him is 1280 X 345 and im moving it left and right in my screen.
now, my main question is how do i make it move with my finger (not swipeLeft / swipeRight) i want it to move with my finger like the home screen of the iPhone IOS.
now, im using this code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:infoView];
//[self.view setTransform:CGAffineTransformTranslate(CGAffineTransformIdentity, (location.x + self.view.frame.origin.x - dx), 0.0)];
NSLog(#"touch: %f",location.x);
infoView.frame = CGRectMake(infoView.frame.origin.x + location.x, 0, 1280, 345);
}
that should be the current way but i cant figur it out.
i have also try to find an answer in google and here, but, as you know.. i didn't found something useful.
i've also made this so you can understand it better.
Don't write code to do this yourself. Use UIScrollView. This is what it is designed for.
If you want to have the same effect as the iPhone/iPad home screen you should use a UIScrollView and enable paging.
If you really want to handle it yourself for some reason, avoiding UIScrollView, there are UIGestureRecognizers, specifically UIPanGestureRecognizer which can leverage most of the job handling multiple touch events in one place.
i used the UIPagecontrol and UIScrollView to make it. very easy and smart object !

How to detect the boundary of an uiimage

In my application i want to detect the boundaries of an uiimage. I'm having an flower image has many parts like,(lief,sticky bulb etc..)as a single image.If i'm touch the particular lief means it find the boundary value of that particular lief and return the value.
I'm having no idea about this.Please any one help me out to do this.
You can use "Flood Fill Algorithm" for that.
Only getting the touch points is not sufficient for this answer.
You need to break all in pixels and then you will use that further.
You could implement some edge detecion algorithm. But unless you have some experience in image processing I think that would be a major headache.
If you have the image in advance (do you?) you could separate the image as a composition of masks and detect the touches in the corresponding mask. The idea is that for each part of the image you will have a corresponding bitmap mask. When you receive a touch, you check all masks to see which one has a black pixel in the location of the touch. It is a thick Quartz technique, but I think it is much more approachable than the edge detection. Check out the relevant quartz documentation.
I think you can do this by subclassing the UIImageView and detecting the touchevent and position.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint position = [touch locationInView:self];
}
I havent tried this, but seems like this will solve your problem.

how to get touch coordinates in an image then draw a markup icon and its content in a popover view

I would like to built an iPhone/iPad application to show large images (in a scrollView or something else which support dragging and zooming) that allow user to:
Touch some where in the image to markup and leave comment
User can tap on that markup icon/button to view comment in a popOverView
Edit comment or remove that markup
So I want to ask that:
How can I get the touch coordinates in image (not screen)?
How can I draw a markup icon/button at touch point in the image and it would follow image even when dragging, zooming since the image is really large, maybe up to 8000x6000 pixels?
How can I display comment/note when user touch on markup icon/button in a view like popOverview in iPad?
Save and load these information.
It is nearly similar to tagging functionality of Facebook App in iPhone.
Any help is appreciated, thank in advance!
1 . You subclass the UIImageView and override the touch methods:
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
2 . You add and UIButton to the UIImageView [yourImageView addSubview:yourButton]; then set the center of your button to touch coordinates.
3 . Just present an popover when user taps on an button. (You can set tag property for buttons to know which button is tapped)
4 . Save data to an plist to documents directory if is not to complex ore use core data.
Good Luck. Just post comments if you need more help.
Edit:
you need to set the user userInteractionEnabled to YES for the UIImageView.
userInteractionEnabled A Boolean value that determines whether user
events are ignored and removed from
the event queue.
#property(nonatomic,
getter=isUserInteractionEnabled) BOOL
userInteractionEnabled Discussion This
property is inherited from the UIView
parent class. This class changes the
default value of this property to NO.
Availability Available in iOS 2.0 and
later. Declared In UIImageView.h
From UIImageView Class Reference

How to code zooming and panning for a UIImageView?

I have a UIImageView object, attached to a controller. It displays fine. What is a easy way to get zooming and panning with the least amount of code? Perhaps some library out there that does this? Hard to believe the SDK does not provide anything.
Add your UIImageView as a subview of a UIScrollView and make sure you change the minimumZoomScale or maximumZoomScale.
Also take a look at the documentation for UIScrollView there might be other settings you want to tweak.
UIScrollView is the SDK class you're looking for
The Three20 project has a photo viewer you may want to look into, that I believe supports zooming and panning in a larger image.
try using
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
Here is some code
http://www.redcodelabs.com/2009/09/objective-c-zoom-image/

How can an underlying view know if its rectangle got touched, no matter if directly or indirectly?

I have an UIView which is transparent and covers almost the whole screen. I left 50 pixels at the top. It is a child of the View Controller's view.
Underneeth the UIView there's MyView that inherits from UIView, which matches the screen size. And inside this MyView class, I ask for a touch on it very simple with this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == self) {
NSLog(#"MyView touched");
}
}
Now the funny thing is, of course, that if the user touches on the transparent UIView which covers that MyView, I don't get "MyView touched" in the console. But when the user touches the little uncovered area of MyView at the top of the screen, then the touch arrives there.
That's logical to me, because I ask for [touch view] == self. But what if I wanted to know that the rectangular area of that MyView got touched (no matter if indirect or direct)?
Is there a way to catch any touch that appears on the screen/window and then just check if it matches the rectangular area of the view?
You should study the iPhone Application Programming Guide's section on Touch Events for the background you're looking for. The concept you want to master is the Responder Chain, so also look through the reference on UIResponder to understand what it's doing. You can definitely do everything you're talking about, and the full discussion is in the link above.