Resizing multiple image views with orientation - iphone

I'm in the process of creating a universal iOS app that, amongst other functions, allows the user to spawn UIImageViews on touch.
The issue that I'm having is that when I rotate the device, the views that are created do not resize correctly.
I have placed the code into my main view controller method, as shown:
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event
{
//first part - creating the frame
view = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"MyImage.png"]];
view.frame = CGRectMake(50, 50, 100, 100);
[self.view addSubview:view];
//second part - triggering image placement on touch
UITouch *touch = [[event allTouches] anyObject];
view.center = [touch locationInView:self.view];
}
Does anybody know of any way in which I can implement an autoresizing method for the views that are dynamically created?
The effect that I'm looking for is: instead of the images using the old portrait coordinates, they somehow resize with an orientation change into landscape, so that they're all visible in positions that are relative to the portrait ones.
If I were using interface builder then it would be easy enough, but I can't find an obvious solution when doing this programmatically.
It's not as simple as returning YES for the shouldAutorotateToInterfaceOrientation, either.
I've checked a lot of the question/answer resources for iOS development, and not many of the solutions seem to focus on dynamically created UIImageViews and frames such as in my example.
I would be extremely grateful if anybody could take the time to provide me a solution, or even just point me in the right direction.
Thank you in advance,
Rory

set the auto resizing mask accordingly
every UIView has a property UIViewAutoresizing
view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;

Related

How to make certain part of image clickable in ios?

In one of my app i am using image for the whole screen. Which can be zoomed to some extent. That image has eight different shapes(includes person,shapes,etc).What i am trying to do is i need to make certain each shape of the image is clickable. Touching each part takes to different screens.I didn't have any idea about how to achieve this. I googled it but no solution.
1.) Is this possible by using co-ordinates(will normal image and zoomed image differ in co-ordinates? How to achieve this by using co-ordinates?
2.) If not what will be the best approach to achieve my goal?
Any ideas/samples is much appreciated.
I would add a UITapGestureRecognizer to the imageView holding your image. And the locationOfTouch:inView: method to determine the coordinates of your touch.
Correct me if i don't understand your question. For me, that should be very simple? Just have couple of buttons which background is clear? And They are all on top of the image.
Check UIResponder and the touches methods in there. You'll probably want to hook in to something like -touchesEnded:withEvent: for detecting when a finger lifts off the screen.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGRect touchLocationInView = [touch locationInView:imageView];
// Do something to check that the rect is valid
// If valid, react to it
}
}
Also, a link to UITouch.

how to move view with my finger in iphone Objective-c

i have this view that the size of him is 1280 X 345 and im moving it left and right in my screen.
now, my main question is how do i make it move with my finger (not swipeLeft / swipeRight) i want it to move with my finger like the home screen of the iPhone IOS.
now, im using this code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:infoView];
//[self.view setTransform:CGAffineTransformTranslate(CGAffineTransformIdentity, (location.x + self.view.frame.origin.x - dx), 0.0)];
NSLog(#"touch: %f",location.x);
infoView.frame = CGRectMake(infoView.frame.origin.x + location.x, 0, 1280, 345);
}
that should be the current way but i cant figur it out.
i have also try to find an answer in google and here, but, as you know.. i didn't found something useful.
i've also made this so you can understand it better.
Don't write code to do this yourself. Use UIScrollView. This is what it is designed for.
If you want to have the same effect as the iPhone/iPad home screen you should use a UIScrollView and enable paging.
If you really want to handle it yourself for some reason, avoiding UIScrollView, there are UIGestureRecognizers, specifically UIPanGestureRecognizer which can leverage most of the job handling multiple touch events in one place.
i used the UIPagecontrol and UIScrollView to make it. very easy and smart object !

iOS: Can I override pinch in/out behavior of UIScrollView?

I'm drawing a graph on a UIView, which is contained by a UIScrollView so that the user can scroll horizontally to look around the entire graph.
Now I want to zoom the graph when a user pinches in with two fingers, but instead of zooming in a view with the same rate for X and Y direction, I want to zoom only in the X direction by changing the X scale, without changing the Y scale.
I think I have to catch the pinch in/out gesture and redraw the graph, overriding the default zooming behavior.
But is there a way to do this?
I've been having a very difficult time to catch the pinch gesture on the UIScrollView, as it cancels the touches when it starts to scroll. I want the zooming to work even after the UIScrollView cancels the touches. :(
Thanks,
Kura
Although you cannot delete the existing pinch gesture recognizer, you can disable it and then add your own:
// Disable existing recognizer
for (UIGestureRecognizer* recognizer in [_scrollView gestureRecognizers]) {
if ([recognizer isKindOfClass:[UIPinchGestureRecognizer class]]) {
[recognizer setEnabled:NO];
}
}
// Add our own
UIPinchGestureRecognizer* pinchRecognizer =
[[UIPinchGestureRecognizer alloc] initWithTarget:self
action:#selector(pinch:)];
[_scrollView addGestureRecognizer:pinchRecognizer];
[pinchRecognizer release];
Then in
- (void) pinch:(UIPinchGestureRecognizer*)recognizer { .. }
use
[recognizer locationOfTouch:0 inView:..]
[recognizer locationOfTouch:1 inView:..]
to figure out if the user is pinching horizontally or vertically.
You should instead access the gestureRecognizers (defined in UIView), there are several of them being used by the scroll view,
figure out which one is the pinch recognizer and call removeGestureRecognizer: on the scroll view, then create your own and have it do the work, add it back with addGestureRecognizer:.
these are all public API,
the recognizers and what order they are in are not (currently),
so program defensively when accessing them
(this is a perfectly valid way to manipulate UIKit views, and Apple won't/shouldn't have issues with it - though they will not guarantee it works in any future release)
You should be able to subclass UIScrollView and override the touchesBegan: method. Don't call [super touchesBegan:] but instead, adjust the zoom as you like:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//Anything you want. Probably you would want to store all the touches
//or their values, so that you can compare them to the touches
//in the touchesEnded: method,
//thus letting you know what the pinch amount was
}
If you like, you can judge whether it's a pinch or not, and if it's not, call the super method, and only handle it yourself for custom pinches.
Edsko & bshirley answers are good, but they don't tell where to place the code.
First, I placed it in viewDidLoad method, but no Pinch Gesture Recognizer was found in the scrollview (maybe because my scrollview is an IBOutlet).
Then I tried in viewWillAppear or viewDidAppear and the UIPinchGestureRecognizer was here.

How to detect touch on UIImageView inside UIScrollview?

I have a UIScrollview in my app and I populate it with LOTS of UIImageViews approx 900. They are all very small and consist of only two different images over and over again.
Now I am having a lot of trouble detecting a touch on one of these UIImageViews.
I have assigned them all a unique TAG so as to be able to distinguish between them but I am really struggling to detect the touch.
The goal is just to be able to change the image of the touched UIImageView.
Due to the large amount of views involved a simple loop checking touch coordinates against each UIImageViews frame is just hanging my app.
Does anyone have any suggestions?
Thanks in Advance.
Ben
In my case the easiest way to do it was adding a gesture recognizer:
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(singleTapGestureCaptured:)];
//Default value for cancelsTouchesInView is YES, which will prevent buttons to be clicked
singleTap.cancelsTouchesInView = NO;
[myScrollView addGestureRecognizer:singleTap];
Then use this method to capture the touch:
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
CGPoint touchPoint=[gesture locationInView:myScrollView];
}
UIImageView has touch processing turned off by default. To change that set userInteractionEnabled to YES (see the docs here http://developer.apple.com/library/ios/#DOCUMENTATION/UIKit/Reference/UIImageView_Class/Reference/Reference.html)
Also if you want to know which view was hit in a view hierarchy you can use hitTest:withEvent: (see docs in UIView) which will be much faster than you looping through the hierarchy.
On a wider note I don't think having 900 UIImageView's on the screen all at once is going to be a good long term strategy. Have you investigated drawing the screen's content via CoreGraphics?
The best way to solve this issue is to subclass UIScrollView, add touchesBegan etc methods to it.
#interface MyScroll : UIScrollView
{
}
You could easily implement this method on the UIImageView object:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
if ([self pointInside:point withEvent:event]) {
//You clicked inside the object
}
return [super hitTest:point withEvent:event]
}
Use the return to tell the application the next touch responder (such as the uiimageview or the scrollview).

How can an underlying view know if its rectangle got touched, no matter if directly or indirectly?

I have an UIView which is transparent and covers almost the whole screen. I left 50 pixels at the top. It is a child of the View Controller's view.
Underneeth the UIView there's MyView that inherits from UIView, which matches the screen size. And inside this MyView class, I ask for a touch on it very simple with this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == self) {
NSLog(#"MyView touched");
}
}
Now the funny thing is, of course, that if the user touches on the transparent UIView which covers that MyView, I don't get "MyView touched" in the console. But when the user touches the little uncovered area of MyView at the top of the screen, then the touch arrives there.
That's logical to me, because I ask for [touch view] == self. But what if I wanted to know that the rectangular area of that MyView got touched (no matter if indirect or direct)?
Is there a way to catch any touch that appears on the screen/window and then just check if it matches the rectangular area of the view?
You should study the iPhone Application Programming Guide's section on Touch Events for the background you're looking for. The concept you want to master is the Responder Chain, so also look through the reference on UIResponder to understand what it's doing. You can definitely do everything you're talking about, and the full discussion is in the link above.