This is a general question to a specific problem.
I am using a UIScrollView in an app that displays photographs. on iOS < 4.0, zooming works great. the same app running on iOS 4.0.x has problems zooming. specifically, if the image does not fill the view (and black bands appear at top/bottom), the first zoom is jerky and garbage data is shown on the bottom of the screen.
the source code to analyze is way to complex and spread out to adequately share here. Can anyone suggest any areas to look at that might cause this strange behavior?
thanks!
Mark
edit: here's the code from the double tap handler (borrowed from the tapDetectingImageView sample code):
- (void)tapDetectingImageView:(TapDetectingImageView *)view gotDoubleTapAtPoint:(CGPoint)tapPoint {
// double tap zooms in
float newScale = [self zoomScale] * ZOOM_STEP;
CGRect zoomRect = [self zoomRectForScale:newScale withCenter:tapPoint];
[self zoomToRect:zoomRect animated:YES];
}
I can force the weirdness if I change the animated parameter in the call to zoomToRect. when animated is NO, my image becomes 2 images superimposed one on top of the other. the bottom image is the original zoom level, the top image is the new zoom level. if I swipe the screen to pan, the image is refreshed. It's almost as if a call to layoutSubviews or DrawRect is not getting called.
This may or may not be related, but the way that UIScrollView dealt with scale factors changed in an undocumented way in iPhone OS 3.2+.
Previously, if you used -scrollViewDidEndZooming:withView:atScale: to re-render your content by applying the identity transform to the view and then redrawing your image sharply at the new scale factor, UIScrollView would ignore this and keep handing you absolute scale factors based on the initial size of the view.
On iPhone OS 3.2+, UIScrollView now gives you a relative scale factor based on the last time you reset the transform of the content view to be the identity transform. This can lead to significant scaling differences between the various OS versions.
Related
I am creating a graphing calculator application in OpenGL on iOS, and I would like to implement pinch zooming that works like the google maps application or uiscrollview zooming. I'm pretty sure I can't use a uiscrollview because the content of the graph is being generated dynamically.
Implementing zooming where the center of the screen is assumed to be the center of the zoom is easy, but in other cases it is not obvious to me how its being implemented. Can anyone point me in the right direction?
The view that does your custom drawing should be capable of drawing at a certain scale. When your user zooms in or out of the scroll view, send the appropriate zoom factor and let the view redraw itself. The changing bounds is not a problem. The scroll view adjusts its bounds as you pan inside, so its subviews shouldn't have any concern over the bounds of the scroll view.
Of course, you'll probably have to consider performance. For example, it may be too slow to redraw the graph at every frame as a user zooms. Instead, you may only want to redraw when the user stops zooming, which is what Maps.app does.
There are many other considerations for this problem, but that would be where I would start.
EDIT: Ah, I overlooked the OpenGL aspect. Still, with a GL layer-backed view, you still should be able to make the appropriate translations based on both the current zoom factor as well as the scroll view's bounds.
I'm new to iPhone dev and Obj-C and I have several problems with ScrollView/ImageView while getting close to deadline. I used IB to create interface so I access most parameters via builder.
1) I was using touch events (begin/moved/ended) on imageView to switch images. When I put ImageView to ScrollView the old gestures stopped working and I can only zoom. If scrollView of both have focus I can't use my gestures even when zoomed out. How can I use both?
2) How do I zoom only image part of view? Unfortunately I also see background area around :/ What's worse - after rotating the view keeps it's old dimensions and I have even more black areas around image. For some reason image is in top-left corner.
Code snippets I found doesn't really help me much in this case. I have various images of different sizes in imageView to switch and zoom in/out.
EDIT: Ok, a little different. How do I override scrollview touch mode so that when image is zoomed out (to screen size) "normal" gestures would work. Currenly I have either scroll view scrolling or gestures, can't use both. Anyone?
Solved by dynamically changing imageview to imagesize and switching like:
- (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale {
if (scrollView.zoomScale==1.0) {
scrollView.userInteractionEnabled = NO;
imageView.userInteractionEnabled = YES;
}
Not the best solution but works good enough.
In my iPhone OS application I want (need) to watch for changes in the device orientation in order to rearrange certain portions of the screen. The approach I used was to use CGRect frame = [UIScreen mainScreen].applicationFrame to get the screen size, and from there calculate the size and / or positioning of other controls (I also tried self.view.frame).
All testing was done so far in Portrait mode, so I could focus on programming the main features and later on just do some adjustments for Landscape. And here enters the problem: In -(void) didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation I added some logging to check the sizes before proceeding, but apparently the values for Width and Height are "wrong" (I say "wrong" because at a first glance the values does not make sense to me).
Here's the output of some logging:
Rotation: Landscape [w=300.000000, h=480.000000]
Rotation: Portrait [w=320.000000, h=460.000000]
The values for "w" and "h" in Landscape seem inverted to me - I was expecting that w=480 and h=300.
What am I doing wrong? The code I used to debug is below.
-(void) didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation {
CGRect frame = [UIScreen mainScreen].applicationFrame;
CGSize size = frame.size;
NSLog(#"%#", [NSString stringWithFormat:#"Rotation: %s [w=%f, h=%f]",
UIInterfaceOrientationIsPortrait(self.interfaceOrientation) ? "Portrait" : "Landscape",
size.width, size.height]);
}
The orientation of your device changed, not the physical characteristics of the screen. You basically tipped it on its side, but in reality it is 320 pixels wide (20 of which are not available to you at the moment since the status bar is showing) and 480 pixels tall. If your view is auto-rotating, then the width/height have been translated for you, but when you ask for the actual dimensions of the screen, you get back the actual dimensions of the screen.
This is also why, when working with translated views, it is important to do calculations based on the view's center and the view's bounds and never on the view's frame.
Use self.view.bounds instead.
I've been stumbling over the same problem and with some diagnostic work I discovered that the view's bounds and frame do not accurately reflect the landscape orientation in viewDidLoad (when the device is held in landscape when the view controller is pushed onto the stack), but do so correctly in viewWillAppear: and viewDidAppear:. I just moved my code that needed the dimensions of the frame/bounds from viewDidLoad to viewWillAppear:, and it worked properly.
Sorry for long winded post.
I am trying to understand UIScrollView and running into very simple problem.
I am creating a scroll view
I am making this view 1.5 size larger then normal size
Using UIScrollView I expect to see some edge elements of view out of bounds, but should be able to pan the view therefore bringing missing elements back to the visible area.
However I am seeing that I can't just pan/scroll view anyway I want, instead view always wants to scroll up, as soon as move away my finger from the screen (touch end event).
I am not handling any touches, etc - I just want to understand why does not scaled view stay put where I scroll it?
CGRect viewFrame = self.view.frame ;
viewFrame.size.width *= 1.5;
viewFrame.size.height *= 1.5;
CGSize mySize = viewFrame.size;
[ ((UIScrollView *) self.view) setContentSize: mySize];
self.view.transform = CGAffineTransformMakeScale(1.5, 1.5);
What I really trying to accomplish is something similar to Number on iPad (the same code will work on iPhone):
There is a view with lots of controls
on it (order entry form)
User can zoom into the entire form so all elements look bigger
user can pan the form therefore bringing various elements into the visible area of the screen.
It seems that UIScrollView can should be able to handle zoom and pan actions (for now I am using Affine Transform to zoom in to the order entry form and iPad)
Thanks
When you transform a view, you transform its internal coordinate system as well. This means that if you scale a view, the view still "thinks" it is the same size it was before the scale because its coordinate units scaled as well.
For example, if you have an image view that has a size of (50,50) and you transform it so that it covers (200,200) on the screen, when you ask the image view its size it will report that its size is still (50,50).
Scrollviews are unusual types of views because they have understand their absolute size relative to physical device screen in order to work properly. When you transform their coordinate system, they lose that connection to the physical device screen and can no longer function properly. This is what you are seeing.
I haven't done this but I'm pretty sure to create the illusion of a zoom in a scrollview, you increase the frame of the scrollview and then transform its subviews (or transform the subviews and then increase the frame of the scrollview to contain the new subview size.) That is the only way to keep the scrollview in sync with the physical device screen.
My application consists of a UIImageView inside a UIScrollView, and I display a big image inside of it. The scroll view allows the user to pinch to zoom in/out in the image, and that all seems to work just fine.
However, when my application is terminated and then re-launched, the UIScrollView displays the image again in the original zoom level (which is currently set to display the whole image, by scaling it in a "aspect fit" mode).
I would really like to be able to re-launch my app and have the UIScrollView reopen with the same parameters as it was set when the app terminated. So if my image is currently zoomed in to the max, and scrolled all the way to the bottom left of it, that should be the view when I open the app again.
How can I do this?
Check the .transform property of the view. If you are zoomed in, this should be modified. Save and restore it on the next launch.
I have found a way to programmatically zoom UIScrollView. This may help you set the desired zoom level upon startup.
The sample code, together with ZoomScrollView class that encapsulates the required zooming magic and a detailed discussion of how (and why) UIScrollView zooming works is available at github.com/andreyvit/ScrollingMadness/.
Someone asked this a while ago, but I don't think you'll like the answer. Hopefully you'll get a better one.
I do it like this:
When you quit save the zoomlevel (myScrollview.zoomScale) and the contentview frame origin.
When you reopen you can set the zoomscale. You also have to set the content size to the new zoomlevel, because otherwise the boundaries are not correct.
[myScrollview setZoomScale:savedZoomScale];
CGSize newsize = CGSizeMake(origsize.width * savedZoomScale,
origsize.height * savedZoomScale);
myScrollview.contentSize = newsize;
To set the offset:
[myScrollView setContentOffset:savedFrameOrigin animated:YES];