Scaling subviews nested deep within UIScrollView - iphone

This is causing me a headache of untold proportions.
I am trying to create a UML Drawing application for the iPad.
I currently have functionality to add classes to a 'canvas'. The canvas is simply a UIView, which is located within a UIScrollView.
Each class element is a UIView containing the class name, attributes and methods (UITextFields). These are all subviews of the canvas.
The problem I'm having is when I zoom in on the classes using the pinch/zoom functionality I'm just getting blurry UITextFields. To my understanding, this is because the UIScrollView just applies a transform, and that I need to handle scaling myself.
The only way I can think of doing this however is to redraw the classes and make them larger, which could potentially distort their placement relative to each other. For example if I had two classes side by side, and one doubled in size, then the left class may overlap the right one.
I've searched for hours for a solution and I'm getting nowhere, does anyone know of a code sample or advice that can illustrate a situation resembling mine? Or, if not, an example that shows how to scale a UITextField (or UILabel) in the manner I've described?

I'm thinking this may not be the most obvious answer based on how my question was phrased. But I solved the problem like so:
The first step is to implement the:
- (void)scrollViewDidEndZooming:(UIScrollView
*)scrollView withView:(UIView *)view atScale:(float)scale;
The key point I seemed to be missing (somehow) was that when the UIScrollView is scaled, it then resets it's scale factor to 1. So imagine you zoomed to 125%, and then zoomed out 25%, you'd actually be looking at your original view at 93% (ish), not 100%.
This obviously throws your drawing if, like me, you were trying to scale the subviews in relation to their original size.
The best way then is to keep a track of the scale (I used the variable scaleTracker) and then use this value to scale your views.
More specifically for my problem, the following code was used:
- (void)scrollViewDidEndZooming:(UIScrollView *)scrollView withView:(UIView *)view atScale:(float)scale
{
scaleTracker *= scale;
[canvas setTransform:CGAffineTransformIdentity];
canvas.frame = CGRectMake((int)canvas.original.origin.x,
(int)canvas.original.origin.y,
(int)(canvas.original.size.width * scaleTracker),
(int)(canvas.original.size.height * scaleTracker));
}
Good luck to anyone else who has issues with UIScrollView. For now I hope this can offer some help to any who are interested.

Related

CATiledLayer - how to use when async downloading images

In my application, I have got a scrollview, a bigger one with lots of subviews on it, subviews gets added dynamically based on scroll. I try to add subviews (imageviews) dynamically on scroll and when user scrolls then I try to fetch more images asynchronously based on pageSize etc from server and these images gets placed into it's corresponding imageview..
I have gone through Apple's WWDC Sessoion 104 as well which appears to be good for offline images.
I also try to resize the images on my scrollview in a ratio, which is fine I believe. The problem is When the number of images increases on the scrollview then application runs out of memory. It must be due to me using the images directly in an imageview instead of using CATiledLayer. But, I am looking for help in displaying async images on scrollview using CATiledLayer.
Many Thanks,
Reno Jones
I edit completely the answer based on the comments.
You should set a controller as delegate of the scrollview than in the scrollViewDidScroll: (or in scrollViewDidEndDecelerating: if you want to check it less often) method you can check if some your images are outside the visible part of the content view and then release (i.e. set the pointer to nil) them to save some memory.
For your convenience here's an example how to get the visible part of your content:
CGRect visibleRect;
visibleRect.origin = scrollView.contentOffset;
visibleRect.size = scrollView.bounds.size;
everything completely outside this rect should be released (of course you then have to reload them if user scroll: it may be tricky constantly load/unload...)
This in simple words is all I did.. I have already been downloading images async and caching them up on device, then I have added an UIImage (imageObject) property to my current tile class, which holds the image object being downloaded and then implemented the following and did the trick for me. And believe me or not, imageview loading vs CATiledLayer - CATiledLayer is 100x faster and efficient.
+ layerClass
{
return [CATiledLayer class];
}
- (void)drawRect:(CGRect)rect
{
[imgObject drawAtPoint:CGPointMake(0,0)];
}
Hope it helps someone else in future. Let me know if I can be of more help.

How do I “redraw layers” in Core Graphics?

I remember that when I was reading the Apple documentation, it would mention that when you call a function such as addSubview, you are adding a “layer of paint,” so to speak, and every time it is called, another layer is overlaid.
This should be an easy question to answer, but I had a hard time thinking of keywords to google for, so please excuse the asking of such a simple question.
How do I clear the “layers” of a custom UIView?
My situation, as it may be relevant: I have these “user cards” that are displayed on the screen. They are initialized with some user images. The cards stay the same, but I call a method in my custom UIView (the card UIView) to redraw the images when I want to display a different user. The problem is that some elements of this custom UIView are transparent, and redrawing these images each time builds on that transparency (an obvious problem).
In Core Graphics, what you draw is what gets shown. The painter’s analogy only refers to a single frame. So if you’re using drawRect, you just don’t cache the previous drawing.
But I suspect you’re talking about some UIKit stuff where you’ve added subviews or sublayers. This will remove those leftover views if you just want to clear everything:
for (UIView *view in customView) {
[view removeFromSuperview];
}
for (CALayer *layer in customView.layer) {
[layer removeFromSuperlayer];
}
Ryan's code for clearing UIViews is more or less correct, but if you came here from Google looking for how to clear CLLayers from a view, I was getting a crash when I attempted to fast enumerate customView.layer like Ryan has in his example.
When I switched it to regular enumeration, it worked for me. Here's a code snippet:
for (int i = 0; i > yourView.layer.sublayers.count; i++) {
CALayer *layer = [self.yourView.layer.sublayers objectAtIndex:i];
[layer removeFromSuperlayer];
}

CATransform 3D with a modified .m34 breaks view hierarchy/ordering in iOS6, but not the view it was applied to

Foreword, this isn't me losing a view off screen because I did the transform wrong, it's weirder.
Problem is, if I use an .m34 transform to achieve the perspective I need, the view hierarchy breaks, but remove the transform and it draws everything correctly.
Here's an example.
I have a background image (subviewOne), a menu(subviewTwo), and an object on top of all of that which I apply the CATransform3D to (subviewThree).
Simple code:
CALayer *layer = subviewThree.layer;
CATransform3D perspectiveTransform = CATransform3DIdentity;
perspectiveTransform.m34 = -1.0 / 500;
layer.transform = perspectiveTransform;
Prior to applying this code, the view hierarchy was, and still is on iOS 5:
(bottom to top)
subviewOne->subviewTwo->subviewThree
After applying it, I end up with:
(bottom to top still)
subviewTwo->subviewOne->subviewThree
Now, subviewThree still has the perspective transform applied to it, and is in the correct spot, on top of everything else, same as on iOS5. However, the Menu/subviewTwo, is now hidden by the background image/subviewOne, and nothing I do will get it to be drawn on top of the subviewOne. No amount of insertSubviewAtIndex:, bringSubviewToFront, sendSubviewToBack, etc, will make the view draw correctly.
This is incredibly peculiar particularly because the views that are drawn out of order are NOT having any kind of CATransform3D applied to them.
I have verified this independently in two different apps and multiple devices 6 devices. iOS5 draws everything correctly, and if I remove those four lines, everything draws correctly, but nothing I've tried on iOS 6 stops the .m34 from breaking the view ordering. It's not always as simplistic as the example I've provided, but this is the most demonstrable case I have witnessed.
Has anyone else experienced this, solved this?
Edit: More info for comment.
Yeah, typo with the extra *.
Figure there's an Imageview, QuadCurve Menu, and Textview.
I was calling the method with the .m34 in the viewDidLoad, but swapped it to the viewDidAppear real quick to check for you.
Doesn't matter. Don't get me wrong, the subviews are listed in the correct order when you call
NSLog(#"%#", [self.view.subviews description]);
They just aren't drawn on screen correctly.
In desperation, I wrote some crazy weird code, and I discovered the following.
I can call the method that draws the menu on a 10 second delay,
[self performSelector:#selector(createQuadCurveMenu) withObject:nil afterDelay:10];
which ends in
[self.view addSubview:menu]
As well as a totally superfluous
[self.view bringSubviewToFront:menu]
and it's still drawn behind an imageView that is set as the lowest subview in the .xib.
I have verified this two ways. I can go into the .xib and set the imageView to hidden, and running again I can see the menu, now that the imageView isn't covering it. I can also just comment out the code that applies the .m34 transform to the textView, and the menu then again correctly appears on top of the imageView. Again, none of this happens on iOS5 and iOS4.
At this point, I'm starting to think that it's a bug inside iOS6 itself, and have been waiting for the NDA to expire so I can ask here if anyone else has experienced it.
Pretty sure this is an iOS 6 bug: I've blogged about it here: iOS 6 Rendering Bug: 3D-Rotation Causes Layers to Render Without Respect for View Hierarchy.
Good news: You can work around the bug by setting zPositions on the affected layers: set the zPositions in increasing order of the view hierarchy. So if I've understood correctly, you want:
subviewOne.layer.zPosition = 0;
subviewTwo.layer.zPosition = 1000;
subviewThree.layer.zPosition = 2000;
Check out the blog for more info, including a link to the Open Radar version of the bug I've logged with Apple.
Also, this might be considered a duplicate of this post: iOS 6 view hierarchy nightmare. It has the same bug source and solution, although the symptoms you both describe are different.

How to zoom two Images placed side by side, together?

I have two scroll-views placed side by side and they can be zoomed individually. I've done this by putting my view inside a scrollview and setting the zoom-scale for the scroll-views. So far, it works fine! Now, there's a new requirement to zoom the two images together so that if I zoom one image, the other is zoomed automatically with the same zoom scale. I was given the roambi app as reference in which two scrollviews can be scrolled together by scrolling either one of them for convenience during comparison. Basically, what I'm doing is also comparison between the two views. I've gone through scrollview delegate methods but was unable to achieve the required results. How do I do this?
I've never done this, but off the top of my head I would say first you need to get the zooming being the same in both of them (as per above), then you'll have to use the delegate methods to make sure both of your scrollviews have the same contentOffset value. i.e. when one changes via manual scrolling or via programmatic scrolling, you have to (using the delegate callbacks) set the other one to the same contentOffset value.
EDIT: As per request, adding a bit of (UNTESTED) code:
- (void)scrollViewDidScroll:(UIScrollView*)scrollView
{
if(scrollView == self.myFirstScroller)
{
self.mySecondScroller.contentOffset = self.myFirstScroller.contentOffset;
}
else {
self.myFirstScroller.contentOffset = self.mySecondScroller.contentOffset;
}
}
and zooming done similarly to above.
But if you're looking for some copy-paste solution you can just drop into your project, I'm afraid you'll have to teach yourself a bit more about scroll views. You should read the Apple Programming Guide, because scrollviews can be a bit tricky, and you often have to use quite a few of the delegate methods to get things working correctly.
I had implemented something similar a while ago (I did it for buttons). This is how I did it:
Take two UIScrollViews and reference them (I've used firstScrollView and secondScrollView)
Take two UIButtons and reference them (I've used firstImgBtn and secondImgBtn). Set the delegates to both the scrollviews and use the following delegate methods:
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView{
//return the respective button in the scrollview to be zoomed
if(scrollView==firstScrollView){
return firstImgBtn;
}
else{
return secondImgBtn;
}
}
- (void)scrollViewDidZoom:(UIScrollView *)scrollView{
// zoom in the other scrollview when one has zoomed
if(zoomTogether){//a bool to decide whether to zoom the two together or not
if(scrollView==firstScrollView){
secondScrollView.zoomScale = firstScrollView.zoomScale;
}
else{
firstScrollView.zoomScale = secondScrollView.zoomScale;
}
}
}
This can be applied to any subclass of UIView-in your case it will be UIImageViews

CAShaperLayer -renderInContext Doesn't Work?

I am able to create a UIImage from a Core Animation layer using the following code:
- (UIImage*)contentsImage;
{
UIGraphicsBeginImageContext([self bounds].size);
[self renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
This code is in my CALayer derived class. The issue I am running into is that I have two CAShapeLayers that are child layers of my layer that do not get rendered to the resulting image. If I add standard CALayers as children they get rendered fine. The Apple docs say:
Renders the receiver and its sublayers
into the specified context.
It also says that it's been available since iPhone OS 2.0. Wondering if there is something I'm missing or if I should file a radar.
Any ideas what might keep the child CAShapeLayers from getting drawn to the image?
Thanks.
The CALayer machinery calls renderInContext to create its bitmapped contents property. But in a CAShapeLayer, the path property is not actually rendered to its contents as seen by this note in the header:
The shape as a whole is composited
between the layer's contents and its
first sublayer.
It stands to reason that renderInContext won't actually render the CAShapeLayer path onto your context. I haven't actually tried this out for myself however.
Don't know if its relevant to you but there is a note in the CALayer documentation for renderInContext that says :
**Important**: The Mac OS X v10.5 implementation of this method does not
support the entire Core Animation composition model. QCCompositionLayer,
CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally,
layers that use 3D transforms are not rendered, nor are layers that specify
backgroundFilters, filters, compositingFilter, or a mask values.
Future versions of Mac OS X may add support for rendering these layers
and properties.
Anyways, I ran into a similar problem when using the UIView drawRect function in conjunction with drawing in an image context. The overall UIView that contained subviews would not draw its subviews if I called drawRect (which makes sense now actually since it says in the documentation if you call drawRect you are responsible for filling that entire area regardless of super and subview implementations). I solved my problem by just called drawRect on all my subviews, passing them their own frames.
So I would suggest maybe switching away from renderInContext and use CALayer's drawInContext instead? You'll need to override the method since it doesn't do anything by default. Your subclasses will also need to move the contexts to their appropriate frames. Also to be safe you might want to check that none of the code you add affects normal rendering of these layers.
I filed a radar on this. I can't see any reason in the docs that it shouldn't work.I will respond back here if/when Apple replies to the radar.