I came across a weird scenario in my code the other day:
I have a UIImageView that loads an image with setImage:. The UIImageView is prior initialized with a frame having the exact size of the image (41*41 px). I also set the content mode to UIViewContentModeCenter, which should ensure my image is never scaled.
Now when I look at my image view I see that the image is a bit cropped horizontally and it appears a little blurred (this is a telltale sign that some rescaling is happening in the background). If instead I initialize my image view with 1 extra pixel in width, all works perfectly (however my image view is now one pixel wider than my image).
Also if I initialize my UIImageView with initWithImage: that seems to work fine too. I checked this view's frame after initialization and I found it to be of the same size: 41*41.
So bottom line is I have a few workarounds for this issue, but I'm trying to understand what's happening here. The only explanations I can think of are:
There is a bug in the framework
The rendering of images doesn't work well for images of a particular size. I know that for examples textures are always powers of 2, though I doubt this has much significance for UIImageView.
For the record I'm compiling for OS 3.0 and the issue happens on both simulator and device.
Is the UIImageView contained inside any other views? One thing I've had before is blurred controls caused by fractional pixel offsets. i.e. if you calculate a frame using division and end up with a non-integer float value then UIKit can add some strange blurring at certain offsets and sizes.
So the image view may have an integer size, but it's absolute frame when taking the parent(s) into consideration may not be.
Related
I have this UIImageView which may contain images with different ratios.
I have studied the properties of UIImageView and it seems that the Aspect Fill property set for the Mode of the View should be what I need.
It works as intended when the image is wider than taller: cuts the left and right parts of the image out of the UIImageView.
However if the image is taller than wider it does not cut the above and below part of the image, it allows the image to come out of its view and overlap below views.
These are the constraints I have for the UIImageView:
I have seen several tutorials proceeding like I do, but the result is different for me.
If the image is fitting correctly, but is extending outside the bounds of the image view, try turning on "clip subviews"
For such condition , Two possible solution may help :
Calculate aspect ratio based on your required dimension so that picture quality remains intact . ( ** use Aspect Fit calculator to find required ratio)
CENTER OR ASPECT FIT attributs of Imageview
Thanks
I have a big UIimage (2000x2000). Image drawed every time on app start, and copied to CALayer.
On Current time i put UIScrollView on main view, and make CALayer with drawn image.
Scrolling on small zoom looks fine. But on min zooming , when whole image visible, image scroll slowed, it becomes not quick responsible on move touch.
So, the question. What can I do, to increase scrolling performance?
The approach I would take is to use a lower resolution version of your image at lower zoom levels (lower = zoomed out).
First, see this post for resizing UIImages.
Respond to the scrollViewDidEndZooming:withView:atScale: method in UIScrollViewDelegate, and switch the images when a certain zoom level is reached. This will take some trial and error to find the correct balance. You may even want to render your image at several different resolutions. Be sure to generate the different sized UIImages in advance so there is no delay while zooming.
I have noticed that when placing PNG images into a view using IB and/or animating those images to various positions around a view, the image can sometimes get a slight blur.
In most cases I can remedy the blur by adding .5 of a pixel to the images position.
[lbLiteButton.layer setPosition:CGPointMake(140.5,159.5)];
Sometimes I have to adjust both x and y like above. Sometimes I only have to adjust x or y.
I remember reading somewhere that this has to do with the size of the image and how core animation works and something to do with half pixels... but I cant find the article anywhere!?
The problem with the ".5 pixel" solution is that its different for every PNG image depending on size, so you can't reuse custom animation because you have to customise it for each different image.
Is there a way to ensure that no matter where I place or animate my image, I won't get any blurred positions?
Does anyone have any information on this?
Thank You!
The position property of a view's layer is based on its anchorPoint property. By default, that is (0.5, 0.5), meaning that the anchor point of the layer is at its center. If your view (and its layer) are an odd number of pixels wide or high, setting an integral value for the position will cause the view's origin to be non-integral, leading to the blurriness you see.
To work around this, you could figure out an integral version of your position by taking the desired center position of the view, subtracting half of the view's width, rounding that value, then adding half of the view's width, and repeating for the height. You could also set the anchorPoint for your view's layer to (0,0) and position the view based on its origin.
There's a chance that this might also be related to a misalignment of a superview. To diagnose this, you could use the Core Animation instrument in Instruments, and select the Color Misaligned Images option. It should color any views or layers that are non-pixel-aligned in your application.
I had a similar experience with blurred text in a label, and it was caused by the superview of my labels having a subpixel offset. So even though the location within that view was integral, when adjusted to its parent's coordinates it had a half pixel or so offset, causing the blur.
If you're getting this only sometimes, though, that might not be the case. Is your superview moving around, or positioned strangely? I'd say the best thing to do is to figure out the precise circumstances under which this is happening.
IB has a bug where sometimes (not often) just moving elements around will render them fuzzy - you see it most often with UILabels and UIImageViews (although that's probably just what is most apparent). I'm sure it has to do with the points mentioned above in some way, but the fix is often to set the location (x,y) coordinates for the element to 0,0, and then back to the original values. This usually resolves the issue (again, this is in IB).
The image on the right is the one that I produced in photoshop. I then stripped all text and put it in an image view, as soon as I did that there was a change in colour and the vertical line lost it sharpness. Has anyone else run into a similar problem? What do I do?
alt text http://grab.by/1DuZ
Are the dimensions correct? Is the position of the image an integer? If these cases antialiasing will slightly blur your image.
One thing to be careful of is that if your image is an odd number of pixels in either dimension then centering it onscreen will cause it to be misaligned. Imagine if you had a 1x1 image (just one pixel) and tried to center it perfectly onscreen. It can't be done because the screen is an even number of pixels wide and high. This is why it's best to always use even dimensioned images whenever possible.
I am developing one game where I want to magnify the image where magnifier image is placed.
For that I am using the concept of masking. After masking I am zooming the image but looks blur. And I want image should be clearer like we r looking through rifle magnifier. So if any one have solution then kindly reply
are you sure that the problem is the masking?
perhaps your resources are too low resolution? high resolution images scaled down always look better than low resolution images scaled up.
Maybe you need to look at the problem backwards... so that your image when looking through the rifle magnifier [scope?] is viewed at a 1:1 resolution and when not viewed through the scope it is zoomed out (1:2 resolution?). so this way your 'normal' mode is the zoomed out mode and the "magnified view" is actually just the image at 1:1.
If you have A UIImage whose size is 293x184 but you create a UIImageView with an initial size of 40x30, the iPhone SCALES the UIImage to fit according to the property: contentMode. Default contentMode is: UIViewContentModeScaleToFill, which scales your image.
So even though you started with a large image, it is now only 40x30 and rendered at 40x30. When you zoom it is STILL 40x30, but rendered at some larger size which is causing the blur.
One solution would be to replace the image after the zoom, then you would have a completely new UIImage at full resolution.
[self.view setFrame:reallyBigFrame];
[self.view setImage:newUIImage];
Another would be initially place the UIImage in a full size 293x184 UIImageView, then use an AffineTransform to scale it down:
view.transform = CGAffineTransformScale(view.transform, 0.25, 0.25);