I am trying to stretch a UIImage with the following code:
UIImage *stretchyImage = [[UIImage imageNamed:#"Tag#2x.png"] stretchableImageWithLeftCapWidth:10.0 topCapHeight:0.0];
UIImageView *newTag = [[UIImageView alloc] initWithImage:stretchyImage];
The image before stretching looks like this:
And after, looks like this:
Why hasn't the stretching worked properly? The corners have all gone pixelated and look stretched, when in fact only the middle should be stretched. FYI: I am running this app on iOS 6.
Why your implementation doesn't work is because of the values you give to the stretchableImageWithLeftCapWidth:topCapHeight: method.
First of all, stretchableImageWithLeftCapWidth:topCapHeight: is deprecated with iOS 6. The new API is resizableImageWithCapInsets:
The image has non-stretchable parts on the top, bottom and the right sides. What you told the API was "get -10 from the left side, stretch the rest according to the size I give you".
Since you have a non-repeatable custom shape on the right side, by both height and width, we should take that piece as a whole.
So the top cap width should be the height of the image (to preserve the shape of the thing on the right side), left cap width should be ~20 pixels (rounded rectangle corners), the bottom cap can be 0, since the top cap is the whole image height, and finally the right cap should be the width of the custom orange shape on the right side (which I take as ~40 pixels).
You can play with the cap values and achieve a better result.
UIImage *image = [UIImage imageNamed:#"Tag"];
UIImage *resizableImage = [image resizableImageWithCapInsets:UIEdgeInsetsMake(image.size.height, 20, 0, 40)];
Should do the trick.
Also, -imageNamed works fine when you get rid of the file extension & #2x.
Related
The short version: How do I know what region of a UIImageView contains the image, and not aspect ratio padding?
The longer version:
I have a UIImageView of fixed size as pictured:
I am loading photos into this UIViewController, and I want to retain the original photo's aspect ratio so I set the contentMode to Aspect Fit. This ends up ensuring that the entire photo is displayed within the UIImageView, but with the side effect of adding some padding (configured in red):
No problem so far.... But now I am doing face detection on the original image. The face detection code returns a list of CGRects which I then render on top of the UIImageView (I have a subclassed UIView and then laid out an instance in IB which is the same size and offset as the UIImageView).
This approach works great when then photo is not padded out to fit into UIImageView. However if there is padding, it introduces some skew as seen here in green:
I need to take the image padding into account when rendering the boxes, but I do not see a way to retrieve it.
Since I know the original image size and the UIImageView size, I can do some algebra to calculate where the padding should be. However it seems like there is probably a way to retrieve this information, and I am overlooking it.
I do not use image views often so this may not be the best solution. But since no one else has answered the question I figured I'd through out a simple mathematical solution that should solve your problem:
UIImage *selectedImage; // the image you want to display
UIImageView *imageView; // the imageview to hold the selectedImage
NSInteger heightOfView = imageView.frame.size.height;
NSInteger heightOfPicture = selectedImage.size.height;
NSInteger yStartingLocationForGreenSquare; // set it to whatever the current location is
// take whatever you had it set to and add the value of the top padding
yStartingLocationForGreenSquare += (heightOfView - heightOfPicture) / 2;
So although there may be other solutions this is a pretty simple math formula to accomplish what you need. Hope it helps.
I have a UIImageView with a frame of (0, 0, 568, 300) containing a UIImage with a native size of 512x384 pixels. The contentmode is set to Aspect fit.
If the user double-Taps on the view I change the size of the UIImageView with the following code:
self.imageViewerViewController.imageView.frame = CGRectMake(0, -63, 568, 426);
the result is that the right edge of the image is distorted, it does not properly scale to the new size.
Attached is an image with a black and white matrix, the distortion is on the right.
It seems that the right column of pixels is repeated to the right edge of the view.
Can anyone help?
I have changed the creation of the affected UIImageView from a xib file to creating it in code and using the method - (id)initWithFrame:(CGRect)aRect: with the maximum size of the scaled image for aRect. Now the image is properly scaled.
Obviously the UIImageView in the xib file has been inited with a 512x384 pixel frame.
Ever since I've updated my XCode, the MinimumTrackImage on my UISlider is now stretching, when before it was clipping like I wanted it to. The MaximumTrackImage's behavior didn't change.
How can I get the MinimumTrackImage to not stretch? Note that I use rubymotion, but a solution using obj-c is also acceptable.
I'm guessing here (you are allowed to guess on StackOverflow as long as you're honest about it)... There is a new iOS6 feature for images and perhaps that is getting in your way here. You can set the resizingMode and capInsets for an image. Try this:
// Objective-C
UIImage *newImage = [oldImage resizableImageWithCapInsets:UIEdgeInsetsZero resizingMode:UIImageResizingModeTile];
# RubyMotion
newImage = oldImage.resizableImageWithCapInsets(UIEdgeInsetsZero, resizingMode:UIImageResizingModeTile)
If you need to adjust the insets as well, replace UIEdgeInsetsZero with UIEdgeInsetsMake(top, left, bottom, right) where top, left, bottom, and right are floats. In RubyMotion, I believe you can just use [top, left, bottom, right].
Info came from here:
http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIImage_Class/Reference/Reference.html
I am drawing a UILabel to a UIImage.
The UILabel has a width of 193.5 while the resulting UIImage from the code below is 194 wide. Why is this?
UIGraphicsBeginImageContextWithOptions(label.bounds.size, YES, [[UIScreen mainScreen] scale]);
[label.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
How'd you manage to get half a pixel in the width of a UILabel? It probably saw that and rounded it up automatically to 194.
Consider this - there is no way really to render half a pixel. I guess you could technically blend one pixel with another, but you still need a full pixel there. So it has to round up. I'd consider it a mistake to have a non-integer value for your any UI element coordinate - you'll end up with a blurry looking UI. So try to figure out how to fix the half-pixel issue.
How would I go about creating a UITextField like the one in this image?
It appears to be slightly larger, specifically in height.
This can be done much better and simpler with a stretchable image:
UIImage *fieldBGImage = [[UIImage imageNamed:#"input.png"] stretchableImageWithLeftCapWidth:20 topCapHeight:20];
[myUITextField setBackground:fieldBGImage];
Think of the background of the text field as split into three sections. A middle section which can be stretched and caps on the ends. Create an image which only needs to be long enough to contain one pixel of this repeating section (a very short text field), and then create a stretchable image from it using stretchableImageWithLeftCapWidth: topCapHeight. Pass the width of the left end cap into 'leftCapWidth.' You can make it stretch vertically as well, but if your background image is the same height as your text box it wont have an effect.
If you're familiar with 9-slice scaling in Flash or Illustrator it's exactly the same concept, except the middle sections are only one pixel wide/tall.
The advantage of this is you don't have to worry about multiple layered objects scaling together and you can resize your text fields any time and the background will stay in tact. It works on other elements too!
You probably want to use the background and borderStyle properties of UITextField. Setting borderStyle as UITextBorderStyleNone and create a custom background image to be stretched and used as the background property would be one approach.
I suggest taking a look at those properties in the UITextField class reference.
You can do this by:
yourTextField.borderStyle = UITextBorderStyleNone;
yourTextField.background = [UIImage imageNamed:#"email-input.png"];
And if you want to give margin to your text inside the textfield, you can do this by:
// Setting Text Field Margins to display the user entered text Properly
UIView *paddingView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 5, 20)];
yourTextField.leftView = paddingView;
yourTextField.leftViewMode = UITextFieldViewModeAlways;
You can also do this through Interface Builder.
just use the following line and it should work
textfield_name.background = [UIImage imageNamed : #"yourImage.png"];
here, "yourImage" is the background image you wanna set...
however, this will work only if your button isnt a roundrect button.So, you can change the type of the button in the Interface Builder or you can use
textfield_name.borderstyle = UITextBorderStyleNone or UITextBorderStyleBezel
and you r gud2go....!
Take an uiimageview set its image property to the image you want as uitextfield background. And on top of this uiimageview place an uitextfield with borderstyle none. This you can do directly in interface builder.