I have image files of dimension 1900x1200. In my code I try to load it as
UIImage *image = [UIImage imageNamed:imageName];
When I try to run this code in iPhone Simulator (Retina Display) my images look out of proportion. I tried to print
image.size.width and image.size.height
and the value I get is 950x600.
What I am doing wrong. Please help me out.
The answer is actually simple: the UIImageView (and the underlying UIImage) are using scale factor 2.0. That is your 1900x1200 pixels image correspond to 950x600 points image with scale factor 2 on retina display. You can double check the UIImage's scale property.
Related
I'm trying to save a screenshot of my iPhone screen, but the result is coming out at 1x scale.
For example a 320x480 pixels screenshot is 320x480 pixels, when displayed on a retina display, it looks fuzzy. But if I take a screenshot with the home screen and the power button, the resulting image is 640x960 and looks perfect on a retina display. How can I take screenshots of the screen taking into account the scale factor of the screen?
Thank you!
Try This:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
[data writeToFile:#"foo.png" atomically:YES];
I have a issue with masking images. I do game "puzzle" and have to make custom images. I found and tried 2 way of custom cropping:
Using CALayer.mask property.
Using UIImage.mask property.
In first option i create my custom path, then assign it to CAShapeLayer.path property, then assign CAShapeLayer to CALayer.mask property. At the end i have custom cropped image.
In second option i use firstly use CGImageMaskCreate() method (i use previously created black mask images of puzzle), then CGContextClipToMask().
In either options i have problem with performance (mostly when i crop image into 16 puzzles and drag in over the screen).
Is there any other approaches to crop image in custom way.
(I don't know how to solve performance problem).
Thanks in advance.
There are lots of UIImage-categories out there you can use for this. Give me a moment and I'll post some links here:
Cropping an UIImage (not really a category though, but it'll fit)
UIImage: Resize, then Crop
https://sites.google.com/a/injoit.com/knowledge-base/for-developers/graphics/uiimage-routines-scaling-cropping-rotating-etc
http://www.hive05.com/2008/11/crop-an-image-using-the-iphone-sdk/
http://maybelost.com/2010/11/cropping-a-uiimage-on-iphone/
Try this:
-(UIImage *)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
...
UIImage *temp_image = [self imageByCropping:original_image toRect:clipping_rectangle];
Maybe you should consider about drawing the image in a new image with an alpha background an overdrawing the current background. I mean: All pixel which are inside the jigsaw piece: normal colour, all pixels outside the jigsaw piece = transparent. And then try to blend it to the new background or overdrawing it.
Just my 2 cents. :)
I am loading images of size 1800x1300 in UIImage.(getting the images from server..)then i am adding that images into Uiscrollview.while scrolling my images ,the app getting crashed.is it a the image size issue.....?thanks in advance.
UIImage *image = [UIImage imageNamed:#"image.png"];
NSData *imgData = UIImageJPEGRepresentation([UIImage imageWithCGImage:[image CGImage]],0.5);
UIImage *compressedImage = [UIImage imageWithData:mgData];
You should definitely scale your images down, significantly, especially for the iPhone screen size.
You should also compress your images as much as you can without affecting the quality. Your image, I am assuming is somewhere around 1.5MB - 2MB, so that will use up an extreme amount if memory, and cause your device to crash every time.
Current I am trying to show a simple table in my iPhone application where I use UITableViewCell's with the style UITableViewCellStyleValue1 (image to the left, detail-label right-alligned). The cells all have the default height (50.0f). Before I add an image to the cell, I resize the image to be 40x40, so that it is not the total height of the cell (I think that looks ugly).
I do this with this code:
cell.imageView.image = [UIImage imageNamed:#"icon.png"];
cell.imageView.image = [RootViewController imageWithImage:cell.imageView.image scaledToSize:CGSizeMake(40, 40)];
This is all very nice and works flawlessly. But I want to accomplish this also on the iPhone 4 (with the higher resolution screen). The problem is, that everything is scaled without problems on the iPhone 4 but the images appear very pixelated.
The reason for this is ofcourse that everything on the screen is blown up to scale to the new resolution, also the images, so the images should probably be something like 80x80. But when I resize them to 80x80 (originals are 120x120) they appear way to big, because of the scaling thing.
Is there a way to actually make my images not the complete height of a tablecell, but I want them in the higher resolution on the iPhone 4. Should I create a complete new View for this?
Oops, after the first reply I realised that my own written function was missing:
+ (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContextWithOptions(newSize, NO, [[UIScreen mainScreen] scale]);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
As you can see, after the first reply, I tried to get this to work with the method UIGraphicsBeginImageContextWithOptions but somehow this results in an empty image.
I assume you wrote "imageWithImage:scaledToSize:", right?
I further assume you use "UIGraphicsBeginImageContext(yourSize)" within this call. Replace that with "UIGraphicsBeginImageContextWithOptions(yourSize, NO, 2.0)" in case your platform is iPhone 4.
The "2.0" defines the scale factor for points (you define the size in points not in pixels). On pre-retina-display a point is 1x1 pixel. On retina display a point is 2x2 pixels.
Edit:
Make sure you have a high-res version of "icon.png" in your resources called "icon#2x.png". This is automatically loaded in case it is a retina display.
How can I save a stretched image?
I have a small(50 X 50) image, that I have stretched while displaying it on a UIImageView (by choosing scaletofit mode.) Now I need to fetch & save that stretched image.
I have tried by using UIImageView's image property, but it gives me original image not that stretched image.
So, do you have any idea how to solve this problem? If then please help me or guide me or at-least give me a hint.
Hoping for your reply.........
Thanks
You have to create an image by copying the bit map of the view's CGContext (core graphic context). This has been asked on stackoverflow before but I can't find the answer right now.
You need to render your UIImageView into a graphic context you create :
UIGraphicsBeginImageContext(yourUIImageView.frame.size);
yourUIImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//get the PNG representation of the UIImage (but you could do the same with JPEG
NSData *myData = UIImagePNGRepresentation(screenshot);
//then save it to a file
[myData writeToFile:thePathToWriteYourFileAsNSString atomically:YES];