Determining Dimensions of Images in iPhone SDK - iphone

Simply, is it possible to somehow capture the dimensions (specifically, width and height) of an image using either the iPhone SDK or an iPhone SDK-compatible library/framework? Perhaps pull the dimensions from the EXIF metadata?

From the Apple API for UIImage, create a UIImage from your image data or file, then use the size property to get a CGSize structure representing the image's dimensions.

If your UIImage name is image, you can get its size and then use the size to get the width and height like this:
CGSize mySize = [image size];
CGFloat imageWidth = mySize.width;
CGFloat imageHeight = mySize.height;

Related

How do I save a section on my screen to the users images (in swift)?

I want my user to be able to upload some images into a little square, and then I want all of them to be saved into one image on the user's iPhone.
I'm basically making an app the combines the users pictures beside each other (there's a ton of apps like that but I want to learn how they work), and then saves the total thing as an image on their phone.
Save all your images in an array(arrImage) and use the following method to merge images
- (UIImage *) mergeImages:(NSArray*)arrImage{
float width = 2024;//set the width of merged image
float height = 2024;//set the height of merged image
CGSize mergedImageSize = CGSizeMake(width, height);
float x = 0;
float y= 0;
UIGraphicsBeginImageContext(mergedImageSize);
for(UIImage *img in arrImages){
CGRect rect = CGRectMake(x, y, width/arrimage.count, height/arrImage.count);
[img drawInRect:rect];
x=x+(width/arrimage.count);
y=y+(height/arrImage.count))
}
 
UIImage* mergedImage = UIGraphicsGetImageFromCurrentImageContext();// it will return an image based on the contents of the current bitmap-based graphics context.
UIGraphicsEndImageContext();
return mergedImage;
}

is retina working in my developer iphone?

I'm testing my app in an iPhone5.
I duplicate an image, named image1.png and image1#2x.png, and tried it in iPhone5, in some part of my code:
UIImage *myImage = [UIImage imageNamed:#"image1.png"] ;
CGFloat imageWidth = myImage.size.width;
CGFloat imageHeight = myImage.size.height;
NSLog(#"image %f %f", imageWidth,imageHeight);
CGRect screenBound = [[UIScreen mainScreen] bounds];
CGSize screenSize = screenBound.size;
CGFloat screenWidth = screenSize.width;
CGFloat screenHeight = screenSize.height;
NSLog(#"screen %f %f", screenWidth, screenHeight);
and when running in the iphone, in the console I see:
2013-04-05 13:13:48.386 Vallabici[2413:907] image 320.000000 57.000000
2013-04-05 13:13:48.389 Vallabici[2413:907] screen 320.000000 568.000000
as it was using normal screen instead of retina.
how can it be?
The size of the image should not change on the retina device, just the scale.
To take the scale of the image add the following log:
NSLog(#"scale %f", myImage.scale);
bounds returns the size of the screen in points, not pixels
See the documentation.
The same is true of size
screens are always measured in pixel
and retina pixels are always = non retina pixel
by code you'll get always screen pixel of 230x480 (on iPhones < 5)
the difference is how images are rendered: on retina display they are rendered at double resolution
It's like printer devices: you can print a small image to an A4 paper filling it, and then print a bigger image...
images printed to paper have always the same measures (in inches or centimeters), but the quality of the results change

how to set UIimageview width with respect to the original image size ratio

In my iphone app i need to set image in the image view.
the image view height is always 100pixels.
we need to set the width of the image view with respect to the original image scale ratio.
ie per suppose width X height of an image is 300x400 (original) (4x3)
our displayed image view image size would be 75X100
width X height of an image is 512x512 (original) (1x1)
our displayed image view image size would be 100X100
width X height of an image is 128 x 256 (original) (1x2)
our displayed image view image size would be 50X100
in all those cases image view height 100pixel should be same, but the width need to varie with respect to the image ratio.
And finally the image view should be in center of the view
How to achieve it
There are a couple of ways you can achieve this. The simplest is to set the scaling mode on the image view to Aspect Fill in Interface Builder, or
imageView.contentMode = UIViewContentModeScaleAspectFit;
The second way is to calculate the aspect ratio of the image and set the width. Something like:
UIImage *myImage = [UIImage imageNamed:#"MyImage"];
float aspectRatio = myImage.size.width / myImage.size.height;
float viewWidth = 100 * aspectRatio;
Well Its up to your logics.
You can get UIImage height and width.
UIImage *image = [UIImage imageNamed:#"anyImage.png"];
float imageHeight = image.size.height;
float imageWidth = image.size.width;
After that you can calculate imageViewWidth-.
According to your requirement -
float value = imageHeight/100.0;
float iV_Width = imageWidth/value;

Calculate the standard proportion of the image height with fixed image width in iPhone?

I have taken the image from the iPhone 4G and post those images to the server, after that i fetched those images and displayed in the table view cell. I want to display the correct aspect ratio of the image, when change the image height.
In my calculation,
CGSize imageSize = image.size;
CGFloat imageWidth = imageSize.width; (620) (In retina from iPhone 4G)
CGFloat imageHeight = imageSize.height; (620) (In retina from iPhone 4G)
CGFloat aspectRatio = imageSize.width / imageSize.height; (620 / 620 = 1).
CGFloat newImageWidth = 300.0f; (Fixed Width).
// Calculate new Image Height.
CGFloat newImageHeight = newImageWidth / aspectRatio; (300 / 1 = 300)
So, new image width / height.(300, 300).
And, Should i need to check the image is retina or not?
I don't know, whether the calculation is correct or not, so please guide me to achieve this.
I want to know, standard procedure to find the height of the image with the fixed image width(Set the correct aspect ratio).
Please help me out.
Thanks!
Why don't you just put it into a UIImageView and set its contentMode to UIViewContentModeScaleAspectFit?

UIViewContentModeScaleAspectFit iphone sdk gives poor quality image

Hopefully a quick one? I am creating a custom uitableviewcell and have added an imageview.
I have some PNG images which are around 200x200 in size. I want to create a thumbnail to put in the tableview, but when I resize the image, it results in a poor quality image.
I use UIViewContentModeScaleAspectFit to resize it to a 50x50 frame.
Should I be calling a better draw resize on each image before I Put it to the table cell? There will be around 20-40 images in each table, so I don't want to over work the device!!
Thanks for any help.
Rescaling the images yourself with CoreGraphics will give you more control over the quality, but your best bet is to size the images appropriately for your table in the first place -- less work the software has to do and complete control over the image's appearance.
If you still want to resize them in Quartz, here's one way you might go about it:
UIImage* originalThumbnail = [UIImage imageWithContentsOfFile:<PATH_TO_IMAGE>];
CGSize originalSize = [originalThumbnail size];
CGSize cropSize = { 50, 50 };
CGRect cropRect = CGRectMake(abs(cropSize.width - originalSize.width)/2.0, abs(cropSize.height - originalSize.height)/2.0, cropSize.width, cropSize.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([originalThumbnail CGImage], cropRect);
// here's your cropped UIImage
UIImage* croppedThumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
You'd want to do this once if possible, i.e. not every time you construct your UITableViewCell.