is retina working in my developer iphone? - iphone

I'm testing my app in an iPhone5.
I duplicate an image, named image1.png and image1#2x.png, and tried it in iPhone5, in some part of my code:
UIImage *myImage = [UIImage imageNamed:#"image1.png"] ;
CGFloat imageWidth = myImage.size.width;
CGFloat imageHeight = myImage.size.height;
NSLog(#"image %f %f", imageWidth,imageHeight);
CGRect screenBound = [[UIScreen mainScreen] bounds];
CGSize screenSize = screenBound.size;
CGFloat screenWidth = screenSize.width;
CGFloat screenHeight = screenSize.height;
NSLog(#"screen %f %f", screenWidth, screenHeight);
and when running in the iphone, in the console I see:
2013-04-05 13:13:48.386 Vallabici[2413:907] image 320.000000 57.000000
2013-04-05 13:13:48.389 Vallabici[2413:907] screen 320.000000 568.000000
as it was using normal screen instead of retina.
how can it be?

The size of the image should not change on the retina device, just the scale.
To take the scale of the image add the following log:
NSLog(#"scale %f", myImage.scale);

bounds returns the size of the screen in points, not pixels
See the documentation.
The same is true of size

screens are always measured in pixel
and retina pixels are always = non retina pixel
by code you'll get always screen pixel of 230x480 (on iPhones < 5)
the difference is how images are rendered: on retina display they are rendered at double resolution
It's like printer devices: you can print a small image to an A4 paper filling it, and then print a bigger image...
images printed to paper have always the same measures (in inches or centimeters), but the quality of the results change

Related

Screen size in pixels (iPhone 6)

According to the page here: http://www.paintcodeapp.com/news/ultimate-guide-to-iphone-resolutions the iPhone 6 screen size in pixels should be 750×1334 (1334x750 landscape) however my application seems to think the screen dimensions are 667/375 in landscape and it's not being rendered properly to the entire screen. Here's the code I'm using to get the window dimensions when the application launches as well as opengl.
applicationDidFinishLaunching:
CGRect rect = [[UIScreen mainScreen] bounds];
CGFloat screenW = rect.size.width;
CGFloat screenH = rect.size.height;
initGL:
CGRect rect = [glView bounds];
CGFloat screenW = rect.size.width;
CGFloat screenH = rect.size.height;
glOrthof(0, screenW, 0, screenH, -1, 1);
Both functions print out the same values: 667w/375h. Could this have to do with an issue I had with the splash image, where basically I had to rename it from Default-667h#2x.png to Default-375w-667h#2x.png to get it to load properly? I feel like this is something simple I'm missing here, any help is appreciated.
It is is ok. What you see is logical points. These get translated into physical pixels. In case of iPhone 6 it is 2x.

Rsize photo to 5" height by 4.5" width in iOS 7+

I am developing iphone app in which user can add customize the photo by adding text on it. Now I need to convert this photo into 5" height by 4.5" width in iOS7.
I am taking the screenshot of the the view as below to combine photo & labels added to it as below
-(UIImage*)customizedImageMain
{
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.containerView.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.containerView.bounds.size);
[self.containerView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return finalImage;
}
I checked the image from iPod touch with 4" screen. The image size that I get from above code is nearly 4.445" width & 7.889" height.
Now how can I resize this image to exact 5" height & 4.5" width irrespective of device in which app is running ?
Thanks in advance.
UPDATED Code as per Vin's solution
-(UIImage*)customizedImageFirst
{
// We want 5" * 4.5" image which can be represented as 1630 * 1458
// 1630 pixels = 5 inch * (326 pixels / 1 inch)
// 1458 pixels = 4.5 inch * (326 pixels / 1 inch) in terms of pixels.
CGSize requiredImageSize = CGSizeMake(1458.0,1630.0);
UIGraphicsBeginImageContext(requiredImageSize);
[self.containerView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *combinedImage = UIGraphicsGetImageFromCurrentImageContext();
[combinedImage drawInRect:CGRectMake(0,0,requiredImageSize.width,requiredImageSize.height)];
UIImage* finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return finalImage;
}
Using above code I am getting this result
Now if I use code without considering UILabels added to customize as below
-(UIImage*)customizedImageSecond // If I use your code
{
// We want 5" * 4.5" image which can be represented as 1630 * 1458
// 1630 pixels = 5 inch * (326 pixels / 1 inch)
// 1458 pixels = 4.5 inch * (326 pixels / 1 inch) in terms of pixels.
CGSize requiredImageSize = CGSizeMake(1458.0,1630.0);
UIGraphicsBeginImageContext(requiredImageSize);
[self.myImageView.image drawInRect:CGRectMake(0,0,requiredImageSize.width,requiredImageSize.height)];
UIImage* finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return finalImage;
}
, then I am getting this image
My UI hierarchy
Main View of UIViewController. I have added another UIView (containerView) in the main view. In this containerView I have added my UIImageView & UILabels. So get the customized image with labels added on it I am taking the screenshot of containerView.
This is the custom method that i generally use to scale the image to the desired size. This method take the newSize (For Eg : CGSize newSize = CGSizeMake(30.0, 30.0);) as one of the argument. Here, 30.0 is the value in terms of pixels.
+(UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
So, you can scale the image in terms of pixels using the above method. But you want to scale the image in terms of inches. A display device has a limited number of pixels it can display, and a limited space over which to display them.
Pixel Density is a measurement of the resolution of devices and can be calculated in terms of PPI(Pixels per inch) or PPCM(Pixels per centimeter).
In your case, you want 5" * 4.5" image which can be represented as 1630 * 1458 (1630 pixels = 5 inch * (326 pixels / 1 inch), 1458 pixels = 4.5 inch * (326 pixels / 1 inch)) in terms of pixels.
Note : PPI for Apple iPhone and iPod Touch above 4th Generation is 326.
So, If you scale the image with size of 1630 * 1458 then you will get the image of 5"" * 4.5". For that you have to pass CGSize newSize = CGSizeMake(1630.0, 1458.0); in above method.

Cutting out predefined piece of a photo from camera

For an application I am developing, I let the user specify dimensions of an object they want to capture on camera (for example 30" x 40"). The next thing I want to do, is show the camera feed with a cameraOverlayView on top of it, showing nothing but a stroked transparent square which has the right ratio to capture that object.
So I tried 2 things to get this to work:
Use a UIViewController which uses the AVCaptureVideoPreviewLayer to display a view with the live video feed. On top of that feed I display a transparent view, which draws a square with the right dimensions (using the ratio the user specified).
and
In another attempt I created a UIViewController, containing a button which pops up the UIImagePickerController. Using this controller, I also created a view which I attach to the picker using the cameraOverlayView property.
The main problem I am having with both these methods is that the image that is actually captured is always larger then what I see on screen, but I am not entirely sure how to cut out THAT piece of the image, after the picture has been taken.
So for example:
My UIImagePickerController is shown, I put an overlay over that showing a square that is 300 x 400px large. The user uses this square to take a picture of their object and centers their object inside this square.
The picture is taken, but instead of a picture that is 320x480 (or 640x960) I get a result that is 3500x2400 (or something like that. It's a completely different ratio then the screen ratio (of course).
How do I then make sure I cut out the right part of the image.
The code that actually calculates the size of the square that should be shown (and should be used to determine what piece of the picture should be cut):
+ (CGRect) getFrameRect:(CGRect) rect forSize:(CGSize) frameSize {
if (CGSizeEqualToSize(frameSize, CGSizeZero))
return CGRectZero;
float maxWidth = rect.size.width - 20;
float maxHeight = rect.size.height - 20;
float ratioX = maxWidth / frameSize.width;
float ratioY = maxHeight / frameSize.height;
float ratio = MIN(ratioX, ratioY);
float newWidth = frameSize.width * ratio;
float newHeight = frameSize.height * ratio;
float x = (rect.size.width - newWidth) / 2;
float y = (rect.size.height - newHeight) / 2;
return CGRectMake(x, y, newWidth, newHeight);
}
This determines the largest square that can be created with the ratio specified in the frameSize parameter, with the dimensions where the square should be drawn in supplied in the rect parameter.
Some solutions come to mind, but I am not sure that is doable.
Shrink the photo down to screen width or height (whatever comes first), take the center of the picture and that is the same center that is shown when taking the picture? Not sure this is going to work, I tried this a bit, but that failed.
Ok, I found the solution:
When you are taking a picture with your camera, the preview screen shows only a part of the photo you are taking.
When your iOS device is in portrait mode, the photo height is scaled down to the height of the screen, and only the middle 640px are shown.
The darker red part is what is shown on screen. So when you take a picture, you need to downsize your image to the max height of your screen to get the right width.
After that I cut out the middle 640x960 pixels to actually get the same image as was shown when taking the picture.
After that the coordinates of my rectangular overlay are the same as with my overlay.
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
UIImage* artImage = [info objectForKey:UIImagePickerControllerOriginalImage];
CGFloat imageHeightRatio = artImage.size.height / 960;
CGFloat imageWidth = artImage.size.width / imageHeightRatio;
CGSize newImageSize = CGSizeMake(imageWidth, 960);
artImage = [artImage imageByScalingProportionallyToSize:newImageSize];
CGRect cutOutRect = CGRectMake((artImage.size.width / 2) - (640 / 2), 0, 640, 960);
artImage = [self imageByCropping:artImage toRect:cutOutRect];
CGRect imageCutRect = [FrameSizeCalculations getFrameRect:CGRectMake(0,0, artImage.size.width, artImage.size.height) forSize:self.frameSize];
artImage = [self imageByCropping:artImage toRect:imageCutRect];
CGRect imageViewRect = CGRectInset(_containmentView.bounds, 10, 10);
NSLog(#"ContainmentView: %f x %f x %f x %f",
_containmentView.frame.origin.x,
_containmentView.frame.origin.y,
_containmentView.frame.size.width,
_containmentView.frame.size.height
);
NSLog(#"imageViewRect: %f x %f x %f x %f",
imageViewRect.origin.x,
imageViewRect.origin.y,
imageViewRect.size.width,
imageViewRect.size.height
);
_imageView.frame = [FrameSizeCalculations getFrameRect:imageViewRect forSize:self.frameSize];
NSLog(#"imageViewRect: %f x %f x %f x %f",
_imageView.frame.origin.x,
_imageView.frame.origin.y,
_imageView.frame.size.width,
_imageView.frame.size.height
);
_imageView.contentMode = UIViewContentModeScaleAspectFill;
_imageView.image = artImage;
}

Calculate the standard proportion of the image height with fixed image width in iPhone?

I have taken the image from the iPhone 4G and post those images to the server, after that i fetched those images and displayed in the table view cell. I want to display the correct aspect ratio of the image, when change the image height.
In my calculation,
CGSize imageSize = image.size;
CGFloat imageWidth = imageSize.width; (620) (In retina from iPhone 4G)
CGFloat imageHeight = imageSize.height; (620) (In retina from iPhone 4G)
CGFloat aspectRatio = imageSize.width / imageSize.height; (620 / 620 = 1).
CGFloat newImageWidth = 300.0f; (Fixed Width).
// Calculate new Image Height.
CGFloat newImageHeight = newImageWidth / aspectRatio; (300 / 1 = 300)
So, new image width / height.(300, 300).
And, Should i need to check the image is retina or not?
I don't know, whether the calculation is correct or not, so please guide me to achieve this.
I want to know, standard procedure to find the height of the image with the fixed image width(Set the correct aspect ratio).
Please help me out.
Thanks!
Why don't you just put it into a UIImageView and set its contentMode to UIViewContentModeScaleAspectFit?

Determining Dimensions of Images in iPhone SDK

Simply, is it possible to somehow capture the dimensions (specifically, width and height) of an image using either the iPhone SDK or an iPhone SDK-compatible library/framework? Perhaps pull the dimensions from the EXIF metadata?
From the Apple API for UIImage, create a UIImage from your image data or file, then use the size property to get a CGSize structure representing the image's dimensions.
If your UIImage name is image, you can get its size and then use the size to get the width and height like this:
CGSize mySize = [image size];
CGFloat imageWidth = mySize.width;
CGFloat imageHeight = mySize.height;