I'm using UIImagePickerController class to capture image from camera.
When is captured, I save the picture on disk. After that I use imageWithContentsOfFile method to load this image as background of the main screen.
The problem is that when I load the picture appears 2 white bands on top & bottom of the view.
My question is how can I take a picture with 320X480px of size in order to load it full screen?
Thanks.
--EDIT--
The originalImage when is captured sizes this:
INFO -> Captured Image Size W:480.000000 H:640.000000
How can I get the image directly 320X480px?
Thanks.
Presumably you get white bands because you've got the content mode of the UIImageView you're displaying it in set to UIViewContentModeAspectFit which means resize the image, maintaining aspect so that it you can see the entire image. If you set the content mode to UIViewContentModeAspectFill it'll do what you want.
The reason for this discrepancy is that the aspect ratio of the screen (2:3) is not the same as the aspect ratio of the images that come from the camera (3:4).
Related
I have a bunch of round Images. They are in png with a 512x512 size. I took them from Flaticon.
To display images in a tableView I use imageView with a 45x45px size and Content Mode as Aspect Fit. Images put in a project assets.xcassets folder with a Scale property as Single Scale.
When images displayed by a tableView a jaggy border is visible in every image:
And its visible without zooming in:
Is it a problem with source images? How to make the image border smooth?
The issue is with an image that you have added.
Please use a normal image and give corner radius to image view as below
self.imageFlag.layer.cornerRadius = self.imageFlag.frame.size.height/2
This will resolve your issue
Below is the screenshot
When we take a photo on an iphone , the image is shown in fullscreen without any "grey bar" at the top i.e. the image is shown in the frame 320*500 size . I want to display that image in my app , but the app has a maximum frame of size 320*480 . Hence when I try to show the image in my app as fullscreen , it is shown as a stretched image . I tried all contentMode options but it didn't work .
So , how to scale an image or how to fix a size of frame so that the image is shown as it is but in a smaller frame without any distortions or something like in "Photos" app of iphone?
When you take a picture, you actually don't see the full-sized photo, you see only the part which fits your display instead (the photo's resolution is bigger than the iPhone's display resolution). So if you want to take a resulting image and show it fullscreen, then you need to do some simple calculations in order to scale image, leaving its proportions correct:
// We know the desired resolution. It's full screen (320, 480) or (640, 960).
// Now we want to determine the destination imageView frame with maximum dimensions
// for it to fit the screen AND leave the image's proportions
float minScale = MIN(screenResolution.width/actualImgWidth, screenResolution.height/actualImgHeight);
// With minScale one side will fit full screen, and the other will occupy a bit smaller space than the screen allows
destImgView.bounds = CGRectMake(0, 0, minScale*actualImgWidth, minScale*actualImgHeight);
destImgView.image = actualImg;
I'm trying to to resize my image (from URL) to full screen in my iPhone. However, the size 320x480 which is supposed to be full size is not full screen.
-(IBAction)changeFullSize{
urlImage.contentMode = UIViewContentModeScaleAspectFit;
urlImage.frame = CGRectMake(0,0,320,480);
}
I'm using a UIButton here, so when the user clicks on the opaque button, it changes to a full sized image.
However, there is an empty space between the tab bar at the top and the start of the picture.
What is the correct size for full screen? Is there a way for it to automatically resize to full screen without me specifying the width and height?
Thanks.
You could try and get it from self.view.frame.size but for the entire screen you can use [[UIScreen mainScreen] bounds]. Hope this helps.
you cant just resize your image by changing its frame, since you are using aspect fit, the image will keep its original aspect, depending on your needs a simple way would be to change to UIViewContentModeScaleToFill
if scaling to fill does not produce your intended results, you may want to check out trevor's blog post on uiimage categories here
once the UIImage categories are added you can just perform
[youruiimage resizedImage:CGSizeMake(320, 480) interpolationQuality:kCGInterpolationHigh]
using this you can use either UIViewContentModeScaleToFill or UIViewContentModeScaleAspectFit and it should turn out nicely =)
If you have status bar then don't forget to reduce its size from the height.
And also aspect-fit would keep the image in its correct ratio, then consider filling it.
Aspect-fill would keep the image in correct ration + will try to make full screen, no void spaces, chances are some part of image wont be shown.
Scale to fill will keep the image full screen, ratio doesn't matter there.
I want to resize an image to a particular width and height without losing its image quality in iphone.
Resize the frame of the image view that contains the image. The image won't change, but it will get smaller. This way you can use the same image for older iPhone screens and Retina Display screens.
If you want to resize the image (so that it takes up less storage space) you'll always lose quality.
I am using the image picker controller to get an image from the user. After some operations on the image, I want the user to be able to save the image at 1600x1200 px, 1024x1024 px or 640x480 px (something like iFlashReady app).
The last option is the size of image I get in the UIImagePickerControllerDelegate method (when using image from camera roll)
Is there any way we can save the image at these resolutions without pixelating the images?
I tried creating a bitmap context with the width and height I want (CGBitmapContextCreate) and drawing the image there. But the image gets pixelated at 1600x1200.
Thanks
This is non-trivial. Your image just doesn't have enough data. To enlarge it you'll need to resample the image and interpolate between pixels (like photoshop when you resize an image).
Most likely you'll want to use a 3rd party library such as:
http://code.google.com/p/simple-iphone-image-processing/
This performs this and many other image processing functions.
From faint memories of computer vision class from long ago, I think what you do is to blur the image after the up-convert.
Before drawing try adjusting your CGBitmapContext's antialiasing and/or interpolation quality:
CGContextSetShouldAntialias( context, 1 == 1 )
CGContextSetInterpolationQuality( context, kCGInterpolationHigh ) ;
If I remember right, antialiasing is turned off on CGContext by default.