incorrect image size in iPhone - iphone

I need to use image height and width.I have 800*800 pixel image in my ios simulator. But when i am finding the size of image using
image.size.width and image.size.height
it is giving me 315*120 which is incorrect why so?
- (void)elcImagePickerController:(ELCImagePickerController *)picker didFinishPickingMediaWithInfo:(NSArray *)info
{
NSLog(#"%#",info);
int temp = [[DataManager sharedObj] imageCount];
for(NSDictionary *dict in info)
{
[[DataManager sharedObj] setImageCount:temp+1];
NSMutableDictionary* dataDic = [[NSMutableDictionary alloc] init];
UIImage* img = [dict objectForKey:UIImagePickerControllerOriginalImage];
NSLog(#"%.2f",img.size.width);
NSLog(#"%.2f",img.size.height);
}
}

Try to call image.frame = cgrectmake(x, y, 800, 800).
If you have an UIImageView *image instead of UIImage *image, problably you get the frame of the view not the image.
SOLVED:
change fullScreenImage to fullResolutionImage in line 33 of your ELCImagePickerController.m.

You can get the image size by using this
UIImage *imageS = [dict objectForKey:UIImagePickerControllerOriginalImage];
NSLog(#"width = %.2f, height = %.2f", imageS.size.width, imageS.size.height);
NSLog(#"Image Size = %#",NSStringFromCGSize(imageS.size));
Both lines will work.
It will print right. Please check your simulator image once again. If you added it via google may me you have been saved the image thumbnail instead of original image thats why you are getting the size wrong. Otherwise method of getting image size is correct.

Related

MKMapView to UIImage iOS 7

The code to render a MKMapView to an UIImage no longer works in iOS 7. It returns an empty image with nothing but the word "Legal" at the bottom and a black compass on the top right. The map itself is missing. Below is my code:
UIGraphicsBeginImageContext(map.bounds.size);
[map.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Map is an IBOutlet that points to a MKMapView. Is there any way to render a MKMapView correctly in iOS 7?
From this SO post:
You can use MKMapSnapshotter and grab the image from the resulting MKMapSnapshot. See the discussion of it WWDC 2013 session video, Putting Map Kit in Perspective.
For example:
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc] init];
options.region = self.mapView.region;
options.scale = [UIScreen mainScreen].scale;
options.size = self.mapView.frame.size;
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc] initWithOptions:options];
[snapshotter startWithCompletionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
UIImage *image = snapshot.image;
NSData *data = UIImagePNGRepresentation(image);
[data writeToFile:[self snapshotFilename] atomically:YES];
}];
Having said that, the renderInContext solution still works for me. There are notes about only doing that in the main queue in iOS7, but it still seems to work. But MKMapSnapshotter seems like the more appropriate solution for iOS7.

UIImagePickerController check if user edited the image

I am using a UIImagePickerController with the property allowsEditing set to YES.
When the user finish picking an image I want to know if the user edited the image he selected or not (e.g. if he scaled the image). This method:
UIImage *editedImage = [info objectForKey:#"UIImagePickerControllerEditedImage"];
returns always an object even if the user left the picture as it was. Is there any way to check if the user edited the image? For example can i check if the UIImagePickerControllerEditedImage and UIImagePickerControllerOriginalImage are different somehow?
Try this in didFinishPickingMediaWithInfo as i am not sure:
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImage *editedimage = [info objectForKey:UIImagePickerControllerEditedImage];
if ([UIImagePNGRepresentation(image) isEqualToData:UIImagePNGRepresentation(editedimage)])
//not edited
else
//edited
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImage *editedimage = [info objectForKey:UIImagePickerControllerEditedImage];
if(editedimage.length>0){
//then got the edited image
}
Could you not just get and compare the CGSize of the image?
BOOL sizeChanged = FALSE;
// get current size of image
CGSize originalSize = [image size];
//After the user hase made the action, get the new size
CGSize currentSize = [image size];
// if the dimensions have been editied the condition is true
if ( originalSize.width != currentSize.width ||
originalSize.height != currentSize.height
)
sizeChanged = TRUE;
else
sizeChanged = FALSE;
Check this out:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIImagePickerControllerDelegate_Protocol/UIImagePickerControllerDelegate/UIImagePickerControllerDelegate.html#//apple_ref/doc/uid/TP40007069
This is the docs for the ImagePicker Delegate. As you can see, when the user picks and image this is called:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
info - is a dictionary which contains data about what happened and what has been picked. if allowediting is set to YES then info contains both the original image and the edited one. Check in the link I gave you for the
Editing Information Keys
there are a bunch of constants there which can give you the data you seek!
Start from here to see the whole mechanics:
http://developer.apple.com/library/ios/documentation/uikit/reference/UIImagePickerController_Class/UIImagePickerController/UIImagePickerController.html#//apple_ref/occ/instp/UIImagePickerController/allowsEditing
I know that this is a very old question, with no activity in awhile, but this is what comes up in a google search, and as far as I can tell, the question remains unanswered satisfactorily.
Anyway, the way to tell if the image has been edited or not is this:
In didFinishPickingMediaWithInfo: you can inspect the width of the CropRect and the width of the original image. If CropRect.width == originalImage.width+1, then it has not been edited. The reason this is true is because to edit the image, the user must pinch and zoom, which scales the image and changes the size of the CropRect. Simply moving the image around will not work as it bounces back unless it is scaled.
NSValue *pickerCropRect = info[UIImagePickerControllerCropRect];
CGRect theCropRect = pickerCropRect.CGRectValue;
UIImage *originalImage = info[UIImagePickerControllerOriginalImage];
CGSize originalImageSize = originalImage.size;
if (theCropRect.size.width == originalImageSize.width+1) {
NSLog(#"Image was NOT edited.");
} else {
NSLog(#"Image was edited.");
}
As far as I can tell this works in iOS 9 on the 6S and 6+. I see no real reason it shouldn't work elsewhere.

UIImageView contentMode different in retina and non-retina dispaly

So I have this UIImageView with an image on top of it. When I'm using normal iPhone display, it shows inside the UIImageView exactly as it supposed to - in the UIImageView bounds, the problem is when I'm using retina display device, the image becomes big and doesn't fit the UIImageView bounds, it goes all over the screen.
How can I fix this issue? I want the image in Retina display to fit inside the UIImageView size.
This is my code:
- (UIImage *)loadScreenShotImageFromDocumentsDirectory
{
UIImage * tempimage;
NSArray *sysPaths = NSSearchPathForDirectoriesInDomains( NSDocumentDirectory, NSUserDomainMask, YES );
NSString *docDirectory = [sysPaths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/MeasureScreenShot.png", docDirectory];
tempimage = [[UIImage alloc] initWithContentsOfFile:filePath];
return tempimage;
}
- (void)viewDidLoad
{
// Setting the image
UIImage * Image = [self loadScreenShotImageFromDocumentsDirectory];
theImageView.frame = CGRectMake(0, 166, 290, 334);
theImageView.contentMode = UIViewContentModeCenter;
theImageView.image = Image;
}
Thank in advance!
Try setting the content mode to UIViewContentModeScaleAspectFit
When I was using an image with core plot, I had to scale the image for the retina display.
CPTImage *fillimage = [CPTImage imageWithCGImage:bubble.CGImage scale:bubble.scale];
the name of the image file was bubble and I had a bubble file and a bubble#2x file. This let the image fit the image view for either display. Hope this helps.

Cropping Image using CGImageCreateWithImageInRect

I'm attempting to implement an iOS camera view that takes pictures that are square in shape (similar to Instagram). My code appears below. The first part, where the frame height is set to be equal to the frame width, is working as expected and the user is given a view that is square. The problem occurs later when I attempt to apply the frame (which is a CGRect property) to the image data using CGImageCreateWithImageInRect. I pass the frame rect to this method with the image. But the results are not cropped to be square. Instead the image retains the original default dimensions from the iOS camera. Can someone please tell me what I've done wrong? My understanding from the Apple documentation is that CGImageCreateWithImageInRect should select an image area of shape Rect from some starting x/y coordinate. But that doesn't seem to be happening.
//Set the frame size to be square shaped
UIView *view = imagePicker.view;
frame = view.frame;
frame.size.height = frame.size.width;
view.frame = frame;
//Crop the image to the frame dimensions using CGImageCreateWithImageInRect
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self.popoverController dismissPopoverAnimated:true];
NSString *mediaType = [info
objectForKey:UIImagePickerControllerMediaType];
[self dismissModalViewControllerAnimated:YES];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
UIImage *image = [info
objectForKey:UIImagePickerControllerOriginalImage];
croppedImage = (__bridge UIImage *)(CGImageCreateWithImageInRect((__bridge CGImageRef)(image), frame));
imageView.image = croppedImage;
}
else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
{
// Code here to support video if enabled
}
}
You are doing right. The only thing is that I think you are setting the frame property same as the picker view, so the final size is the same as the original size.
Try to set frame smaller than pickerView.view.frame, not equal
Check this out
Cropping an UIImage
You are setting the frame wrong.
I suggest you take a look at this sample code from Apple, on how to create what you are trying to:
https://developer.apple.com/library/mac/#samplecode/VirtualScanner/Listings/Sources_VirtualScanner_m.html#//apple_ref/doc/uid/DTS40011006-Sources_VirtualScanner_m-DontLinkElementID_9
Look at the:
- (ICAError)startScanningWithParams:(ICD_ScannerStartPB*)pb
function

How to Define UIImageView size as UIImage resolution?

I have scenario, in which I am getting images using Web Service and all images are in different resolution. Now my requirement is that I want resolution of each Images and using that I want to define size of UIImageView so I can prevent my Images from getting blurred
For example image resolution if 326 pixel/inch the imageview should be as size of that image can represent fully without any blur.
UIImage *img = [UIImage imageNamed:#"foo.png"];
CGRect rect = CGRectMake(0, 0, img.size.width, img.size.height);
UIImageView *imgView = [[UIImageView alloc] initWithFrame:rect];
[imgView setImage:img];
Image size IS it's resolution.
Your problem might be - retina display!
Check for Retina display and thus - make UIImageView width/height twice smaller (so that each UIImageView pixel would consist of four smaller UIImage pixels for retina display).
How to check for retina display:
https://stackoverflow.com/a/7607087/894671
How to check image size (without actually loading image in memory):
NSString *mFullPath = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]
stringByAppendingPathComponent:#"imageName.png"];
NSURL *imageFileURL = [NSURL fileURLWithPath:mFullPath];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)imageFileURL, NULL);
if (imageSource == NULL)
{
// Error loading image ...
}
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache, nil];
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (CFDictionaryRef)options);
NSNumber *mImgWidth;
NSNumber *mImgHeight;
if (imageProperties)
{
//loaded image width
mImgWidth = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
//loaded image height
mImgHeight = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
CFRelease(imageProperties);
}
if (imageSource != NULL)
{
CFRelease(imageSource);
}
So - for example:
UIImageView *mImgView = [[UIImageView alloc] init];
[mImgView setImage:[UIImage imageNamed:#"imageName.png"]];
[[self view] addSubview:mImgView];
if ([UIScreen instancesRespondToSelector:#selector(scale)])
{
CGFloat scale = [[UIScreen mainScreen] scale];
if (scale > 1.0)
{
//iphone retina screen
[mImgView setFrame:CGRectMake(0,0,[mImgWidth intValue]/2,[mImgHeight intValue]/2)];
}
else
{
//iphone screen
[mImgView setFrame:CGRectMake(0,0,[mImgWidth intValue],[mImgHeight intValue])];
}
}
Hope that helps!
You can get image size using following code. So, first calculate downloaded image size and than make image view according to that.
UIImage *Yourimage = [UIImage imageNamed:#"image.png"];
CGFloat width = Yourimage.size.width;
CGFloat height = Yourimage.size.height;
Hope, this will help you..
UIImage *oldimage = [UIImage imageWithContentsOfFile:imagePath]; // or you can set from url with NSURL
CGSize imgSize = [oldimage size];
imgview.frame = CGRectMake(10, 10, imgSize.width,imgSize.height);
[imgview setImage:oldimage];
100% working ....
To solve this problem, we need to take care of the device's display resolution..
For example you have an image of resolution 326ppi which is same as of iPhone4, iPhone4S and iPod 4th Gen. So you can simply use solutions suggested by #Nit and #Peko. But for other devices(or for image with different resolution on these devices) you will need to apply maths to calculate size for better display.
Now suppose you have 260ppi(with dimensions W x H) image and you wish to display it on iPhone4S, so as the information contained in it per inches is less than the display resolution of iPhone so we will need to resize it by reducing image size by 326/260 factor. so now the size for imageView that you will use is
imageViewWidth = W*(260/326);
imageViewHeight = H*(260/326);
In general:
resizeFactor = imageResolution/deviceDisplayResolution;
imageViewWidth = W*resizeFactor;
imageViewHeight = H*resizeFactor;
Here I am considering when we set an image in imageView and resize it, it does not removes or adds pixels from image,
Let the UIImageView do the work by utilizing the contentMode property to do your image resizing for you.
You probably want to be displaying your UIImageView with a static sizing (the "frame" property) that represents the maximum size of the image you want to display, and allowing the images to resize within that frame relative to their own particular size requirements (overall size, aspect ratio, etc.). You can let the UIImageView do the heavy lifting for you of dealing with different sized images by mastering the contentMode property. It has many different settings, one of which is UIViewContentModeScaleAspectFit, which will downsize your image as necessary to fit within the UIImageView, which if the image is smaller, it will simply display centered. You can play with the setting to get the results you want.
Note that with this approach, there is nothing special you need to do to deal with scaling issues associated with a Retina display.
As per the requirement you stated in the question body, I believe you need not change UIImageView size.
Image can represent fully without any blur using this line of code:
imageView.contentMode = UIViewContentModeScaleAspectFit;