How to display the UIImageJPEGRepresentation image in iphone sdk - iphone

I want to use UIImageJPEGRepresentation to add the image in iphone, in below code i am missing some thing I don't know how to add
NSData *data=[NSData dataWithContentsOfURL:[NSURL URLWithString:gameObj.gameThumbnails]];
UIImage *myImage=[UIImage imageWithData:data];
imageView.image=UIImageJPEGRepresentation(myImage, 90);// I am missing something here
[elementView addSubview:imageView];
[imageView release];
/*gameObj.gameThumbnails is an url like http://static.gamesradar.com/images/mb/GamesRadar/us/Games/X/X-Men%20Arcade/Bulk%20Viewer/360%20PS3/2010-10-11/192.168.30.167-image36_bmp_jpgcopy--game_thumbnail.jpg
*/
Please help me out of this

u are setting comparison quality to 90. it should be between 0.0 to 1.0

The UIImageJPEGRepresentation function returns an NSData (array of bytes) which represents a content of the image if compressed in JPEG.
To display it, you just need to use the UIImage directly.
imageView.image = myImage;

Make sure that you:
Set your compression quality to 0.9(it MUST be between 0.0 - 1.0)
Set the frame of the UIImageView correctly
Don't make that UIImageView hidden
And if it still doesn't work then try the following:
[elementView bringSubviewToFront:imageView];
Hope it helps

Related

Loaded image in UIImage is pixelated on retina

I parsed the data from a web which also contains jpg image. The problem is that the image looks blurry/pixelated on retina display. Any solution for this? Thanks.
NSData *data = [NSData dataWithContentsOfURL:linkUrl];
UIImage *img = [[UIImage alloc] initWithData:data];
// detailViewController.faces.contentScaleFactor=[UIScreen mainScreen].scale;//Attampt to solve the problem
detailViewController.faces.image=img;
After initializing your image with the data, create a new one from it with the correct scale like this:
img = [UIImage imageWithCGImage:img.CGImage scale:[UIScreen mainScreen].scale orientation:img.imageOrientation];
...but note that the image will now appear half the size on retina displays unless you scale it up, for example by stretching it in an image view.

Rotating an image prior to saving in iOS

In my app, I need to save an image. I need the image to always be saved as a portrait, even if the device is in landscape mode. I am checking to see if the device is in landscape mode and if it is, I would like to rotate my image before it's saved as a PNG. Can anyone help me figure this out?
-(void) saveImage {
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
if (UIInterfaceOrientationIsLandscape([[UIDevice currentDevice] orientation])) {
//// need to rotate it here
}
NSData *data = UIImagePNGRepresentation (viewImage);
[data writeToFile:savePath atomically:YES];
}
This thread may help you. It shows how to use the imageOrientation method of a UIImage in order to switch the orientation. Hope that Helps!

Get image width and height before loading it completely in iPhone

I am loading an image in my UITableViewCell using
[NSData dataWithContentsOfURL:imageUrl]
For setting custom height for my tableview cell , i need the actual size of the image that am loading.
Can we get the width and height of an image before loading it completely ? Thanks in advance.
Try the Image I/O interface as done below. This will allow you to get the image size without having to load the entire file:
#import <ImageIO/ImageIO.h>
NSMutableString *imageURL = [NSMutableString stringWithFormat:#"http://www.myimageurl.com/image.png"];
CGImageSourceRef source = CGImageSourceCreateWithURL((CFURLRef)[NSURL URLWithString:imageURL], NULL);
NSDictionary* imageHeader = (__bridge NSDictionary*) CGImageSourceCopyPropertiesAtIndex(source, 0, NULL);
NSLog(#"Image header %#",imageHeader);
NSLog(#"PixelHeight %#",[imageHeader objectForKey:#"PixelHeight"]);
you can do it like this:
NSData *imageData = [NSData dataWithContentsOfURL:imageUrl];
UIImage *image = [UIImage imageWithData:imageData];
NSLog(#"image height: %f",image.size.height);
NSLog(#"image width: %f",image.size.width);
Take a look at this Question How do I extract the width and height of a PNG from looking at the header in objective c which shares how is it possible to parse image Meta-data.
I have a created OpenSource project Ottran that extracts the image size and type of a remote image by downloading as little as possible, which supports PNG, JPEG , BMP and GIF formats.
NSData is an "opaque" data, so you cannot do much with it before converting it to something more "useful" (e.g., creating an UIImage by means of it -initWithData: method). At that moment you could enquiry the image size, but it would be late for you.
The only approach I see, if you really need knowing the image size before the image is fully downloaded, is implementing a minimal server-side API so that you can ask for the image size before trying to download it.
Anyway, why do you need to know the image size before it is actually downloaded? Could you not set the row height at the moment when it has been downloaded (i.e., from your request delegate method)?
dataWithContentsOfURL is synchronous it will block your UI until its download complete, so please use header content to get resolution, Below is Swift 3.0 code
if let imageSource = CGImageSourceCreateWithURL(url! as CFURL, nil) {
if let imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil) as Dictionary? {
let pixelWidth = imageProperties[kCGImagePropertyPixelWidth] as! Int
let pixelHeight = imageProperties[kCGImagePropertyPixelHeight] as! Int
print("the image width is: \(pixelWidth)")
print("the image height is: \(pixelHeight)")
}
}

iOS UIImageJPEGRepresentation error: Not a JPEG file: starts with 0xff 0xd9

I am writing a .jpg file to my app's Documents directory like this:
NSData *img = UIImageJPEGRepresentation(myUIImage, 1.0);
BOOL retValue = [img writeToFile:myFilePath atomically:YES];
Later, I load that image back into a UIImage using:
UIImage *myImage = [UIImage imageWithContentsOfFile:path];
I know it works because I can draw the image in a table cell and it is fine. Now if I try to use UIImageJPEGRepresentation(myImage, 1.0), the debugger prints out these lines:
<Error>: Not a JPEG file: starts with 0xff 0xd9
<Error>: Application transferred too few scanlines
And the function returns nil. Does anybody have an idea why this would happen? I haven't done anything to manipulate the UIImage data after it was loaded. I just provided the UIImage to an image view in a cell. I set the image view properties such that all the images in the cells line up and are the same size, but I don't think that should have anything to do with being able to convert the UIImage to NSData.
The images were not corrupt, I was able to display them correctly. The issue is possibly a bug in UIImage, or perhaps the documentation should be more clear about using imageWithContentsOfFile:.
I was able to eliminate the error message by changing
UIImage *myImage = [UIImage imageWithContentsOfFile:path];
to
NSData *img = [NSData dataWithContentsOfFile:path];
UIImage *photo = [UIImage imageWithData:img];

Where does the image go when i take a screen shot?

On click event of a button
[mybutton addTarget:self action:#selector(captureView)
forControlEvents:UIControlEventTouchUpInside];
- (void)captureView {
UIGraphicsBeginImageContext(CGSizeMake(320,480));
CGContextRef context = UIGraphicsGetCurrentContext();
[self.view.layer renderInContext:context];
UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"%#",screenShot);
}
My screenshot prints UIImage: 0x4b249c0. Is this correct code to take a screen short of a particular area of an iphone app? Where this image store at that particular time. How can i see those images?
The UIImage is just an in-memory representation of the screenshot you've taken. You will need to write it out somewhere if you want to save it. For example, the following code will write that image to the user's photo library:
UIImageWriteToSavedPhotosAlbum(screenShot, self, #selector(image:didFinishSavingWithError:contextInfo:), NULL);
Remember to retain the image and release it within your -image:didFinishSavingWithError:contextInfo:
callback method.
You can also manually save out an image as a PNG or JPEG using code like the following:
NSData *imageData = UIImagePNGRepresentation(screenShot);
[imageData writeToFile:pathToSaveImage atomically:YES];
Here your image is in the memory itself and you need to save it somewhere say in Bundle .