How can I get a CGImageRef from a file? - iphone

So I know you can get a CGImage from a file using UIImage...
UIImage *img = [UIImage imageNamed:#"name.bmp"];
[img CGImage];
But, is there a way to get a CGImageRef without using a UIImage?
(I am trying this in a static library which doesn't have access to UIKit) I could use other frameworks if necessary, just not UIKit. Would CIImage work? Or NSBitmapImageRep?

You can get it from NSData:
CGDataProviderRef imgDataProvider = CGDataProviderCreateWithCFData((CFDataRef)[NSData dataWithContentsOfFile:#""]);
CGImageRef image = CGImageCreateWithPNGDataProvider(imgDataProvider, NULL, true, kCGRenderingIntentDefault); // Or JPEGDataProvider
You would have to be certain that the data was a JPEG or a PNG for this to work.

You have to understand that using [UIImage imageNamed:#"name.bmp"] works through UIKit framework. This is not thread safe. Example for dispatch_apply you must use Quartz framework.
// Through Quartz
NSString *maskFilePath = [[NSBundle mainBundle] pathForResource:#"mask" ofType:#"png"];
CGDataProviderRef dataProvider = CGDataProviderCreateWithFilename([maskFilePath UTF8String]);
CGImageRef maskRef = CGImageCreateWithPNGDataProvider(dataProvider, NULL, true, kCGRenderingIntentDefault);

Related

Converting byte array coming from Web service to UIImage iPhone

I am new to this technology.
I searched a lot but cant find any relevant.
In my application,I am receiving byte array from web service, my byte array which I receive from web service is
[137,80,78,71,13,10,26,10,0,0,0,13,73,72,68,82,0,0,1,195,0,0,1,195,8,2,0,0,0,215,2... ]
and I want to convert this byte array into UIImage for showing it in UIImageView.
Use below constructor for UIImage.
+ (UIImage *)imageWithData:(NSData *)data;
NSData *data = [NSData dataWithBytes:YOUR_BYTE_ARRAY length:ARRAY_LENGTH];
UIImage *img = [UIImage imageWithData:data];
UIImageView *imgView = [[UIImageView alloc]initWithImage:img];
The first 8 bytes in the byte array above, \211 P N G \r \n \032 \n (or 137,80,78,71,13,10,26,10 in decimal), reveal this to be a PNG file.
At the very least, you should be able to just save your entire byte sequence to a file, and load it using + (UIImage *)imageNamed:(NSString *)name or + (UIImage *)imageWithContentsOfFile:(NSString *)path. For example:
UIImage *myImage = [UIImage imageNamed:#"myfile.png"]
// myfile.png should be in the main bundle
(Apurv's method is more direct, and better for this reason. But since you are having such difficulty with it, I thought I'd suggest a slightly different approach.)
You need to Base64 decode the data first. Data that is returned from many SOAP web services is base 64 encoded as well as sometimes raw data embedded in websites. It is pretty simple to do but just easiest to use a library to someone else has created.
Start by including this in your project and including the .h in this file: https://github.com/nicklockwood/Base64
NSString *base64String = #"**YOUR BYE ARRAY HERE**";
UIImage *imageOrig = [UIImage imageWithData:[NSData dataFromBase64String:base64String]];
UIImageView *imageView = [[UIImageView alloc]initWithImage:imageOrig];
That should do it. In my previous experiences I just put what ever data blob I get over the webservice into a string then create the image using this method and it works great there is a great discussion on the details of Base64 encoding and decoding here : http://cocoadev.com/wiki/BaseSixtyFour which is what I used to create my class but Nick's code on gitHub is much better as its ARC compliant.
From Webservice we get array of NSNumber. We will have to convert it to NSData like this:
NSMutableData *data = [[NSMutableData alloc] initWithCapacity: [strings count]];
for( NSNumber *number in strings) {
char byte = [number charValue];
[data appendBytes: &byte length: 1];
}
Covert NSData to UIImage:
UIImage *imageOrig = [UIImage imageWithData:data];
We get JSON also out of NSData.
NSError *error1 = nil;
NSArray *jsonArray = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error1];
if (error1 != nil) {
NSLog(#"Error parsing JSON.");
} else {
NSLog(#"Array: %#", jsonArray);
}
//Use this
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
CGContextRef bitmapContext=CGBitmapContextCreate(YOUR_BYTE_ARRAY, w, h, 8, 4*w, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
CFRelease(colorSpace);
free(YOUR_BYTE_ARRAY);
CGImageRef cgImage=CGBitmapContextCreateImage(bitmapContext);
CGContextRelease(bitmapContext);
UIImage *newimage = [UIImage imageWithCGImage:cgImage];
[yourImageView setImage:newimage];
CGImageRelease(cgImage);
May be it will help you...
#import "NSDataAdditions.h"
NSData *dataObj = [NSData dataWithBase64EncodedString:StringImage];
UIImage *Image = [UIImage imageWithData:dataObj];

Take a screenshot of screen

I need to take a screen shot of a screen and save as pdf. I have accomplished the save as pdf task however the screenshot i take always gives a blank pdf. I have no idea why. My code is as follows :
-(IBAction)takeScreenShot
{
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIImageView *newImage = [[UIImageView alloc] initWithImage:image];
UIGraphicsEndImageContext();
[self createPDFfromUIView:newImage saveToDocumentsWithFileName:#"SecondScreen1.pdf"];
}
-(void)createPDFfromUIView:(UIImageView*)aView saveToDocumentsWithFileName:(NSString*)aFilename
{
// Creates a mutable data object for updating with binary data, like a byte array
NSMutableData *pdfData = [NSMutableData data];
//CGSize pageSize = CGSizeMake(612, 792);
// Points the pdf converter to the mutable data object and to the UIView to be converted
UIGraphicsBeginPDFContextToData(pdfData, aView.bounds, nil);
UIGraphicsBeginPDFPage();
// draws rect to the view and thus this is captured by UIGraphicsBeginPDFContextToData
//[aView.layer renderInContext:UIGraphicsGetCurrentContext()];
// remove PDF rendering context
UIGraphicsEndPDFContext();
// Retrieves the document directories from the iOS device
NSArray* documentDirectories = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask,YES);
NSString* documentDirectory = [documentDirectories objectAtIndex:0];
NSString* documentDirectoryFilename = [documentDirectory stringByAppendingPathComponent:aFilename];
// instructs the mutable data object to write its context to a file on disk
[pdfData writeToFile:documentDirectoryFilename atomically:YES];
NSLog(#"documentDirectoryFileName: %#",documentDirectoryFilename);
}
This line that's commented out should be writing your image to the pdf. Put that code back in and it might work.
[aView.layer renderInContext:UIGraphicsGetCurrentContext()];
If that's failing (with no errors) make sure that UIImage *image is not blank.

Convert Screenshot to PDF in Xcode?

How might one convert a screenshot, taken on an iPhone for example, into a PDF file. It's easy enough to take the screenshot and put it into a UIImage, but it's the conversion which has me stumped. I took a look at the Quartz framework which is the only one in Xcode that I know to support the PDF format, but couldn't find anything there to make this work. Is there anything native to Xcode that I'm missing? If not, is there a public framework somewhere that could handle a conversion like this?
I figured it out. It involved saving a screenshot as a UIImage, and then using the very helpful tutorial found here to get me going with the PDF conversion. As it turns out, there are functions to handle the making of PDF documents in the Core Graphics class.
You wouldn't convert a screenshot to a pdf. You would create a screenshot, create a pdf and insert the screenshot image into the first page of the pdf.
Untested code to create image:
nameStr = [NSString stringWithFormat:#"myImage.png"];
fileManager = [NSFileManager defaultManager];
paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
documentsDirectory = [paths objectAtIndex:0];
namePath = [documentsDirectory stringByAppendingString:#"/"];
namePath = [thumbnailPath nameStr];
CGRect contextRect = CGRectMake(0, 0, 1024, 768);
UIGraphicsBeginImageContext(contextRect.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data = UIImagePNGRepresentation(viewImage);
UIImage *myImage = [[UIImage alloc] initWithData:data];
Untested code to add image to a pdf:
pageSize = CGSizeMake(1024, 768);
fileName = #"myFile.pdf";
pdfFileName = [documentsDirectory stringByAppendingPathComponent:fileName];
UIGraphicsBeginPDFContextToFile(pdfFileName, CGRectZero, nil);
UIGraphicsBeginPDFPageWithInfo(CGRectMake(0, 0, pageSize.width, pageSize.height), nil);
[myImage drawInRect:CGRectMake( 0, 0, 1024, 768];
UIGraphicsEndPDFContext();
p.s. assuming ipad 1024 x 768 above

How do I convert a UIImage to NSData or CFDataRef?

How do I convert a UIImage to NSData or CFDataRef? I need to pass a CFDataRef to ABPersonSetImageData.
This worked for me, for a PNG image. For other image types, I assume you just have to find the corresponding UIImage...Representation method.
UIImage *image = [UIImage imageNamed:#"imageName.png"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
If you need a CFDataRef for a UIImage, it's just one more line.
CFDataRef imgDataRef = (CFDataRef)imageData;
you can use this
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
and simply cast imageData to CFDataRef
CFDataRef = (CFDataRef) imageData;
CFDataRef cfdata = CFDataCreate(NULL, [imageData bytes], [imageData length]);
For a working example click here.
Thank you.
For those who wondering what is difference between NSData and CFData, here is the explanation from Apple Docs:
CFData is “toll-free bridged” with its Cocoa Foundation counterpart,
NSData. What this means is that the Core Foundation type is
interchangeable in function or method calls with the bridged
Foundation object. In other words, in a method where you see an NSData
* parameter, you can pass in a CFDataRef, and in a function where you see a CFDataRef parameter, you can pass in an NSData instance. This
also applies to concrete subclasses of NSData. See Toll-Free Bridged
Types for more information on toll-free bridging.
This explains why casting NSData to CFData works.

To add UIImage directly to file

I want to add my UIImage directly into the file instead of converting into UIImagePNGRepresentation or UIImageJPGRepresentation(as it takes time) like:-
UIImage *im = [UIImage imageWithCGImage:ref];
[array addObject:im];
NSData *data = [array objectAtIndex:i];
[data writeToFile:path atomically:YES];
But it is showing error.
So there is any way that i can do it.
Thanks in Advance.
your use of the array only obfuscates that you are basically doing:
NSData *data = im;
Which cannot possibly work because im is a UIImage, and not an NSData nor a subclass.
What you want to do is to create a new NSData and initialize it with the content of the image. Since you got a CGImageRef, I suggest using it directly, without using a UIImage in between.
CGDataProviderRef imageDataProvider = CGImageGetDataProvider(ref);
CFDataRef imageData = CGDataProviderCopyData(imageDataProvider);
NSData *data = (NSData*) imageData;
Note that it is OK to cast the CFDataRef to NSData* because CFData is “toll-free bridged” with its Cocoa Foundation counterpart, NSData.
I hope that helps.
(don't forget to release data when done)