To add UIImage directly to file - iphone

I want to add my UIImage directly into the file instead of converting into UIImagePNGRepresentation or UIImageJPGRepresentation(as it takes time) like:-
UIImage *im = [UIImage imageWithCGImage:ref];
[array addObject:im];
NSData *data = [array objectAtIndex:i];
[data writeToFile:path atomically:YES];
But it is showing error.
So there is any way that i can do it.
Thanks in Advance.

your use of the array only obfuscates that you are basically doing:
NSData *data = im;
Which cannot possibly work because im is a UIImage, and not an NSData nor a subclass.
What you want to do is to create a new NSData and initialize it with the content of the image. Since you got a CGImageRef, I suggest using it directly, without using a UIImage in between.
CGDataProviderRef imageDataProvider = CGImageGetDataProvider(ref);
CFDataRef imageData = CGDataProviderCopyData(imageDataProvider);
NSData *data = (NSData*) imageData;
Note that it is OK to cast the CFDataRef to NSData* because CFData is “toll-free bridged” with its Cocoa Foundation counterpart, NSData.
I hope that helps.
(don't forget to release data when done)

Related

Save images in a n ios app

I am creating a greeting card app.I have some existing templates and i also can create new greeting cards by adding cliparts ,textstyles etc.I have saved the image as a uiimage object.I am stuck at the point to save these images for future use.Please help.
To save a UIImage to file do the following
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/img.jpg"];
//Save
NSData* imageData = UIImageJPEGRepresentation(img, 1.0);
[imageData writeToFile:path atomically:YES];
//Load
NSData* imageData = [NSData dataWithContentsOfFile:path];
UIImage *image = [UIImage imageWithData:imageData];
Check out this link.
How to save picture to iPhone photo library?
You can also save the image in iPhone file system by converting image to data and save data to file.
NSData *data = UIImagePNGRepresentation(yourImage);
[data writeToFile:(NSString *)path atomically:(BOOL)useAuxiliaryFile]

Converting byte array coming from Web service to UIImage iPhone

I am new to this technology.
I searched a lot but cant find any relevant.
In my application,I am receiving byte array from web service, my byte array which I receive from web service is
[137,80,78,71,13,10,26,10,0,0,0,13,73,72,68,82,0,0,1,195,0,0,1,195,8,2,0,0,0,215,2... ]
and I want to convert this byte array into UIImage for showing it in UIImageView.
Use below constructor for UIImage.
+ (UIImage *)imageWithData:(NSData *)data;
NSData *data = [NSData dataWithBytes:YOUR_BYTE_ARRAY length:ARRAY_LENGTH];
UIImage *img = [UIImage imageWithData:data];
UIImageView *imgView = [[UIImageView alloc]initWithImage:img];
The first 8 bytes in the byte array above, \211 P N G \r \n \032 \n (or 137,80,78,71,13,10,26,10 in decimal), reveal this to be a PNG file.
At the very least, you should be able to just save your entire byte sequence to a file, and load it using + (UIImage *)imageNamed:(NSString *)name or + (UIImage *)imageWithContentsOfFile:(NSString *)path. For example:
UIImage *myImage = [UIImage imageNamed:#"myfile.png"]
// myfile.png should be in the main bundle
(Apurv's method is more direct, and better for this reason. But since you are having such difficulty with it, I thought I'd suggest a slightly different approach.)
You need to Base64 decode the data first. Data that is returned from many SOAP web services is base 64 encoded as well as sometimes raw data embedded in websites. It is pretty simple to do but just easiest to use a library to someone else has created.
Start by including this in your project and including the .h in this file: https://github.com/nicklockwood/Base64
NSString *base64String = #"**YOUR BYE ARRAY HERE**";
UIImage *imageOrig = [UIImage imageWithData:[NSData dataFromBase64String:base64String]];
UIImageView *imageView = [[UIImageView alloc]initWithImage:imageOrig];
That should do it. In my previous experiences I just put what ever data blob I get over the webservice into a string then create the image using this method and it works great there is a great discussion on the details of Base64 encoding and decoding here : http://cocoadev.com/wiki/BaseSixtyFour which is what I used to create my class but Nick's code on gitHub is much better as its ARC compliant.
From Webservice we get array of NSNumber. We will have to convert it to NSData like this:
NSMutableData *data = [[NSMutableData alloc] initWithCapacity: [strings count]];
for( NSNumber *number in strings) {
char byte = [number charValue];
[data appendBytes: &byte length: 1];
}
Covert NSData to UIImage:
UIImage *imageOrig = [UIImage imageWithData:data];
We get JSON also out of NSData.
NSError *error1 = nil;
NSArray *jsonArray = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error1];
if (error1 != nil) {
NSLog(#"Error parsing JSON.");
} else {
NSLog(#"Array: %#", jsonArray);
}
//Use this
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
CGContextRef bitmapContext=CGBitmapContextCreate(YOUR_BYTE_ARRAY, w, h, 8, 4*w, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
CFRelease(colorSpace);
free(YOUR_BYTE_ARRAY);
CGImageRef cgImage=CGBitmapContextCreateImage(bitmapContext);
CGContextRelease(bitmapContext);
UIImage *newimage = [UIImage imageWithCGImage:cgImage];
[yourImageView setImage:newimage];
CGImageRelease(cgImage);
May be it will help you...
#import "NSDataAdditions.h"
NSData *dataObj = [NSData dataWithBase64EncodedString:StringImage];
UIImage *Image = [UIImage imageWithData:dataObj];

How do I convert a UIImage to NSData or CFDataRef?

How do I convert a UIImage to NSData or CFDataRef? I need to pass a CFDataRef to ABPersonSetImageData.
This worked for me, for a PNG image. For other image types, I assume you just have to find the corresponding UIImage...Representation method.
UIImage *image = [UIImage imageNamed:#"imageName.png"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
If you need a CFDataRef for a UIImage, it's just one more line.
CFDataRef imgDataRef = (CFDataRef)imageData;
you can use this
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
and simply cast imageData to CFDataRef
CFDataRef = (CFDataRef) imageData;
CFDataRef cfdata = CFDataCreate(NULL, [imageData bytes], [imageData length]);
For a working example click here.
Thank you.
For those who wondering what is difference between NSData and CFData, here is the explanation from Apple Docs:
CFData is “toll-free bridged” with its Cocoa Foundation counterpart,
NSData. What this means is that the Core Foundation type is
interchangeable in function or method calls with the bridged
Foundation object. In other words, in a method where you see an NSData
* parameter, you can pass in a CFDataRef, and in a function where you see a CFDataRef parameter, you can pass in an NSData instance. This
also applies to concrete subclasses of NSData. See Toll-Free Bridged
Types for more information on toll-free bridging.
This explains why casting NSData to CFData works.

Convert UIImage to NSString (and vice-versa)

I need a method to convert a UIImage in a NSString and then convert the NSString back to a UIImage.
Thanks.
for >= IOS 7
- (NSString *)imageToNSString:(UIImage *)image
{
NSData *imageData = UIImagePNGRepresentation(image);
return [imageData base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
}
- (UIImage *)stringToUIImage:(NSString *)string
{
NSData *data = [[NSData alloc]initWithBase64EncodedString:string
options:NSDataBase64DecodingIgnoreUnknownCharacters];
return [UIImage imageWithData:data];
}
Convert it to a binary stream instead (NSData). This will depend on the format of your UIImage. If it's a JPEG/PNG for instance, you do:
NSData *data1 = UIImageJPEGRepresentation(image, 1.0);
NSData *data2 = UIImagePNGRepresentation(image);
UPDATE: Converting the binary data to NSString is a bad idea, that is why we have the class NSData. The OP wants to be able to send it as a data stream and then reconstruct it again; NSString will not be needed for this.
Convert to PNG or JPEG using UIImagePNGRepresentation or UIImageJPEGRepresentation, which will return an NSData, and then convert the NSData to a string (not sure how you want to do that mapping). How about just dealing with the NSData? You can read/write that to a file.

Race Condition (?) In iPhone Temp File Writing

I'm creating some temporary files in the iPad simulator. To test my file creation, I create the file and then read it back. Here's some code to show this:
-(NSString *) writeToTempFile:(UIImage*) image{
NSString *path = [self createTemporaryFile];
NSLog(#"path: %#", path);
NSData *data = UIImageJPEGRepresentation(image, 1);
[data writeToFile:path atomically:YES];
free(data);
return path;
}
-(UIImage *) readTempFile:(NSString *) path{
NSData *data = [[NSData alloc] initWithContentsOfFile:path];
UIImage *image = [[UIImage alloc] initWithData:data];
return image;
}
I call these methods one after another, before a final function writes out the UIImage to the photo album.
UIImageWriteToSavedPhotosAlbum(image2, self, nil, nil);
The problem is, this always crashes my app on the third time it is executed. First and second time it successfully does all of this and stores to the album. Third time it crashes to Home. Any ideas?
NSData *data = UIImageJPEGRepresentation(image, 1);
[data writeToFile:path atomically:YES];
free(data);
The NSData returned from UIImageJPEGRepresentation is -autoreleased. There is no need to free() it. And it is wrong to free() any Objective-C objects — send a -release message instead.
Please read through the Memory Management Programming Guide.