CFDataRef to NSData - iphone

I am having trouble converting a CFDataRef into NSData whilst using ARC. I am using the __bridge_transfer OR __bridge cast but it is not working. Could anyone suggest me some other way of casting this two types.
I get the following error
Automatic Reference Counting Issue: Incompatible types casting 'CFDataRef *' (aka 'const struct __CFData **') to 'NSData *' with a __bridge cast

NSData *my_nsdata = (__bridge_transfer NSData*)my_cfdata; // -1 on the my_cfdata
or
NSData *my_nsdata = (__bridge NSData*)my_cfdata; // no adjustment of retain counts.
From my blog post here:
http://amattn.com/2011/12/07/arc_best_practices.html

Related

Convert NSMutableData to NSData in Swift

I have a method which is returning NSMutableData. I need to pass this NSMutableData to another method but that method is expecting only NSData. I am trying to find any method/solution to convert NSMutableData to NSdata. But still no luck.
In Objective C, it can be done like this
NSData *immutableData = [NSData dataWithData:mutableData];
I am not sure how it can be done in Swift?Can someone help me in this?
Simply pass the NSMutableData to any method that expects NSData. Since it's a subclass, it will work fine.
But if you really want to do the conversion, simply do (Swift 3):
let data = someNSMutableDataVariable.copy() as! NSData
or
let data = NSData(data: someNSMutableDataVariable as Data)
It may make sense to update your code to use Data instead of NSMutableData or NSData. Just like using String instead of NSString and NSMutableString.

How to convert UIImage to JSON file in iPhone?

I have been using NSJSONSerialization class for converting fields of my object to JSON. Sadly only NSString, NSNumber, NSArray, NSDictionary, or NSNull types are supported.
As my object has one additional field, that is UIImage, I am at loss as to how to deal with it. I am sure many people have encountered this common problem, so what is best method to approach this?
You can encode UIImage data by base64, and add it to json object.
To get data from UIImage, you can use UIImagePNGRepresentation and UIImageJPEGRepresentation.
The code like this,
NSData *imageData = UIImagePNGRepresentation(image);
NSString *base64encodedStr = base64encode(imageData);
[dict setObject:base64encodedStr forKey:#"myImage"];
//then covert dict to json object.
To restore UIImage data, just parse json object and decode the data by base64.
Hope this can help you.
You could convert your images data to a string and then write that string.
NSData *imageData = UIPNGRepresentation(image);
NSString *imageString = [[NSString alloc] initWithData:imageData encoding:NSUTF8StringEncoding];
//I don't know how to use NSJSONSerialization
//[NSJSONSerialization serializeString:imageString];
NSString *base64encodedStr = [imageData base64Encoding];

How do I convert a UIImage to NSData or CFDataRef?

How do I convert a UIImage to NSData or CFDataRef? I need to pass a CFDataRef to ABPersonSetImageData.
This worked for me, for a PNG image. For other image types, I assume you just have to find the corresponding UIImage...Representation method.
UIImage *image = [UIImage imageNamed:#"imageName.png"];
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
If you need a CFDataRef for a UIImage, it's just one more line.
CFDataRef imgDataRef = (CFDataRef)imageData;
you can use this
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
and simply cast imageData to CFDataRef
CFDataRef = (CFDataRef) imageData;
CFDataRef cfdata = CFDataCreate(NULL, [imageData bytes], [imageData length]);
For a working example click here.
Thank you.
For those who wondering what is difference between NSData and CFData, here is the explanation from Apple Docs:
CFData is “toll-free bridged” with its Cocoa Foundation counterpart,
NSData. What this means is that the Core Foundation type is
interchangeable in function or method calls with the bridged
Foundation object. In other words, in a method where you see an NSData
* parameter, you can pass in a CFDataRef, and in a function where you see a CFDataRef parameter, you can pass in an NSData instance. This
also applies to concrete subclasses of NSData. See Toll-Free Bridged
Types for more information on toll-free bridging.
This explains why casting NSData to CFData works.

send UIImage as bytes?

I have a UIImageView which contains a image taken with the camera of the phone.
Next thing that i want to do is send this image to a server. I already have the socket connections set up using the CFstreams.
But i really can't get this to work. I have a UIImage object, which i want to send over somehow with my TCP connection...
I tried to send a NSData like so, but it gives me an error:
Code:
// Getting image from OpenCV and converting it back to UIImage
NSData *data = UIImagePNGRepresentation([UIImage imageWithCVMat:tpl]);
uint32_t length = (uint32_t)htonl([data length]);
// Send the image? (Doesn't work)
[outputStream write:[data bytes] maxLength:length];
error: Semantic Issue: Cannot initialize a parameter of type 'const
uint8_t *' (aka 'const unsigned char *') with an rvalue of type 'const
void *'
So anyone any idea how to send an image from a UIImage(View) as a byte array through the outputStream writer??
In this case you can safely cast:
[outputStream write:(const uint8_t *)[data bytes] maxLength:length];
since you know where the (void*) data comes from.

ABPersonSetImageData doesn't display image

I use ABUnknownPersonViewController to display a contact view.
I try to set an image with:
NSData *dataRef = UIImagePNGRepresentation([UIImage imageNamed:#"contact3.png"]);
ABPersonSetImageData(newPersonViewController.displayedPerson, (CFDataRef)dataRef, nil);
It doesn't work and I don't know why. Any ideas?
You can't just cast an NSData object to a CFDataRef; as noted in the docs, a CFDataRef is a "reference to an immutable CFData object", which is not the same as an NSData instance:
typedef const struct __CFData *CFDataRef;
To create the CFDataRef from the NSData instance, you need to use the CFDataCreate method, passing the bytes and length:
NSData *dataRef = UIImagePNGRepresentation([UIImage imageNamed:#"contact3.png"]);
CFDataRef dr = CFDataCreate(NULL, [dataRef bytes], [dataRef length]);
Note also that since you create the object yourself, you must also release it, following the Core Foundation Ownership Policy; you use the CFRelease function to release ownership of the Core Foundation object:
CFRelease(dr);
This is similar to the Memory Management in Cocoa, and once the retain count of the Core Foundation object reaches zero it will be deallocated.
Edit: Stefan was completely right, in his comment, that NSData and CFData are also toll-free bridged on the iPhone with Cocoa-Touch as with Cocoa, so my original answer was wrong. My fault, should have edited it before.