there are two normal ways to import images (from camera and from album) but I also want to take images of iphone phonebook, so what should I do ?
Below can be used for getting image from AddressBook
NSData *imgData = (NSData *)ABPersonCopyImageData(person);
UIImage *img = [UIImage imgaeWithData:imgData];
where person is of type ABRecordRef. Now as CFData and NSData are
toll-free bridged
, so you can simply type cast CFData to NSData and get the image
Here is the Tutorial
http://adeem.me/blog/2009/04/02/get-image-from-contact-stored-in-addressbook/
If with "phonebook" you mean AddressBook framework to access contacts data, try using Erica Sadun's AB cocoa wrapper. It includes a sample project, so it will be easy.
Related
On the iPhone I saved an image to the photo library with modifications (appending data after the image code) using the assest library to save the NSData directly to the photo library. Now I want to read the image back from the photo library as an NSData to read it. If I read it as a UIImage, it changes the data inside the image.
How can I read the photo from the photo library as NSData? I tried looking into the reference URL in 4.1 but no luck.
Edit: I edited to explain why I'm saving it as an NSData. The URL method in the answers works if you just want pure NSData, but does not help in the context that I am asking.
Edit 2: The answer with the getBytes was the answer that did it for me. The code I used was:
int8_t *bytes = malloc([representation size]);
NSUInteger length = [representation getBytes:bytes fromOffset:0 length:[representation size] error:&error];
This was able to get me everything inside the file which gave me the image code PLUS what I added in NSData form.
Edit 3:
Is there a way to do this now in iOS 10 with the PHPhotoLibrary which replaces AssetLibrary? I need the image in NSData being the original RAW image with no modifications.
You should look into the getBytes method of ALAssetRepresentation.
This gives you the original RAW image data of your image and also makes it possible to use a buffer to process the image instead of reading the image into memory all at once.
And of course you can always generate NSData from the getBytes method.
Cheers,
Hendrik
If you already have your ALAsset, you should be able to get an NSData of the default representation using the following:
NSData *data = [NSData dataWithContentsOfURL:[[asset defaultRepresentation] url]];
How can we pull images from a website and display in our app using iphone sdk?
Also, how can we embed audio in our iPhone App using iPhone SDK?
Can you please refer to any tutorial / videos related to this?
As #PeteRossi said, you can use this code:
NSData * data = [NSData dataWithContentsOfURL:urlOfImage];
UIImage * image = [UIImage imageWithData: data];
However, I would look into using ASIHTTPRequest for downloading things from the web. ASIHTTPRequest contains powerful classes for downloading things asynchronously so that your app won't have to wait for that image to download before it continues.
Pulling an image from a website:
NSData * data = [NSData dataWithContentsOfURL:urlOfImage];
UIImage * image = [UIImage imageWithData: data];
As for your second question, it depends on what you are trying to do. Here is an overview:
http://developer.apple.com/technologies/ios/audio-and-video.html
I'm working on an iPhone App that uses the camera to take pictures, then I'm saving them to the Applications Documents directory. I'm using the following code to convert the UIImage to NSData,
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
Then I write the NSData using
[imageData writeToFile:path atomically:NO]
It all works. The problem is that UIImagePNGRepresentation() is really slow. It takes 8-9 secs on my 3G to convert the image to NSData. This seems wrong to me. Does anyone have any experience with this? Is this just slow function or am I doing something terribly wrong?
Thanks
Are you sure you want to save pictures captured with the camera as PNG?
JPEG is a more appropriate format for photographs. Additionally, its likely much faster!
I'm trying to post an image to TwitPic.com using their API.
However, I have never posted an image using HTTPPOST or however else before.
Anybody enlighten me to how I can post NSData from a UIImage using their api?
TwitPic is expecting a multi-part form data. There is a great open source library called ASIHTTPRequest.
You can use their APIs and post your image as multi-part/form-data.
See below for the sample.
http://allseeing-i.com/ASIHTTPRequest/How-to-use
Use the UIImagePNGRepresentation function or its JPEG equivalent to turn a UIImage into an NSData:
http://developer.apple.com/iphone/library/documentation/UIKit/Reference/UIKitFunctionReference/Reference/reference.html#//apple_ref/c/func/UIImagePNGRepresentation
#Jamie:
Try:
[request setData:twitpicImage forKey:#"media"];
If the image view is actually displaying an image that is stored in a file on disk, use:
[request setFile:theImagePath forKey:#"media"];
This way, ASIHTTPRequest will stream the image data from disk, so you don't have to keep an NSData instance in memory.
Ta
Ben
Instead of setPostValue for the image, use setData instead.
I am coding an iPhone application where images are transferred from one iPhone to another using Bluetooth. How do I archive an image and send it to another iPhone, then un-archive the image back? Archiving the image directly using NSKeyedarchiver doesn't work.
You can turn UIImage objects into NSData representations using the Image Manipulation function UIImageJPEGRepresentation or UIImagePNGRepresentation.
You can turn the NSData representation into a UIImage using the UIImage convenience method imageWithData: or initializer initWithData:.