NSData to UIImage - iphone

I'm trying to save a UIIMage and then retrieve it and display it. I've been successful by saving an image in the bundle, retrieving it from there and displaying. My problem comes from when I try to save an image to disk (converting it to NSData), then retrieving it. I convert the image to NSData like so...
NSData* topImageData = UIImageJPEGRepresentation(topImage, 1.0);
then I write it to disk like so...
[topImageData writeToFile:topPathToFile atomically:NO];
then I tried to retrieve it like so...
topSubView.image = [[UIImage alloc] initWithContentsOfFile:topPathToFile];
which returns no image (the size is zero). so then I tried...
NSData *data = [[NSData alloc] initWithContentsOfFile:topPathToFile];
topSubView.image = [[UIImage alloc] initWithData:data];
to no avail. When I step through the debugger I do see that data contains the correct number of bytes but I'm confused as to why my image is not being created. Am I missing a step? Do I need to do something with NSData before converting to an image?

Try this code. This worked for me.
Saving the data.
create path to save the image.
let libraryDirectory = NSSearchPathForDirectoriesInDomains(.libraryDirectory,
.userDomainMask,
true)[0]
let libraryURL = URL(fileURLWithPath: libraryDirectory, isDirectory: true)
let fileURL = libraryURL.appendingPathComponent("image.data")
convert the image to a Data object and save it to the file.
let data = UIImageJPEGRepresentation(myImage, 1.0)
try? data?.write(to: fileURL)
retrieving the saved image
let newImage = UIImage(contentsOfFile: fileURL.relativePath)
Create an imageview and load it with the retrieved image.
let imageView = UIImageView(image: newImage)
self.view.addSubview(imageView)

You should be able to use class methods of UIImage
[UIImage imageWithContentsOfFile:topPathToFile];
OR
[UIImage imageWithData:data];
Did that not work?
Hope this helps!

Just in case this helps someone, from iOS6 we have the imageWithData:scale method. To get an UIImage with the right scale from an NSData object, use that method, declaring the scale you use to store the original image. For example:
CGFloat screenScale = [[UIScreen mainScreen] scale];
UIImage *image = [UIImage imageWithData:myimage scale:screenScale];
Here, myimage is the NSData object where you stored the original image. In the example, I used the scale of the screen. If you use another scale for the original image, use that value instead.

Check this out: http://www.nixwire.com/getting-uiimage-to-work-with-nscoding-encodewithcoder/.
It has exactly what I think the solution to your problem is. You can't just make an image out of NSData, even though that's what common sense suggests. You have to use UIImagePNGRepresentation. Let me know if this helps.

Use if-let block with Data to prevent app crash & safe execution of code, as function UIImagePNGRepresentation returns an optional value.
if let img = UIImage(named: "TestImage.png") {
if let data:Data = UIImagePNGRepresentation(img) {
// Handle operations with data here...
}
}
Note: Data is Swift 3+ class. Use Data instead of NSData with
Swift 3+
Generic image operations (like png & jpg both):
if let img = UIImage(named: "TestImage.png") { //UIImage(named: "TestImage.jpg")
if let data:Data = UIImagePNGRepresentation(img) {
handleOperationWithData(data: data)
} else if let data:Data = UIImageJPEGRepresentation(img, 1.0) {
handleOperationWithData(data: data)
}
}
*******
func handleOperationWithData(data: Data) {
// Handle operations with data here...
if let image = UIImage(data: data) {
// Use image...
}
}
By using extension:
extension UIImage {
var pngRepresentationData: Data? {
return UIImagePNGRepresentation(img)
}
var jpegRepresentationData: Data? {
return UIImageJPEGRepresentation(self, 1.0)
}
}
*******
if let img = UIImage(named: "TestImage.png") { //UIImage(named: "TestImage.jpg")
if let data = img.pngRepresentationData {
handleOperationWithData(data: data)
} else if let data = img.jpegRepresentationData {
handleOperationWithData(data: data)
}
}
*******
func handleOperationWithData(data: Data) {
// Handle operations with data here...
if let image = UIImage(data: data) {
// Use image...
}
}

Related

How to create a NSImage from a NSData with Swift

Sorry to ask such a simple but this is my first program in Swift... I try to create a NSImage from a NSData that contains a JPEG image I loaded from disk (URLs are in an array name choseFiles[]).
The compiler issues an error on the second and I'm stuck:'NSData' is not implicitly convertible to 'Data'; did you mean to use 'as' to explicitly convert?
Thank you
let imageAsNSData = NSData(contentsOf: chosenFiles[0]) // UIKit/UIImage for iOS not MacOS !
let imageAsNSImage = NSImage(data: imageAsNSData)
if (imageAsNSImage) {
// image could be created from NSData
//
} else {
// image could NOT be created from NSData
//
}
----- EDIT -----
I tried
let imageAsNSImage = NSImage(data: imageAsNSData! as Data)
if (imageAsNSImage != nil) {
which seems to work (at least for the compiler). Am I correct?
You can use swift Data to get your image:
do {
let imageData = try Data(contentsOf: chosenFiles[0])
NSImage(data: imageData)
} catch {
print("Unable to load data: \(error)")
}
Now in swift3 NSData is replaced by Data. When you are downloading image from url store it as Data not NSData.
imageView.image = NSImage.init(data: data! as Data)

Saving UIImage Into Core Data Always Returns Nil when using ImagePickerView - Swift 3

I've been trying to save a single picture to an entity containing a single property titled "prof" and configured as a Binary Data type.
I go through the hoops to select a picture from UIImagePickerViewController, then I call up my method that handles saving the picture in Core Data in the desired NSData format.
My issue stems from loading the picture, in my loadImage method the entity for the image is not nil, meaning it does exist. However, I get nil when I try to parse the fetched NSData to a UIImage format to recreate the picture and then be able to use it.
Now i am using Swift 3 and Xcode 8, so far all the troubleshooting questions on here have the solution of casting the NSData to UImage like so:
let image : UIImage = UIImage(data: imageData)
however, xcode gives me a compiler error when I do this, and instead forces me to cast it as:
let image : UIImage = UIImage(data: (imageData as Data?)!)
which is where i get the nil that's throwing up my flow in the air... i've tried saving the data in many different ways, but still nothing.
if anyone could go through my following methods, see if i might be doing something wrong in the saving part, or the formating of NSData on the fetch method... anything would help.
My configuration:
-the prof property has "Allow external storage" set to true
-my persistent store is seeded blank at the app installation, meaning all the needed properties are already set up when the app is launched for the first time, but obviously set to nil until changed or modified by my various data flows.
-There is no other picture entity in my data model, this is the only one.
func saveProfilePicture(_ pic: UIImage){
let picData = UIImagePNGRepresentation(pic)
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first)
first?.setValue(picData, forKey: "prof")
try context.save()
} catch let err {
print(err)
}
}
func getProfilePicture() -> UIImage? {
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
var image : UIImage?
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first?.prof) as NSData
if let parsedImage : UIImage = UIImage(data: (first as Data?)!) as? UIImage {
image = parsedImage
}
} catch let err {
print(err)
}
return image
}
EDIT
The solution was found by noticing that in Swift 3, the UIImage class adheres to the Transformable protocol. Swapping my property type for the image from Binary Data to Transformable actually made it possible to save the UIImage as UIImage directly into Core Data without parsing it to another data type.
func saveProfilePicture(_ image: UIImage){
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first)
first?.prof = image
print(first)
coreDataManager.saveData()
} catch let err {
print(err)
}
}
func loadProfilePicture() -> UIImage? {
var image : UIImage?
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = records.first
if let img = first?.prof {
image = img as? UIImage
} else {
print("no image")
}
} catch let err {
print(err)
}
return image
}

Realm complains 'Binary too big'

I want to save image in Realm but it says that binary is too big. I know that NSData should be less than 16MB. So how can I handle this issue? Anyway to resize NSData?
Had the same problem as well, doing research I fixed my error with help from Realm docs. Here is the link. https://realm.io/docs/tutorials/scanner/#overview.
The helpful code snippet:
func data() -> Data {
var imageData = UIImagePNGRepresentation(self)
// Resize the image if it exceeds the 2MB API limit
if (imageData?.count)! > 2097152 {
let oldSize = self.size
let newSize = CGSize(width: 800, height: oldSize.height / oldSize.width *
800)
let newImage = self.resizeImage(self, size: newSize)
imageData = UIImageJPEGRepresentation(newImage, 0.7)
}
return imageData!
}
To add to Realm, A code can be something like this:
#IBOutlet weak var thumbImg: UIImageView!
let picture = Image()
let imageDownSizing = thumbImg.image?.data()
//thumbImg.image is of type UIImage type, so convert UIImage -> Data.
//picture.image is of type Data.
picture.image = UIImagePNGRepresentation(thumbImg.image!)
picture.image = imageDownSizing
let item = Item()
item.toImage = picture
do{
let realm = try! Realm()
try realm.write {
realm.add(item)
}
}catch{
print("Error saving context \(error)")
}
You can reference parts of the file with NSFileHandle and it's offsetInFile method. e.g. in 16MB chunks.
You can use let imageData = image?.jpegData(compressionQuality: 0.5)
which will get your image in Data format and compress the size

Error when trying to save image in NSUserDefaults using Swift

When i try to save an image in NSUserDefaults, the app crashed with this error.
Why? Is it possible to save an image with NSUserDefaults? If not, then how do I save the image?
Image...
Code...
var image1:UIImage = image1
var save1: NSUserDefaults = NSUserDefaults.standardUserDefaults()
save1.setObject(Info.Image1, forKey: "Image1")
save1.synchronize()
Log error...
libc++abi.dylib: terminating with uncaught exception of type NSException
(lldb)
NSUserDefaults isn't just a big truck you can throw anything you want onto. It's a series of tubes which only specific types.
What you can save to NSUserDefaults:
NSData
NSString
NSNumber
NSDate
NSArray
NSDictionary
If you're trying to save anything else to NSUserDefaults, you typically need to archive it to an NSData object and store it (keeping in mind you'll have to unarchive it later when you need it back).
There are two ways to turn a UIImage object into data. There are functions for creating a PNG representation of the image or a JPEG representation of the image.
For the PNG:
let imageData = UIImagePNGRepresentation(yourImage)
For the JPEG:
let imageData = UIImageJPEGRepresentation(yourImage, 1.0)
where the second argument is a CGFloat representing the compression quality, with 0.0 being the lowest quality and 1.0 being the highest quality. Keep in mind that if you use JPEG, each time you compress and uncompress, if you're using anything but 1.0, you're going to degrade the quality over time. PNG is lossless so you won't degrade the image.
To get the image back out of the data object, there's an init method for UIImage to do this:
let yourImage = UIImage(data:imageData)
This method will work no matter how you converted the UIImage object to data.
In newer versions of Swift, the functions have been renamed, and reorganized, and are now invoked as:
For the PNG:
let imageData = yourImage.pngData()
For the JPEG:
let imageData = yourImage.jpegData(compressionQuality: 1.0)
Although Xcode will autocorrect the old versions for you
In order to save UIImage in NSUserDefaults, you need to convert UIImage into NSData using UIImageJPEGRepresentation(image, 1) then save it in NSUserDefaults. And on the other side, while retrieving it on another ViewController (within that same application) you need to get NSData and then convert it into UIImage.
Here i am writing small code snippet in swift to demonstrate this. I tried this in XCODE 6.4
/*Code to save UIImage in NSUserDefaults on viewWillDisappear() event*/
override func viewWillDisappear(animated: Bool)
{
super.viewWillDisappear(animated)
//Get some image in image variable of type UIImage.
var image = ....
let defaults = NSUserDefaults.standardUserDefaults()
var imgData = UIImageJPEGRepresentation(image, 1)
defaults.setObject(imgData, forKey: "image")
}
/*Code to get Image and show it on imageView at viewWillAppear() event*/
override func viewWillAppear(animated: Bool)
{
super.viewWillAppear(animated)
let defaults = NSUserDefaults.standardUserDefaults()
if let imgData = defaults.objectForKey("image") as? NSData
{
if let image = UIImage(data: imgData)
{
//set image in UIImageView imgSignature
self.imgSignature.image = image
//remove cache after fetching image data
defaults.removeObjectForKey("image")
}
}
}
Updated for Swift 3:
If you want to save image in UserDefault used below lines of code save and retrieve the image;
To Save image in UserDefault:
if let image = response.result.value {
UserDefaults.standard.register(defaults: ["key":UIImageJPEGRepresentation(image, 100)!])
UserDefaults.standard.set(UIImageJPEGRepresentation(image, 100), forKey: "key")
}
To Retrieve the image from UserDefault and set it to ImageView:
var mLogoImageView = UIImageView()
if let imageData = UserDefaults.standard.value(forKey: "key") as? Data{
let imageFromData = UIImage(data: imageData)
mLogoImageView.image = imageFromData!
}
Enjoy..!
In Swift 4 - 5
Set:
setImage(image: UIImage(named: "12")!)
func setImage(image : UIImage) {
UserDefaults.standard.set(image.jpegData(compressionQuality: 100), forKey: "key")
}
Get
func getImage() -> UIImage? {
if let imageData = UserDefaults.standard.value(forKey: "key") as? Data{
if let imageFromData = UIImage(data: imageData){
return imageFromData
}
}
return nil
}

UIImageWriteToSavedPhotosAlbum save as PNG with transparency?

I'm using UIImageWriteToSavedPhotosAlbum to save a UIImage to the user's photo album. The problem is that the image doesn't have transparency and is a JPG. I've got the pixel data set correctly to have transparency, but there doesn't seem to be a way to save in a transparency-supported format. Ideas?
EDIT: There is no way to accomplish this, however there are other ways to deliver PNG images to the user. One of which is to save the image in the Documents directory (as detailed below). Once you've done that, you can email it, save it in a database, etc. You just can't get it into the photo album (for now) unless it is a lossy non-transparent JPG.
As pointed out on this SO question there is a simple way to save pngs in your Photo Albums:
UIImage* image = ...; // produce your image
NSData* imageData = UIImagePNGRepresentation(image); // get png representation
UIImage* pngImage = [UIImage imageWithData:imageData]; // rewrap
UIImageWriteToSavedPhotosAlbum(pngImage, nil, nil, nil); // save to photo album
This is a problem I have noticed before and reported on the Apple Developer Forums about a year ago. As far as I know it is still an open issue.
If you have a moment, please take the time to file a feature request at Apple Bug Report. If more people report this issue, it is more likely that Apple will fix this method to output non-lossy, alpha-capable PNG.
EDIT
If you can compose your image in memory, I think something like the following would work or at least get you started:
- (UIImage *) composeImageWithWidth:(NSInteger)_width andHeight:(NSInteger)_height {
CGSize _size = CGSizeMake(_width, _height);
UIGraphicsBeginImageContext(_size);
// Draw image with Quartz 2D routines over here...
UIImage *_compositeImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return _compositeImage;
}
//
// cf. https://developer.apple.com/iphone/library/documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/FilesandNetworking/FilesandNetworking.html#//apple_ref/doc/uid/TP40007072-CH21-SW20
//
- (BOOL) writeApplicationData:(NSData *)data toFile:(NSString *)fileName {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
if (!documentsDirectory) {
NSLog(#"Documents directory not found!");
return NO;
}
NSString *appFile = [documentsDirectory stringByAppendingPathComponent:fileName];
return ([data writeToFile:appFile atomically:YES]);
}
// ...
NSString *_imageName = #"myImageName.png";
NSData *_imageData = [NSData dataWithData:UIImagePNGRepresentation([self composeImageWithWidth:100 andHeight:100)];
if (![self writeApplicationData:_imageData toFile:_imageName]) {
NSLog(#"Save failed!");
}
In Swift 5:
func pngFrom(image: UIImage) -> UIImage {
let imageData = image.pngData()!
let imagePng = UIImage(data: imageData)!
return imagePng
}
I created an extension of UIImage with safe unwrapping:
Extension
extension UIImage {
func toPNG() -> UIImage? {
guard let imageData = self.pngData() else {return nil}
guard let imagePng = UIImage(data: imageData) else {return nil}
return imagePng
}
}
Usage:
let image = //your UIImage
if let pngImage = image.toPNG() {
UIImageWriteToSavedPhotosAlbum(pngImage, nil, nil, nil)
}
As an alternative to creating a secondary UIImage for UIImageWriteToSavedPhotosAlbum the PNG data can be written directly using the PHPhotoLibrary.
Here is a UIImage extension named 'saveToPhotos' which does this:
extension UIImage {
func saveToPhotos(completion: #escaping (_ success:Bool) -> ()) {
if let pngData = self.pngData() {
PHPhotoLibrary.shared().performChanges({ () -> Void in
let creationRequest = PHAssetCreationRequest.forAsset()
let options = PHAssetResourceCreationOptions()
creationRequest.addResource(with: PHAssetResourceType.photo, data: pngData, options: options)
}, completionHandler: { (success, error) -> Void in
if success == false {
if let errorString = error?.localizedDescription {
print("Photo could not be saved: \(errorString))")
}
completion(false)
}
else {
print("Photo saved!")
completion(true)
}
})
}
else {
completion(false)
}
}
}
To use:
if let image = UIImage(named: "Background.png") {
image.saveToPhotos { (success) in
if success {
// image saved to photos
}
else {
// image not saved
}
}
}