how to get png representation of an NSImage object in Swift - swift

I know I can use "UIImagePNGRepresentation(UIImagexxx)" to get the png representation of a UIImage object.
But I'm not sure how I can do something similar for NSImage. I could only find TIFFRepresentation available for NSImage objects.
Any ideas? Thanks a lot

To help with cross-platform code, I implemented a version ofUIImagePNGRepresentation() that runs on Mac (and uses NSImage):
#if os(macOS)
public func UIImagePNGRepresentation(_ image: NSImage) -> Data? {
guard let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil)
else { return nil }
let imageRep = NSBitmapImageRep(cgImage: cgImage)
imageRep.size = image.size // display size in points
return imageRep.representation(using: .png, properties: [:])
}
#endif

Related

Using VNRecognizeTextRequest on image drawn in UIImageView returns empty results

I have an UIImageView in which I draw handwritten text, using UIGraphicsBeginImageContext to create the bitmap image.
I pass this image to an OCR func:
func ocrText(onImage: UIImage?) {
let request = VNRecognizeTextRequest { request, error in
guard let observations = request.results as? [VNRecognizedTextObservation] else {
fatalError("Received invalid observations") }
print("observations", observations.count) // count is 0
for observation in observations {
if observation.topCandidates(1).isEmpty {
continue
}
}
} // end of request
request.recognitionLanguages = ["fr"]
let requests = [request]
DispatchQueue.global(qos: .userInitiated).async {
let ocrGroup = DispatchGroup()
guard let img = onImage?.cgImage else { return }
crGroup.enter()
let handler = VNImageRequestHandler(cgImage: img, options: [:])
try? handler.perform(requests)
ocrGroup.leave()
crGroup.wait()
}
}
Problem is that observations is an empty array.
But, If I save UIImage to the photo album:
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
and read back image from the album with imagePicker and pass this image to ocrText, it works.
So it seems there is a format change to the image (or metadata?) when saved to album and that VNRecognizer needs those data.
Is there a way to change directly the original bitmap image format, without going through the storage on photo album ?
Or am I missing something in the use of VNRecognizeTextRequest ?
I finally found a way to get it.
I save the image to a file as jpeg and read the file back.
This didn't work with png, but works with jpeg.

PNG/JPEG representation from CIImage always returns nil

I'm currently making a photo editing app.
When a photo is selected by the user, it is automatically converted into black and white using this code:
func blackWhiteImage(image: UIImage) -> Data {
print("Starting black & white")
let orgImg = CIImage(image: image)
let bnwImg = orgImg?.applyingFilter("CIColorControls", withInputParameters: [kCIInputSaturationKey:0.0])
let outputImage = UIImage(ciImage: bnwImg!)
print("Black & white complete")
return UIImagePNGRepresentation(outputImage)!
}
The problem I am having with this code is that I keep getting this error:
fatal error: unexpectedly found nil while unwrapping an Optional value
I have had my code in a slightly different configuration, but it still breaks when it gets to the UIImagePNG/JPEGRepresentation(xx) section.
Are there any ways to get the PNG or JPEG data from a CIImage for use in an image view / just UIImage in general?
Any of the other methods don't go into enough detail for what code should be used.
Just begin a new graphics context and draw your grayscale image there. iOS 10 or later you can use UIGraphicsImageRenderer, for older iOS version syntax please check edit history:
Xcode 11 • Swift 5.1
func blackWhiteImage(image: UIImage, isOpaque: Bool = false) -> Data? {
guard let ciImage = CIImage(image: image)?.applyingFilter("CIColorControls", parameters: [kCIInputSaturationKey: 0]) else { return nil }
let format = image.imageRendererFormat
format.opaque = isOpaque
return UIGraphicsImageRenderer(size: image.size, format: format).image { _ in
UIImage(ciImage: ciImage).draw(in: CGRect(origin: .zero, size: image.size))
}.pngData()
}
You can also extend UIImage to return a grayscale image :
extension UIImage {
var coreImage: CIImage? { CIImage(image: self) }
func grayscale(isOpaque: Bool = false) -> UIImage? {
guard let coreImage = coreImage?.applyingFilter("CIColorControls", parameters: [kCIInputSaturationKey: 0]) else { return nil }
let format = imageRendererFormat
format.opaque = isOpaque
return UIGraphicsImageRenderer(size: size, format: format).image { _ in
UIImage(ciImage: coreImage).draw(in: CGRect(origin: .zero, size: size))
}
}
}
let profilePicture = UIImage(data: try! Data(contentsOf: URL(string:"http://i.stack.imgur.com/Xs4RX.jpg")!))!
if let grayscale = profilePicture.grayscale(), let data = grayscale.pngData() { // or Swift 4.1 or earlier -> let data = UIImagePNGRepresentation(grayscale)
print(data.count) // 689035
}

Saving UIImage Into Core Data Always Returns Nil when using ImagePickerView - Swift 3

I've been trying to save a single picture to an entity containing a single property titled "prof" and configured as a Binary Data type.
I go through the hoops to select a picture from UIImagePickerViewController, then I call up my method that handles saving the picture in Core Data in the desired NSData format.
My issue stems from loading the picture, in my loadImage method the entity for the image is not nil, meaning it does exist. However, I get nil when I try to parse the fetched NSData to a UIImage format to recreate the picture and then be able to use it.
Now i am using Swift 3 and Xcode 8, so far all the troubleshooting questions on here have the solution of casting the NSData to UImage like so:
let image : UIImage = UIImage(data: imageData)
however, xcode gives me a compiler error when I do this, and instead forces me to cast it as:
let image : UIImage = UIImage(data: (imageData as Data?)!)
which is where i get the nil that's throwing up my flow in the air... i've tried saving the data in many different ways, but still nothing.
if anyone could go through my following methods, see if i might be doing something wrong in the saving part, or the formating of NSData on the fetch method... anything would help.
My configuration:
-the prof property has "Allow external storage" set to true
-my persistent store is seeded blank at the app installation, meaning all the needed properties are already set up when the app is launched for the first time, but obviously set to nil until changed or modified by my various data flows.
-There is no other picture entity in my data model, this is the only one.
func saveProfilePicture(_ pic: UIImage){
let picData = UIImagePNGRepresentation(pic)
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first)
first?.setValue(picData, forKey: "prof")
try context.save()
} catch let err {
print(err)
}
}
func getProfilePicture() -> UIImage? {
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
var image : UIImage?
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first?.prof) as NSData
if let parsedImage : UIImage = UIImage(data: (first as Data?)!) as? UIImage {
image = parsedImage
}
} catch let err {
print(err)
}
return image
}
EDIT
The solution was found by noticing that in Swift 3, the UIImage class adheres to the Transformable protocol. Swapping my property type for the image from Binary Data to Transformable actually made it possible to save the UIImage as UIImage directly into Core Data without parsing it to another data type.
func saveProfilePicture(_ image: UIImage){
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first)
first?.prof = image
print(first)
coreDataManager.saveData()
} catch let err {
print(err)
}
}
func loadProfilePicture() -> UIImage? {
var image : UIImage?
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = records.first
if let img = first?.prof {
image = img as? UIImage
} else {
print("no image")
}
} catch let err {
print(err)
}
return image
}

Error when trying to save image in NSUserDefaults using Swift

When i try to save an image in NSUserDefaults, the app crashed with this error.
Why? Is it possible to save an image with NSUserDefaults? If not, then how do I save the image?
Image...
Code...
var image1:UIImage = image1
var save1: NSUserDefaults = NSUserDefaults.standardUserDefaults()
save1.setObject(Info.Image1, forKey: "Image1")
save1.synchronize()
Log error...
libc++abi.dylib: terminating with uncaught exception of type NSException
(lldb)
NSUserDefaults isn't just a big truck you can throw anything you want onto. It's a series of tubes which only specific types.
What you can save to NSUserDefaults:
NSData
NSString
NSNumber
NSDate
NSArray
NSDictionary
If you're trying to save anything else to NSUserDefaults, you typically need to archive it to an NSData object and store it (keeping in mind you'll have to unarchive it later when you need it back).
There are two ways to turn a UIImage object into data. There are functions for creating a PNG representation of the image or a JPEG representation of the image.
For the PNG:
let imageData = UIImagePNGRepresentation(yourImage)
For the JPEG:
let imageData = UIImageJPEGRepresentation(yourImage, 1.0)
where the second argument is a CGFloat representing the compression quality, with 0.0 being the lowest quality and 1.0 being the highest quality. Keep in mind that if you use JPEG, each time you compress and uncompress, if you're using anything but 1.0, you're going to degrade the quality over time. PNG is lossless so you won't degrade the image.
To get the image back out of the data object, there's an init method for UIImage to do this:
let yourImage = UIImage(data:imageData)
This method will work no matter how you converted the UIImage object to data.
In newer versions of Swift, the functions have been renamed, and reorganized, and are now invoked as:
For the PNG:
let imageData = yourImage.pngData()
For the JPEG:
let imageData = yourImage.jpegData(compressionQuality: 1.0)
Although Xcode will autocorrect the old versions for you
In order to save UIImage in NSUserDefaults, you need to convert UIImage into NSData using UIImageJPEGRepresentation(image, 1) then save it in NSUserDefaults. And on the other side, while retrieving it on another ViewController (within that same application) you need to get NSData and then convert it into UIImage.
Here i am writing small code snippet in swift to demonstrate this. I tried this in XCODE 6.4
/*Code to save UIImage in NSUserDefaults on viewWillDisappear() event*/
override func viewWillDisappear(animated: Bool)
{
super.viewWillDisappear(animated)
//Get some image in image variable of type UIImage.
var image = ....
let defaults = NSUserDefaults.standardUserDefaults()
var imgData = UIImageJPEGRepresentation(image, 1)
defaults.setObject(imgData, forKey: "image")
}
/*Code to get Image and show it on imageView at viewWillAppear() event*/
override func viewWillAppear(animated: Bool)
{
super.viewWillAppear(animated)
let defaults = NSUserDefaults.standardUserDefaults()
if let imgData = defaults.objectForKey("image") as? NSData
{
if let image = UIImage(data: imgData)
{
//set image in UIImageView imgSignature
self.imgSignature.image = image
//remove cache after fetching image data
defaults.removeObjectForKey("image")
}
}
}
Updated for Swift 3:
If you want to save image in UserDefault used below lines of code save and retrieve the image;
To Save image in UserDefault:
if let image = response.result.value {
UserDefaults.standard.register(defaults: ["key":UIImageJPEGRepresentation(image, 100)!])
UserDefaults.standard.set(UIImageJPEGRepresentation(image, 100), forKey: "key")
}
To Retrieve the image from UserDefault and set it to ImageView:
var mLogoImageView = UIImageView()
if let imageData = UserDefaults.standard.value(forKey: "key") as? Data{
let imageFromData = UIImage(data: imageData)
mLogoImageView.image = imageFromData!
}
Enjoy..!
In Swift 4 - 5
Set:
setImage(image: UIImage(named: "12")!)
func setImage(image : UIImage) {
UserDefaults.standard.set(image.jpegData(compressionQuality: 100), forKey: "key")
}
Get
func getImage() -> UIImage? {
if let imageData = UserDefaults.standard.value(forKey: "key") as? Data{
if let imageFromData = UIImage(data: imageData){
return imageFromData
}
}
return nil
}

Saving CGImageRef to a png file?

in my Cocoa application, I load a .jpg file from disk, manipulate it. Now it needs to be written to disk as a .png file. How can you do that?
Thanks for your help!
Using CGImageDestination and passing kUTTypePNG is the correct approach. Here's a quick snippet:
#import MobileCoreServices; // or `#import CoreServices;` on Mac
#import ImageIO;
BOOL CGImageWriteToFile(CGImageRef image, NSString *path) {
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:path];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypePNG, 1, NULL);
if (!destination) {
NSLog(#"Failed to create CGImageDestination for %#", path);
return NO;
}
CGImageDestinationAddImage(destination, image, nil);
if (!CGImageDestinationFinalize(destination)) {
NSLog(#"Failed to write image to %#", path);
CFRelease(destination);
return NO;
}
CFRelease(destination);
return YES;
}
You'll need to add ImageIO and CoreServices (or MobileCoreServices on iOS) to your project and include the headers.
If you're on iOS and don't need a solution that works on Mac too, you can use a simpler approach:
// `image` is a CGImageRef
// `path` is a NSString with the path to where you want to save it
[UIImagePNGRepresentation([UIImage imageWithCGImage:image]) writeToFile:path atomically:YES];
In my tests, the ImageIO approach was about 10% faster than the UIImage approach on my iPhone 5s. In the simulator, the UIImage approach was faster. It's probably worth testing each for your particular situation on the device if you're really concerned with performance.
Here is a macOS-friendly, Swift 3 & 4 example:
#discardableResult func writeCGImage(_ image: CGImage, to destinationURL: URL) -> Bool {
guard let destination = CGImageDestinationCreateWithURL(destinationURL as CFURL, kUTTypePNG, 1, nil) else { return false }
CGImageDestinationAddImage(destination, image, nil)
return CGImageDestinationFinalize(destination)
}
Create a CGImageDestination, passing kUTTypePNG as the type of file to create. Add the image, then finalize the destination.
Swift 5+ adopted version
import Foundation
import CoreGraphics
import CoreImage
import ImageIO
import MobileCoreServices
extension CIImage {
public func convertToCGImage() -> CGImage? {
let context = CIContext(options: nil)
if let cgImage = context.createCGImage(self, from: self.extent) {
return cgImage
}
return nil
}
public func data() -> Data? {
convertToCGImage()?.pngData()
}
}
extension CGImage {
public func pngData() -> Data? {
let cfdata: CFMutableData = CFDataCreateMutable(nil, 0)
if let destination = CGImageDestinationCreateWithData(cfdata, kUTTypePNG as CFString, 1, nil) {
CGImageDestinationAddImage(destination, self, nil)
if CGImageDestinationFinalize(destination) {
return cfdata as Data
}
}
return nil
}
}
The provided solutions probably still work fine but there is some newer API for this in CoreImage which does the same and is a bit more "Swifty" to use:
import CoreImage
func write(cgimage: CGImage, to url: URL) throws {
let cicontext = CIContext()
let ciimage = CIImage(cgImage: cgimage)
try cicontext.writePNGRepresentation(of: ciimage, to: url, format: .RGBA8, colorSpace: ciimage.colorSpace!)
}