Using VNRecognizeTextRequest on image drawn in UIImageView returns empty results - swift

I have an UIImageView in which I draw handwritten text, using UIGraphicsBeginImageContext to create the bitmap image.
I pass this image to an OCR func:
func ocrText(onImage: UIImage?) {
let request = VNRecognizeTextRequest { request, error in
guard let observations = request.results as? [VNRecognizedTextObservation] else {
fatalError("Received invalid observations") }
print("observations", observations.count) // count is 0
for observation in observations {
if observation.topCandidates(1).isEmpty {
continue
}
}
} // end of request
request.recognitionLanguages = ["fr"]
let requests = [request]
DispatchQueue.global(qos: .userInitiated).async {
let ocrGroup = DispatchGroup()
guard let img = onImage?.cgImage else { return }
crGroup.enter()
let handler = VNImageRequestHandler(cgImage: img, options: [:])
try? handler.perform(requests)
ocrGroup.leave()
crGroup.wait()
}
}
Problem is that observations is an empty array.
But, If I save UIImage to the photo album:
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
and read back image from the album with imagePicker and pass this image to ocrText, it works.
So it seems there is a format change to the image (or metadata?) when saved to album and that VNRecognizer needs those data.
Is there a way to change directly the original bitmap image format, without going through the storage on photo album ?
Or am I missing something in the use of VNRecognizeTextRequest ?

I finally found a way to get it.
I save the image to a file as jpeg and read the file back.
This didn't work with png, but works with jpeg.

Related

How to download and show a list of images in a collection view so to avoid flickering and images in wrong spots?

I'm struggling to solve the very common problem of loading images inside a collection view given a list of urls. I have implemented a UIImageView extension. In there I defined a cache variable:
static var imageCache = NSCache<NSString, UIImage>()
In addition I have created a loadImage method that takes as input the image cache key, the url itself, a placeholder image and an error image and works as follows:
public func loadImage(cacheKey: String?, urlString: String?, placeholderImage: UIImage?, errorImage: UIImage?) {
image = nil
if let cacheKey = cacheKey, let cachedImage = UIImageView.imageCache.object(forKey: cacheKey as NSString) {
image = cachedImage
return
}
if let placeholder = placeholderImage {
image = placeholder
}
if let urlString = urlString, let url = URL(string: urlString) {
self.downloadImage(url: url, errorImage: errorImage)
} else {
image = errorImage
}
}
The download image function proceeds to create a data task, download the image and assign to the image view:
private func downloadImage(url: URL, errorImage: UIImage?) {
let dataTask = URLSession.shared.dataTask(with: url) { [weak self] (data, response, error ) in
if error != nil {
DispatchQueue.main.async {
self?.image = errorImage
}
return
}
if let statusCode = (response as? HTTPURLResponse)?.statusCode {
if !(200...299).contains(statusCode) {
DispatchQueue.main.async {
self?.image = errorImage
}
return
}
}
if let data = data, let image = UIImage(data: data) {
UIImageView.imageCache.setObject(image, forKey: url.absoluteString as NSString)
DispatchQueue.main.async {
self?.image = image
}
}
}
dataTask.resume()
}
I call the loadImage method inside cellForItemAt, so that I don't have to download all the data at once, but only the images that are effectively displayed on screen. The way I call the function is the following:
cell.albumImage.loadImage(
cacheKey: bestQualityImage,
urlString: bestQualityImage,
placeholderImage: UIImage(named: "Undefined"),
errorImage: UIImage(named: "Undefined")
)
The main problem I face with the current implementation is that sometimes the images are not displayed in the correct spot. In other words, if I have three elements on screen I would sometimes see all three elements with the same image instead of their respective image.
I believe that the problem is that by the time the download is complete for a specific cell, the cellForItemAt for that cell has already ended and the cell gets the wrong image instead.
Is there a way I can modify my function to fix this bug or should I completely change my approach?
EDIT.
At the beginning I thought the problem was that I was using an extension and I tried to use a function with a closure returning the downloaded image but that didn't solve my problem.

how to assign a URL to imageView in swift?

I have JSON response for image after a post request
"profile_picture": "uploads/profile_pictures/18/file.jpeg"
if I combine the base URL and "profile_picture" url I can have the image // browsing on the web page
http://ms.XXX.net/uploads/profile_pictures/18/file.jpeg
I want to store that image url to a UIImageView and show that image on ImageView.Please guide me how do I do that. Below is how im trying.
image = "http://ms.XXX.net/"+"\(LoginSingleton.shared.pathImage!)"
imageView.image = image as? UIImage
It would be ideal to use third party frameworks like SDWebImage, Kingfisher etc. for better user experience. But you can do it without them by getting the data from url asynchronously and then set the image.
guard let url = URL(string: "image-url") else { return }
DispatchQueue.global().async {
guard let data = try? Data(contentsOf: url) else { return }
DispatchQueue.main.async {
let image = UIImage(data: data)
// Set the image to your image view
}
}

How to use SDWebImage to download and save CGImage

I'm trying to use SDWebImage to download and image from an external url and return it. I do not want to set it on a view. This is the code I'm using, but it's not working. I'm returning nil. But I know the url I'm passing in works because I can see it in the browser. What am I doing wrong?
func downloadImage() -> CGImage {
var myImage: CGImage?
let myUrl = URL(string: "my-url-here.com")
SDWebImageDownloader.shared.downloadImage(with: myUrl, completed: { (image, data, error, true) in
print("Completed")
if image != nil {
myImage = image?.cgImage
}
})
return myImage!
}
I also tried this version, also with no luck:
func downloadImage() -> CGImage {
var myImage: CGImage?
let myUrl = URL(string: "my-url-here.com")
SDWebImageManager.shared.loadImage(with: myUrl, options: .continueInBackground, progress: { (received, expected, nil) in
print(received, expected)
}, completed: { (downloadedImage, data, error, SDImageCacheType, true, imageUrlString) in
DispatchQueue.main.async {
if downloadedImage != nil {
myImage = downloadedImage?.cgImage
}
}
})
return myImage!
}
SDWebImage is an asynchronous library. You can’t just return the results. Generally one would use an #escaping closure to supply the results to the caller. E.g.
func downloadImage(completion: #escaping(CGImage?) -> Void) {
let url = URL(string: "https://my-url-here.com")!
SDWebImageDownloader.shared.downloadImage(with: url) { image, _, _, _ in
completion(image?.cgImage)
}
}
And you’d use it like:
downloadImage { image in
guard let image = image else { return }
// use image here
}
// but not here
But let’s step back and look at the whole pattern. You say you want to “save” the result. If you’re talking about saving it to persistent storage, you would not want to use CGImage (or UIImage or whatever) at all. That’s computationally inefficient (converting asset to image and then back to Data so you can save it), space inefficient (you have to load the whole asset into memory at the same time), and likely introduces problems (e.g. if you download a JPG, convert to CGImage and then try to recreate a JPG, the resulting asset will be slightly different, bigger, and/or with new JPG artifacts). If you’re just pre-downloading assets, just use a simple networking library like Alamofire or URLSession.

how to get image from Json in Swift

i'm a baby of xcode developer, and i really need a help. Below is one of my json data, that i have print in output, for the text i already got display into my screen, but now i'm trying to get the image from the server, and i don't know how to do it.
JSON :
"MoviePhotoL" : "\/Data\/UploadFile\/cnymv-01_1.jpg",
"MoviePhotoP" : "\/Data\/UploadFile\/cnymv-02_1.jpg"
XCODE:
let userImage = iP["MoviePhotoP"] as? String
cell.imageView.image = userImage (??????)
i know that String cannot be converted into UIImage, and i already try to convert it to NSData and convert the NSData to UIImage(data), but still not get the picture :'(.... can somebody please help me?? i really need some help
Those paths seem relative to another source.
You need to generate or get an absolute URL that will let you access the image.
Right now you have a simple string and that's all, you can't convert this to data or image.
You need a string that you can put in a browser and load an image.
Once you're able to do that, you can load the image in your app.
Example:
func getImage(from string: String) -> UIImage? {
//2. Get valid URL
guard let url = URL(string: string)
else {
print("Unable to create URL")
return nil
}
var image: UIImage? = nil
do {
//3. Get valid data
let data = try Data(contentsOf: url, options: [])
//4. Make image
image = UIImage(data: data)
}
catch {
print(error.localizedDescription)
}
return image
}
//1. Get valid string
let string = "https://images.freeimages.com/images/large-previews/f2c/effi-1-1366221.jpg"
if let image = getImage(from: string) {
//5. Apply image
cell.imageView.image = image
}
NOTE: Data(contentsOf:options:) is synchronous and can reduce performance. The larger the image, the longer it will lock it's thread.
Generally you would do such intensive tasks in a background thread and update UI on the main thread, but... to keep this answer simple, I chose not to show that.

Saving UIImage Into Core Data Always Returns Nil when using ImagePickerView - Swift 3

I've been trying to save a single picture to an entity containing a single property titled "prof" and configured as a Binary Data type.
I go through the hoops to select a picture from UIImagePickerViewController, then I call up my method that handles saving the picture in Core Data in the desired NSData format.
My issue stems from loading the picture, in my loadImage method the entity for the image is not nil, meaning it does exist. However, I get nil when I try to parse the fetched NSData to a UIImage format to recreate the picture and then be able to use it.
Now i am using Swift 3 and Xcode 8, so far all the troubleshooting questions on here have the solution of casting the NSData to UImage like so:
let image : UIImage = UIImage(data: imageData)
however, xcode gives me a compiler error when I do this, and instead forces me to cast it as:
let image : UIImage = UIImage(data: (imageData as Data?)!)
which is where i get the nil that's throwing up my flow in the air... i've tried saving the data in many different ways, but still nothing.
if anyone could go through my following methods, see if i might be doing something wrong in the saving part, or the formating of NSData on the fetch method... anything would help.
My configuration:
-the prof property has "Allow external storage" set to true
-my persistent store is seeded blank at the app installation, meaning all the needed properties are already set up when the app is launched for the first time, but obviously set to nil until changed or modified by my various data flows.
-There is no other picture entity in my data model, this is the only one.
func saveProfilePicture(_ pic: UIImage){
let picData = UIImagePNGRepresentation(pic)
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first)
first?.setValue(picData, forKey: "prof")
try context.save()
} catch let err {
print(err)
}
}
func getProfilePicture() -> UIImage? {
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
var image : UIImage?
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first?.prof) as NSData
if let parsedImage : UIImage = UIImage(data: (first as Data?)!) as? UIImage {
image = parsedImage
}
} catch let err {
print(err)
}
return image
}
EDIT
The solution was found by noticing that in Swift 3, the UIImage class adheres to the Transformable protocol. Swapping my property type for the image from Binary Data to Transformable actually made it possible to save the UIImage as UIImage directly into Core Data without parsing it to another data type.
func saveProfilePicture(_ image: UIImage){
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = (records.first)
first?.prof = image
print(first)
coreDataManager.saveData()
} catch let err {
print(err)
}
}
func loadProfilePicture() -> UIImage? {
var image : UIImage?
let request: NSFetchRequest<UsePics> = UsePics.fetchRequest()
do {
let records = try coreDataManager.managedObjectContext.fetch(request) as [UsePics]
let first = records.first
if let img = first?.prof {
image = img as? UIImage
} else {
print("no image")
}
} catch let err {
print(err)
}
return image
}