I have been searching google and could not find a solution for my problem. I have some code that grabs metadata from images the user picks from the image picker. The problem is that my code doesn't grab ALL metadata from HEIC and RAW images. I set up some print statements to find out which data my code doesn't grab from HEIC.
Manufacturer not found
Camera model not found
Camera software not found
Aperture not found
Focal length not found
ISO not found
Shutter speed not found
//And this is my metadata extracting code block
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
let image = info[.originalImage] as? UIImage
let url = info[.imageURL]
let optionalImageData = try? Data(contentsOf: url as! URL)
guard let imageData = optionalImageData else { return }
let source: CGImageSource = CGImageSourceCreateWithData(imageData as CFData, nil)!
let metadata = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [AnyHashable: Any]
print(metadata!)
self.dismiss(animated: true, completion: nil)
}
I'm going to guess that you have not obtained user permission to access the photos library. You are allowed to present the picker even so, but the information that you can receive in response is very limited. If you have user permission, you can receive the image as a PHAsset and you can get the desired metadata from the photos library.
Related
I have an UIImageView in which I draw handwritten text, using UIGraphicsBeginImageContext to create the bitmap image.
I pass this image to an OCR func:
func ocrText(onImage: UIImage?) {
let request = VNRecognizeTextRequest { request, error in
guard let observations = request.results as? [VNRecognizedTextObservation] else {
fatalError("Received invalid observations") }
print("observations", observations.count) // count is 0
for observation in observations {
if observation.topCandidates(1).isEmpty {
continue
}
}
} // end of request
request.recognitionLanguages = ["fr"]
let requests = [request]
DispatchQueue.global(qos: .userInitiated).async {
let ocrGroup = DispatchGroup()
guard let img = onImage?.cgImage else { return }
crGroup.enter()
let handler = VNImageRequestHandler(cgImage: img, options: [:])
try? handler.perform(requests)
ocrGroup.leave()
crGroup.wait()
}
}
Problem is that observations is an empty array.
But, If I save UIImage to the photo album:
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
and read back image from the album with imagePicker and pass this image to ocrText, it works.
So it seems there is a format change to the image (or metadata?) when saved to album and that VNRecognizer needs those data.
Is there a way to change directly the original bitmap image format, without going through the storage on photo album ?
Or am I missing something in the use of VNRecognizeTextRequest ?
I finally found a way to get it.
I save the image to a file as jpeg and read the file back.
This didn't work with png, but works with jpeg.
How could I get a video from Gallery(Photos) in custom format and size.
for example I want to read a video in 360p.
I used below code to get video data but apple said it doesn't guarantee to read it in lowest quality.
It's a PHAsset extension, so self refering to a PHAsset object.
var fileData: Data? = nil
let manager = PHImageManager.default()
let options = PHVideoRequestOptions()
options.isNetworkAccessAllowed = true
options.deliveryMode = .fastFormat
manager.requestAVAsset(forVideo: self, options: options) {
(asset: AVAsset?, audioMix: AVAudioMix?, _) in
if let avassetURL = asset as? AVURLAsset {
guard let video = try? Data(contentsOf: avassetURL.url) else {
print("reading video failed")
return
}
fileData = video
}
}
There is a simple reason it can't be guaranteed: The file in 360p might not be on the device or in the cloud. So the Photos framework will deliver a format nearest to what you request. If you want exactly 360p, I would recommend you reencode the video you get from the photos framework yourself.
I am writing a visual recognition application that uses VisualRecognition.classify in order to classify images. I have configured my Swift environment and haven't been able to classify images when including a URL from the internet:
I have now created an application that uses the camera and photo library to allow users to take photos and have them classified. I am running into issues when passing along an fileURL from the device to the VisualRecognition service though.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let image = info[UIImagePickerController.InfoKey.originalImage] as? UIImage {
imageView.image = image
imagePicker.dismiss(animated: true, completion: nil)
let visualRecongnition = VisualRecognition(version: version, apiKey: apiKey)
let imageData = image.jpegData(compressionQuality: 0.01)
let documentURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let fileURL = documentURL.appendingPathComponent("tempImage.jpg")
try? imageData?.write(to: fileURL, options: [])
visualRecongnition.classify(imageFile: fileURL, success: { (classifiedImages) in
print(classifiedImages)
}) // getting error here " Missing argument for parameter 'completionHandler' in call"
}else {
print("There was an error picking Image")
}
}
I have even attempted to include the NSURL directly into the classify call as I have done with the working external URL, but still run into the same error. Would really like to see how to use a local image from the device in order to classify it successfully
The problem is that your call to classify does not correspond to the signature of the classify method. In this line:
visualRecongnition.classify(imageFile: fileURL, success: { (classifiedImages) in
change success to completionHandler, and add a second parameter in the closure (even if you ignore it), like this:
visualRecongnition.classify(imageFile: fileURL, completionHandler: { classifiedImages,_ in
You must first request permission to use the camera and library. Open your Info.plist file in Source code mode, and add the following lines:
<key>NSCameraUsageDescription</key>
<string>Ask for permission to use camera.</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>Ask for permission to use Photo Library</string>
If you also want to be able to write images to the Camera Roll add this too:
<key>NSPhotoLibraryAddUsageDescription</key>
<string>Ask for permission to save images to Photo library</string>
I need to get location data (coordinates for latitude and longitude) from an image picked from PhotoLibrary using UIImagePickerController. Existing answers proposes using fetchAssetsWithALAssetURLs, but it is deprecated for iOS 8 to 11, so I wonder what's the alternative?
Thanks!
This worked for me. For new iOS versions, use the following lines of code.
var asset: PHAsset?
asset = info[UIImagePickerControllerPHAsset] as? PHAsset
Note that asset = nil if the user did not give permission to access his Photo Library since the PHAsset data is sensitive.
To obtain permission, edit Info.plist accordingly and request for permission using PHPhotoLibrary.requestAuthorization().
If you are only supporting iOS 11 and later then you can directly get the PHAsset using the .phAsset key. You only need to use PHAsset fetchAssetsWithALAssetURLs if you need to support iOS 10 or earlier.
Once you have the PHAsset reference, then you can access the location property to get the coordinates of the image.
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
var asset: PHAsset?
if #available(iOS 11.0, *) {
asset = info[.phAsset] as? PHAsset
} else {
if let url = info[.referenceURL] as? URL {
let result = PHAsset.fetchAssets(withALAssetURLs: [url], options: nil)
asset = result.firstObject
}
}
if let asset = asset {
if let location = asset.location {
print("Image location is \(location.coordinate.latitude), \(location.coordinate.longitude)")
}
}
}
I have Implemented the Following code to add the image into Document directory and store the Path in my sqlite Database
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingImage image: UIImage, editingInfo: [String : AnyObject]?) {
imagePath = imagesDirectoryPath + "/\(Date().description.replacingOccurrences(of: " ", with: "")).png"
print(imagePath)
let data = UIImagePNGRepresentation(image)
print(data!)
let success = FileManager.default.createFile(atPath: imagePath, contents: data, attributes: nil)
print(success)
dismiss(animated: true) { () -> Void in
}
}
Can Anyone help me how to get image back into My Tableview Using this Path as shown in the Image.
In the Table View cellForRowAt Method Or In Ur case if u want to display image in some other Case
You can Simply Do this by
let imageURL = URL(fileURLWithPath: student.Image_URL)
cell.imgProfileDisplay.image = UIImage(contentsOfFile: imageURL.path)
This will convert the image from the URL that we store in our DataBAse.
For Full Source Code of the Implementation kindly Comment.
Thanks