I'm actually puzzled. I want to retrieve local images from the photos library and want to resize them to 266x266 pixels. The code is working as expected on my iPhone12 pro 14.6, and on an iPad Pro (iPadOS 14.6) but not on an iPhone11 iOS15.
let imageSize = CGSize(width: 266, height: 266)
print("asset size: \(asset.pixelWidth) x \(asset.pixelHeight)")
let fullsizeOptions = PHImageRequestOptions()
fullsizeOptions.deliveryMode = .highQualityFormat
fullsizeOptions.isSynchronous = false
fullsizeOptions.isNetworkAccessAllowed = true
fullsizeOptions.resizeMode = .exact
let requestId = manager.requestImage(for: asset, targetSize: imageSize, contentMode: .aspectFill, options: fullsizeOptions, resultHandler: { (image, info) -> Void in
guard image != nil else {
print("🌈 detailed image fetch (\(asset.localIdentifier)) failed")
return
}
print ("real image size: \(image?.size)")
})
On iPhone11 I receive the following log:
asset size: 3024 x 4032
real image size: Optional((266.0, 354.0))
As you can see the real image size is not 266x266 - as expected. On my iPhone12 and on an iPad this code is working as expected. Any idea what's going wrong here? I had the very same problem with the same iPhone11 with iOS 14.4 - then I updated this one because I thought this is an iOS bug. But with iOS15 I can reproduce the same behaviour.. and now I'm lost. Any ideas?
Thanks a lot!
Cheers
Dennis
Related
I'm using a framework called OpalImagePicker, it allows me to pick several images instead of one. It returns an array of PHAssets.
I want to get these PHAssets, turn them into image, then convert them to base64 string so that i can send them to my data base.
But there's a problem: the images have really low quality when I try to get them from the PHAsset array.
Here's my code:
let requestOptions = PHImageRequestOptions()
requestOptions.version = .current
requestOptions.deliveryMode = .opportunistic
requestOptions.resizeMode = .exact
requestOptions.isNetworkAccessAllowed = true
let imagePicker = OpalImagePickerController()
imagePicker.maximumSelectionsAllowed = 4
imagePicker.allowedMediaTypes = Set([PHAssetMediaType.image])
self.presentOpalImagePickerController(imagePicker, animated: true,
select: { (assets) in
for a in assets{
// print(a)
// self.img.append(a.image)
self.img.append(a.imagehd(targetSize: CGSize(width: a.pixelWidth, height: a.pixelHeight), contentMode: PHImageContentMode.aspectFill, options: requestOptions))
and the function:
func imagehd(targetSize: CGSize, contentMode: PHImageContentMode, options: PHImageRequestOptions?) -> UIImage {
var thumbnail = UIImage()
let imageManager = PHCachingImageManager()
imageManager.requestImage(for: self, targetSize: targetSize, contentMode: contentMode, options: options, resultHandler: { image, _ in
thumbnail = image!
})
return thumbnail
}
I tried to give "request options.version" the ".original" value, or even high quality to delivery Mode, but then it just gives me nothing (image is nil)
I'm really lost. Can someone help?
Thanks a lot.
After update to iOS 11, photo assets now load slowly and I get this message in console:
[ImageManager] Unable to load image data,
/var/mobile/Media/DCIM/103APPLE/IMG_3064.JPG
I use static function to load image:
class func getAssetImage(asset: PHAsset, size: CGSize = CGSize.zero) -> UIImage? {
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
option.isSynchronous = true
var assetImage: UIImage!
var scaleSize = size
if size == CGSize.zero {
scaleSize = CGSize(width: asset.pixelWidth, height: asset.pixelHeight)
}
manager.requestImage(for: asset, targetSize: scaleSize, contentMode: .aspectFit, options: option) { (image, nil) in
if let image = image {
assetImage = image
}
}
if assetImage == nil {
manager.requestImageData(for: asset, options: option, resultHandler: { (data, _, orientation, _) in
if let data = data {
if let image = UIImage.init(data: data) {
assetImage = image
}
}
})
}
return assetImage
}
Request image for asset usually always succeeds, but it prints this message. If I use requestImageData function only, there is no such message, but photos made with Apple camera lose their orientation and I get even more issues while loading big amount of images (I use image slideshow in my app).
Apple always sucks when it comes to updates, maybe someone got a solution how to fix this? It even fails to load an asset, when there is a big list of them in user camera. Switching to requestImageData is not an option for me as it brings nil data frequently now.
I would like to point out, that I call this function only once. It is not used in UITableView etc. I use other code for thumbs with globally initialised manager and options, so assets are definitely not nil or etc.
I call this function only when user clicks at certain thumb.
When gallery has like 5000 photos, maybe connection to assets is just overloaded and later it can't handle request and crashes?
So many questions.
Hey I was having the warning as well and here is what worked for me.
Replacing
CGSize(width: asset.pixelWidth, height: asset.pixelHeight)
by
PHImageManagerMaximumSize in requestImage call
removed the warning log 🎉
Hope this helps,
I had the same problem. Though this did not completely solve it, but it definitely helped.
option.isNetworkAccessAllowed = true
This helps only on the devices where Optimise iPhone Storage option for Photos app has been turned on.
Your code has some serious issues. You are saying .isSynchronous = true without stepping into a background thread to do the fetch. That is illegal and is what is causing the slowness. Plus, you are asking for a targetSize without also saying .resizeMode = .exact, which means you are getting much bigger images than you are asking for.
However, the warning you're seeing is irrelevant and can be ignored. It in no way signals a failure of image delivery; it seems to be just some internal message that has trickled up to the console by mistake.
This seems to be a bug with iOS 11, but I found I could work around by setting synchronous option false. I reworked my code to deal with the async delivery. Probably you can use sync(execute:) for quick fix.
Also, I believe the problem only occurred with photos delivered by iCloud sharing.
You can try method "requestImageData" with following options. This worked for me in iOS 11.2 (both on device and simulator).
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.isSynchronous = true
PHImageManager.default().requestImageData(for: asset, options: options, resultHandler: { (data, dataUTI, orientation, info) in
I am using Swift 3, Xcode 8.2, iOS 10
For debugging purposes, I am trying to take an image, crop it and show what the cropped image looks like. I am trying to write this image to my computer (Mac OS X) desktop but all the tutorials I've found has me writing this to my documents directory which I incorrectly thought would actually be a directory on my computer but turns out it's this location on the phone itself that I can't figure out how to access.
I have this so far:
let finalImage : UIImage
let crop_section = CGRect(x: 0.0, y: 0.0, width: 1000.0, height: 1000.0)
let cg_image = screenshot.cgImage?.cropping(to: crop_section)
finalImage = UIImage(cgImage: cg_image!)
let documentsDirectoryURL = try! FileManager().url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: true)
// create a name for your image
print(documentsDirectoryURL)
let fileURL = documentsDirectoryURL.appendingPathComponent("cropped.png")
if !FileManager.default.fileExists(atPath: fileURL.path) {
do {
try UIImagePNGRepresentation(finalImage)!.write(to: fileURL)
print("Image Added Successfully")
} catch {
print(error)
}
} else {
print("Image Not Added")
}
I want to replace documentDirectory with something like ../Desktop/cropped.png but I can't figure out how to do this. Any help would be greatly appreciated!
I want to replace documentDirectory with something like ../Desktop/cropped.png
You can't. This is an iOS program. An iOS app is sandboxed within its own set of directories on the iOS device / simulator; it cannot reach out and see the Mac OS X desktop. They are two different universes.
Working on a pdf photo report app, and struggling with the low image quality in the pdfs that are generated.
func drawImage(index: Int, rectPos: Int) {
let image = getImage(index)
let xPosition = CGFloat(rectArray[rectPos][0])
let yPosition = CGFloat(rectArray[rectPos][1])
image.drawInRectAspectFill(CGRectMake(xPosition, yPosition, 325, 244))
}
func getImage(index: Int) -> UIImage {
var thumbnail = UIImage()
if self.photoAsset.count != 0 {
let initialRequestOptions = PHImageRequestOptions()
initialRequestOptions.resizeMode = .Exact
initialRequestOptions.deliveryMode = .HighQualityFormat
initialRequestOptions.synchronous = true
PHImageManager.defaultManager().requestImageForAsset(self.photoAsset[index], targetSize: CGSizeMake(325, 244), contentMode: PHImageContentMode.Default, options: initialRequestOptions, resultHandler: { (result, info) -> Void in
thumbnail = result!
})
}
return thumbnail
}
I then use these functions to grab the image and place it into a position on a page after UIGraphicsBeginPDFPageWithInfo(page, nil)...
I'm using BSImagePicker pod to get the images.
And finally my photoAsset is just an array of PHAsset photos that is generated after the user selects the images from the pod's CollectionView...
So far I tried all the settings for the initialRequestOptions.deliveryMode... highquality doesn't seem to make images any better.
What am I doing wrong here? Thanks!
Changing the target resolution when requesting image sets the minimum resolution of the image you are going to use.
instead of:
PHImageManager.defaultManager().requestImageForAsset(self.photoAsset[index], targetSize: CGSizeMake(325, 244)...
I simply doubled the size of the image and the image that is drawn to the pdf is better quality.
PHImageManager.defaultManager().requestImageForAsset(self.photoAsset[index], targetSize: CGSizeMake(650, 488)...
I am trying to create a blur effect using the following snippet:
let glowEffectNode = SKEffectNode()
glowEffectNode.shouldRasterize = true
let glowSize = CGSize(width: barSize.width, height: barSize.height)
let glowEffectSprite = SKSpriteNode(color: barColorData.topColor, size: glowSize)
glowEffectNode.addChild(glowEffectSprite)
let glowFilter = CIFilter(name: "CIGaussianBlur")
glowFilter!.setDefaults()
glowFilter!.setValue(5, forKey: "inputRadius")
glowEffectNode.filter = glowFilter
Of course on iOS 8.x it works perfectly but from iOS 9.x (tried it both both on 9.0 and 9.1) the blur is not working properly. (On the simulator the node seems to be a bit transparent but definitely not blurred and on the device it seems blurred but cropped and also has an offset from its center position:/)
Is there a quick way to fix this using CIFilter ?
I fiddled a bit more with this and found a solution...
First of all, it seems that using odd numbers for the blur radius causes the entire node to be rendered with an offset (???) so using 10 for example fixed the offset issue.
Secondly, it seems that the blur is cropped since the entire node is the rendered sprite and for a blur effect you need an extra space so I use a transparent sprite for the extra space and the following code snippet now works:
let glowEffectNode = SKEffectNode()
glowEffectNode.shouldRasterize = true
let glowBackgroundSize = CGSize(width: barSize.width + 60, height: barSize.height + 60)
let glowSize = CGSize(width: barSize.width + 10, height: barSize.height + 10)
let glowEffectSprite = SKSpriteNode(color: barColorData.topColor, size: glowSize)
glowEffectNode.addChild(SKSpriteNode(color: SKColor.clearColor(), size: glowBackgroundSize))
glowEffectNode.addChild(glowEffectSprite)
let glowFilter = CIFilter(name: "CIGaussianBlur")
glowFilter!.setDefaults()
glowFilter!.setValue(10, forKey: "inputRadius")
glowEffectNode.filter = glowFilter
I should have mentioned that I am creating a texture from this node using view.textureFromNode(glowEffectNode) for efficiency purposes but I tried using the node itself and the problem was still there so the above should work regardless