These days I am encountering a problem in relation to memory usage in my app that rarely causes this one to crash. I noticed from the memory inspector that it is caused by converting the CVPixelBuffer (of my camera) to UIimage. Below you can see the code I use:
extension UIImage {
public convenience init?(pixelBuffer: CVPixelBuffer) {
var cgImage: CGImage?
VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &cgImage)
guard let cgImage = cgImage else {
return nil
}
self.init(cgImage: cgImage)
}
}
Any suggestions? Thanks in advance.
Related
I am using VTPixelTransferSessionTransferImage to modify the size and pixel format of a CVPixelBuffer. I am struggling to get to the bottom of a memory leak using this code block.
I have found several similar issues but all of the solutions are ObjC, and do not apply in Swift because of the memory management differences. Any help would be much appreciated.
I should note, when this method is called and the size/format already match (when coming from AVFoundation) I do not see a leak, but when CVPixelBuffer comes from a Blackmagic source the leak occurs. Simply returning the sampleBuffer as on iOS results in no leak on macOS with either AVFoundation or Blackmagic sources.
private func convertPixelBuffer(_ sourceBuffer: CVPixelBuffer) -> CVPixelBuffer? {
#if os(macOS)
guard let session = pixelTransferSession else { return nil }
let state = self.state
var destinationBuffer: CVPixelBuffer? = nil
let status: CVReturn = CVPixelBufferCreate(kCFAllocatorDefault, state.width, state.height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, pixelBufferAttributes as CFDictionary?, &destinationBuffer)
guard status == 0, let destinationBuffer = destinationBuffer else { return nil }
// transfer the image
VTPixelTransferSessionTransferImage(session, from: sourceBuffer, to: destinationBuffer)
return destinationBuffer
#else
return sourceBuffer
#endif
}
Any suggestions would be much appreciated.
Environment:
Xcode 10.2
Swift 4.2
MacOS Target: 10.11
I have the following extension to NSImage:
extension NSImage {
func filter(filter: String) -> NSImage? {
return autoreleasepool { [weak self] () -> NSImage? in
let image = CIImage(data: (self?.tiffRepresentation!)!)
if let filter = CIFilter(name: filter) {
filter.setDefaults()
filter.setValue(image, forKey: kCIInputImageKey)
let context = CIContext(options: [CIContextOption.useSoftwareRenderer: true])
// *** 64 Byte MEMORY LEAK on line below ***
guard let imageRef = context.createCGImage(filter.outputImage!, from: image!.extent) else {
context.clearCaches()
context.reclaimResources()
return nil
}
context.clearCaches()
context.reclaimResources()
return NSImage(cgImage: imageRef, size: NSSize(width: 0, height: 0))
} else {
return nil
}
}
}
}
The image being inverted is set in the InterfaceBuilder and is a 16KB 480x480 png with an alpha channel and sRGB IEC61966-2.1 color profile. I am calling the "filter" function from a subclassed NSButton as follows:
...
override func awakeFromNib() {
....
autoreleasepool {
image = image!.filter(filter: "CIColorInvert")
}
....
}
...
The filter works as advertised. However, I am getting a 64 Byte memory leak at the guarded 'context.createCGImage' call.
The things I have tried:
Moving the autoreleasepool block up and down the filter function
Removing the autoreleasepool block from the subclassed NSButton
Removing one and both of the autoreleasepool blocks
Same results: 64 Byte Malloc memory leak at the line indicated above. What am I missing?
I am trying to convert image snapshots into a video but I am facing UI Thread problems: my view controller is locked. I would like to know how to handle this because I did a lot of research and tried to detach the process into different DispatchQueues but none of them worked. So, it explains why I am not using any Queue on the code below:
class ScreenRecorder {
func renderPhotosAsVideo(callback: #escaping(_ success: Bool, _ url: URL)->()) {
var frames = [UIImage]()
for _ in 0..<100 {
let image = self.containerView.takeScreenshot()
if let imageData = UIImageJPEGRepresentation(image, 0.7), let compressedImage = UIImage(data: imageData) {
frames.append(compressedImage)
}
}
self.generateVideoUrl(frames: frames, complete: { (fileURL: URL) in
self.saveVideo(url: fileURL, complete: { saved in
print("animation video save complete")
callback(saved, fileURL)
})
})
}
}
extension UIView {
func takeScreenshot() -> UIImage {
let renderer = UIGraphicsImageRenderer(size: self.bounds.size)
let image = renderer.image { _ in
self.drawHierarchy(in: self.bounds, afterScreenUpdates: true)
}
return image
}
}
class ViewController {
let recorder = ScreenRecorder()
recorder.renderPhotoAsVideo { success, url in
if (success) {
print("ok")
} else {
self.alert(title: "Erro", message: "Nao foi possivel salvar o video", block: nil)
}
}
}
PS: I used this tutorial as reference -> http://www.mikitamanko.com/blog/2017/05/21/swift-how-to-record-a-screen-video-or-convert-images-to-videos/
It really looks like this is not possible, at least not the way you are trying to do it.
There are quite a few ways to render a UIViews content into an image, but all of them must be used from the main thread only. This applies also to the drawInHierarchy method you are using.
As you have to call it on the main thread and the method is just getting called so many times, I think this will never work in a performant way.
See profiling in Instruments:
Sources:
How to render view into image faster?
Safe way to render UIVIew to an image on background thread?
I am using Firebase to host and download images for my app. Each image on Firebase ranges from 200kb-400kb, and the user downloads about 12 at a time as they scroll through a collectionView. When I launch the VC that downloads the images, my app goes from using about 100mb of memory to 650mb of memory from downloading 19 images total. The images in questions are JPEGs and have been compressed quite heavily before their upload to Firebase. These images are stored in an NSCache, and clearing the cache brings the memory usage back down to around 100mb.
What is going on? Here is some code that may help?:
class TripOverviewCell: UICollectionViewCell {
#IBOutlet weak var imageView: UIImageView!
func updateUI(photo:Photo, image:UIImage? = nil) {
//Call when preparing to show image
if image != nil {
print("Loaded from cache")
imageView.image = image
photo.assignImage(image: image!)
} else {
let url = photo.imageUrl
let ref = FIRStorage.storage().reference(forURL: url)
ref.data(withMaxSize: 5*1024*1024, completion: { [weak self] (data, error) in
if error != nil {
print("Unable to download image")
} else {
print("Image downloaded")
if let imageData = data {
if let image = UIImage(data: imageData) {
self?.imageView.image = image
photo.assignImage(image: image)
TripsVC.imageCache.setObject(image, forKey: photo.uid as NSString)
}
}
}
})
}
}
}
As Frank commented, displayed images take up a LOT more space than just the data for the images. My error lay in the fact that I was caching UIImages, which is the full displayed image. Instead, I should have cached the data and then created images from that data where I need them. Memory usage is down from 550mb to about 20mb.
I fetch a lot of images from the web, and they are all kind of sizes - they can be big, small etc..
So I can resize them when I display them in the cell but this is inefficient. It's way better to resize them after SDWebImage have download them and cache them resized, instead of storing large images on disk and resize them for every cell.
So how can I do this with SDWebImage, or I have to hack a bit onto the class?
SDWebImage developer Olivier Poitrey answered this question for me here.
You have to implement the SDWebImageManagerDelegate protocol and then set it as the shared manager's delegate like this:
SDWebImageManager.sharedManager.delegate = self;
using the imageManager:transformDownloadedImage:withURL: instance method.
More information.
Worked perfectly for me.
I had the same problem as you, and tried tweaking SDWebImage first, but ended up building my own component that solved the problem. You can take take a look at it here : https://github.com/adig/RemoteImageView
SDWebImage 3.8.2
If using UIImageView category sd_setImageWithURL. I have created another UIImageView category (extension)
func modifiedImageFromUrl(url: NSURL?) {
self.sd_setImageWithURL(url) { (image, error, cacheType, url) in
if cacheType == SDImageCacheType.None && image != nil {
dispatch_async(dispatch_get_global_queue(QOS_CLASS_USER_INITIATED, 0)) {
let modifiedImage = // modify image as you want
dispatch_async(dispatch_get_main_queue()) {
SDWebImageManager.sharedManager().saveImageToCache(modifiedImage, forURL: url)
self.image = modifiedImage
}
}
}
}
}
Expansion on MaeSTRo's answer in Swift 3:
myImageView.sd_setImage(with: imageUrl){ (image, error, cacheType, url) in
guard let image = image, cacheType == .none else { return }
DispatchQueue.global(qos: .userInitiated).async {
let modifiedImage = myImageProcessor(image)
SDWebImageManager.shared().saveImage(toCache: modifiedImage, for: imageUrl)
DispatchQueue.main.async {
myImageView.image = modifiedImage
myImageView.setNeedsDisplay()
}
}
}