I am using VTPixelTransferSessionTransferImage to modify the size and pixel format of a CVPixelBuffer. I am struggling to get to the bottom of a memory leak using this code block.
I have found several similar issues but all of the solutions are ObjC, and do not apply in Swift because of the memory management differences. Any help would be much appreciated.
I should note, when this method is called and the size/format already match (when coming from AVFoundation) I do not see a leak, but when CVPixelBuffer comes from a Blackmagic source the leak occurs. Simply returning the sampleBuffer as on iOS results in no leak on macOS with either AVFoundation or Blackmagic sources.
private func convertPixelBuffer(_ sourceBuffer: CVPixelBuffer) -> CVPixelBuffer? {
#if os(macOS)
guard let session = pixelTransferSession else { return nil }
let state = self.state
var destinationBuffer: CVPixelBuffer? = nil
let status: CVReturn = CVPixelBufferCreate(kCFAllocatorDefault, state.width, state.height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, pixelBufferAttributes as CFDictionary?, &destinationBuffer)
guard status == 0, let destinationBuffer = destinationBuffer else { return nil }
// transfer the image
VTPixelTransferSessionTransferImage(session, from: sourceBuffer, to: destinationBuffer)
return destinationBuffer
#else
return sourceBuffer
#endif
}
Any suggestions would be much appreciated.
Related
These days I am encountering a problem in relation to memory usage in my app that rarely causes this one to crash. I noticed from the memory inspector that it is caused by converting the CVPixelBuffer (of my camera) to UIimage. Below you can see the code I use:
extension UIImage {
public convenience init?(pixelBuffer: CVPixelBuffer) {
var cgImage: CGImage?
VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &cgImage)
guard let cgImage = cgImage else {
return nil
}
self.init(cgImage: cgImage)
}
}
Any suggestions? Thanks in advance.
I am trying to run a simple oscillator using the new AVAudioSourceNode Apple introduced in the latest release. The code is excerpt from the example code Apple released, available here.
However, whenever I run this in a Swift playground, the callback is fired but no sound is emitted. When I move this code to an iOS app, it works fine. Any idea what's happening? AFAIK other audio nodes work well in Playgrounds, so I'm not sure why this specific one fails. See code below. Ran using Xcode 11 and macOS 10.15.
import AVFoundation
import PlaygroundSupport
let audioEngine = AVAudioEngine()
let mainMixerNode = audioEngine.mainMixerNode
let outputNode = audioEngine.outputNode
let format = outputNode.inputFormat(forBus: 0)
let incrementAmount = 1.0 / Float(format.sampleRate)
var time: Float = 0.0
func sineWave(time: Float) -> Float {
return sin(2.0 * Float.pi * 440.0 * time)
}
let sourceNode = AVAudioSourceNode { (_, _, frameCount, audioBufferList) -> OSStatus in
let bufferListPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
for frameIndex in 0..<Int(frameCount) {
let sample = sineWave(time: time)
time += incrementAmount
for buffer in bufferListPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frameIndex] = sample
}
}
return noErr
}
audioEngine.attach(sourceNode)
audioEngine.connect(sourceNode, to: mainMixerNode, format: format)
audioEngine.connect(mainMixerNode, to: outputNode, format: nil)
mainMixerNode.outputVolume = 0.5
audioEngine.prepare()
do {
try audioEngine.start()
} catch {
print(error.localizedDescription)
}
PlaygroundPage.current.needsIndefiniteExecution = true
It seems that Playground printing really ruins the performance of real time processing blocks. I had the same problem and then I moved the AVAudioSourceNode code to a different .swift file in the Sources folder, as suggested here
This routine returns nil in OSX 10.13.2 Beta but not in a playground. Don't have an older OS to test with at the moment.
func getImage(_ url: CFURL) -> CGImage? {
let sourceOptions = [
kCGImageSourceShouldAllowFloat as String: kCFBooleanTrue as NSNumber
] as CFDictionary
guard let imageSource = CGImageSourceCreateWithURL(url, sourceOptions) else { return nil }
guard let image = CGImageSourceCreateImageAtIndex(imageSource, 0, nil) else {
let imageSourceStatus = CGImageSourceGetStatus(imageSource)
Swift.print("image nil, Image Source Status = \(imageSourceStatus)")
return nil
}
return image
}
imageSource is non-nil, but image is nil. The console message is "Image Source Status = CGImageSourceStatus", which is not one of the valid enum values.
Tried with both Swift 3.2 and Swift 4.0. The Dictionary arg in CGImageSourceCreateWithURL can be set to nil; nothing changes.
The correct frameworks are in the project (although it linked without them, so I'm not sure they matter.) Did "import ImageIO" but it built without it, so again I'm not sure in matters.
The URL is absolutely a valid file URL -- like I said, this works in the playground, and I've tried a number of different files and file types (tiff, jpg and png).
Any ideas?
Intro and background:
I have been working on a project for sometime that lets the user do some custom manipulations from their camera (a live feed)
At the moment, I start the capture session in the following way:
var session: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
videoPreviewLayer!.frame = CameraView.bounds
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
session = AVCaptureSession()
session!.sessionPreset = AVCaptureSessionPresetPhoto
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && session!.canAddInput(input) {
session!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if session!.canAddOutput(stillImageOutput) {
session!.addOutput(stillImageOutput)
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
CameraView.layer.addSublayer(videoPreviewLayer!)
session!.startRunning()
}
}
}
where CameraView is the UIView of my viewcontroller. I now have a function called: singleTapped() that I want to get every frame of the capture, process it, then put into the CameraView frame (Perhaps I should be using a UIImageView instead?)...
Research:
I have looked here and here, as well as many others for getting the frames of the camera, yet these don't necessarily conclude where I need. What's interesting is in the first link I provided: In their answer they have:
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
var data_image = UIImage(data: image) //THEY EXTRACTED A UIIMAGE HERE
self.imageView.image = data_image
}
which does indeed get a UIImage from the camera, but is this a viable method for 30fps?
Rational and Constraints:
The reason for why I need a UIImage is because I am utilizing a library someone else wrote for transforming a UIImage in a custom way quickly. I want to present this transformation to the user "live".
In conclusion
Please let me know if I am missing something, or if I should reword something. As said above this is my first post, so I am not quite strong with SO nuances. Thanks, and cheers
You should maybe try reconsider using AVCaptureSession. For what you are doing (I assume) you should try using OpenCV. Its a great utility for image manipulations, especially if you are doing so at 30/60fps* (The actual frame rate after processing might, and I guarantee will, be less). Depending on what this manipulation is you have been given, you can easily port that over into XCode using bridging headers or converting everything entirely to C++ for use with OpenCV.
With OpenCV you can call the camera from built-in functions and that can save you lots of processing time and therefore runtime. For example, take a look at this.
I have used OpenCV for similar situations to which you just described, and I think you could benefit. Swift is nice, but sometimes handling certain things are better through other means...
This bit of code has been working for me ever since I wrote it, yet today it decided to quit working on me. I've been trying to resolve the issue the majority of the day, and it's slowly driving me crazy. I keep getting a "fatal error: unexpectedly found nil while unwrapping an Optional value" error. I understand what it means, but I can't seem to fix it. I found this thread - "unexpectedly found nil while unwrapping an Optional value" when retriveing PFFile from Parse.com - and tried the solution, but it did not work. Here is my code:
if let userImageFile: AnyObject = content["imageFile"] as? PFFile {
println("Here is your userImageFile: \(userImageFile)")
userImageFile.getDataInBackgroundWithBlock { (imageData:NSData!, error:NSError!) -> Void in
if error == nil
{
var contentImage = UIImage(data: imageData)
self.feedView.image = contentImage
}
}
}
I know that the userImageFile is not nil, because the print line I have put in before getDataInBackgroundWithBlock correctly prints out the PFFile i'm trying to access from my Parse data browser. I just don't understand how it can work for a week and then suddenly stop working. If someone could help me, I would really appreciate it. Thanks!
ANSWER
So after banging my head into my desk for many hours, I finally said screw it and deleted my code, only to re-write it and have it work again...This is now working code, but I've also updated it a bit with the suggestions of the post below.
if let imageFile: AnyObject = content["imageFile"] as? PFFile {
imageFile.getDataInBackgroundWithBlock({ (imageData:NSData!, error:NSError!) -> Void in
if let imageData = imageData
{
dispatch_async(dispatch_get_main_queue())
{
var contentImage = UIImage(data: imageData)
self.feedView.image = contentImage
}
}
else
{
//Do something else
}
})
}
Even if userImageFile is not nil, you are making the assumption that imageData passed to the closure is not nil. I would check that the image actually exists at server side, and modify your closure code as follows:
if let imageData = imageData {
dispatch_async(dispatch_get_main_queue()) {
var contentImage = UIImage(data: imageData)
self.feedView.image = contentImage
}
} else {
// Something happened
}
Remember that an explicitly unwrapped optional can technically be nil - it's still an optional, but you are instructing the compiler to consider it as a non optional (i.e. automatically unwrap).
Also note that you are modifying a UI component from a thread that is probably not the main. It's better to enclose it in a dispatch_async on the main thread.