AudioKit Creating Sinewave Tone When Returning from Background - swift

I'm using AudioKit to run an AKSequencer() that plays both mp3 and wav files using AKMIDISampler(). Everything works great, except in cases when the app has entered background state for more than 30+ min, and then brought back up again for use. It seems to then lose all of it's audio connections and plays the "missing file" sinewave tone mentioned in other threads. The app can happily can enter background momentarily, user can quit, etc without the tone. It seems to only happen when left in background for long periods of time and then brought up again.
I've tried changing the order of AudioKit.start() and file loading, but nothing seems to completely eliminate the issue.
My current workaround is simply to prevent the user's display from timing out, however that does not address many use-cases of the issue occurring.
Is there a way to handle whatever error I'm setting up that creates this tone? Here is a representative example of what I'm doing with ~40 audio files.
//viewController
override func viewDidLoad() {
sequencer.setupSequencer()
}
class SamplerWav {
let audioWav = AKMIDISampler()
func loadWavFile() {
try? audioWav.loadWav("some_wav_audio_file")
}
class SamplerMp3 {
let audioMp3 = AKMIDISampler()
let audioMp3_akAudioFile = try! AKAudioFile(readFileName: "some_other_audio_file.mp3")
func loadMp3File() {
try? audioMp3.loadAudioFile(audioMp3_akAudioFile)
}
class Sequencer {
let mixer = AKMixer()
let subMix = AKMixer()
let samplerWav = SamplerWav()
let samplerMp3 = SamplerMp3()
var callbackTrack: AKMusicTrack!
let callbackInstr = AKMIDICallbackInstrument()
func setupSequencer{
AudioKit.output = mixer.mixer
try! AudioKit.start()
callbackTrack = sequencer.newTrack()
callbackTrack?.setMIDIOutput(callbackInstr.midiIn)
samplerWav.loadWavFile()
samplerMp3.loadMp3File()
samplerWav.audioWav >>> subMix
samplerMp3.audioMp3 >>> submix
submix >>> mixer
}
//Typically run from a callback track
func playbackSomeSound(){
try? samplerWav.audioWav.play(noteNumber: 60, velocity: 100, channel: 1)
}
}
Thanks! I'm a big fan of AudioKit.

After some trial and error, here's a workflow that seems to address the issue for my circumstance:
-create my callback track(s) -once- from viewDidLoad
-stop AudioKit, and call .detach() on all my AKMIDISampler tracks and any routing in willResignActive
-start AudioKit (again), and reload and reroute all of the audio files/tracks from didBecomeActive

Related

How to play multiple footstep sounds without blocking animations?

Am coding a game with scenes that a man runs in different distance and durations. Due to different running speeds and durations(0.009sec, 0.02sec, 0.05sec...), it's not a good way to playing a sound with multiple steps inside. By now, It is set to play one step sound when footstep moves once.
I tried the following code:
Steps play one by one, but the animation thread blocked. Then man runs unsmoothly
DispatchQueue.main.async {
Sound.shared.play(.step)
}
The animation goes smoothly, but some step sounds missing, it could be two or more sounds play at the same time, so they sound like one step sound.
DispatchQueue.global(qos: .background).async {
Sound.shared.play(.step)
}
Same as the second
let soundQueue = DispatchQueue.init(label: "soundQueue", qos: .default, attributes: [.concurrent], autoreleaseFrequency: .inherit, target: nil)
soundQueue.async {
Sound.shared.play(.step)
}
Supplement Info: First 30 step sounds AVAudioPlayer are stored in an array so that sounds can be played in a fast way.
func playSound(_ type: SoundType) {
switch type {
case .key:
case .step:
self.doPlayStepSound()
//....
default:
break
}
var stepPlayerArr = [AVAudioPlayer?]()
func doPlayStepSound() {
if stepCount >= 30 {
stepCount = 0
}
if let player = stepPlayerArr[stepCount] {
player.play()
}
stepCount += 1
}
How can I play each step sound when the man's footstep moves without blocking the animations?
Thanks.

How to remove sandbox without xcode

Earlier I asked a question regarding generateCGImagesAsynchronously. Thankfully it got answered and works great.
The issue is that it only works as a Cocoa app on xcode. I am now trying to move this logic to an executable Swift package but AVFoundation code, such as generateCGImagesAsynchronously, won't work. That is, no error is raised but those functions seem to be mocked. I presume this might have to do with my package being sandboxed? I was able to remove the sandbox from the Cocoa app I previously wrote the code for, but I can't figure out how to do that for this executable.
I am new to Swift and trying to understand it and it's kind of frustrating to think that what I want my code to do is dependent on the IDE I am using.
If anyone can point me in the direction of where to read in the docs, or some other sources, on how to make programs without using xcode, that would be great. Thanks!
Here is my code:
import Darwin
import Foundation
import AppKit
import AVFoundation
import Cocoa
#discardableResult func writeCGImage(
_ image: CGImage,
to destinationURL: URL
) -> Bool {
guard let destination = CGImageDestinationCreateWithURL(
destinationURL as CFURL,
kUTTypePNG,
1,
nil
) else { return false }
CGImageDestinationAddImage(destination, image, nil)
return CGImageDestinationFinalize(destination)
}
func imageGenCompletionHandler(
requestedTime: CMTime,
image: CGImage?,
actualTime: CMTime,
result: AVAssetImageGenerator.Result,
error: Error?
) {
guard let image = image else { return }
let path = saveToPath.appendingPathComponent(
"img\(actualTime).png"
)
writeCGImage(image, to: path)
}
let arguments: [String] = Array(CommandLine.arguments.dropFirst())
// For now, we assume the second arg, which is the
// path that the user wants us to save to, always exists.
let saveToPath = URL(fileURLWithPath: arguments[1], isDirectory: true)
let vidURL = URL(fileURLWithPath: arguments[0])
let vidAsset = AVAsset(url: vidURL)
let vidDuration = vidAsset.duration
let imageGen = AVAssetImageGenerator(asset: vidAsset)
var frameForTimes = [NSValue]()
let sampleCounts = 20
let totalTimeLength = Int(truncatingIfNeeded: vidDuration.value as Int64)
let steps = totalTimeLength / sampleCounts
for sampleCount in 0 ..< sampleCounts {
let cmTime = CMTimeMake(
value: Int64(sampleCount * steps),
timescale: Int32(vidDuration.timescale)
)
frameForTimes.append(NSValue(time: cmTime))
}
imageGen.generateCGImagesAsynchronously(
forTimes: frameForTimes,
completionHandler: imageGenCompletionHandler
)
As I said in a comment on your previous question, this has nothing to do with Xcode per se. Xcode just generates a lot of code and build commands for you.
macOS is a complex operating system and programs that want to use its more advanced features must follow certain patterns. One of this patterns is called the run loop. If you create a Cocoa app, you get most of these things for free.
Since you are trying to perform some asynchronous actions, you need a run loop. Appending this should work:
RunLoop.current.run()
Otherwise, your program will simply terminate when the main thread (your code) finishes. The run loop, however, causes the program to run a loop and wait for asynchronous events (this also includes UI interactions, for example) to occur.
Note that inserting this same line also fixes your issues from the other question.

AVCaptureDevice configuration taking unpredictable time to propagate to connected AVCapturePhotoOutput

I am trying to modify the exposure duration of photos taken.
Within the loop:
for (customDuration, customISO) in zip(exposureDurations, exposureISOs) { // exposureDurations and exposureISOs are just arrays of exposures
// code here
}
I set the AVCapture device's exposure to my custom value:
captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera,
for: AVMediaType.video,
position: .back) {
captureSession = AVCaptureSession()
capturePhotoOutput = AVCapturePhotoOutput()
do { try captureDevice.lockForConfiguration() }
catch { print("capturebracket: cannot lock camera for configuration."); return }
// set the correct exposure
captureDevice.setExposureModeCustom(duration: customDuration, iso: customISO, completionHandler: { (_) -> Void in exposureModeSet = true })
// unlock the capture device (sets changes)
captureDevice.unlockForConfiguration()
// wait for changes to propagate
while !exposureModeSet {}
// get the correct photo settings
let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey : AVVideoCodecType.jpeg, AVVideoCompressionPropertiesKey : [AVVideoQualityKey : jpegQuality]])
Then I print the value to make sure it's been set correctly:
print("duration: \(captureDevice.exposureDuration)") // prints customExposure
print("device duration: \(( captureSession.inputs[0] as? AVCaptureDeviceInput)!.device.exposureDuration)") // prints customExposure
print("device duration: \((capturePhotoOutput.connections[0].inputPorts[0].input as? AVCaptureDeviceInput)!.device.exposureDuration)") // prints customExposure
photoSettings.isAutoStillImageStabilizationEnabled = false
Then I take the photo:
capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)
Somehow, when I loop through the above code with different exposures, the photos end up getting the wrong exposures. Specifically, the first run through works fine. The second will sometimes take a photo with the first exposure, sometimes with the second. The third will sometimes take a photo with the second exposure, sometimes with the third, and so on.
The interesting thing is that when I insert a long, say 1-3 second pause just before taking the photo, all of the exposures are set correctly... so it seems that the changes to AVCaptureDevice are taking time to propagate, and the photoOutput rushes ahead and takes the photo before the exposure has been set correctly.
Does anyone have any ideas? Can't seem to find anything in the docs about it.

CGDisplayStream only capturing a single frame

I'm finally starting to play with Swift (and macOS development) for the first time. I'm trying to write a simple screen capture app to get started. I've already taken a look at and succeeded in using the AVFoundation APIs for doing this (AVCaptureSession, AVCaptureScreenInput, etc). But now I'd like to attempt to go a little lower-level and play with the closer-to-the-metal CGDisplayStream API.
Unfortunately I've only been able to get it to capture a single frame. I suspect I may be missing something regarding potential interaction between the main Runloop and the DispatchQueue I'm passing in? Not really clear on if those things interact in the first place.
Here's a small reproduction of my issue:
import Foundation
import AVFoundation
import CoreGraphics
let mainDisplay = CGMainDisplayID()
let displayBounds = CGDisplayBounds(mainDisplay)
let recordingQueue = DispatchQueue.global(qos: .background)
let displayStreamProps : [CFString : Any] = [
CGDisplayStream.preserveAspectRatio: kCFBooleanTrue,
CGDisplayStream.showCursor: kCFBooleanTrue,
CGDisplayStream.minimumFrameTime: 60,
]
let displayStream = CGDisplayStream(
dispatchQueueDisplay: mainDisplay,
outputWidth: Int(displayBounds.width),
outputHeight: Int(displayBounds.height),
pixelFormat: Int32(kCVPixelFormatType_32BGRA),
properties: displayStreamProps as CFDictionary,
queue: recordingQueue,
handler: { status, displayTime, frameSurface, updateRef in
print("is only called once")
}
)
func quit(_ : Int32) {
displayStream?.stop()
}
signal(SIGINT, quit)
displayStream?.start()
RunLoop.current.run()
Any help would be massively appreciated!!
Removing this line seems to fix the issue:
CGDisplayStream.minimumFrameTime: 60,
The docs don't mention what the unit for this "time" field is, but it appears to be in seconds. So you could change it to 1.0/60.0 for 60fps capture.

iOS: Keep application running in background

How do I keep my application running in the background?
Would I have to jailbreak my iPhone to do this? I just need this app to check something from the internet every set interval and notify when needed, for my own use.
Yes, no need to jailbreak. Check out the "Implementing long-running background tasks" section of this doc from Apple.
From Apple's doc:
Declaring Your App’s Supported Background Tasks
Support for some types of background execution must be declared in advance by the app that uses them. An app declares support for a service using its Info.plist file. Add the UIBackgroundModes key to your Info.plist file and set its value to an array containing one or more of the following strings: (see Apple's doc from link mentioned above.)
I guess this is what you required
When an iOS application goes to the background, are lengthy tasks paused?
iOS Application Background Downloading
This might help you ...
Enjoy Coding :)
Use local notifications to do that. But this will not check every time. You will have to set a time where you will check your specific event, you may shorten this by decreasing your time slot. Read more about local notification to know how to achieve this at:
http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Introduction/Introduction.html
I found a way, to keep app running in background by playing silence
Make sure, that you selected audio playback in background modes
Also, don't use this method for long time, since it consumes CPU resources and battery juice, but I think it's a suitable way to keep app alive for a few minutes.
Just create an instance of SilencePlayer, call play() and then stop(), when you done
import CoreAudio
public class SilencePlayer {
private var audioQueue: AudioQueueRef? = nil
public private(set) var isStarted = false
public func play() {
if isStarted { return }
print("Playing silence")
let avs = AVAudioSession.sharedInstance()
try! avs.setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
try! avs.setActive(true)
isStarted = true
var streamFormat = AudioStreamBasicDescription(
mSampleRate: 16000,
mFormatID: kAudioFormatLinearPCM,
mFormatFlags: kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked,
mBytesPerPacket: 2,
mFramesPerPacket: 1,
mBytesPerFrame: 2,
mChannelsPerFrame: 1,
mBitsPerChannel: 16,
mReserved: 0
)
let status = AudioQueueNewOutput(
&streamFormat,
SilenceQueueOutputCallback,
nil, nil, nil, 0,
&audioQueue
)
print("OSStatus for silence \(status)")
var buffers = Array<AudioQueueBufferRef?>.init(repeating: nil, count: 3)
for i in 0..<3 {
buffers[i]?.pointee.mAudioDataByteSize = 320
AudioQueueAllocateBuffer(audioQueue!, 320, &(buffers[i]))
SilenceQueueOutputCallback(nil, audioQueue!, buffers[i]!)
}
let startStatus = AudioQueueStart(audioQueue!, nil)
print("Start status for silence \(startStatus)")
}
public func stop() {
guard isStarted else { return }
print("Called stop silence")
if let aq = audioQueue {
AudioQueueStop(aq, true)
audioQueue = nil
}
try! AVAudioSession.sharedInstance().setActive(false)
isStarted = false
}
}
fileprivate func SilenceQueueOutputCallback(_ userData: UnsafeMutableRawPointer?, _ audioQueueRef: AudioQueueRef, _ bufferRef: AudioQueueBufferRef) -> Void {
let pointer = bufferRef.pointee.mAudioData
let length = bufferRef.pointee.mAudioDataByteSize
memset(pointer, 0, Int(length))
if AudioQueueEnqueueBuffer(audioQueueRef, bufferRef, 0, nil) != 0 {
AudioQueueFreeBuffer(audioQueueRef, bufferRef)
}
}
Tested on iOS 10 and Swift 4
I know this is not the answer to your question, but I think it is a solution.
This assumes that your trying to check something or get data from the internet on a regular basis?
Create a service that checks the internet every set interval for whatever it is you want to know, and create a push notification to alert you of it, if the server is down, or whatever it is your trying to monitor has changed state. Just an idea.
Yes you can do something like this. For that you need to set entry in info.plist to tell os that my app will run in background. I have done this while I wanted to pass user's location after particular time stamp to server. For that I have set "Required background modes" set to "App registers for location updates".
You can write a handler of type UIBackgroundTaskIdentifier.
You can already do this in the applicationDidEnterBackground Method