I have the following situation: In my app I have created virtual MIDI ports, as explained in some examples on the Audiokit page.
I use midi in order to trigger AKMIDISampler and AKMIDISampler loads .aupresets for percusive instruments. I create the .aupresets in the AULAB so that the NOTE OFF should be ignored.
So the samples can fade out and will not be cut off when the next hit comes.
It works as expected with my MIDI Keyboard and some other hardware MIDI controller. MIDI NOTE OFF is ignored, previous sound can fade out and the sounds overlap fine. But when I load my App in AudioBus and trigger it over virtual MIDI with a Sequencer, every new sample always cuts off the previous one. That sounds very unnatural and should be changed.
Where is the difference between virtual MIDI and not virtual? What am I doing wrong and how to get the same behaviour on both levels? Any help is appreciated! Thank you!
//open midi ports
let midi = AudioKit.midi
midi.createVirtualPorts()
midi.openInput(name: "Session 1")
midi.addListener(self)
//Play Sampler
func receivedMIDINoteOn(noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
do {
try self.myMIDISampler.play(noteNumber: myNote, velocity: velocity, channel: myMIDIChannel)
} catch {
AKLog("Can't play the file, error:\(error)")
}
}
It should have the same results on virtual midi and hardware midi commands.
In principle I think I need a way to ignore / filter midi note off at all levels.
That could also be a solution.
#user3491466,
Without seeing your entire Xcode project and the .aupreset that you've saved from AU Lab, could you try increasing the layer's Release Time in the parameters menu, so that the samples continue to play out and not get cut off?
Please let me know if this helps at all. Be sure to also check out the Audiobus SDK and associated classes that are in the AudioKit Synth One open source project. That way, you can see if there's anything that's set differently.
https://github.com/AudioKit/AudioKitSynthOne
Related
With the updates in iOS 14, there is a green dot that appears in the top corner when the camera is in use. My app sometimes uses the camera for AR, but it isn't needed for other aspects of the app. I'd like to make the green dot go away while the camera isn't needed so users don't think we're doing something with the camera in the background, but I can't seem to find a way to programmatically stop using the camera. Is there a way to do this?
It sounds strange – you're using AR app but do not need an AR camera. I suppose you understand that RGB sensor (as well as IMU sensors) must be active during the time when ARSession is running. ARSession is based on data coming from RGB sensor. So if you turn it off – nothing good happens.
At first let's discuss how to start and stop AR tracking. You can start running ARSession using standard ARKit procedure:
let config = ARWorldTrackingConfiguration()
sceneView.session.run(config, options: someOptions)
...and you can stop running ARSession using:
sceneView.session.pause()
After stopping ARSession, a heavy CPU/GPU processing and enormous battery energy consuming will be eliminated. However, if a session was stopped, next time you'll run it, it must begin tracking from scratch.
Solution
So the best solution in this case is to use a pre-saved scene data stored in ARWorldMap. Just track a surrounding environment, save ARWorldMap, and then stop running session. Then do whatever you want (with no active ARSession), and after it run a session again and just load ARWorldMap data. Simple.
After reading about it, I have some mess in my head.
This function is being called while user swipe his finger on some UI element :
func wasDragged() { signal here }
I would like to make small Haptic signals every time it's being called ( like a dates picker wheel )
How would I setup first time and make the signals of the Haptic Engine on call ?
Do I have to check for device kind ? I want it only on iPhone 7 and up.
Using latest Swift.
The documentation oh Haptic feedback is really descriptive.
But if you want some quick solution here it is.
var hapticGenerator: UISelectionFeedbackGenerator?
func wasDragged() {
hapticGenerator = UISelectionFeedbackGenerator()
haptiGenerator.selectionChanged()
hapticGeneraor = nil
}
Alternatively depending on the logic of the screen, you can initialize generator outside of wasDragged function and inside of it just call hapticGenerator.prepare() and selectionChanged(). In that case you should not assign nil to it after the dragging is complete because it won't get triggered again. As per documentation you have to release generator when no longer needed as Taptic Engine will wait and therefore consume system resources for another call.
Note that calling these methods does not play haptics directly.
Instead, it informs the system of the event. The system then
determines whether to play the haptics based on the device, the
application’s state, the amount of battery power remaining, and other
factors.
For example, haptic feedback is currently played only:
On a device with a supported Taptic Engine
When the app is running in the foreground
When the System Haptics setting is enabled
Documentation:
https://developer.apple.com/documentation/uikit/uifeedbackgenerator
Been scratching my head with this one, to no avail.
Am just trying to find out how to instantly change the current playback position when using the AKAudioPlayer provided by AudioKit.
player.playhead is read-only, so cant be changed.
changing player.startTime whilst the player is already playing seems to change the playback position in terms of the reported .playhead position but the actual audio being played does not change position - am I missing something here?
Obviously I can stop audio and restart at new position but a several second expensive CPU gap is not desirable for a a simple mp3 / wav file player!
Any ideas?
We've added a new player called AKPlayer which will also allow scanning through the file. The player will stream from disk by default. This player is designed to fix some of the shortcomings in AKAudioPlayer.
OK Aurelius pointed me in the right direction of a new class introduced a couple weeks ago called AKClipPlayer
Like the standard Apple AV class you can now set the .currentTime property (in seconds) (the AKAudioPlayer class did not allow this) - it seems to stop playback doing this though so don't forget to .play() your node right after - there is no glitch or pause, is seamless on an old iPhone6S
My initial issue was as described in the title there. When I was recording from the mic, sound effects were playing at the same time even if the device was in mute mode (i.e., had the mute button physically switched on).
I since found this thread, which totally worked. the mute button now works correctly, and no sounds are played in the app while the mic is recording...
Unfortunately, the mic has now stopped recording!
It seems like I can have one or the other, but not both. Can anyone confirm if I can have the device on mute and record from the mic at the same time? And if so, how?
Thank you so much in advance,
Stew
UPDATE : I'm fairly certain that this isn't possible, and I'm basing this on the fact that the mute switch also doesn't work in Garage Band (Another app which requires simultanious playback and recording).
To try it yourself, simply load up Garage Band, play some music, and then note that the mute switch doesn't work.
I'm leaving this open in case anyone does come up with a solution, but I won't be offended if it's closed or deleted.
Can somebody tell me how to scrub the AQPlayer ( used in Apple's SpeakHere example ) using a UISlider like the iPod does?
I know how to handle the slider part, but once I have my value from the slider, what do I need to set/change/update in AQPlayer, or the AudioQueue, so that the player moves to that part of the Queue and continues playing from that point?
Is there any easy way to do this with a percentage of the playing time or do I have to make some calculations with the packets??
Thanks for any input.
Al
For anyone who also needs to seek/scrubb in an audio file, I found a solution to my question at the following link: Audio Queues
Have a look at the function
-(void)seek:(UInt64)packetOffset;
It worked perfectly after some initial fine tuning.