Programmatically Create Aggregate Audio Devices In Swift Using CoreAudio - swift

I am researching creating multi-output devices on either OS X or iOS, and I found out that CoreAudio would allow you to create aggregate devices. My first question is, does iOS allow you to do this. I know that this is certainly possible on OS X, but I've heard that iOS will not allow it. I would really appreciate an example of how you would go about detecting multiple audio output devices and creating an aggregate device, all using swift. I have checked here, but it doesn't fully answer my question, and the answer it does have is based on Objective-C. I'd appreciate any help, and thanks in advance!

The aggregate audio device API is not publicly available on iOS, so you cannot create those devices yourself.
However iOS will create aggregate devices* for you depending on the most recently attached audio hardware and some other rules if you activate an AVAudioSession that uses the .multiRoute category.
When you get a route change notification due to an audio interface being added or removed, you can create a remote IO audio unit with the right number of channels. I haven’t tried using multi route audio with AVAudioEngine nor have I tried using only a subset of the available channels.
* They’re probably aggregate devices, although you never see them or interact with them directly.

Related

Private iOS API to access raw input from the noise canceling mic(s) on iPhone?

Is there a way to use a private iOS API to access the raw input from the noise canceling mic(s) on the iPhone?
I've tried looking through header dumps I found online but couldn't find anything related to the secondary microphones.
1) One interesting thing which I found on this subject is
./System/Library/Frameworks/AudioToolbox.framework/AudioToolbox
It has some class called AUMultiMicNoiseSuppressor.
2) Make sure that you have the newest header dumps, because a lot of online dumps are for iOS 3.0 (which is outdate)
3) I would recommend to look through frameworks, choose promising and run them through a disassembler. Header dumps usually are dumped with class-dump-z, which dumps only Objective-C API and doesn't dump and C API. It could be that API which you are looking for is C API.
Have you find the solution to access multiple microphone simultaneously? I'm working on a project just as you mentioned. What I know by far is a record using the build-in microphones doing stereo recording.
https://audioboo.fm/boos/1102187-recording-in-stereo-from-the-iphone-5#t=0m20s
And it is pretty easy on Android. using the Record Class, there are two channels recording the sound from bottom microphone and top microphone (note 3).

How to record game in cocos2d iPhone

I am developing a cocos2d app.
It's almost completed but now I want to record the activities of my app as a video file, including sound produced by the app.
How can I implement this?
Anybody can help me.
Please suggest a way to implement this.
Thanks in advance.
The question isn't new, but since it isn't answered I thought I'd pitch in:
We provide an SDK called "Everyplay" that allows you to do exactly what you're looking for. It's free to use, and is lightweight.
We provide out-of-the-box integrations for Unity3D, cocos2d (1.x, 2.x), cocos2d-x, and you can of course integrate to a custom OpenGL-based game engine.
The documentation is available at https://developers.everyplay.com/doc
The documentation contains an example app key to use when developing, but you can of course sign up for your own client key at https://developers.everyplay.com/
There are many options - and the fact that your app is cocos2d doesn't matter much.
iSimulate works well. You can actually play the app on your device and record the gameplay as well as the touch events. This is important if you want to show user interaction in your app. You run the app in the simulator but you control it from your device.
If you just want to record the app interaction without caring about showing users the touch events, you can use Screenflow or Jing or some other recording software. I used to use Jing (free) but Screenflow works better for me and it also lets you create more advanced video like a trailer with effects. edit You should be able to capture touch events through the simulator with Screenflow too. You can choose to show them or not. And can use different indicators for those events.
Search on google for mac or iphone recording software. There are many options. I had the best experience with Screenflow because I wanted to make a trailer and gameplay video.
I'm developing similar application which allow user record the activity within cocos2d-x activity.
I'm using screen capture method and then combine it using FFMPEG. The performance wasn't too good thought but is the easiest way to achieve.

Reverb with OpenAL on iOS

Is there any possible way to do reverb using OpenAL on iOS? Anyone have any code snippets to achieve this effect? I know it's not included in the OpenAL library for iOS, but I would think there's still a way to program it in.
Thanks.
Reverb is now natively supported in OpenAL (as of iOS 5.0). You can view a sample implementation on the ObjectAL project:
https://github.com/kstenerud/ObjectAL-for-iPhone
Just grab the most recent source from this repository, load "ObjectAL.xcodeproj" and run the ObjectALDemo target on any iOS 5.0 device (should also work on the simulator).
The actual implementation lies in two places:
https://github.com/kstenerud/ObjectAL-for-iPhone/blob/master/ObjectAL/ObjectAL/OpenAL/ALListener.m
https://github.com/kstenerud/ObjectAL-for-iPhone/blob/master/ObjectAL/ObjectAL/OpenAL/ALSource.m
Look for the word 'reverb' in these files (and the corresponding header files) to find the name of the OpenAL properties and constants used to set and control the reverb effect.
Good luck!
You could use pre-rendered audio if the situation allows it. If you want to do it real time look into DSP. Theres no way do to this out of the box that I am aware of.
The additional desktop APIs like EFX and EAX use hardware signal processing. Maybe in the future these hand held devices will implement the full OpenAL and OpenGL APIs, but for now we have the stripped down versions, for practical reasons like cost and battery life etc.
I'm sure there is a way, but its not going to be easy.

Simple embeddable MidiSynth for iOS?

I have a guitar diagram app for Android that I am porting to iOS. Android has a embedded midi synthesizer (sonivox), so I can generate midi files and let Android handle the playback.
Is there a way to do this on iOS? Or are there very lightweight embeddable synths for iOS?
Since iOS 5 there's now the AUSampler audio unit for which you can load a sound bank (apple preset and soundfonts) and then control the sampler/synth via MIDI messages.
Seee this example app:
https://developer.apple.com/library/ios/#samplecode/LoadPresetDemo/Listings/ReadMe_txt.html
It works great!
Update: My answer is out-of-dated. #lukebuehler's answer is much appropriate.
If you don't mind non-opensource solution, try FMOD. Being a commercial audio engine for games, fmod equips a simple MIDI synth. I've tried the free evaluation version. It plays GM MIDI files correctly on my iPhone 3G.
If what you want is not just a SMF file player, you want a full function GS/GM softsynth, which can response individual midi events in realtime. You can try the midisynth from CrimsonTech. Its license fee is fair. Crimsontech provides several demo apps in the appstore. Besides, it also provides an evaluation SDK for free. You don't need to pay a penny for the license until you're really going to publish your app.
I don't think MIDI support in iOS 4.2 allows playback of MIDI data from the phone itself. It is merely for sending and receiving MIDI commands to other MIDI devices.
From the recent iOS 4.2 docs, it seems that you can use the MIDI support to send MIDI commands to other devices for playback. You can also receive commands from other devices and make changes to these commands, or to save it in a file. However I can't find any support to actually play MIDI file from the phone directly. Someone please correct me if I'm wrong (I wish I'm wrong!!).
There is MIDI support in iOS 4.2. If it is the same as what OS X provides then trhere will also be a basic synth included. Check it out.

Outputting audio to paired bluetooth device?

I ride an off-road motorcycle on the unsurfaced road network of ancient byways in the UK. It's great fun, but I've yet to find a good turn-by-turn application suitable for this purpose. So, I figured I'd write one :)
I have a bluetooth system in my helmet. Is there any way of streaming audio to a paired bluetooth device from an iphone app. I can't see any reference to this in the SDK docs. I don't use the TomTom app, but I guess that must do it?
I think this is what you are looking for. It's very well documented in the Apple Docs.
IOBluetoothDevice
Here's a topic that could be of help too:
Topic
I think it you pair the Bluetooth device, then using it to receive audio shouldn't be hard.
Having said that I modified my answer, because Headphones ( when paired via bluetooth ), should appear as routes in Core Audio ( making it trivial to pass audio to them ). If you want to use anything sophisticated with bluetooth ( more than just passing audio , I think GameKit is your best shot. )
Hope I helped