How to listen to midi events from Aksequencer - midi

I am playing a MIDI file from the new AKSequencer but I don't know how I can listen to the midi events that the sequencer plays with an AKMIDIListener.
The end of my question is to create a MIDI player with which I can send the data to another instrument with a midi IN port or to a CoreMIDI compatible RTP-MIDI server.

AKSequencer was renamed to AKAppleSequencer. And from the source code of AudioKit, the function setGlobalMIDIOutput is not implemented in the actual AKSequencer yet. I suggest you to use AKAppleSequencer instead.

Related

MIDI File playback library for iOS

Is anyone aware of a library that can load a standard MIDI File and send the output to a MIDI interface? I've seen a number of libraries that play MIDI Files into an internal synthesizer, but none that will output to the MIDI interface.
Thanks in advance!
Start with MusicSequence from AudioToolbox.framework.
Create a MusicSequence then set the endpoint using MusicSequenceSetMIDIEndpoint.

Recording Radio Streaming in IOS

I am working on radio app in which i want to record live radio streaming on an record button click.It should record just the radio streaming sound. Currently i am recording from Mike but its not the way i want as it also captures the surrounding/background voice. I did read few post but all of them are regarding AVaudioRecorder. I have used wunderradio api for playing the radio from url. I haven't got any method/function in their api which will record that audio from url. I want to record the buffered audio only from that url,so that I can play that recorded file later.
Any help would be appreciated.
Recording audio playback by using the microphone is really not the right way to do it. If you're using AudioQueue services for audio playback you can use the AudioFileWritePackets function to write audio data to a file.
If you implement recording this way you can write audio files in the format they're streamed in. So an mp3 audio stream can be written as an mp3 file to disk since you're writing encoded audio packets with the AudioFileWritePackets function.
To get a better understanding of how this all should be implemented you can have a look at the SpeakHere sample code of Apple. The SpeakHere project records from the microphone but it's still a good example to see how the AudioFileWritePackets function works. In your case you need to use the AudioFileWritePackets function in the output callback of your AudioQueue. This is the callback that's called when a buffer is finished playing. So it's a good to place to write the already played audio buffers data to a file.

Network MIDI Protocol using iOS

Is it possible output both live audio and MIDI RTP data from Bluetooth device of iPhone? Looking to leverage the Core MIDI library. Crucial is the ability to fire off both sources simultaneously with no latency.
Many thanks........

How to do a live midi streaming?

I'm looking for help in my research
I'm trying to generate live midi so that people can listen to it via a web browser
I'm not sure but I'm guessing there must be a way to set up a midi server to accept connections from my desktop with midi sequencer , sending that midi data to an online midi server where people can connect to and listen to the midi that is generating live right in their web browser
any help appreciated
You are going to be better off writing a plugin to handle this. That being said, it is possible to dynamically play MIDI with JavaScript. See this question: generating MIDI in javascript
You could read in the data just like any feed from your web server, and play it back in chunks.

Save audio stream into file with MPMoviePlayerController

I've some trouble with audio streaming using MPMoviePlayerController.
I want to know if it's possible to save the data streaming info to a file while MPMoviePlayer is playing that file.
Is there a simple way to do this?
Does anyone have an idea?
According to apple (http://developer.apple.com/library/ios/#codinghowtos/AudioAndVideo/_index.html) for streaming audio you connect to a network stream using CFNetwork interfaces from CoreFoundation, then parse the network packets into audio packets using Audio File Stream Service (AudioToolbox/AudioFileStream.h) and then play the audio packets using Audio Queue Services (AudioToolbox/AudioQueue.h) ….
Now my idea is that if we can find some way to write the audio packets to the file in between sending the packets to the audio queue then we can save the audio stream while playing them…
Its just an idea that needs implementation and don't know weather it will work for video stream or not.