Programmatically playing MIDI with OpenAL and SF2 - midi

I can create a basic MIDI file from scratch verified using TiMidity++, and I know enough OpenAL to play a streamed source from a file.
I'm just wondering if it's possible to create a program in C to play MIDI files by reading in a SF2 and MIDI file and using OpenAL to play the sound. Do I need to use another library? I want to know beforehand because the SF2 format looks pretty complicated.

Core audio is where it's at!
If you are doing this on a mac, DLSMusicDevice does what you want.
if you are on iOS, you will need to do much 'by hand' but it is manageable (tedious, as well). If you go this route, look at the sample code 'mixer host'.

You need to use an CoreAudio. Create an AUGraph with a Sampler connected to an IOUnit. Look at header files AUComponent.h in the AudioUnit framework. Once you've got your audio graph setup you can play a note using the following function:
MusicDeviceMIDIEvent(sampler, status, note, velocity, 0);

Related

objective-c record audio session output

I am writing an app that generates music. I am using OpenAL to: modify gain; modify pitch; mix audio; and play the resulting audio. I now need to record the audio as it is being played. I understand that OpenAL does not let you record the output audio. The other options I have found is to use audio units. However because I need to mix/pitch/gain the audio and record it, it seems I need to write all the audio processing so I can have access to the output buffer. Is this correct? Or is there a different iOS API I can use to do this. If not then is there a 3rd party solution already that lets me record the output (paid solutions are fine)?
You are correct.
Audio Units are the only iOS public API that allows an app to both process and then record audio.
Trying to record the OpenAL output may well be a violation of Apple's rules against using non-public APIs.
The alternative may be to completely rewrite the portions of OpenAL you need (there may be open source for some portions) running on top of the RemoteIO Audio Unit.
The best way to go is likely to be Core Audio, since it will give you as much flexibility as you need. Take a look into the Extended Audio File Services reference pages.
Using and extended audio file you should be able to set up a file format and audio stream buffer to send the final mixed output to, and then use the ExtAudioFileWrite() function to write the samples to the file.

Is it possible to convert audio file with some music notes?

I have a music file with a particular tone(music.mp4). I want to convert existing sound file (speech.mp4) into the tone which is specified in music.mp4. Its like converting a speech into some particular tone. I do not want to play both files simultaneously. I want to convert source file with help of some music file. So, output file will be converted file.
Is it possible? I searched for Audio Unit Hosting and Multimedia guide. But do not get any clue.
Thanks in advance.
The answer is: it sounds (no pun intended) like it would be possible with the iOS. You just need to find someone who knows how to program that specific functionality. I do not know why you would think to find the answer in the Apple Docs. I want to know if it's possible to program music that plays backwards. I want to know if I can program a sound that converts my words into something my dog understands. I can't imagine the documentation could possibly cover everything everyone ever wants to program an iPhone to do.
There are no iOS public APIs for frequency analysis of audio files. You would have to write your own DSP code for that. The AVFoundation and Accelerate frameworks have some audio file conversion and math functions that may help, but that is only a small portion of the code needed.

iOS Advanced Audio API for decompressing format

On iOS, is it possible to get the user's audio stream in a decompressed format? For example, the MP3 is returned as a WAV that can be used for audio analysis? I'm relatively new to the iOS platform, and I remember seeing that this wasn't possible in older iOS versions. I read that iOS 4 brought in some advanced APIs but I'm not sure where I can find documentations/samples for these.
If you don't mind using API for iOS 4.1 and above, you could try using the AVAssetReader class and friends. In this similar question you have a full example on how to extract video frames. I would expect the same to work for audio, and the nice thing is that the reader deals with all the details of decompression. You can even do composition with AVComposition to merge several streams.
These classes are part of the AVFramework, which allows not only reading but also creating your own content.
Apple has an OpenAL example at http://developer.apple.com/library/mac/#samplecode/OpenALExample/Introduction/Intro.html where Scene.m should interest you.
The Apple documentation has this picture where the Core Audio framework clearly shows that it gives you MP3 out. It also states that you can access audio units in a more radical way if you so need.
The same Core Audio document gives also some information about using MIDI if it may help you.
Edit:
You're in luck today.
In this example an audio file is loaded and fed into an AudioUnit graph. You could fairly easily write an AudioUnit of your own to put into this graph and which analyzes the PCM stream as you see fit. You can even do it in the callback function, although that's probably not a good idea because callbacks are encouraged to be as simple as possible.

Play midi file on the iPhone

Is it possible to play .mid files directly via some API, or one have to convert the midi file to e.g. AAC first?
(2 years later…) You can use MusicPlayer and MusicSequence APIs. Available in iOS 5.
There is no Apple API for this. You could write your own, which i think would depend on what you are hoping it is going sound like.
There is lots of available source code for reading midi files and there are a few open source synths for the iphone - or you could use openAl for triggering samples. It probably isn't going to sound like Garageband tho.
If you want it to sound as good as possible you will have to convert it first.

Can iphone mix two sound files or build custom equalizer?

Can iphone mix two sound files or build custom equalizer?
I have studied for weeks about this problem,
and it seems unable to use iphone-sdk to mix two or more sound files or to build custom equalizer.
Is anyone have the experience to do this?
Yes you can. AVAudioPlayer can play multiple sounds and you can control the volume for each. Or you can use Audio Units and have more control over the audio data.
aurioTouch is a good sample app for what you are thinking of.
For simple playback of sound files you can use the AVAudioPlayer class introduced in the 2.2 SDK. It provides playback and volume controls for playing any audio file. As far as I am aware, there are no restrictions on the number of sound files you can play on the iPhone. The only restriction on playing sound files is that you may only play one AAC or MP3 compressed file at a time, the rest of the files must be either uncompressed or in the IMA4 format.
If your needs are more low-level (If you need to do DSP) you might want to look at AudioQueue Services or AudioUnits - two Mac OS X audio processing APIs that are also available on the iPhone.