iPhone headphone output - mono control left / right ear - iphone

I'm working on an iOS6 and above GPS navigation route creation and following app. At the moment I've integrated the OpenEARS framework to provide text to speech directions to the user.
I have setup an AVAudioSession and overrode the audio category to headphones. I'm looking for some advice in limiting my audio output to the left / right headphone channel depending on what physical direction the user must travel to reach the destination.
This is my first foray into Audio on iOS and am happy to change from OpenEARS if someone has knowledge of completing this with another TTS library.
Thanks, Ben

To anyone it may help in the future...
After spending hours looking at Core Audio and AudioToolbox methods for balancing I stumbled over a pan property within AVAudioPlayer. OpenEARS, and ultimately Flite, TTS systems use the AVAudioPlayer to play the converted audio file. Cheers!
The audio player’s stereo pan position.
#property float pan Discussion By setting this property you can
position a sound in the stereo field. A value of –1.0 is full left,
0.0 is center, and 1.0 is full right.
Availability Available in iOS 4.0 and later. Declared In
AVAudioPlayer.h

Related

Core Audio (Audio Units) audio session and MPVolumeView

I work on a VOIP app.
I use Core Audio Audio Units for playing and recording audio. I need to be able to manipulate sound volume and output devices. I am trying to use MPVolumeView to set sound volume and choose output devices.
My problem is: When I start using(start playout and capture for RemoteIO Audio Unit) Audio Units it seems MPVolumeView no longer control volume of my session but instead it controls system wide sound preferences. At the same time hardware buttons control volume of sounds played by Audio Units. Also when I start using Audio Units MPVolumeView start showing button to change output devices but before that it doesn't.
It seems that MPVolumeView controls sound volume for some system wide audio session but when I start using Audio Units another app wide (or even Audio Unit wide) audio session is created and used to play sound.
So the question is how to make MPVolumeView control sound volume for my Core Audio audio session?
I would appreciate any hints on why this happens. I've spent almost all day googling and I see that some people have related problems but none got any hints :(. I can also post more details if needed.
Confirmed as a bug by Apple engineer.
In more details - MPVolumeView should be tied to a specific audio route (in more broad sense, like audio route + audio category + mode etc.), and it is for a couple of most common routes (e.g. headphones + play category + default mode) but not to all custom routes you can create.
So basically when one creates some custom route MPVolumeView still tries to control it's last (workable) or default route.

How to set pan property like AVAudioPlayer in AVPlayer

I am using AVPlayer to play simultaneously two songs.I have just finished everything using AVPlayer ..now my problem is how to set voice for left and Right ears in AVPlayer.I want same property as AVAudioPlayer's pan property.When we set pan= -0.1 it routes only through Left ear and +1.0 routes to right ear of Headphone.Any help how to do this ?
Please check the documentation of AVAudioPlayer and use the pan property but openAL allows pan, gain, location, and balance, along with effects.
Playing Sounds with Positioning Using OpenAL
To provide full-featured audio playback including stereo positioning, level control, and simultaneous sounds, use OpenAL. See “Playing Sounds with Positioning Using OpenAL.”
and check this link
ObjectAL for iPhone
Hope it helps.

Play mp3 file smoothly upon dragging a scroll using AVToolbox or openAL

I have been facing this since so many days but I have not reach to any conclusion.
My problem is : I want to play an mp3 file but not simply by clicking on a play button.
It is this way I want to play it.
*There is a slider that I can drag using finger, I want that the mp3 should play with the frequency with which I am dragging the finger (or speed with which I am dragging my finger, so that it will give an effect of fast forwarding (funny type of voice)) or if I drag slider slowyly the output will be slow *
The problem is the output of the sound is not coming out smooth. its very distorted and disturbed voice.
I want the outuput to be smoother.
Please help. Any suggestions please. Presently I am using AVAudioPlayer and passing the time value based upon slider input to play the file. (It does not seems to be feasible though).
I feel that it is possible using openAL only and no other way. Because using openAL we can modify the frequency of the sound file (pitch)
CAN SOME ONE PLEASE REFER ME A LINK TO openAL implementation for iPhone . I have never played a sound file using openAL
Help!!
You won't be able to do it with AVAudioPlayer, as it does not support pitch operations.
You can load and decode the entire track into memory for playback with OpenAL (which supports pitch), or you can do realtime loading/decoding and pitch changing using Audio Units (MUCH lower level, and more complicated, though).

Audio/Voice Visualization

Hey you Objective-C bods.
Does anyone know how I would go about changing (transforming) an image based on the input from the Microphone on the iPhone?
i.e. When a user speaks into the Mic, the image will pulse or skew.
[edit] Anyone have any ideas, I have (what is basically) a voice recording app. I just wanted something to change as the voice input is provided. I've seen it in a sample project, but that wasn't with an UIImage. [/edit]
Thanking you!!
Apple put together some great frameworks for this! The AVFoundation framework and CoreAudio framework will be the most useful to you.
To get audio level information AVAudioRecorder is useful. Although it is made to be used for recording, it also provides levels data for the microphone. This would be useful for deforming your image base on how loud the user is shouting at his phone ;)
Here is the apple documentation for AVAudioRecorder: AVAudioRecorder Class Reference
A bit more detail:
// You will need an AVAudioRecorder object
AVAudioRecorder *myRecorderObject;
// To be able to get levels data from the microphone you need
// to enable metering for your recorder object
[myRecorderObject prepareToRecord];
myRecorderObject.meteringEnabled=YES;
// Now you can poll the microphone to get some levels data
float peakPower = [myRecorderObject peakPowerForChannel:0];
float averagePower = [myRecorderObject averagePowerForChannel:0];
If you want to see a great example of how an AVAudioRecorder object can be used to get levels data, check out this tutorial.
As far as deforming your image, that would be up to an image library. There are a lot to choose from and some great ones from apple. I am not familiar with anything though so that might be up for someone else to answer.
Best of luck!
You may try using gl-data-visualization-view extensible framework in order to visualize your sound levels.

iPhone MPMusicPlayerController playback starting position

I'm using Media Player Framework to access the user's music library on iPhone. I would like to set the playback starting position so that I can start playing a song from 30 second mark, for example.
I have trouble finding out how to do this. The MPMediaPlayerController only offers beginSeekingForward but that's not quite what I'm looking for as it simply accelerates the playback speed.
There is probably something really simple that I'm missing.
MPMusicPlayerController's property currentPlaybackTime is a writeable property, so adjusting the playback starting point can be done with player.currentPlaybackTime = 30.0
You can use player.currentPlaybackTime to set the time, before you start playing and playback will start at your desired point.
UPDATE
2009 me had some real problems. He didn't really understand properties and missed the fact that MPMusicPlayerController.currentPlaybackTime is writable! And he was angry. Angry because iOS3.0 had promised iPod Library "Access" and instead delivered MPMusicPlayerController. He had been hoping for speedy access to the music packet data upon which he would have built many fascinating and magical audio applications. Luckily, iOS4.1's AVAssetReader came along 1 year later and he was finally able to stop hating.
WRONG 2009 ANSWER
Nope, this API is deliberately crippled, which is why you don't see any functions for
opening, or streaming from, the media file.
Your only hope is lowering the volume and calling beginSeekingForward until currentPlaybackTime returns >= 30s.
Enjoy!