AVAudioPlayer sound files playing out of sync when mixed - iPhone - iphone

I am trying to play 6 seperate .mp3 sound files using instances of AVAudioPlayer.
The sounds are playing at the same time but they seem to be playing out of sync or
at slightly different speeds. Does anyone know why this may be?
Here is how I initialize a sound:
NSURL *musicurl = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource: #"SoundLoop" ofType: #"mp3"]];
music = [[AVAudioPlayer alloc] initWithContentsOfURL: musicurl error: nil];
[musicurl release];
music.numberOfLoops = -1; // Loop indefinately
music.currentTime = 0; // start at beginning
music.volume = 1.0;
music.meteringEnabled = YES;
and here is my AudioSession code:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
audioSession.delegate = self;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
[[AVAudioSession sharedInstance] setPreferredHardwareSampleRate:44100 error:nil];
[[AVAudioSession sharedInstance] setPreferredIOBufferDuration:30 error:nil];
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
Float32 hardvol = 1.0;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);
AudioSessionSetProperty(kAudioSessionProperty_CurrentHardwareOutputVolume, sizeof(hardvol), &hardvol);
UInt32 doSetProperty = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
[[AVAudioSession sharedInstance] setActive:YES error: nil];
Could it possibly have anything to do with the bit rates of the sounds? or that I am using .mp3?
Thanks.

I found the solution to this problem by using the playAtTime: method of the AVAudioPlayer class:
NSTimeInterval shortStartDelay = 0.5; // seconds
NSTimeInterval now = player.deviceCurrentTime;
[player playAtTime:now + shortStartDelay]; //these players are instances of AVAudioPlayer
[player2 playAtTime:now + shortStartDelay];
[player3 playAtTime:now + shortStartDelay];
Using this will allow all the sounds to play asynchronously and in sync.

You should check out the MultiMedia Programming Guide. The "Using Audio" section has tons of helpful information.
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/MultimediaPG/Introduction/Introduction.html
This sounds like it relates to your issue:
When using hardware-assisted decoding,
the device can play only a single
instance of one of the supported
formats at a time. For example, if you
are playing a stereo MP3 sound using
the hardware codec, a second
simultaneous MP3 sound will use
software decoding. Similarly, you
cannot simultaneously play an AAC and
an ALAC sound using hardware. If the
iPod application is playing an AAC or
MP3 sound in the background, it has
claimed the hardware codec; your
application then plays AAC, ALAC, and
MP3 audio using software decoding.
To play multiple sounds with best
performance, or to efficiently play
sounds while the iPod is playing in
the background, use linear PCM
(uncompressed) or IMA4 (compressed)
audio.
Here's another bit that claims what you are doing should be possible but it seems like Apple is throwing in a caveat with "the most processor-efficient multiple playback" line. I would think that if the processor is strained that would lend itself to not keeping things perfectly in time.
Starting in iOS 3.0, nearly all
supported audio formats can be used
for simultaneous playback—namely, all
those that can be played using
software decoding, as described in
Table 1-1. For the most
processor-efficient multiple playback,
use linear PCM (uncompressed) or IMA4
(compressed) audio.
In terms of debugging, you should start with two audio sounds and see if there is an issue. Then work your way up to the 6 and figure out if there is a distinct point at which the problem starts to occur. I would also find (or make) some audio tracks in the formats that Apple recommends (PCM or IMA4) and do the same testing with those. Doing these two things should help you narrow down what the actual problem is.

Related

AVPlayer not finishing the stream track

I am using AVPlayer to play long audio mp3 stream music (8 minutes), short musics (1 to 3 minutes) plays perfectly, but with these bigger musics the music starts playing, but after play some random minutes (between 2 and 3:20) the player starts the track again from the beginning. Although the player restart the music, the status information (duration and current time) keeps counting normally, without restart, just the audio restarts. Someone has an idea?
The file I am playing is this: http://storage-new.newjamendo.com/download/track/57864/mp31/
The player code:
AVAudioSession *mySession = [AVAudioSession sharedInstance];
// Assign the Playback category to the audio session.
NSError *audioSessionError = nil;
[mySession setCategory: AVAudioSessionCategoryPlayback
error: &audioSessionError];
if (audioSessionError != nil) {
NSLog (#"Error setting audio session category.");
return;
}
// Activate the audio session
[mySession setActive: YES
error: &audioSessionError];
if (audioSessionError != nil) {
NSLog (#"Error activating audio session during initial setup.");
return;
}
player = [AVPlayer playerWithURL:[NSURL URLWithString:url]];
[player play];
And here is how I track the information about the current time, that keeps counting normally.
AVPlayerItem *item = player.currentItem;
CMTime duration = [item duration];
CMTime currentTime = [item currentTime];
Seems that the problem is that the mp3 file I was streaming was a mpeg-1 and the Apple documentation says that I have to use mpeg-2, so I have just changed to the correct version, and the error is not happening again.
I have managed to fix this problem, it would appear (for me) that the server was sending the file as one block.
I don't know the ins and outs but the stream is now provided as a "ranged" stream (it is provided in segments I am told). Not only did this solve the problem but such things below started to work for me (as apposed to providing NaN).
float duration = CMTimeGetSeconds(self.avQueuePlayer.currentItem.duration);
I hope this helps, I am not being told much about what was changed on the server. Strangly my URL links worked on just about all other devices but I got this problem when playing from my App AND Safari.

How to play, record and edit the audio in iPhone

Here I need to develop an IOS app consists of record,play and edit the audio.I done the Audio Recording and Playing by using the AVAudioRecorder and AVAudioPlayer.
But I want how to edit the audio,and save it. Is there any way to do.kindly help me
Recording:
AVAudioRecorder *audioRecorder = [[AVAudioRecorder alloc]
initWithURL:audioFileURL
settings:audioSettings
error:nil];
[audioRecorder record];
Playing:
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL error:nil];
[audioPlayer play
Here I have included the sample code for recording and palying the audio, but how can i edit the audio linke in this Yotube link: https://www.youtube.com/watch?v=LWz2x9dXpsM
Audio recording can be done using AVFoundation framework of apple but editing of audio is kinda difficult process. So I suggest you to do it using open source library projects in iOS.
You can use The Amazing Audio Engine which provides great quality of audio recording, Audio Filtering and Audio Mixing. The Amazing Audio is build in Non-ARC and is very light, thread safe.
You can also download the sample of Amazing Audio Engine from Github.
Check this link once.In this example how to play audio and how to record the audio using AVAudioplayer Framework.
http://www.appcoda.com/ios-avfoundation-framework-tutorial/
You can use Extended Audio File Services. have a look at the reference for ExtAudioFileRead & ExtAudioFileWrite (they have sample code) then you can open one audio file read it, trim it and then write the new one.Go here
Extended Audio File Services Reference

Has anyone been able to play a video file and show live camera feed at the same time in separate views on iOS?

I have been trying to do this for a few days now using AVFoundation as well as trying to use MPMoviePlayerViewController. The closest I can get is allowing one to play at a time. I would like to think that this is possible because of Facetime. However, I know this is a little different because there is no separate video file.
Any ideas would help, and thanks.
I'm not sure where this is documented, but to get AVCaptureVideoPreviewLayer and MPMoviePlayerViewController to play together at the same time you need to set a mixable audio session category first.
Here's one way to do that:
AVAudioSession* session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:nil];
UInt32 mixable = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(mixable), &mixable);
[session setActive:YES error:nil];
See the Audio Session Programming Guide and Audio Session Cookbook for more info.
Have you tried to play video on one thread and recording video on another? That would allow both of them to run while maintaining their separation.

AVAudioRecorder & AVAudioPlayer - Sound output on internal speaker, how to change?

i have a problem with AVAudioRecorder and AVAudioPlayer.
when i use Player and Record at the same time (eg. for playing sound while recording) the sound is in the quiet internal Speaker. i searched stackoverflow and all i found was this code:
UInt32 *audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
But this doesn't help me :(
When i copyPaste it, i got errors.
What can i do to record and play the loud Speaker at the bottom?
I don't use anything like SCLister oder something...
Thanks in advance
Max
This is a bit old, but this post helped me and I wanted to update it for anyone else who might need it in the future. The code posted at the top is correct - it will take the quiet audio that's being played through the phone speaker and route it to the loudspeaker at the bottom. There is a minor typo in the code, which is why it's giving errors. Here is the correct snippet which will solve this issue:
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
Make sure you also activate the audio session right after setting this, before creating your audio player/recorder:
[[AVAudioSession sharedInstance] setActive:YES error:nil];
Last, if you're going to be playing and recording at the same time you'll probably need to set the category and mixing functions too. Here's the entire snippet which will set the category, enable mixing, route the audio to the main speaker, and activate the session. You'll want to do this only once right after the app launches.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
OSStatus propertySetError = 0;
UInt32 allowMixing = true;
propertySetError = AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(allowMixing), &allowMixing);
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
NSLog(#"Mixing: %x", propertySetError); // This should be 0 or there was an issue somewhere
[[AVAudioSession sharedInstance] setActive:YES error:nil];
Hope that helps someone!
I answered it here already: How to get AVAudioPlayer output to the speaker
In short, use this before recording:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];
...and use this before playback (on loud speakers or headphones if they are plugged in)
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil];
Only thing I have found about this topic is this:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
which must be set when you record your audio if you want to play back at the same time. Give that a try and lemme know.
P.S. Make sure you add the AudioToolbox and AVFoundation frameworks to your project and include them in your .m files.
If you're currently playing through the quiet speakers, and want to play through the loud speaker at the bottom of the iPhone, use this code:
UInt32 doChangeDefaultRoute = 1;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryDefaultToSpeaker,
sizeof (doChangeDefaultRoute),
&doChangeDefaultRoute
);
This is old question, but none of the other answers helped me... However, I found a solution which I am posting for future reference in case someone needs it.
The solution is described in the following blog post: iOS: Force audio output to speakers while headphones are plugged in .
You need to create new Objective-C class AudioRouter in your project. Then import AudioRouter.h into your header file of the class where you are initiating audio functionality. Now in the corrseponding .m file add the following lines within viewDidLoad method:
AudioRouter *foobar = [[AudioRouter alloc] init];
[foobar initAudioSessionRouting];
[foobar forceOutputToBuiltInSpeakers];
Now you have audio (e.g. AVAudioPlayer) output forced to loudspeaker!
AudioSessionSetProperty is deprecated since iOS7 but the following worked for me.
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&error];

Sound on simulator but not device

I'm using the following to play an m4a file:
NSString *path = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent: fileName];
SystemSoundID soundID;
NSURL *filePath = [NSURL fileURLWithPath:path isDirectory:NO];
AudioServicesCreateSystemSoundID((CFURLRef)filePath, &soundID);
AudioServicesPlaySystemSound(soundID);
It works fine on the simulator but I hear nothing on the device. Sounds files I'm using all stay in the bundle. Here is what filePath looks like from the device:
file://localhost/var/mobile/Applications/418945F3-3711-4B4D-BC65-0D78993C77FB/African%20Adventure.app/Switch%201.m4a
Is there an issue with the file path or any thing different I need to do for the device?
Just as a sidenote - I was having the exact same problem and spent probably close to an hour on converting files to the correct format, etc.. Yet the problem was the "mute" switch on the iPad. So even though the volume was up, and I could hear other sounds on the iPad, because the mute switch was turned on, it wasn't playing system sounds.
To add to the confusion, this app uses text-to-speech and the volume coming from the dictation was perfectly fine, it was only the sounds coming from AudioServicesPlaySystemSound() that weren't being played.
I had trouble with this too. Finally I realised it was because AudioServices can only play audio with the following constratints.
Sound files that you play using this
function must be:
- No longer than 30 seconds in duration
- In linear PCM or IMA4 (IMA/ADPCM) format
- Packaged in a .caf, .aif, or .wav file
From Apple docs: http://developer.apple.com/library/ios/#documentation/AudioToolbox/Reference/SystemSoundServicesReference/Reference/reference.html
You might want to use the AVAudioPlayer instead of AudioServices.
The following code will take an audio file (.m4a) and play the audio file 1 time. Don't forget to release "audioPlayer" when you're done with it.
NSString *urlAddress = [[NSBundle mainBundle] pathForResource:#"filename" ofType:#"m4a"];
NSURL *url = [NSURL fileURLWithPath:urlAddress];
NSError *error;
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
if (audioPlayer == nil)
{
NSLog([error description]);
}
else
{
[audioPlayer play];
}
Hope this example helps you with playing audio on the actual device. It might also be a good idea to increase the device audio when the file is playing.
Note: You will need to add the AVFoundation framework to your project if you have not already done so. As well as import the header file.
#import <AVFoundation/AVFoundation.h>
Update:
From Apple's Core Audio Overview Document
Audio Session Services
Audio Session Services lets you manage audio sessions in your application—coordinating the audio behavior in your application with background applications on an iPhone or iPod touch. Audio Session Services consists of a subset of the functions, data types, and constants declared in the AudioServices.h header file in AudioToolbox.framework.
The AVAudioPlayer Class
The AVAudioPlayer class provides a simple Objective-C interface for playing sounds. If your application does not require stereo positioning or precise synchronization, and if you are not playing audio captured from a network stream, Apple recommends that you use this class for playback. This class is declared in the AVAudioPlayer.h header file in AVFoundation.framework.
Start by error-checking your returns. Is filePath nil? Do either of the AudioServices functions return an error? The most likely cause is case-sensitivity. The iPhone filesystem is case sensitive while the Mac is not. But the first step in debugging is to look at the errors the system is providing.
The simulator uses regular QuickTime for playback, so it's easy to have media assets which work in the sim, but fail on the device due to missing / unsupported codecs. The test is if you can play the file at all on the device, eg through Safari or the iPod app.