openAL streaming & interruptions - iphone

I made an iphone application which uses OpenAL to play many sounds.
These sounds are in mp3, quite heavy (more than 1mn)and I stream them (2 buffers per sound) in order to use less memory.
To manage interruptions, I use this code :
In OpenALSupport.c file :
//used to disable openAL during a call
void openALInterruptionListener ( void *inClientData, UInt32 inInterruptionState)
{
if (inInterruptionState == kAudioSessionBeginInterruption)
{
alcMakeContextCurrent (NULL);
}
}
//used to restore openAL after a call
void restoreOpenAL(void* a_context)
{
alcMakeContextCurrent(a_context);
}
In my SoundManager.m file :
- (void) restoreOpenAL
{
restoreOpenAL(mContext);
}
//OPENAL initialization
- (bool) initOpenAL
{
// Initialization
mDevice = alcOpenDevice(NULL);
if (mDevice) {
...
// use the device to make a context
mContext=alcCreateContext(mDevice,NULL);
// set my context to the currently active one
alcMakeContextCurrent(mContext);
AudioSessionInitialize (NULL, NULL, openALInterruptionListener, mContext);
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];
NSError *setCategoryError = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryAmbient error: &setCategoryError];
...
}
And finally in my AppDelegate :
- (void)applicationDidBecomeActive:(UIApplication *)application
{
[[CSoundManager getInstance] restoreOpenAL];
...
}
With this method the sounds are back after a call , but flows seem to be played randomly.
Is there a specific way to manage interruption with streaming sounds ? I don't find any article about that.
Thanks for your help.

Ok, I answer to my own question.
I solved the problem by managing error on my streaming method :
- (void) updateStream
{
ALint processed;
alGetSourcei(sourceID, AL_BUFFERS_PROCESSED, &processed);
while(processed--)
{
oldPosition = position;
NSUInteger buffer;
alSourceUnqueueBuffers(sourceID, 1, &buffer);
////////////////////
//code freshly added
ALint err = alGetError();
if (err != 0)
{
NSLog(#"Error Calling alSourceUnQueueBuffers: %d",err);
processed++;
//restore old position for the next buffer
position = oldPosition;
usleep(10000);
continue;
}
////////////////////
[self stream:buffer];
alSourceQueueBuffers(sourceID, 1, &buffer);
////////////////////
//code freshly added
err = alGetError();
if (err != 0)
{
NSLog(#"Error Calling alSourceQueueBuffers: %d",err);
processed++;
usleep(10000);
//restore old position for the next buffer
position = oldPosition;
}
///////////////////
}
}

Related

GKVoiceChat not connecting nor sending voice data on GameCenter Match

I'm trying to set up voice-chat on a game I'm developing for iOS, it successfully creates the match, but when I try to set up the voice-chat it does nothing, what am I doing wrong? It runs without throwing errors. Here's the code I'm using to make the voice-chat.
- (void)establishVoice
{
if (![GKVoiceChat isVoIPAllowed])
return;
if (![self establishPlayAndRecordAudioSession])
return;
NSLog(#"Did stablish voice chat");
chat = [match voiceChatWithName:#"GeneralChat"];
[chat start]; // stop with [chat end];
chat.active = YES; // disable mic by setting to NO
chat.volume = 1.0f; // adjust as needed.
chat.playerStateUpdateHandler = ^(NSString *playerID, GKVoiceChatPlayerState state) {
switch (state)
{
case GKVoiceChatPlayerSpeaking:
// Highlight player's picture
NSLog(#"Speaking");
break;
case GKVoiceChatPlayerSilent:
// Dim player's picture
NSLog(#"Silent");
break;
case GKVoiceChatPlayerConnected:
// Show player name/picture
NSLog(#"Voice connected");
break;
case GKVoiceChatPlayerDisconnected:
// Hide player name/picture
NSLog(#"Voice disconnected");
break;
} };
}
Where establishPlayAndRecordAudioSession is:
- (BOOL) establishPlayAndRecordAudioSession
{
NSLog(#"Establishing Audio Session");
NSError *error;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
BOOL success = [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
if (!success)
{
NSLog(#"Error setting session category: %#", error.localizedFailureReason);
return NO;
}
else
{
success = [audioSession setActive: YES error: &error];
if (success)
{
NSLog(#"Audio session is active (play and record)");
return YES;
}
else
{
NSLog(#"Error activating audio session: %#", error.localizedFailureReason);
return NO;
}
}
return NO;
}
The code successfully logs "Did stablish voice chat", so it does run the code, but when I start talking, it doesn't seem to get the voice nor send it. What am I doing wrong? Am I missing something? P.S. I'm not getting the GKVoiceChatPlayerConnected fired.

iOS Gamecenter Programmatic Matchmaking

I'm trying to implement a real-time multiplayer game with a custom UI (no GKMatchMakerViewController). I'm using startBrowsingForNearbyPlayersWithReachableHandler:
^(NSString *playerID, BOOL reachable) to find a local player, and then initiating a match request with the GKMatchmaker singleton (which I have already initiated).
Here's where I'm having trouble. When I send a request, the completion handler fires almost immediately, without an error, and the match it returns has an expected player count of zero. Meanwhile, the other player definitely has not responded to the request.
Relevant code:
- (void) findMatch
{
GKMatchRequest *request = [[GKMatchRequest alloc] init];
request.minPlayers = NUM_PLAYERS_PER_MATCH; //2
request.maxPlayers = NUM_PLAYERS_PER_MATCH; //2
if (nil != self.playersToInvite)
{
// we always successfully get in this if-statement
request.playersToInvite = self.playersToInvite;
request.inviteeResponseHandler = ^(NSString *playerID, GKInviteeResponse
response)
{
[self.delegate updateUIForPlayer: playerID accepted: (response ==
GKInviteeResponseAccepted)];
};
}
request.inviteMessage = #"Let's Play!";
[self.matchmaker findMatchForRequest:request
withCompletionHandler:^(GKMatch *match, NSError *error) {
if (error) {
// Print the error
NSLog(#"%#", error.localizedDescription);
}
else if (match != nil)
{
self.currentMatch = match;
self.currentMatch.delegate = self;
// All players are connected
if (match.expectedPlayerCount == 0)
{
// start match
[self startMatch];
}
[self stopLookingForPlayers];
}
}];
}
Figured it out! I needed to call - (void)matchForInvite:(GKInvite *)invite completionHandler:(void (^)(GKMatch *match, NSError *error))completionHandler in my invitation handler so that both players have the same match data.

How to read/update Local sqlite data when application is in suspend / background mode after every 30 min?

How to read/update Local sqlite data when application is in suspend / background mode after every sec?
Below Is the Code what I am trying to Do
Right Now i just want to NsLog All the Id Present In local Sqlite Table After Every Second
But unable to do that.
Can any one tell me what i am doing wrong or can any one correct the following code.
Thanks In Advance
-(void)applicationDidEnterBackground:(UIApplication *)app
{
if ([[UIDevice currentDevice] respondsToSelector:#selector(isMultitaskingSupported)]) { //Check if our iOS version supports multitasking I.E iOS 4
if ([[UIDevice currentDevice] isMultitaskingSupported]) { //Check if device supports mulitasking
UIApplication *application = [UIApplication sharedApplication]; //Get the shared application instance
__block UIBackgroundTaskIdentifier background_task; //Create a task object
background_task = [application beginBackgroundTaskWithExpirationHandler: ^ {
[application endBackgroundTask: background_task]; //Tell the system that we are done with the tasks
background_task = UIBackgroundTaskInvalid; //Set the task to be invalid
//System will be shutting down the app at any point in time now
}];
//Background tasks require you to use asyncrous tasks
dispatch_queue_t backgroundQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_source_t timerSource = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, backgroundQueue);
dispatch_source_set_timer(timerSource, dispatch_time(DISPATCH_TIME_NOW, 0), 1*NSEC_PER_SEC, 0*NSEC_PER_SEC);
dispatch_source_set_event_handler(timerSource, ^{
[self BackGroundTaskFunction];
});
dispatch_resume(timerSource);
}
}
}
-(void)BackGroundTaskFunction
{
[self GetUserName]
if (recordArray.count>0)
{
for (int i=0; i<recordArray.count; i++)
{
DB_data *db=(DB_data *)[recordArray objectAtIndex:i];
Nslog(#"%d",db.Id);
}
}
}
-(void)GetUserName{
self.recordArray = [[NSMutableArray alloc] init];
{
const char *sql = "select * from UserName";
sqlite3_stmt *selectStatement;
int returnValue = sqlite3_prepare_v2(database, sql, -1, &selectStatement, NULL);
if(returnValue == SQLITE_OK)
{
while(sqlite3_step(selectStatement) == SQLITE_ROW)
{
[recordArray addObject:[[DB_data alloc] initwithDataSet:sqlite3_column_int(selectStatement, 0) UserName:[NSString stringWithUTF8String:(char *)sqlite3_column_text(selectStatement, 1)]]];
}
}
else
{
printf( "could not prepare statemnt: %s\n", sqlite3_errmsg(database) );
}
}
}
No you can not do that, only apps that play audio, are tracking user location or voip client are allowed to run in the background.
If your does not fall in one of those types of apps then you can not run in the background.
The code you post allows you to run a lengthy process for a maximum of 10 min.
I solved My problem by using NSRUNLOOP which help me to make application keep running in background

Is there an event for when the headphones are unplugged?

During a test, a client noticed that video playback in the iPhone pauses when headphones are unplugged. He wanted similar functionality for audio playback, and maybe the ability to pop up a message.
Does anyone know if there's an event of some kind I could hook into to make this possible?
See Responding to Route Changes from the Audio Session Programming Guide.
This changed with iOS 7, you just need to listen to the notification named AVAudioSessionRouteChangeNotification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(audioRouteChanged:) name:AVAudioSessionRouteChangeNotification object:nil];
Swift 3.0 #snakeoil's solution:
NotificationCenter.default.addObserver(self, selector: #selector(YourViewController.yourMethodThatShouldBeCalledOnChange), name: NSNotification.Name.AVAudioSessionRouteChange, object: nil)
Here's the full implementation I eventually used for sending events when the headphones are plugged in (and unplugged).
There was a fair amount of complexity I needed to deal with to ensure things still worked after the app was returned from the background.
CVAudioSession.h file
#import <Foundation/Foundation.h>
#define kCVAudioInputChangedNotification #"kCVAudioInputChangedNotification"
#define kCVAudioInterruptionEnded #"kCVAudioInterruptionEnded"
#interface CVAudioSession : NSObject
+(void) setup;
+(void) destroy;
+(NSString*) currentAudioRoute;
+(BOOL) interrupted;
#end
CVAudioSession.m file
#import "CVAudioSession.h"
#import <AudioToolbox/AudioToolbox.h>
#implementation CVAudioSession
static BOOL _isInterrupted = NO;
+(void) setup {
NSLog(#"CVAudioSession setup");
// Set up the audio session for recording
OSStatus error = AudioSessionInitialize(NULL, NULL, interruptionListener, (__bridge void*)self);
if (error) NSLog(#"ERROR INITIALIZING AUDIO SESSION! %ld\n", error);
if (!error) {
UInt32 category = kAudioSessionCategory_RecordAudio; // NOTE CANT PLAY BACK WITH THIS
error = AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(category), &category);
if (error) NSLog(#"couldn't set audio category!");
error = AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange, propListener, (__bridge void*) self);
if (error) NSLog(#"ERROR ADDING AUDIO SESSION PROP LISTENER! %ld\n", error);
UInt32 inputAvailable = 0;
UInt32 size = sizeof(inputAvailable);
// we do not want to allow recording if input is not available
error = AudioSessionGetProperty(kAudioSessionProperty_AudioInputAvailable, &size, &inputAvailable);
if (error) NSLog(#"ERROR GETTING INPUT AVAILABILITY! %ld\n", error);
// we also need to listen to see if input availability changes
error = AudioSessionAddPropertyListener(kAudioSessionProperty_AudioInputAvailable, propListener, (__bridge void*) self);
if (error) NSLog(#"ERROR ADDING AUDIO SESSION PROP LISTENER! %ld\n", error);
error = AudioSessionSetActive(true);
if (error) NSLog(#"CVAudioSession: AudioSessionSetActive (true) failed");
}
}
+ (NSString*) currentAudioRoute {
UInt32 routeSize = sizeof (CFStringRef);
CFStringRef route;
AudioSessionGetProperty (kAudioSessionProperty_AudioRoute,
&routeSize,
&route);
NSString* routeStr = (__bridge NSString*)route;
return routeStr;
}
+(void) destroy {
NSLog(#"CVAudioSession destroy");
// Very important - remove the listeners, or we'll crash when audio routes etc change when we're no longer on screen
OSStatus stat = AudioSessionRemovePropertyListenerWithUserData(kAudioSessionProperty_AudioRouteChange, propListener, (__bridge void*)self);
NSLog(#".. AudioSessionRemovePropertyListener kAudioSessionProperty_AudioRouteChange returned %ld", stat);
stat = AudioSessionRemovePropertyListenerWithUserData(kAudioSessionProperty_AudioInputAvailable, propListener, (__bridge void*)self);
NSLog(#".. AudioSessionRemovePropertyListener kAudioSessionProperty_AudioInputAvailable returned %ld", stat);
AudioSessionSetActive(false); // disable audio session.
NSLog(#"AudioSession is now inactive");
}
+(BOOL) interrupted {
return _isInterrupted;
}
// Called when audio is interrupted for whatever reason. NOTE: doesn't always call the END one..
void interruptionListener( void * inClientData,
UInt32 inInterruptionState) {
if (inInterruptionState == kAudioSessionBeginInterruption)
{
_isInterrupted = YES;
NSLog(#"CVAudioSession: interruptionListener kAudioSessionBeginInterruption. Disable audio session..");
// Try just deactivating the audiosession..
OSStatus rc = AudioSessionSetActive(false);
if (rc) {
NSLog(#"CVAudioSession: interruptionListener kAudioSessionBeginInterruption - AudioSessionSetActive(false) returned %.ld", rc);
} else {
NSLog(#"CVAudioSession: interruptionListener kAudioSessionBeginInterruption - AudioSessionSetActive(false) ok.");
}
} else if (inInterruptionState == kAudioSessionEndInterruption) {
_isInterrupted = NO;
// Reactivate the audiosession
OSStatus rc = AudioSessionSetActive(true);
if (rc) {
NSLog(#"CVAudioSession: interruptionListener kAudioSessionEndInterruption - AudioSessionSetActive(true) returned %.ld", rc);
} else {
NSLog(#"CVAudioSession: interruptionListener kAudioSessionEndInterruption - AudioSessionSetActive(true) ok.");
}
[[NSNotificationCenter defaultCenter] postNotificationName:kCVAudioInterruptionEnded object:(__bridge NSObject*)inClientData userInfo:nil];
}
}
// This is called when microphone or other audio devices are plugged in and out. Is on the main thread
void propListener( void * inClientData,
AudioSessionPropertyID inID,
UInt32 inDataSize,
const void * inData)
{
if (inID == kAudioSessionProperty_AudioRouteChange)
{
CFDictionaryRef routeDictionary = (CFDictionaryRef)inData;
CFNumberRef reason = (CFNumberRef)CFDictionaryGetValue(routeDictionary, CFSTR(kAudioSession_AudioRouteChangeKey_Reason));
SInt32 reasonVal;
CFNumberGetValue(reason, kCFNumberSInt32Type, &reasonVal);
if (reasonVal != kAudioSessionRouteChangeReason_CategoryChange)
{
NSLog(#"CVAudioSession: input changed");
[[NSNotificationCenter defaultCenter] postNotificationName:kCVAudioInputChangedNotification object:(__bridge NSObject*)inClientData userInfo:nil];
}
}
else if (inID == kAudioSessionProperty_AudioInputAvailable)
{
if (inDataSize == sizeof(UInt32)) {
UInt32 isAvailable = *(UInt32*)inData;
if (isAvailable == 0) {
NSLog(#"AUDIO RECORDING IS NOT AVAILABLE");
}
}
}
}
#end

How to play looping sound with OpenAL on iPhone

I'm following a tutorial about playing sound with OpenAL. Now everything works fine except I can't make the sound looping. I believe that I've used AL_LOOPING for the source. Now it can only play once and when it finishes playing, the app will block(doesn't response to my tap on the play button). Any ideas about what's wrong with the code?
// start up openAL
// init device and context
-(void)initOpenAL
{
// Initialization
mDevice = alcOpenDevice(NULL); // select the "preferred device"
if (mDevice) {
// use the device to make a context
mContext = alcCreateContext(mDevice, NULL);
// set my context to the currently active one
alcMakeContextCurrent(mContext);
}
}
// open the audio file
// returns a big audio ID struct
-(AudioFileID)openAudioFile:(NSString*)filePath
{
AudioFileID outAFID;
// use the NSURl instead of a cfurlref cuz it is easier
NSURL * afUrl = [NSURL fileURLWithPath:filePath];
// do some platform specific stuff..
#if TARGET_OS_IPHONE
OSStatus result = AudioFileOpenURL((CFURLRef)afUrl, kAudioFileReadPermission, 0, &outAFID);
#else
OSStatus result = AudioFileOpenURL((CFURLRef)afUrl, fsRdPerm, 0, &outAFID);
#endif
if (result != 0) NSLog(#"cannot openf file: %#",filePath);
return outAFID;
}
// find the audio portion of the file
// return the size in bytes
-(UInt32)audioFileSize:(AudioFileID)fileDescriptor
{
UInt64 outDataSize = 0;
UInt32 thePropSize = sizeof(UInt64);
OSStatus result = AudioFileGetProperty(fileDescriptor, kAudioFilePropertyAudioDataByteCount, &thePropSize, &outDataSize);
if(result != 0) NSLog(#"cannot find file size");
return (UInt32)outDataSize;
}
- (void)stopSound
{
alSourceStop(sourceID);
}
-(void)cleanUpOpenAL:(id)sender
{
// delete the sources
alDeleteSources(1, &sourceID);
// delete the buffers
alDeleteBuffers(1, &bufferID);
// destroy the context
alcDestroyContext(mContext);
// close the device
alcCloseDevice(mDevice);
}
-(IBAction)play:(id)sender
{
alSourcePlay(sourceID);
}
#pragma mark -
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad {
[super viewDidLoad];
[self initOpenAL];
// get the full path of the file
NSString* fileName = [[NSBundle mainBundle] pathForResource:#"sound" ofType:#"caf"];
// first, open the file
AudioFileID fileID = [self openAudioFile:fileName];
// find out how big the actual audio data is
UInt32 fileSize = [self audioFileSize:fileID];
// this is where the audio data will live for the moment
unsigned char * outData = malloc(fileSize);
// this where we actually get the bytes from the file and put them
// into the data buffer
OSStatus result = noErr;
result = AudioFileReadBytes(fileID, false, 0, &fileSize, outData);
AudioFileClose(fileID); //close the file
if (result != 0) NSLog(#"cannot load effect: %#", fileName);
//NSUInteger bufferID; // buffer is defined in head file
// grab a buffer ID from openAL
alGenBuffers(1, &bufferID);
// jam the audio data into the new buffer
alBufferData(bufferID, AL_FORMAT_STEREO16, outData, fileSize, 8000);
//NSUInteger sourceID; // source is defined in head file
// grab a source ID from openAL
alGenSources(1, &sourceID);
// attach the buffer to the source
alSourcei(sourceID, AL_BUFFER, bufferID);
// set some basic source prefs
alSourcef(sourceID, AL_PITCH, 1.0f);
alSourcef(sourceID, AL_GAIN, 1.0f);
alSourcei(sourceID, AL_LOOPING, AL_TRUE);
// clean up the buffer
if (outData)
{
free(outData);
outData = NULL;
}
}
You should be able to release outData right after your alBufferData() call. It exclude it as the culprit, you can try the static extension and manage the memory yourself. It's something like:
alBufferDataStaticProcPtr alBufferDataStaticProc = (alBufferDataStaticProcPtr)alcGetProcAddress(0, (const ALCchar *)"alBufferDataStatic");
alBufferDataStaticProc(bufferID, bitChanFormat, audioData, audioDataSize, dataFormat.mSampleRate);