I hope someone else has an insight into this problem...
I am trying to play a custom sound that is loaded into my app's resources. I created the sound using the afconvert tool, as recommended in the Converting audio to CAF format for playback on iPhone using OpenAL thread. The specific command executed to produce my custom sound file is:
afconvert ~/Desktop/moof.au ~/Desktop/moof.caf -d ima4 -f caff -v
When I run my app on a test device (iPhone 5, iOS 7.0; or iPhone 4, iOS 6.1.3) and send a push notification containing the name of my custom sound file, the device vibrates but no sound plays.
Here's where it gets a little weird... If I set a breakpoint at the start of the method that plays the sound, and single-step through the code, the device plays the custom sound and vibrates in response to the push notification!
Here's the code in question:
- (void)application:(UIApplication *)application didReceiveRemoteNotification:(NSDictionary *)userInfo
{
// If the app is running and receives a remote notification, the app calls this method to process
// the notification payload. Your implementation of this method should use the app state to take an
// appropriate course of action.
ALog(#"push data package: %#", userInfo);
NSDictionary *payload = [userInfo objectForKey:#"aps"];
NSNumber *badgeNumber = [payload objectForKey:#"badge"];
NSString *soundName = [payload objectForKey:#"sound"];
// app is in foreground
if (application.applicationState == UIApplicationStateActive) {
ALog(#"active");
if (badgeNumber) {
application.applicationIconBadgeNumber = [badgeNumber integerValue];
}
[self playPushNotificationAlertSound:soundName];
}
// app is in background
else if (application.applicationState == UIApplicationStateBackground) {
ALog(#"background");
if (badgeNumber) {
application.applicationIconBadgeNumber = [badgeNumber integerValue];
}
[self playPushNotificationAlertSound:soundName];
}
// app was launched from inactive state
else if ((application.applicationState == UIApplicationStateInactive)) {
ALog(#"inactive");
if (badgeNumber) {
application.applicationIconBadgeNumber = [badgeNumber integerValue];
}
}
}
- (void)playPushNotificationAlertSound:(NSString *)soundName
{
ALog(#"soundName = '%#'", soundName);
if (![soundName isEqualToString:#"default"]) {
NSURL *soundURL = [[NSBundle mainBundle] URLForResource:soundName withExtension:#"caf"];
if (soundURL) {
CFURLRef soundFileURLRef = (__bridge CFURLRef)soundURL;
SystemSoundID soundFileObject = 0;
OSStatus status = AudioServicesCreateSystemSoundID(soundFileURLRef, &soundFileObject);
if (status == kAudioServicesNoError) {
AudioServicesPlayAlertSound(soundFileObject);
AudioServicesDisposeSystemSoundID(soundFileObject);
}
ALog(#"soundFileURLRef = %#", soundFileURLRef);
ALog(#"soundFileObject = %i, status = %i", (unsigned int)soundFileObject, (int)status);
}
}
}
And here's the console log without breakpoints active, and no custom sound plays:
2013-09-22 11:02:40.123 MyAppUnderDevelopment[2489:60b] push data package: {
"_" = BlyZ4SOYEeOEc5DiugJ6IA;
aps = {
alert = "Test #1";
badge = 1;
sound = moof;
};
}
2013-09-22 11:02:40.124 MyAppUnderDevelopment[2489:60b] active
2013-09-22 11:02:40.136 MyAppUnderDevelopment[2489:60b] soundName = 'moof'
2013-09-22 11:02:40.138 MyAppUnderDevelopment[2489:60b] soundFileURLRef = file:///var/mobile/Applications/31785684-2EFA-4FEB-95F3-3A6B82B16A4A/MyAppUnderDevelopment.app/moof.caf
2013-09-22 11:02:40.139 MyAppUnderDevelopment[2489:60b] soundFileObject = 4100, status = 0
And the console log with breakpoint active, single-stepping, and the sound plays:
2013-09-22 11:03:29.891 MyAppUnderDevelopment[2489:60b] push data package: {
"_" = "I_yGQCOYEeO515DiugJkgA";
aps = {
alert = "Test #2";
badge = 2;
sound = moof;
};
}
2013-09-22 11:03:29.892 MyAppUnderDevelopment[2489:60b] active
2013-09-22 11:03:33.752 MyAppUnderDevelopment[2489:60b] soundName = 'moof'
2013-09-22 11:03:40.757 MyAppUnderDevelopment[2489:60b] soundFileURLRef = file:///var/mobile/Applications/31785684-2EFA-4FEB-95F3-3A6B82B16A4A/MyAppUnderDevelopment.app/moof.caf
2013-09-22 11:03:45.356 MyAppUnderDevelopment[2489:60b] soundFileObject = 4101, status = 0
Any helpful ideas on what's going wrong?
After preparing my question, I was running through my test sequence to ensure I had correctly recorded the steps I had taken. While single-stepping quickly through the code, I discovered that calling...
AudioServicesPlayAlertSound(soundFileObject);
AudioServicesDisposeSystemSoundID(soundFileObject);
...in this way caused the soundFileObject to be disposed before the asynchronous sound had a chance to play. The short-term solution was to turn the soundFileObject into a retained property, and use lazy instantiation to create it as required.
Hope this helps someone else who may be banging their head against a wall.
Related
I'm attempting to play an AVAudioFile using the AVAudioEngine. The code is largely taken and adapted from the Apple Developer on-line videos, but there is no playback. Have spent some time going through the forums, but nothing seems to throw any light on it.
I have two methods. The first one calls the standard open file dialog, opens the audio file, allocates an AVAudioBuffer object (which I will use later) and fills it with the audio data. The second one sets up the AVAudioEngine and AVAudioPlayerNode objects, connects everything up and plays the file. The two methods are listed below .
- (IBAction)getSoundFileAudioData:(id)sender {
NSOpenPanel* openPanel = [NSOpenPanel openPanel];
openPanel.title = #"Choose a .caf file";
openPanel.showsResizeIndicator = YES;
openPanel.showsHiddenFiles = NO;
openPanel.canChooseDirectories = NO;
openPanel.canCreateDirectories = YES;
openPanel.allowsMultipleSelection = NO;
openPanel.allowedFileTypes = #[#"caf"];
[openPanel beginWithCompletionHandler:^(NSInteger result){
if (result == NSFileHandlingPanelOKButton) {
NSURL* theAudioFileURL = [[openPanel URLs] objectAtIndex:0];
// Open the document.
theAudioFile = [[AVAudioFile alloc] initForReading:theAudioFileURL error:nil];
AVAudioFormat *format = theAudioFile.processingFormat;
AVAudioFrameCount capacity = (AVAudioFrameCount)theAudioFile.length;
theAudioBuffer = [[AVAudioPCMBuffer alloc]
initWithPCMFormat:format frameCapacity:capacity];
NSError *error;
if (![theAudioFile readIntoBuffer:theAudioBuffer error:&error]) {
NSLog(#"problem filling buffer");
}
else
playOrigSoundFileButton.enabled = true;
}
}];}
- (IBAction)playSoundFileAudioData:(id)sender {
AVAudioEngine *engine = [[AVAudioEngine alloc] init]; // set up the audio engine
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init]; // set up a player node
[engine attachNode:player]; // attach player node to engine
AVAudioMixerNode *mixer = [engine mainMixerNode];
[engine connect:player to:mixer format:theAudioFile.processingFormat]; // connect player node to mixer
[engine connect:player to:mixer format:[mixer outputFormatForBus:0]]; // connect player node to mixer
NSError *error;
if (![engine startAndReturnError:&error]) {
NSLog(#"Problem starting engine ");
}
else {
[player scheduleFile:theAudioFile atTime: nil completionHandler:nil];
[player play];
}
}
I have checked that the functions are being executed, that the audio engine is running and the audio file has been read. I have also tried with different audio file formats, both mono and stereo. Any ideas?
I'm running Xcode 7.3.1.
declare your AVAudioPlayerNode *player as global
refer this link Can't play file from documents in AVAudioPlayer
I have an app for iPhone and iPad that plays an audio stream using AVPlayer, I am using the same player of the Apple Sample StitchedStreamPlayer, but I made some changes to play music instead of video.
When I run the app, I can listen for some few seconds and then, the device restarts and following error is displayed:
Terminating in response to SpringBoard's termination.
(when I am running using xcode on the device it plays some minutes, but when I unplug the device and run the app again the app crashes)
I am using the iPhone 4 and an iPad mini for testing, none of them are Jailbroken and booth are iOS 6.
The code is quite big, but here is some parts:
header:
#interface NewPlayer : NSObject <AVAudioSessionDelegate>
#property (strong) AVPlayer *player;
#property (strong) AVPlayerItem *playerItem;
some important methods of Implementation
-(void)play:(NSString *)audio
{
/* Has the user entered a audio URL? */
NSURL *audioUrl = [NSURL URLWithString:audio];
if ([audioUrl scheme]) /* Sanity check on the URL. */
{
/*
Create an asset for inspection of a resource referenced by a given URL.
Load the values for the asset keys "tracks", "playable".
*/
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:audioUrl options:nil];
NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];
/* Tells the asset to load the values of any of the specified keys that are not already loaded. */
[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:
^{
dispatch_async( dispatch_get_main_queue(),
^{
/* IMPORTANT: Must dispatch to main queue in order to operate on the AVPlayer and AVPlayerItem. */
[self prepareToPlayAsset:asset withKeys:requestedKeys];
});
}];
}
}
- (void)prepareToPlayAsset:(AVURLAsset *)asset withKeys:(NSArray *)requestedKeys
{
/* Make sure that the value of each key has loaded successfully. */
for (NSString *thisKey in requestedKeys)
{
NSError *error = nil;
AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
if (keyStatus == AVKeyValueStatusFailed)
{
[self assetFailedToPrepareForPlayback:error];
return;
}
/* If you are also implementing the use of -[AVAsset cancelLoading], add your code here to bail
out properly in the case of cancellation. */
}
/* Use the AVAsset playable property to detect whether the asset can be played. */
if (!asset.playable)
{
/* Generate an error describing the failure. */
NSString *localizedDescription = NSLocalizedString(#"Item cannot be played", #"Item cannot be played description");
NSString *localizedFailureReason = NSLocalizedString(#"The assets tracks were loaded, but could not be made playable.", #"Item cannot be played failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *assetCannotBePlayedError = [NSError errorWithDomain:#"StitchedStreamPlayer" code:0 userInfo:errorDict];
/* Display the error to the user. */
[self assetFailedToPrepareForPlayback:assetCannotBePlayedError];
return;
}
/* At this point we're ready to set up for playback of the asset. */
/* Stop observing our prior AVPlayerItem, if we have one. */
if (self.playerItem)
{
/* Remove existing player item key value observers and notifications. */
[self.playerItem removeObserver:self forKeyPath:kStatusKey];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
}
/* Create a new instance of AVPlayerItem from the now successfully loaded AVAsset. */
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
/* Observe the player item "status" key to determine when it is ready to play. */
[self.playerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MyStreamingAudioViewControllerPlayerItemStatusObserverContext];
/* When the player item has played to its end time we'll toggle
the movie controller Pause button to be the Play button */
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
/* Create new player, if we don't already have one. */
if (![self player])
{
/* Get a new AVPlayer initialized to play the specified player item. */
[self setPlayer:[AVPlayer playerWithPlayerItem:self.playerItem]];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MyStreamingAudioViewControllerCurrentItemObservationContext];
}
/* Make our new AVPlayerItem the AVPlayer's current item. */
if (self.player.currentItem != self.playerItem)
{
/* Replace the player item with a new player item. The item replacement occurs
asynchronously; observe the currentItem property to find out when the
replacement will/did occur*/
[[self player] replaceCurrentItemWithPlayerItem:self.playerItem];
[self syncPlayPauseButtons];
}
}
- (void)observeValueForKeyPath:(NSString*) path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context
{
/* AVPlayerItem "status" property value observer. */
if (context == MyStreamingAudioViewControllerPlayerItemStatusObserverContext)
{
[self syncPlayPauseButtons];
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
NSLog(#"desconhecido");
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVPlayerItem becomes ready to play, i.e.
[playerItem status] == AVPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
NSLog(#"ready to play");
[player play];
[self.delegate tocandoMusica];
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *thePlayerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:thePlayerItem.error];
NSLog(#"falhou");
[self.delegate acabouMusica];
}
break;
}
}
/* AVPlayer "rate" property value observer. */
else if (context == MyStreamingAudioViewControllerRateObservationContext)
{
//[self syncPlayPauseButtons];
}
/* AVPlayer "currentItem" property observer.
Called when the AVPlayer replaceCurrentItemWithPlayerItem:
replacement will/did occur. */
else if (context == MyStreamingAudioViewControllerCurrentItemObservationContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
/* New player item null? */
if (newPlayerItem == (id)[NSNull null])
{
//[self disablePlayerButtons];
//[self disableScrubber];
}
else /* Replacement of player currentItem has occurred */
{
/* Specifies that the player should preserve the video’s aspect ratio and
fit the video within the layer’s bounds. */
[self syncPlayPauseButtons];
}
}
/* Observe the AVPlayer "currentItem.timedMetadata" property to parse the media stream
timed metadata. */
else if (context == MyStreamingAudioViewControllerTimedMetadataObserverContext)
{
//NSArray* array = [[player currentItem] timedMetadata];
//for (AVMetadataItem *metadataItem in array)
//{
//}
}
else
{
[super observeValueForKeyPath:path ofObject:object change:change context:context];
}
return;
}
If you want to take a deep look, just take a look on StitchedStreamPlayer Sample, I have no idea. I have looked at:
Failed to play audio file using AVPlayer in iPhone
memory leak in AudioToolbox library AVAudioPlayer
AudioToolBox leak in iOS6?
and many others..
I have tried to forget all this implementation and use just
player = [AVPlayer playerWithURL:[NSURL URLWithString:url]];
[player play];
but it crashes!
Some idea?
EDITED
I Have tried the MPMoviePlayerController but the same happened, the music started and then the device restarted.
This is the code I have used:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL];
player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:[[arrRadios objectAtIndex:indexPath.row] objectForKey:#"url"]]];
[player play];
I set the "application does not run in background" in my info.plist, so when user tap home button, app quits.
When my [UIApplication -appWillTerminate:] called, I will schedule 64 local notifications to system, all of them are non-repeating.
but that take a seemingly long time(6.17 seconds) on a iPhone4 with iOS6.0.1.
When I look at the time profiler, I found that the curve is very strange, it don't take much CPU time, but it do take a lot of time.
Also when I look at the call tree, 93% of the time is spent on [UIApplication -scheduleLocalNotification:] in the time range showed in the image.
Why?
This is how I generate my notifications:
UILocalNotification *n = [[[UILocalNotification] alloc] init] autorelease];
n.alertBody = #"some body";
n.hasAction = YES;
n.alertAction = #"some action";
n.fireDate = #"some date";
n.repeatInterval = 0;
n.soundName = #"my sound"
n.userInfo = aDictionaryWithAStringAbount10CharacterLongAnd2NSNumber.
[self.notifications addObject:n];
This is how I schedule my notifications:
-(void)endProxyAndWriteToSystemLocalNotification
{
_proxying = NO;
NSDate *dateAnchor = [NSDate date];
NSEnumerator *enumerator = [self.notifications objectEnumerator];
NSInteger i = 0;
while (i < maxLocalNotifCount) {
UILocalNotification *n = [enumerator nextObject];
if (!d) {
break;
}
if ([n.fireDate timeIntervalSinceDate:dateAnchor] >= 0) {
[[UIApplication sharedApplication] scheduleLocalNotification:n];
i++;
}
}
[self.notificationDatas removeAllObjects];
}
This would help:
-(void)endProxyAndWriteToSystemLocalNotification {
[[UIApplication sharedApplication] setScheduledLocalNotifications:self.notifications];
}
iOS 4.2 and later
read UIApplication Class Reference for detailed description
I think the problem is that you are trying to schedule 64 local notifications. Is there a reason to do all of these on app termination? Apples scheduleLocalNotification was not designed to be called so many times on termination
I am writing an application that uses the AVAudioRecorder class. It works great except for when a phone call comes in. I am handling this per apple's guidelines of using the AVAudioRecorderDelegate methods
– (void) audioRecorderBeginInterruption:
– (void) audioRecorderEndInterruption:
It works great until the interruption ends and I attempt to "resume" the recording by calling the record method again (per the documentation). However it does not resume my recording but instead throws out the old one and starts up an entirely new one in its place. I have not been able to find a solution to this problem, if anyone has figured this out, or if it is a bug with apple's AVAudioRecorder please let me know. I really hope I do not have to write this using AudioQueues.
thanks
Looks like its a bug with apple's API. Great fun....
This was the response we received from a support ticket.
"The behavior you described is a bug and unfortunately there's nothing in the API that you can change to work around to actually append to the original recording. The interruption is resulting in capturing only the audio recorded after the interruption. You could try and stop the recording after the interruption then creating a new file after which would at least not cause the user to loose any information, but the result would be two separate files.
Please file a bug report at for this issue since bugs filed by external developers are critical when iOS engineering is evaluating critical features of fixes to address. It's easily reproducible but if you have a test app you can include please do, iOS engineering like having apps that show the bug directly.
"
My solution was:
Start record on temp file
Watch for AVAudioSessionInterruptionNotificatio
On AVAudioSessionInterruptionTypeBegan - stop the recording.
On AVAudioSessionInterruptionTypeEnded - Start new recording.
When the user stops - Marge the files.
Full Code
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(audioSessionInterruptionNotification:)
name:AVAudioSessionInterruptionNotification
object:audioSession];
-(void)audioSessionInterruptionNotification:(NSNotification*)notification {
NSString* seccReason = #"";
//Check the type of notification, especially if you are sending multiple AVAudioSession events here
NSLog(#"Interruption notification name %#", notification.name);
NSError *err = noErr;
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
seccReason = #"Interruption notification received";
//Check to see if it was a Begin interruption
if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
seccReason = #"Interruption began";
NSLog(#"Interruption notification name %# audio pause", notification.name);
dispatch_time_t restartTime = dispatch_time(DISPATCH_TIME_NOW,
0.01 * NSEC_PER_SEC);
dispatch_after(restartTime, dispatch_get_global_queue(0, 0), ^{
AVAudioRecorder *recorder = [[self recorderPool] objectForKey:lastRecID];
if (recorder) {
if(recorder.isRecording) {
[recorder stop];
NSLog(#"Interruption notification name Pauseing recording %#", lastRecID);
} else {
NSLog(#"Interruption notification name Already Paused %#", lastRecID);
}
}else {
NSLog(#"Interruption notification name recording %# not found", lastRecID);
}
NSLog(#"Interruption notification Pauseing recording status %d",recorder.isRecording);
});
} else if([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeEnded]]){
seccReason = #"Interruption ended!";
NSLog(#"Interruption notification name %# audio resume", notification.name);
//Start New Recording
dispatch_time_t restartTime = dispatch_time(DISPATCH_TIME_NOW,
0.1 * NSEC_PER_SEC);
dispatch_after(restartTime, dispatch_get_global_queue(0, 0), ^{
AVAudioRecorder *recorder = [[self recorderPool] objectForKey:lastRecID];
NSLog(#"Interruption notification Resumeing recording status %d",recorder.isRecording);
if (recorder) {
if(!recorder.isRecording) {
NSString *filePath = [[self orgFileNames] objectForKey:lastRecID];
NSArray * fileNames =[[self fileNames] objectForKey:lastRecID];
NSString *tmpFileName = [self gnrTempFileName:filePath AndNumber:fileNames.count];
[[[self fileNames] objectForKey:lastRecID] addObject:tmpFileName];
NSURL *url = [NSURL fileURLWithPath:tmpFileName];
NSError *error = nil;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&error];
if (![recorder record]) {
NSLog(#"Interruption notification Error Resumeing recording %#",tempRecorder);
return;
}
[[self recorderPool] setObject:recorder forKey:lastRecID];
NSLog(#"Interruption notification nameResumeing recording %#",lastRecID);
}else {
NSLog(#"Interruption notification Already Recording %d",recorder.isRecording);
}
}else {
NSLog(#"Interruption notification name recording %# not found",lastRecID);
}
});
}
}
}
You will try by using this piece of code
-(IBAction)pauseandplay:(id)sender
{
BOOL status= [player isPlaying];
if(status)
{
[pauseplay setImage:[UIImage imageNamed:#"play.png"]];
[player pause];
}
else
{
[pauseplay setImage:[UIImage imageNamed:#"icon-pause.png"]];
[player play];
updateTimer = [NSTimer scheduledTimerWithTimeInterval:.01 target:self selector:#selector(updateCurrentTime) userInfo:player repeats:YES];
}
}
I have used the code from apples example from this page: Link, but I can't seem to get the sound to repeat. I have checked other applications, such as skype (for VOIP) and Alarm Clock Pro (audio?) but I cannot get the sound file to be repeated.
This is my code:
- (void)applicationDidEnterBackground:(UIApplication *)application
{
AlarmHandler *AHinstance = getAlarmHandlerInstance();
UIApplication* app = [UIApplication sharedApplication];
NSArray *alarmList = [AHinstance getAlarms];
NSArray *oldNotifications = [app scheduledLocalNotifications];
if ([oldNotifications count] > 0)
{
[app cancelAllLocalNotifications];
}
for (Alarm *theAlarm in alarmList) {
NSDate *alarmDate = [theAlarm getNearestActivationDate];
Package *alarmPackage = [theAlarm getAlarmPackage];
NSArray *fileList = [alarmPackage getVoiceFileListForBackgroundNotificationWithHour:theAlarm.larmHour];
if( alarmDate == nil ) continue;
UILocalNotification* alarm = [[[UILocalNotification alloc] init] autorelease];
if (alarm)
{
NSLog(#"File: %#", [fileList objectAtIndex:0]);
alarm.fireDate = alarmDate;
alarm.timeZone = [NSTimeZone defaultTimeZone];
alarm.soundName = [fileList objectAtIndex:0];
alarm.alertBody = #"Time to wake up!";
alarm.repeatInterval = 0;
[app scheduleLocalNotification:alarm];
}
}
}
Any suggestions on how I can fix this?
I have had suggestions to register app as audio player and play sounds in the background, but it seems that apple does take kindly to those applications because they aren't real audio players. Therefore they deny those apps.
Regards,
Paul Peelen
There is no way to do this for local notifications. You can either register as a VOIP app or as a "background audio" app, which have separate APIs. However, if you do not provide appropriate functionality to qualify for those kinds of uses, you'll most likely be rejected.
Yes this is possible, as the documentation states:
Your own applications can schedule up to 128 simultaneous notifications, any of which can be configured to repeat at a specified interval
You just need to configure the repeatInterval property:
The calendar interval at which to reschedule the notification.