In a SpriteKit app, I am playing a click sound when the user moves a block on the screen, but this causes terrible lagging (fps dropping to near zero).
Code:
• self.audioPlayers is a strong NSMutableArray which holds currently playing AVAudioPlayers and removes them when the audioPlayerDidFinishPlaying: delegate method is called.
• The blockShouldMove method compares the touch location to its previous location and only returns YES if the user has moved the cursor enough distance so we only play a maximum of around 8 sounds simultaneously.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *oneTouch = [touches anyObject];
CGPoint location = [oneTouch locationInNode:self];
if (![self blockShouldMove:location]) {
return;
}
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:#"click_01.mp3" ofType:nil];
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:soundFilePath] error:nil];
[audioPlayer prepareToPlay];
audioPlayer.volume = 1.0;
audioPlayer.numberOfLoops = 0;
audioPlayer.delegate = self;
[audioPlayer play];
[self.audioPlayers addObject:audioPlayer];
}
On both simulators and real devices (iOS and tvOS), if I make circles with my finger, the fps drops to almost nothing until well after I have even released my finger.
If I remove the whole AVAudioPlayer and use [SKAction playSoundFileNamed:#"click_01.mp3" waitForCompletion:NO], everything works fine. But unfortunately, SKAction sound handling is terrible for my purpose because the volume cannot be set.
Is there anything that can be done to make this better?
This is an indirect sort of "answer", more a comment but it's too long to write it in that form.
I ran into something like this issue with a SpriteKit game. I solved it by dropping the AVAudioPlayers and instead using the scene's audio engine. I connected a bunch of AVAudioPlayerNodes to the main mixer node and started them playing (nothing). Whenever I'd want a sound effect of some sort, I'd grab the next audio player node (round robin fashion, and there were enough of them so that I was sure it would be idle) and schedule a preloaded sound buffer for it.
The point is that for more complicated audio in SpriteKit, you may need to go through the audio engine and build a sound graph that's appropriate for what you want to do. Multiple AVAudioPlayers existing at once seem to be stepping on each other somehow and causing lag.
However if you didn't try the SKAudioNode functionality yet, I'd say do that first. You should be able to run an action on the audio node to adjust volume. I didn't go that route since I couldn't get the stereo panning to work the way I wanted with that setup. If that works for what you want to do, it's probably simpler than setting up your own sound graph.
If you want to see the solution I eventually used, you can look at https://github.com/bg2b/RockRats/blob/master/Asteroids/Sounds.swift
Related
I'm using AVAudioPlayer to play a little shot sound when a user clicks a button. The sounds lasts about 3 seconds and I want that, if a user hit a button multiple times, the shot should sound multiple times. If the user clicks twice in 2 seconds, then the second sound should overlap the first shot.
My problem is that the shot only sounds every 3 seconds (if the user clicks rapidly) instead of every hit of the button.
Inside ViewDidLoad
NSString *path = [[NSBundle mainBundle] pathForResource:#"shot" ofType:#"caf"];
urlShotCaf = [NSURL fileURLWithPath:path];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:urlShotCaf error:nil] ;
[player prepareToPlay];
And when a person clicks the shot button
- (IBAction)tap:(id)sender {
clicks++;
[player play];
}
Can I do this with AVAudioPlayer? Should I use another framework?
As stated in reference here:
Play multiple sounds simultaneously, one sound per audio player, with precise synchronization
I guess you need a AVAudioPlayer for every sound you want to play simultaneously. Otherwise you could use any simple library like CocosDenshion that is really simple, easy to embed and powerful (it resides on OpenAL).
Just do
[SimpleAudioEngine sharedEngine] playEffect:#"yoursound.wav"];
and you are done.
Please try this. It works for me.
- (IBAction) tap:(id)sender {
if ([player isPlaying]) {
[player stop];
[player setCurrentTime:0.0];
}
[player play];
}
If high level frameworks fail, then you can dip down to AUSampler - an AudioUnit sample player (typically used for playback of sampled instruments, drum sounds, and so on). This should have very fast response times and support multiple active notes. Configure the sample's playback as one-shot. When the button is pressed, simulate a note on event. You could also map different samples (audio recordings) to different notes and velocity ranges.
I am creating an application which displays a certain video based on an external event, which may require the playing video to change quickly - once per second or more. However, there must not be a gap or lag between the videos.
What is the best way to do this? There are only four videos, each about two megabytes.
I was considering creating four MPMoviePlayerControllers, and have their views added to the main view but hidden, and switching by pausing and hiding the current video, then unhiding and playing the next video. Is there a more elegant solution?
Edit
Here's some more information for my exact sitaution:
The different video frames share mostly common pixels- so it's OK for a frame to stick during switch, but NOT okay for black frames to appear.
Each video is only about ten seconds long, and there are only four videos. The general state transitions are 1<->2<->3<->4->1.
The video playback should compatible with simultaneous AVAudioRecorder recording. As far as I can tell, MPMoviePlayerController is not.
You'll need to set up and prebuffer all the video streams to avoid hiccups, so I don't think your multiple MPMoviePlayerController solution is too far off the mark.
However, that specific solution is potentially problematic because each movie player has its own UI. The UIs do not synchronize with each other, so one might be showing the control bar, another not; one might be in full screen mode, etc. Switching among them will cause visual discontinuities.
Since it sounds like your video switching is not necessarily user-initiated, I'm guessing you don't care too much about the standard video player UI.
I would suggest dropping down to the underlying layer, AV Foundation. In theory, you can just create an AVPlayerItem for each video. This is a stream-management object with no UI overhead, so it's perfect for what you're doing. You could then -- again, in theory -- create one AVPlayer and one AVPlayerLayer to handle the display. When you wanted to switch from one AVPlayerItem stream to another, you could call the AVPlayer's replaceCurrentItemWithPlayerItem: message to swap out the data stream.
I made a little test project (GitHub) to verify this, and unfortunately the straightforward solution isn't quite perfect. There is no video flow glitching, but in the transition from AVPlayer to AVPlayer, the presentation layer seems to briefly flash the previous movie's last-seen frame at the size appropriate to the next movie. It seems to help to allocate separate AVPlayer objects for each movie and switch among them to a constant player layer. There still seems to be an instantaneous flash of background, but at least it's a subtle defect. Here's the gist of the code:
#interface ViewController : UIViewController
{
NSMutableArray *players;
AVPlayerLayer *playerLayer;
}
#property (nonatomic) IBOutlet UIView *videoView;
- (IBAction) selectVideo:(id)sender;
#end
#implementation ViewController
#synthesize videoView;
- (void)viewDidLoad
{
[super viewDidLoad];
NSArray *videoTitles = [NSArray arrayWithObjects:#"Ultimate Dog Tease",
#"Backin Up", #"Herman Cain", nil];
players = [NSMutableArray array];
for (NSString *title in videoTitles) {
AVPlayerItem *player = [AVPlayer playerWithURL:[[NSBundle mainBundle]
URLForResource:title withExtension:#"mp4"]];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
[players addObject:player];
}
playerLayer = [AVPlayerLayer playerLayerWithPlayer:[players objectAtIndex:0]];
playerLayer.frame = self.videoView.layer.bounds;
playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
[self.videoView.layer addSublayer:playerLayer];
}
- (void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context
{
[object removeObserver:self forKeyPath:#"status"];
for (AVPlayer *player in players) {
if (player.status != AVPlayerStatusReadyToPlay) {
return;
}
}
// All videos are ready to go
[self playItemAtIndex:0];
}
- (void) playItemAtIndex:(NSUInteger)idx
{
AVPlayer *newPlayer = [players objectAtIndex:idx];
if (newPlayer != playerLayer.player) {
[playerLayer.player pause];
playerLayer.player = newPlayer;
}
[newPlayer play];
}
- (IBAction) selectVideo:(id)sender
{
[self playItemAtIndex:((UILabel *)sender).tag];
}
#end
Half the code is there just to observe the state of the players and make sure that playback doesn't start until all videos have been buffered.
Allocating three separate AVPlayerLayers (in addition to three AVPlayers) prevents any sort of flash. Unfortunately, an AVPlayer connected to an undisplayed AVPlayerLayer seems to assume that it doesn't need to maintain a video buffer. Every switch among layers then produces a transient video stutter. So that's no good.
A couple of things to note when using AV Foundation are:
1) The AVPlayer object doesn't have built-in support for looped playback. You'll have to observe for the end of the current video and manually seek to time zero.
2) There's no UI at all other than the video frame, but again, I'm guessing that this might actually be an advantage for you.
The MPMoviePlayerController is a singleton. Four instances will share the same pointer, the same view, etc. With the native player, I think you have only two alternatives: one is to change the contentURL property when you want a transition. If the latency this way unacceptable, the other alternative is to produce a longer video with the shorter clips concatenated. You can create very quick jumps within the single, longer clip by setting currentPlaybackTime.
I'm trying to do an app where a short sound sample is supposed to be played while the person using the app is dragging his/her finger(s) across the screen. When the finger(s) are lifted away from the screen - the sound will stop.
This is the current function that triggers the sound (I've tried various methods):
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
NSLog(#"Ja, den börjar...");
return YES;
}
-(void)ccTouchMoved:(NSSet *)touch withEvent:(UIEvent *)event{
soundFile = [[CDAudioManager sharedManager] audioSourceForChannel:kASC_Right];
[soundFile load:#"sound.wav"];
soundFile.backgroundMusic = NO;
soundSourceForFile:#"sound.wav"]retain];
}
This is the function that stops the sound:
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event{
[soundFile stop];
}
I first started out using the ccTouchBegan (Just to get some kind of sound working), which looped the sound seamlessly. At this point the ccTouchEnded worked together with the "Touch Up Inside" event.
The point, as I said, is the the sound is supposed to be played when the user drags his/her finger(s) across the screen. But when I tried to the tie the playSound function to the "ccTouchMoved" the sound loops repeatedly over itself, instead of one at the time, making it hell to use. The stopSound function doesn't work after i changed to the ccTouchMoved.
I tried to use NSTimer to create some kind of way to handle the loops, but without any success.
I started this project with the regular iOS SDK, and found my limitations when i found out i wasn't able to handle pitch & gain manipulation without Cocos2d.
I got everything working in the regular SDK by wrapping it in a if-statement:
if(![mySound isPlaying]{
[mySound play];
}
This, as I said, worked perfectly fine in the regular SDK, but not now when I'm using Cocos2d.
ccTouchMoved will be called continuously as the finger moves along the screen. The problem you are having here is that each time this is called you are loading a new sound file and they are overlapping because they are newly created individual objects. You only have a reference to the final sound you load (which is what soundFile is pointing at) and you aren't freeing up the memory either.
Example:
(as you drag your finger)
LoadedSoundA created and starts playing
soundfile points to LoadedSoundA
// finger moves
LoadedSoundB created and starts playing
soundfile points to LoadedSoundB
// finger moves
LoadedSoundC created and starts playing
soundfile points to LoadedSoundC
... etc
the only sound you have a pointer to at the moment is the last created sound, since you reassign soundfile each time. So you can only 'stop' the sound you created last.
You are also leaking a lot of memory since you are retaining all of these sounds and never releasing them.
I would suggest a different tactic:
In touchesBegan you should load the sound and have it play on loop and record the time of the touch into a class level iVar.
Now, in TouchesMoved you should get the time of the current touch and see if it is close enough to the time you recorded. If it is within say, 0.5 seconds then just update the recorded timestamp and continue; However, if it has been too long since the last touch you stop the sound that is playing.
This way you have a seamless sound being played, it is only created once and you maintain your ownership of it.
Hope this helps
I have an application which has two kinds of sounds:
1. Background sounds, looping forever till next view.
2. Sounds clips that could be played separately or all in one go.
Previously i tried using AVAudioPlayer but there was always a very little silence interval between two clips (to which i didn't find any solution, if you have one, please post in reply), so shifted to Finch e.g. http://github.com/zoul/Finch/tree/master
Now here is the deal:
I am using AVAudioPlayer to play looping forever sounds while Finch to play those back-to-back clips. I am doing this:
- (void)viewDidLoad {
[super viewDidLoad];
soundEngine = [[Finch alloc] init];
/***
CLIPPED
***/
}
-(void) playclip:(id) sender {
NSString *fullPath = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:#"sitar.wav"];
NSLog(#"Song is %#", fullPath);
clip = [[Sound alloc] initWithFile:fullPath];
NSLog(#"Clip Length: %#", clip.length);
NSLog(#"Clip Playing ? : %d", clip.playing);
[clip play];
NSLog(#"Played Clip");
}
This is almost same code as the demo project provided as in the git repository. But for some reason i get this on console:
Song is /Users/shoaibi/Library/Application Support/iPhone Simulator/User/Applications/17FBA26F-6047-4D56-9E45-ADAFE07B7234/Test.app/sitar.wav
Failed to create OpenAL source, error code a004.
Clip Length: (null)
Clip Playing ? : 0
Played Clip
An ls on the directory printed on console reveals that there is sitar.wav there with the correct permissions. What could be the issue? may be the background sound AVAudioPlayer which i stop using:
- (void)viewWillAppear:(BOOL)animated {
//Stop the background music before my ears bleed :P
d = [[UIApplication sharedApplication] delegate];
if (d.bg.playing)
{
[d.bg stop];
NSLog(#"Background music stoped");
}
}
is not freeing up sound resource?
Latest Finch now checks whether OpenAL is initialized when you initialize a new Sound. If current OpenAL context is not set because you forgot or failed to initialize Finch, you get a better error message. Give it a try, it might be the case that your initialization code is buggy.
I got the same error when I called the sound clip allocs in my viewWillAppear - seems it was called before the awakeFromNib finished doing its stuff. Anyway, shifting the allocs so they happened later fixed it.
I'm using the AVAudioPlayer to play sound FX on a separate thread in my app. Right now my app is fairly simple, I have a UIScrollView for a scrolling background, and a couple of CALayers that represent a player and an enemy in a simple side-scroller type game (think Mario or Kung Fu or whatnot)...I can move the player left and right, with the view scrolling appropriately with no problems. I have an "attack" button that's playing a sound effect...and many times when I play the sound effect, I get a little bit of a hitch, graphically. Is this just par for the course on the iPhone using Quartz? Is there some way to have this perform a little more smoothly without starting to investigate OpenGL?
Use NSData dataWithContentsOfMappedFile instead of dataWithContentsOfFile.
Also AVAudioPlayer handles it's own threading, you don't need to create one. [AVAudioPlayer play] returns after it started playing, not after it is done playing.
And don't forget to release your AVAudioPlayer when you no longer need it, for example if you only want to play a sound onces, implemement the audioPlayerDidFinishPlaying:successfully: delegate and release it in there.
How are you playing the sound? If I was to make a guess it would be that each time the sound is played the OS is actually loading it from disk as well as playing it.
Update:
Ok, you really don't need to be creating a new thread to play a sound. This alone could cause a stutter in a game and is also redundant - The AVAudioPlayer play method is asynchronous - it returns immediately and not when the sound ends.
I had the same problem with slowdown. To solve it, I first play and pause the sound when it first loads at volume 0 - this works better than prepareToPlay which still seems to result in slowdown:
player.volume = 0.0;
[player play];
[player pause];
player.volume = 1.0;
When it comes time to play a sound I just [player play];. Then every update cycle I loop through all my active sounds and catch ones that are about to end and rewind them instead of letting them expire - this avoids the overhead and slowdown of starting up the sound from scratch. The example here stops the sounds 1/15th of a second from the end of the sound:
for ({each player in active sounds})
{
if ((player.duration - player.currentTime) <= (1.0/15.0))
{
[player pause];
player.currentTime = 0.0;
}
}
Now it's ready to go for the next play. Seems to work fine!
Update: Turns out I was still getting some pauses the very first time the sound was played. I think playing it at zero volume wasn't actually doing anything, but I changed it to play at 0.0001 volume and then pause - it's inaudible but seems to do the job of prepping the sound.
Had the same problem at first. Per documentation you need to call "prepareToPlay" every time a sound has finished playing. Adopting the AVAudioPlayerDelegate protocol and adding the following method will do this.
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{
[player prepareToPlay];
}
(Using an answer rather than a comment for length and so I can show code):
My sound playing method is as follows:
- (void)playSoundThreaded:(id)soundPlayer
{
if(soundPlayer == nil)
return;
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
[(AVAudioPlayer*)soundPlayer play];
[pool release];
}
- (void)playSound:(AVAudioPlayer*)player
{
if(player == nil)
return;
[NSThread detachNewThreadSelector:#selector(playSoundThreaded:)
toTarget:self withObject:(id)player];
}
And the sound is loaded up as follows:
fxPlayer = [[AVAudioPlayer alloc] initWithData:
[NSData dataWithContentsOfFile:
[[NSBundle mainBundle] pathForResource:#"voicehiya" ofType:#"caf"]] error:NULL];
We've had some inkling that the sound is being reloaded from disk every time...but have no idea as to why that is or how to prevent it.
I think the thread is over complicating your player. Try this for a go:
/* setup the sound player */
AVAudioPlayer *player1=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:audioFilePath] error:NULL];
player1.numberOfLoops = 0;
player1.volume=1.0f;
[player1 prepareToPlay];
and then playSound becomes
[player play];
And don't forget to have the AudioSession category set correctly too.
Why not just use AudioServicesPlaySystemSound? This is well suited to sound effects.
I use it in my game (vConqr) which is also based on a UIScrollView, with numerous other views (so all having their own CALayer), and it doesn't significantly impact performance.
I've been investigating this for a few days now thanks to a project I'm on at work. I added some home-rolled profiling to my code and found that the calls to play sounds were causing frame rate issues in my OpenGL game. I've got a few recommendations for you based on what I've found:
Use an AVAudioPlayer playing an mp3 for your background music.
Use OpenAL for your sound effects.
For sound effects, use the audio converter command line tool from apple to get your sounds to .caf format. This is an uncompressed format that doesn't require the CPU to decompress your sound to memory before sending it to the sound card. It can go directly, saving you time. Instructions on using the tool can be found HERE. (scroll to the very bottom of the page)
Use the lowest bit depths that still sound good. You don't really need 128 kbps - you can probably get away with 20 for a lot of things. Lower bit rate = less processing = faster sound.
I'll let you know if I find anything else that's useful! Good luck!