I'm using the exact code from the iPhone Application Programming Guide Multimedia Support to use AVAudioRecorder to record a file to the disk and then AVAudioPlayer to load and play that file.
This is working fine in the simulator but is not working on the device. The file gets loaded (we can see the NSTimeInterval) but does not play (play returns false).
After it didn't work with the sample code from the website, we tried changing to a bunch of different codecs with no success. And of course, the sound is on.
Thanks a bunch.
SOLVED!
The problem was that after you set the AVAudioSession category to Record for recording, you have to set it to Play for playing. Sounds obvious now but you know how it is. When you use the Simulator it uses a different underlying implementation so this problem does not show up.
Related
I have a Audio Source linked to an Audio Listener looping some music. The problem with the audio is that I cannot hear anything. There are no error messages. I have tried muting the audio and then undoing it. I have also tried to restart Unity and adjusting the values on the Audio Source including pitch, volume, and looping, but the problem continues on. What is even more confusing is that when I started a project, the audio was working well, and then I did some adjusting on the Project Settings: Physics, the audio suddenly stopped. I do not have any code involved in the audio. When I tried to add another Audio Source and testing it to see if the Audio Source is the problem, it still creates no sound. I believe the problem lies nowhere near the Audio Source (though I might be wrong :D). Is there any other way to mute audio that I have missed? Thanks for any help.
Check that the Global Volume Setting under Project settings > Audio and that it is not set to 0. That worked for me.
I'm trying to play an audio file in a cocos2d application. Here is the line which tries to play the sound:
[[SimpleAudioEngine sharedEngine] playEffect:#"pig_squeal.wav"];
If I put a log near this line, the log appears, and I can play the sound with iTunes. But when the sound should be played, there is a message displayed:
AudioStreamBasicDescription: 2 ch, 44100 Hz, 'lpcm' (0x00000C2C) 8.24-bit little-endian signed integer, deinterleaved
What's the problem?
This was discussed in comments but I've amalgamated all the possibilities of why it may not work here.:
That's not an error message, just some information.
Is the sound definitely in your library and a part of the target?
You haven't changed the volume of the SimpleAudioEngine or the volume of your device isn't all the way down?
Click the sound in your library. Press Option+Command+1 to bring up the file editor. Scroll down to "Target Membership" and ensure the sound is checked for your target.
Try a different sound effect too? Try and narrow the problem down to "is it SimpleAudioEngine".
Also try and playBackgroundMusic for a sound (This was the solution in this case)
And try an mp3
There isn't a problem. It's a status message written to the log when you initialise the current OpenAL context using alcMakeContextCurrent. As far as I'm aware, you can't get rid of the message.
There is no problems with your code for the Playing of the Sound. Please check the Sound file has been added to the project also do check the format of the sound. It should play the Sound whenever you call the Play Effect. Please also try to PRELOAD the sound effect in the init method.
that message means the song was read correctly and should be playing. possible reasons you don't hear it?
volume turned down.
device malfunctioning
audio has silence in it
audio file is large and will take a long time to load.
sound has been redirect to come out of the headphones or the ear piece (even if not attached)
But the sound is loading and most likely playing.
If you are getting a crash while running in device from xcode pls try this(worked for me). Dissconnect the device from Xcode and run the app in the device. I dont know why it worked like that. But when I did this there was no crash.
I'm developing an interactive storybook type application for the iPhone and I've recently encountered a frustrating bug concerning audio mixing on the device.
Firstly, I setup an audio session. I set the category to AVAudioSessionCategoryAmbient and then init and play my AVAudioPlayer instance. Now, in the background whilst the audio is playing I'm pre-loading a video to play using an MPMoviePlayerController followed by a call to prepareToPlay. The reason I pre-load the video this way is because I need it to play instantly later on cue with fairly strict timing.
In this configuration, the audio/movie works fine and they mix and do not interrupt each other. However, this particular audio session category does not permit audio to continue playing while the device is locked, a feature I really need. As a result I'm forced to consider a different category: AVAudioSessionCategoryPlayback.
By default this category does not permit mixing with other audio, according to the Apple docs. To enable mixing with other audio I am overriding the relevant category:
OSStatus propertySetError = 0;
UInt32 setProperty = 1;
propertySetError = AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(setProperty), &setProperty);
assert(propertySetError == 0);
Unfortunately, this solves my playing whilst locked issue but introduces another issue: the AVAudioPlayer audio is interrupted briefly as the video loads with a minor stutter. The stutter is small, perhaps less than a second but is enough to disrupt the user experience. I've read this related post which enabled me to pre-load the video with the AVAudioSessionCategoryAmbient, but unfortunately the same approach doesn't seem to work with the new category.
The audio session category is applied successfully, according to the return code. Does anyone know why enabling audio mixing with this category is not the same as the mixing facility provided by ambient category?
The best way I've found working a similar problem is to use the newer AVPlayer (+1 #adam) and set your app to enable background audio and receive remote control notifications. I was tipped off to this by #MarquelV following How can you play music from the iPod app while still receiving remote control events in your app?
If you can get backgrounding working properly, that should enable you to continue playing while the device is locked. Oh, and don't forget to add keys to info.plist, its easy to do and then have no idea why it isn't working.
i'm using AVAudioRecorder and AudioQueueServices to record audio from the iphone's mic but I want to auto record the sound once it is detected so theres no need to press on a record button
I've looked at a tutorial about the AVFoundation:
www.iphoneam.com/blog/index.php?title=using-the-iphone-to-record-audio-a-guide&more=1&c=1&tb=1&pb=1
and also a tutorial that explains how the iphone mic can detect sound levels, i've tried adding the two tutorials together but i cant get the code to work at all
Can't figure out what to do and i'm not sure if i'm taking the wrong approach
Any advice?
Thanks
Check out this tutorial: http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/
I got something similar working using the above tutorial. I have one AVAudioRecorder monitoring the mic's volume and another that records the mic's input when a certain threshold is reached.
I'm using an AVAudioPlayer to play a mono AIFF file from my app's Documents directory.
On the simulator, [player play] returns YES and I hear the file playing. On the device the play method returns NO and nothing happens. I grabbed the file via the Organizer - it plays back fine on my Mac and seems to be well-formed. I realize the simulator can access codecs that aren't available on the iPhone, but that shouldn't matter in this case, should it?
I'm at a loss as to how to debug this. Any suggestions?
I asked this question on Apple's forum and someone there asked if I had activated an Audio Session. I hadn't, and doing so fixed my problem.
Make sure you do not use an absolute path to the audio file. When you move to device, the paths change. You need to get the path to the documents folder using the standard methods and then add the relative path to your audio file. That way you get a correct path regardless of the hardware or other changes.
Create a delegate to the audio player, and override -audioPlayerDecodeErrorDidOccur:error: to see if any error happens.