I have a wav file that was working fine. It is played using the AudioServices methods. Suddenly it stopped working. The weird thing is if i change he wav file to a different one that works. Any idea what is going on? The non working sound is slightly longer (still <10seconds) but it was originally working so I just can't figure it out.
Any suggestions of what to try would be most appreciated. Thanks :-)
AudioServices won't play sounds longer than 5 seconds. Playing longer sounds will require using a different Audio approach.
Here's a decent example of playing sounds with another method:
http://iphoneincubator.com/blog/tutorial/how-to-play-audio-with-the-iphone-sdk
Related
I am developing one application. In that I want to record the sounds and I want to play that recorded sound file. I know the frameworks for doing this. But how to develop programmatically by using that frameworks?
You can refer to this link:
I have implemented this code in one of my apps and it works completely fine.
How do I record audio on iPhone with AVAudioRecorder?
For Playing the sound you have option to use AVAudioRecorder.
Hope this helps.
The best way to do it - and I am talking from painful experience here - is with the RemoteIO audio unit. You can also do it with AudioQueue, but it has a higher latency, and the queue type approach becomes very problematic.
So, I think that they are really different tools for different jobs. Note that you won't play a sound file as such. You will play the contents of a buffer held in memory. As long as the buffer is not too large, this should not be an issue.
So, going with RemoteIO, you will find this blog and tutorial very useful. It includes code samples.
Using RemoteIO audio unit By MICHAEL TYSON
I am developing an app where the user can record their voice, and then alter it in some way. I have implemented OpenAL, and I am able to adjust the pitch to speed up and slow down the audio file. The thing is, I want to add filters like echo, reverb, etc.. I have scoured the internet for hours and have found nothing to help me. I came across a OpenAL called FreeSL, which has a bunch of filters built in, but I cannot get it compile in xcode.
I have also looked into Dirac3, but again all I am seeing is basic pitch/time controls; no echos or anything.
Can anyone point me in the direction a good framework or explain how OpenAL can handle filters like this?
Thanks!
I found a library that is exactly what I am looking for, FMOD:
http://www.fmod.org/index.php/fmod
Hi all
Im working with AVPlayer app to run on an iPad and just getting my head around it. Currently stuck with an issue that i cant seem to get the avplayer to restart a currently playing file safely. The video doesnt need to be finished for a call for the command to restart is called.
[player[currentPage] seekToTime:startTime toleranceBefore:startTime toleranceAfter:startTime];
[player[currentPage] play];
is what im using at the moment and it works majority of the time but sometimes the video seems to go out of sync with the sound . that is to say the sound restarts fine but the video seems to be paused and doesnt restart until the sound catches back up with it.
Anyone have experience with this or can point me in the right direction to research it myself. I have gone through the avplayer class files on the apple dev site and have been unlucky with searching for similar problems.
Thanks for any help
G
Check out my answer for this question:
iOS Multiple AVPlayer objects results in loss of audio/video sync
I am using Finch to play sound. Works great. One exception: I get an incoming call, answer the call, hang up. Go back to the app. Now sounds don't seem to play correctly anymore. What is the most resource-friendly way of ensuring they will? I guess the audio session is somehow closed...
Consider just using CocosDenshion sound library. we have found it solves all problems. not perfect but very reliable. hope it helps!
Note there is also the ObjectAL library, which, is possibly simply better than CocosDenshion.
You have to setup your own OpenAL audio interrupter.
An example of how to do this is found in Apple's SDK example called oalTouch.
See:
https://developer.apple.com/library/ios/#samplecode/oalTouch/Introduction/Intro.html
I'm new to programming in Objective-c and the iPhone but I am working on an app to teach myself. I have been trying to figure out how to record sounds on the iPhone. Apple provides excellent documentation for recording from the microphone with AVAudioRecorder but I want to record sounds made by my app, or even just record sounds when buttons are pressed (button is pressed, certain audio gets recorded).
I have no clue how to do this and can't seem to find anything that would help me along this path, only microphone stuff.
Can anyone share ideas or code that would make this possible?
Thanks
Once you move beyond the convenience classes like AVAudioRecorder, you've got a lot of studying to do.
I would start with:
Getting Started with Audio and Video
Multimedia Programming Guide:Audio
Core Audio Overview
Audio programming is a very large and complex subject. If you want to customize, be prepared to spend some time learning it.
I've made a sample app that records sounds/audio and has a playback. Hope this helps! https://github.com/casspangell/AudioMic