Sound Not Playing In iOS - Not A Source Code Problem - iphone

I want to play a sound in my application. Nothing special, just a little WAV file. The file exists in the device in my main bundle, and I've got both the AVFoundation and AudioToolbox frameworks added. I have alternately tried using both of them, testing out every bit of sample code on the net for playing this sound. It won't work. So what else could be wrong, given that the file exists and the code snippet probably isn't the problem?
Is it to do with imports? I import the necessary .h files for the frameworks at the top of the file.
I've noticed that one of the methods requires there to be a delegate that implements a particular protocol. Is this necessary? Some examples require it, others don't. What do I need to implement for this?
This fails to play the sound both on the device and the simulator. HAve I overlooked antyhing else?

Is it to do with imports?
No. Problems with imports will be apparent at build time.
I've noticed that one of the methods
requires there to be a delegate that
implements a particular protocol. Is
this necessary?
No. The AVAudioPlayer delegate is optional.
This fails to play the sound both on
the device and the simulator. HAve I
overlooked antyhing else?
The most likely explanation is that the file is not being properly referenced. Alternatively, the file may be in a non-standard format for and the player chokes on it silently. Try testing with another file type.

In my case today, an mp3 worked fine on the Simulator (and in iTunes and everywhere else I tested it), but not on a device. I converted the mp3 to an aif and IT worked fine on the device.
I know this is more worthy of a comment than an answer, but I lack the rep to comment.

Related

Swift playing sound doesn't seem to work on my app

Iv'e tried using the SwiftySound lib
but from some reason it didn't work, Iv'e also tried to play a sound using the AVFoundation but it doesn't seems to work neither (it's wether I get a nil on the url or it's simply doesn't work). the device is not muted and the speakers works fine, does anyone know what it might be?
Make sure you are adding your sound files to your project and also to your application target. It may be the case that, AVfoundation or SwiftySound not able to find your sound file. Also check out example project of SwiftySound.
pod try SwiftySound
Also check out other similar questions:- iOS Sound not playing in Swift

How to encode artist name image to audio recording in iOS?

I am working for Music app in that one of the feature is recording user voice and playback the same. So far all things are in control. Yesterday I got a thought and straight away I started Googling, the idea is adding artist names and album image to my recorded audio using AVAudioRecorder, But there is not much success in it.
I also seen AV Foundation Audio Settings Constants to set the AVAudioRecorder settings, failed in this also.
You probably can use an existing audio tagging library, so after creating the file, you can use it to add the required data. I did a quick search and found this libraries:
SonatinaTag: It was made for OSX but it may work for iOS (The project state that has very few external requirements). Not sure if support writing.
TagLib-ObjC: A wrapper for the popular TagLib. Seems to be in development.
TagLib: TagLib, I know is just pure C pain, but maybe is not that hard to use.
Good luck!

AVAssetExportSession missing audio track when exporting on device

I run the export on the simulator and everything works great. I run it on the device and the video gets exported but there's no audio. This leads me to believe that I must be using an audio format that the device doesn't support but OS X does, as the simulator uses what OS X uses. I've tried m4a, aiff, and aifc and have had no luck! Any ideas??
I have a very similar problem. It does not seem to do with codecs, as I made a separate test case that runs fine with the same video. There’s a related question that says the problem might be in playing the same assets using MPMoviePlayerController. That got me on the right track (sort of).
In my case the trouble stem from using the assets in an AVPlayer during the export. I was not able to find the exact combination that causes the export to drop the audio track – in the separate test project the export runs fine even though the asset plays in AVPlayer at the same time. After several hours of trying to find the exact cause I gave up and simply popped the asset out of the player using replaceCurrentItemWithPlayerItem:nil during export. It’s a hack, but it works.
AVFoundation is a very powerful framework, but God I wish it wasn’t so finicky or at least logged more errors instead of silently producing garbage.

Sound on the iPhone: Finch: Ensuring that sound actually plays

I am using Finch to play sound. Works great. One exception: I get an incoming call, answer the call, hang up. Go back to the app. Now sounds don't seem to play correctly anymore. What is the most resource-friendly way of ensuring they will? I guess the audio session is somehow closed...
Consider just using CocosDenshion sound library. we have found it solves all problems. not perfect but very reliable. hope it helps!
Note there is also the ObjectAL library, which, is possibly simply better than CocosDenshion.
You have to setup your own OpenAL audio interrupter.
An example of how to do this is found in Apple's SDK example called oalTouch.
See:
https://developer.apple.com/library/ios/#samplecode/oalTouch/Introduction/Intro.html

iPhone SDK: Is it possible to process audio file from local library

Well, I will try best not to make it as a 'I just want the code' question...
I'm recently working on a project which requires some audio signal processing from local music files (e.g. iTunes Library). The whole work includes:
Get the PCM data of an audio file (normally from iTunes library); <--AudioQueue (?)
Write the PCM data to a new file (it seems that Apple does not allow direct modification on music tracks); <--CoreAudio(?)
Do some processing and modification, like filters, manipulators, etc. <-- Will be developed in C++
Play the processed track. <--RemoteIO
The problem is, after going through some blogs and discussions:
http://lists.apple.com/archives/coreaudio-api/2009/Aug/msg00100.html, http://atastypixel.com/blog/using-remoteio-audio-unit/
http://osdir.com/ml/coreaudio-api/2009-08/msg00093.html
as well as the official sample codes, I got a feeling that the CoreAudio SDK allow us to apply audio processing only on voice demos recorded from Mic.
My question is that:
Can I get raw data from iTunes library tracks instead of Mic input?
If the first question is 'No', is there a way to 'fool' the SDK to let it think it is getting data from Mic input, not from iTunes? (I have done some similar 'hacking' stuff in C# before XD)
If the whole processing just doesn't work, can anyone provide some alternative ideas?
Any help will be appreciated. Thank you very much :-)
Thanks.
Just found something really cool yesterday.
From iPhone Media Library to PCM Samples in Dozens of Confounding, Potentially Lossy Steps
(http://www.subfurther.com/blog/?p=1103
And also a class library by MIT:
TSLibraryImport: Objective-C class + sample code for importing files from user's iPod Library in iOS4.
(http://bitbucket.org/artgillespie/tslibraryimport/changeset/a81838f8c78a
Hope they help!
Cheers,
Manca
1) No. Apple does not allow direct access to PCM data of songs. Otherwise you could create music-sharing apps, which is not in Apple's interests.
2) No. Hacking and getting approved is impossible due to Apple's code approval mechanism.
3) The only alternative I could think of is that you have to do the processing part on PC/Mac and then transfer it to the iPhone. Or you would have to store the files in your own applications folder - you should be able to load and process these via CoreAudio.
I know this thread is old but... did this work for you, Manca? And did this app get approved?
EDIT: just discovered the AVAssetReader class, introduced since iOS 4.1, should help