AudioStreamer and AVAudioRecorder don't work together - iphone

I am currently using Matt Gallagher's AudioStreamer (which works great!) but when I try to stop the playback and completely remove the streamer, my recording fails. I am unable to get something to record in anyway after I start using the streamer the first time.
With the streamer no longer existing, I have no idea what could be causing it to completely ruin recording functionality. Is there anyway that I can get this working? Any input at all would be extremely valuable.
Thanks in advance!
Matthew

You may have to initialize and configure an Audio Session, or reconfigure the Audio Session type when changing modes (ending the playback streamer, etc.)

Related

Streaming Audio Using the AVPlayer

I want to use the AVPlayer to stream audio, I only want to stream single tracks so not bothered about queuing. Does anyone know of any good examples online of how to do this. I'm also not to bothered about track time but will need to know it when it is ready to play, and if get interrupted etc.
Cheers
The Apple demo is good for what you are asking:
http://developer.apple.com/library/ios/#samplecode/AVPlayerDemo/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010101

iPhone SDK - HTML Live Streaming for audio

I've been developing an iPhone application that streams audio using Matt Gallagher's audio streamer found here: GitHud: AudioStreamer
However, I'm having some problems when the iPhone loses internet connection because the stream cuts out and doesn't reconnect until the user actually presses the play button again. I've even tried using the reachability classes from Apple to try and automatically stop and reconnect the stream but this isn't working 100%.
I've been reading around on the internet and I've found something called HTTP Live Streaming that can supposedly be used to stream audio on the iPhone. However, I can't seem to find any examples of how to use this, therefore can anyone help me by given a brief description any any source that might help to get this working please?
Thanks in advance,
Luke
Not enough detail for me to answer this entirely, but I use a set of calls
to be notified of reachability changes.
When I get a failure, I change the play image to stop.
I then wait for a notification that the network is back
and then programmatically press play for the user.

Sound on the iPhone: Finch: Ensuring that sound actually plays

I am using Finch to play sound. Works great. One exception: I get an incoming call, answer the call, hang up. Go back to the app. Now sounds don't seem to play correctly anymore. What is the most resource-friendly way of ensuring they will? I guess the audio session is somehow closed...
Consider just using CocosDenshion sound library. we have found it solves all problems. not perfect but very reliable. hope it helps!
Note there is also the ObjectAL library, which, is possibly simply better than CocosDenshion.
You have to setup your own OpenAL audio interrupter.
An example of how to do this is found in Apple's SDK example called oalTouch.
See:
https://developer.apple.com/library/ios/#samplecode/oalTouch/Introduction/Intro.html

Simultaneous record and play from same file

I'm attempting to use Audio Queue Services for the first time. After reading all the documentation and playing with some sample code, I think I understand the classes pretty well and have implemented my own playback and recording application without any problems.
I need to simultaneously record and play from the same buffer, but I'm having some significant difficulty writing to a file and reading from the file at the same time. I can get the file to be played back with no problem but only for the last written buffer before the playing started. I'm hoping it's possible to continue to playback the file for as long as it's being written to. Is this possible?
Thanks in advance!
It might be worthwhile to save the captured data in your own buffers instead of writing to a file. You can then supply these buffers to the playback.
Please make sure that you add the following line from AVFoundation framework for simultaneous capture and playback:
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error:nil];
Hope this helps.

Is it possible to both play a sound and record from the microphone at the same time on the iPhone?

I want to both play sound and record sound at the same time on the iPhone.
All the apps which record sound first disable any music playing. However as music would disrupt their purpose I don't know if they're doing this deliberately or whether the iPhone does it automatically when you record from the microphone.
Does anyone have experience with this?
Refer to this page. Good luck!
I've heard something about this on one episode of the mobile orchard podcast - the one with Michael Tyson, the creator of Loopy. During the discussion, Michael explained that he had to include some code to filter out the sounds that were coming in through the microphone in realtime as other sounds were being played on the speaker.
Based on that discussion, it seems entirely possible to both play and record at the same time, but I'm sure you'll have to do your own filtering to avoid recording the sounds your playing.
You can use either a core audio unit or an audio queue, and
you need to make sure that your audio session category is set to kAudioSessionCategory_PlayAndRecord.
Beware that sound output when you use this session is much quieter
than the normal solo session (observed on 3.0).