I've some trouble with audio streaming using MPMoviePlayerController.
I want to know if it's possible to save the data streaming info to a file while MPMoviePlayer is playing that file.
Is there a simple way to do this?
Does anyone have an idea?
According to apple (http://developer.apple.com/library/ios/#codinghowtos/AudioAndVideo/_index.html) for streaming audio you connect to a network stream using CFNetwork interfaces from CoreFoundation, then parse the network packets into audio packets using Audio File Stream Service (AudioToolbox/AudioFileStream.h) and then play the audio packets using Audio Queue Services (AudioToolbox/AudioQueue.h) ….
Now my idea is that if we can find some way to write the audio packets to the file in between sending the packets to the audio queue then we can save the audio stream while playing them…
Its just an idea that needs implementation and don't know weather it will work for video stream or not.
Related
I am working on an iPhone app that needs to stream audio file to multiple devices (iPhone) connected via Bluetooth and play it in the Synchronized way, that is, all the devices starts playing the audio file simultaneously.
I am able to make a connection and stream the audio packet and even play the audio file, but the only issue that I am facing is the Latency. All the devices starts playing the audio file at different time, I want all of them to play the same audio packet together. Is there any way I can make them play audio file synchronously?
I remember glancing over audio broadcast option in the bluetooth spec. Not sure whether iphone supports it..
I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.
I am trying to write an iPhone app that playback mp3 audio streamed by our audio server over http socket. I am just wondering if there is any easy solutions that play the mp3 directly over the socket without any local buffering and conversion?
I found same posts about streaming mp3 files over HTTP connection but with no luck to find anything useful about the socket streaming.
Thank you in advance.
The most trivial solution would be the use of MPMoviePlayerController for the playback of streaming audio via HTTP.
Introduction
Alternatives
May be what you could do is impliment a mini server on the iPhone here is a one then simply stream the mp3 from there
I have an app in the App Store for streaming compressed music files over the network. I'm using Audio Queue Services to handle the playback of the files.
I would like to do some processing on the files as they are streamed. I tested the data in the audio buffers, but it is not decompressed audio. If I'm streaming an mp3 file, the buffers are just little slices of mp3 data.
My question is: is it ever possible to access the uncompressed PCM data when using Audio Queue Services? Am I forced to switch to using Audio Units?
You could use Audio Conversion Services to do your own MP3-to-PCM conversion, apply your effect, then put PCM into the queue instead of MP3. It's not going to be terribly easy, but you'd still be saved the threading challenges of doing this directly with audio units (which would require you to do your own conversion anyways and then probably use a CARingBuffer to send samples between the download/convert thread and the realtime callback thread from the IO unit)
Does anybody know of a way to stream audio from a file being played back in an app over the internet in a standardized streaming audio format? That is server up an Audio stream from the iPhone.
You need a http server on you device, multiple packet audio files, like:
http://your.com/audio1.mp3
http://your.com/audio2.mp3
http://your.com/audio3.mp3
And m3u file that shows the current realtime file... surely with a small delay, try 0.5 sec, the client opens http://your.com/play.m3u and plays with AVPlayer the stream. Like that.
In 4.0, just open the .m3u URL with the AVAudioPlayer. It was never simpler to stream audio.