I'm trying to understand meaning and status of rtmp streams. I see some tracks have both mp3 and rtmp streams, but I don't understand why and what stream should I use.
What reasons to give both streams? Is where are any sounds without mp3 stream? Should I use rtmp stream if it available or not?
Is it okay to play only mp3 stream?
Not all clients support RTMP, which is why both streams are offered. You can use whichever you like, but there's indeed going to be tracks that do not have an mp3 stream.
Related
I find most of streaming audio discussions are about the streaming media from http server, e.g. AudioStreamer from cocoa with love or MPMoviePlayerController. They both init with NSURL. But my case is other than that. I use SMB to access the media files on some window shared server. The media content is got with SMB message (thru socket) and is accumulated in memory (NSMutableData)
So is there a way to play them (those NSMutableData) before download is finished ?
Update, so for streaming audio I understand I need audio queue service.
What about stream video other than http? I think it is doable because there is a free app called TIOD which does not only stream audio but also video from SMB server.
BTW, I never expect others to do work for me. I check all the document I can find and can't find a way to do it (for video). I had thought, well, that may mean it can't be done. But then I find TIOD can do that. That's why I raised the quesion in the first place to see if other has experiences for it.
Yea you can stream that as well, its the same thing as getting the data from an NSURL... if you look at audio streaming example by matt gallagher here you see that he is getting data from some URL, but ultimatly when he calls the parse function he is giving it bytes of data, same things should apply for your situation, with the data you get you should be able to call the parse function and have the Audio Player stream your audio file..
I've some trouble with audio streaming using MPMoviePlayerController.
I want to know if it's possible to save the data streaming info to a file while MPMoviePlayer is playing that file.
Is there a simple way to do this?
Does anyone have an idea?
According to apple (http://developer.apple.com/library/ios/#codinghowtos/AudioAndVideo/_index.html) for streaming audio you connect to a network stream using CFNetwork interfaces from CoreFoundation, then parse the network packets into audio packets using Audio File Stream Service (AudioToolbox/AudioFileStream.h) and then play the audio packets using Audio Queue Services (AudioToolbox/AudioQueue.h) ….
Now my idea is that if we can find some way to write the audio packets to the file in between sending the packets to the audio queue then we can save the audio stream while playing them…
Its just an idea that needs implementation and don't know weather it will work for video stream or not.
Can anyone suggest any open source linux program for converting .wav files to flash format for RTMP streaming? Does RTMP support any format other than flash?
Flash Media Server supports three audio formats for streaming: Nellymoser, MP3, and ACC. You also can play MP3 files directly from the Flash Player view HTTP download, you don't really need to use RTMP (which is more advantageous for video due to higher bitrate).
Here's a good article on streaming audio with FMS:
http://www.adobe.com/devnet/flashmediaserver/articles/beginner_audio_fms3.html
For conversion, you can use ffmpeg.
http://fosswire.com/post/2007/11/using-ffmpeg-to-convert-to-mp3/
You can stream via the CRTMP server to flash clients. VLC can play RTMP streams.
Will the ALAC format support live streaming in iPhone ? the ALAC audio recording format is streamed to Server machine? so will i be able to play the audio chunk data, does ALAC format support?
Thank You.
Assuming you mean "Apple Lossless" audio...
I don't see why it wouldn't, but I don't know the details. You'll probably need to embed it in a transport stream instead of a MPEG 4 container (but then, I don't know how the HTTP live streaming works either).
I don't think streaming lossless audio is sensible, though.
Streaming lossless audio is possible, we have flac streaming using icecast and it works beautifully. However, we are not using HTTP Live Stream (HLS) to do it. We stream flac from the source generator to a number of servers and they create HLS's from there.
It is technically possible to mux alac into mpegts (ffmpeg can do this) as well as play it back (using ffmpeg), but there isn't a format identifier for other clients. Adding this feature to HLS will be as easy as calling/writing Apple and asking them to add ALAC to this list:
http://www.smpte-ra.org/mpegreg/mpegreg.html
and update their products accordingly. If you've purchased an Apple product less than 90 days ago, or you have AppleCare: give them a call. They have to work on the issue for you if you are covered. The more requests that get elevated to their engineers, the more likely they are to add support for alac in HLS.
Are there any tutorials that discuss how to stream audio from the Internet to the iPhone for OS 3.x? The only one I've seen is very outdated and doesn't compile: http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html.
Are you in control of the content or not? If you're in control, you could use HTTP Live Streaming, which is trivial to play with the MPMoviePlayerController, though you'll lose your GUI. The alternative answer, if you're playing something like a Shoutcast stream, is to use the gamut of Core Audio services, primarily Audio File Stream Services to parse the stream into packets, and Audio Queue Services to play back. And yes, that'll be hard.