how to read a http video stream with libavcodec (ffmpeg) - iphone

I'm trying to read a real-time http video stream (I've set one up using VLC and my webcam) using libavcodec from ffmpeg. so far I've been able to read an mpeg file fine but when I pass a url instead of the file name it hangs (I assume it's waiting for the end of the file).
anyone have experience/ideas on how to do this? I'm also open to alternatives if people have other suggestions.
the end goal is to do live video-streaming on the iphone. apple's http live streaming has too much lag for what I need but that's for another post :)
any help is greatly appreciated!

if you aren't using apple's way, you can't.

I have to admit I dont quite understand what you want to achieve, however, based on my interpretation of the question it seems like it is related to the segmenting of the video.
The segmenting can be achieved using ffmpeg and libavcodec (look for example here, key line is the one with packet.flags). Just remember that the segment length (in time) depends on the keyframe interval (for h264 at least). If you want an example of a full segmented streaming solution that works (most of the time), check here.
Otherwise, you have to dig into the codec and create the transport stream manually. The MPEG2-TS, which is what iOS supports, can be a little bit difficult sometimes, but it is not too bad. Good luck!

you can use libcurl to download http segment file

Related

Using libvlc to stream a video from memory?

I am generating a video stream in real time, and I've got it as a series of bitmaps in memory.
I'd like to stream these bitmaps over the network using libvlc, but I wasn't able to find the right functions in the API (all streaming functions expect a file or other source).
I even thought of emulating a capture device, but that seemed too convoluted to be true, so I'd rather ask.
My question is, what do I now have to do with these bitmaps to be able to use libvlc to stream them?
I found a question which appears to be solving the same issue.
Other suggestion with significantly less overhead is "emulating" a file with named pipes, i.e. FIFOs.

How can I monitor an mp3 live stream to detect corruption?

Once a month the mp3 streams messes up and the only way to tell it has messed up is by listening to it as it streams. Is there a script or program or tool I can use to monitor the live streams at a given url and send some kind of flag when it corrupts?
What happens is normally it plays a song for example or some music but once a month, every month, randomly, the stream corrupts and starts random chimpmunk like trash audio. Any ideas on this? I am just getting started at this with no idea at all.
Typically, this will happen when you play a track of the wrong sample rate.
Most (all that I've seen) SHOUTcast/Icecast encoders (going straight from files) will compress for MP3 just fine, but assume a fixed sample rate of whatever they are configured for. Typically this will be 44.1kHz. If you drop in a 48kHz track, or a 22.05kHz track, they will play at different speeds while causing all sorts of random issues with the stream.
The problem is easy enough to verify. Simply create a file of a different sample rate and test it. I suspect you will reproduce the problem. If that is the case, to my knowledge there is no way to detect it, since your stream isn't actually corrupt... it just sounds incorrect. You will have to scan all of your files for sample rate. FFMPEG in a script should be able to help you with that.
Now, if the problem actually is a corrupt MP3 stream, then you have problems on your encoding side. I suspect simply swapping out whatever DLL or module you're using with a recent stable version of LAME will help.
To detect a corrupt MP3 stream, your encoder must be using CRC. If you enable it, you should be able to read through the headers of each frame to find the CRC, and then run it on the audio data. In the event you get an error (or several frames with errors), you can then trigger a warning.
You can find information on the MP3 stream header here:
http://www.mp3-tech.org/programmer/frame_header.html

Video recording and saving the video on a server

I just caught with a task which is how should i go about capturing a video in my app and saving it directly on some server. I have the sample code discussed in WWDC2010 but i just need some other more helpful links or tutorials to complete this task.
Please give me your opinion or share any links if you have.
Thanks,
There is nothing in the APIs that will allow you to do this. The only way is to use AVAssetWriters to segment the video. You would then stream the completed segments. These would need to be reassembled on the server side if you require a single file.

MPMoviePlayerController : a way to get used bandwidth?

I would like to check used bandwidth when playing a video with MPMoviePlayerController to be able to play a video which matched client bandwidth.
For now, I download a part of my file by using NSURLConnection and I can find bandwidth. But I think it's not a good idea to download more data than expected (and the goal is to use as less bandwidth as possible).
Does a 'current downloaded bytes' property, or something like that, exist ? I hope you can help me.
Thanks a lot !
Take a look at the Reachability sample code, it will help you determine if the client is on WiFi,WWAN (3G/Edge), etc. You can make certain assumptions based on these findings. If you want exact speed, you'll download a file and check the speed.
You may want to look into HTTP video streaming, you can provide different (varying level of quality) versions of the video for each connection speed. The server determines the version to send.
Some docs on HTTP Streaming:
http://developer.apple.com/iphone/library/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html
http://www.scribd.com/doc/20173481/iPhone-Streaming

RTMP Streaming Server implementation: connect/createStream/play message sequence passed, but no video/audio in flashplayer

Hi!
Writing RTMP Streaming Server for streaming AVC+AAC video. And it works fine with rtmpdump. But I can't force it to work in flowplayer and other flash video players.
The message sequence after handshake is similar to FMS / RED5 / erlyvideo / haxevideo servers: I've tried a lot of variations.
From Chrome debug console I can see, what all negotiating messages passed to the flowplayer. The last one is onMetaData. And after this the working sample (rtmp://flash.tvwmedia.net/LiveVideo//Live300) gets NetStream.Buffer.Full. And streaming from my server don't get it.
I'm starting with AVC Header message, containing sps/pps. After it first AVC picture passed. After - AAC header and AAC sample. And then AVC/AAC samples. This dumped OK by rtmpdump - I have working flv on exit. But flowplayer and others does not work.
What can be the problem?
Is there any additional requirements for streams?
Is it possible that broken h264 stream cause flashplayer to stop playing? Is it possible to obtain system messages from flash player, which say about it?
Hope, You can help me :) I'm fighting with this problem over 2 weeks, and now just don't know any variants I can try.
Here is debug log + flv from rtmpdump. It contains negotiating messages and some first samples of media.
Update:
I've fixed one bug: wrong chunk stram ID used for "system" messages (e.g. SetChunkSize). But it's still don't playing.
Here is another log, almost the same as wowza produces. And wowza/red5 logs too (to compare).
I've checked the following things, which different in RTMP servers:
Different ChunkStreamIDs (for non-system streams)
Different StreamIDs (on createStream)
128b ank 4Kb chunk sizes
Unpacked/Packed chunk headers (in prev. log there are unpacked, in new - packed)
Different answers on connect call (from many servers)
Using 57 00, 57 01 video packets (video info/command frame)
Adding 09 (Access Unit Delimiter) NALU before each picture
Different order of audio/video DCR/packets
Audio only/video only
But tuning all that didn't let my server to work anyway :)
Any ideas how to solve this?
Update:
I've made a log through Flazr proxy as Peter suggested. Results are the same. And I can't find the solution: both logs looks good. Maybe I just don't see something easy...
Thank you!
I am not sure what the issue is, but you want to make sure that you are doing the following:
1. Sending pings
2. Handling bytes read/written reports
From the rtmpdump log your flow looks good, I didn't notice anything obvious. There are two more project which may help you depending upon your experience with either C++ or Ruby. The Izumi server is fairly simple and may be easier to follow if you are a Ruby dev.
http://code.google.com/p/rubyizumi/
If you are a C++ guy then look at RTMPd, Andrei knows more about the server side flash process than anyone else that I know of.
http://www.rtmpd.com/
Lastly, if you want another opensource Java Player/Server implementation for learning you
can look at Flazr
http://flazr.com/
I'm the author of Flazr which Mondain referred to (thx Mondain!).
I want to point you to the "proxy server" feature of Flazr. You can connect your flash player (or rtmpdump) to the proxy server and point the proxy server to your server. If you set the log to DEBUG mode, you will get a very detailed log trace of all the RTMP messages in both directions. This has been helpful to me in the past to compare Flazr with other implementations such as Red5. Hope this helps.