Video recording and saving the video on a server - iphone

I just caught with a task which is how should i go about capturing a video in my app and saving it directly on some server. I have the sample code discussed in WWDC2010 but i just need some other more helpful links or tutorials to complete this task.
Please give me your opinion or share any links if you have.
Thanks,

There is nothing in the APIs that will allow you to do this. The only way is to use AVAssetWriters to segment the video. You would then stream the completed segments. These would need to be reassembled on the server side if you require a single file.

Related

Extract cue-point or data from live video stream

Does anybody know of a way I can extract timing data from any kind of live video stream?
I have tried with JWPlayer, but it is not capable of doing that.
My encoder is Streamcoders Mediasuite and I am happy to stream using whatever streamtype is necessary, in order for me to get cuepoint info (or any kind of timing info) from the stream.
Caveat - Flash and Silverlight are not an option, as the viewer-base is restricted by policy.
Thanks in advance.
Neil.

MPMoviePlayerController : a way to get used bandwidth?

I would like to check used bandwidth when playing a video with MPMoviePlayerController to be able to play a video which matched client bandwidth.
For now, I download a part of my file by using NSURLConnection and I can find bandwidth. But I think it's not a good idea to download more data than expected (and the goal is to use as less bandwidth as possible).
Does a 'current downloaded bytes' property, or something like that, exist ? I hope you can help me.
Thanks a lot !
Take a look at the Reachability sample code, it will help you determine if the client is on WiFi,WWAN (3G/Edge), etc. You can make certain assumptions based on these findings. If you want exact speed, you'll download a file and check the speed.
You may want to look into HTTP video streaming, you can provide different (varying level of quality) versions of the video for each connection speed. The server determines the version to send.
Some docs on HTTP Streaming:
http://developer.apple.com/iphone/library/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html
http://www.scribd.com/doc/20173481/iPhone-Streaming

RTMP Streaming Server implementation: connect/createStream/play message sequence passed, but no video/audio in flashplayer

Hi!
Writing RTMP Streaming Server for streaming AVC+AAC video. And it works fine with rtmpdump. But I can't force it to work in flowplayer and other flash video players.
The message sequence after handshake is similar to FMS / RED5 / erlyvideo / haxevideo servers: I've tried a lot of variations.
From Chrome debug console I can see, what all negotiating messages passed to the flowplayer. The last one is onMetaData. And after this the working sample (rtmp://flash.tvwmedia.net/LiveVideo//Live300) gets NetStream.Buffer.Full. And streaming from my server don't get it.
I'm starting with AVC Header message, containing sps/pps. After it first AVC picture passed. After - AAC header and AAC sample. And then AVC/AAC samples. This dumped OK by rtmpdump - I have working flv on exit. But flowplayer and others does not work.
What can be the problem?
Is there any additional requirements for streams?
Is it possible that broken h264 stream cause flashplayer to stop playing? Is it possible to obtain system messages from flash player, which say about it?
Hope, You can help me :) I'm fighting with this problem over 2 weeks, and now just don't know any variants I can try.
Here is debug log + flv from rtmpdump. It contains negotiating messages and some first samples of media.
Update:
I've fixed one bug: wrong chunk stram ID used for "system" messages (e.g. SetChunkSize). But it's still don't playing.
Here is another log, almost the same as wowza produces. And wowza/red5 logs too (to compare).
I've checked the following things, which different in RTMP servers:
Different ChunkStreamIDs (for non-system streams)
Different StreamIDs (on createStream)
128b ank 4Kb chunk sizes
Unpacked/Packed chunk headers (in prev. log there are unpacked, in new - packed)
Different answers on connect call (from many servers)
Using 57 00, 57 01 video packets (video info/command frame)
Adding 09 (Access Unit Delimiter) NALU before each picture
Different order of audio/video DCR/packets
Audio only/video only
But tuning all that didn't let my server to work anyway :)
Any ideas how to solve this?
Update:
I've made a log through Flazr proxy as Peter suggested. Results are the same. And I can't find the solution: both logs looks good. Maybe I just don't see something easy...
Thank you!
I am not sure what the issue is, but you want to make sure that you are doing the following:
1. Sending pings
2. Handling bytes read/written reports
From the rtmpdump log your flow looks good, I didn't notice anything obvious. There are two more project which may help you depending upon your experience with either C++ or Ruby. The Izumi server is fairly simple and may be easier to follow if you are a Ruby dev.
http://code.google.com/p/rubyizumi/
If you are a C++ guy then look at RTMPd, Andrei knows more about the server side flash process than anyone else that I know of.
http://www.rtmpd.com/
Lastly, if you want another opensource Java Player/Server implementation for learning you
can look at Flazr
http://flazr.com/
I'm the author of Flazr which Mondain referred to (thx Mondain!).
I want to point you to the "proxy server" feature of Flazr. You can connect your flash player (or rtmpdump) to the proxy server and point the proxy server to your server. If you set the log to DEBUG mode, you will get a very detailed log trace of all the RTMP messages in both directions. This has been helpful to me in the past to compare Flazr with other implementations such as Red5. Hope this helps.

how to read a http video stream with libavcodec (ffmpeg)

I'm trying to read a real-time http video stream (I've set one up using VLC and my webcam) using libavcodec from ffmpeg. so far I've been able to read an mpeg file fine but when I pass a url instead of the file name it hangs (I assume it's waiting for the end of the file).
anyone have experience/ideas on how to do this? I'm also open to alternatives if people have other suggestions.
the end goal is to do live video-streaming on the iphone. apple's http live streaming has too much lag for what I need but that's for another post :)
any help is greatly appreciated!
if you aren't using apple's way, you can't.
I have to admit I dont quite understand what you want to achieve, however, based on my interpretation of the question it seems like it is related to the segmenting of the video.
The segmenting can be achieved using ffmpeg and libavcodec (look for example here, key line is the one with packet.flags). Just remember that the segment length (in time) depends on the keyframe interval (for h264 at least). If you want an example of a full segmented streaming solution that works (most of the time), check here.
Otherwise, you have to dig into the codec and create the transport stream manually. The MPEG2-TS, which is what iOS supports, can be a little bit difficult sometimes, but it is not too bad. Good luck!
you can use libcurl to download http segment file

iPhone. How to Track Progress of HTTP Upload

could you please advice how to track progress of file upload to HTTP Server?
I am using NSURLConnection and other classes to solve the task of file upload.
But I cann't find out how to track upload progress.
Thanks.
I asked something similar awhile back:
CFNetwork HTTP timeout?
I don't know if ASIHTTPRequest has this support built-in yet, and I didn't have a reason back then to switch over, but you may want to check it out and/or contact the authors.