Timestamp on Live video after transcoding from RTSP stream to FLV - flv

I have access to an RTSP stream from an IP camera, and the RTSP packets have video timestamps in the header.
If I transcode the video stream to FLV using VLC for the purposes of playing the video on the web, is it safe to assume that I will lose the timestamp information in the transcode process?
If so, is there any way around this?

Related

RTMP request with body

I used to watch online streaming videos by using mobile app, and would like to get its real streaming url for playing in VLC media player on my computer.
The following is the screenshot of captured network packet of the streaming video:
RTMP streaming scrrenshot
I tried to use VLC media player to play "tcUrl" part of the screenshot, but it failed to open this MRL...
I noticed that when mobile app sends out this RTMP request, it has RTMP body.
Is there any way that I can send out RTMP request with Body in VLC player? or is there any other tool capable to do that?
Thanks.
You should take a look at the play command, which specifies the stream.
The tcUrl is similar to a directory, and the play command specifies the stream like file.
Compare to the HTTP URL:
http://server/dir/livestream.flv
rtmp://server/dir/livestream
The tcUrl is rtmp://server/dir and the stream is livestream. Then you could play or forward if you get the entire url, for example:
ffplay rtmp://server/dir/livestream
ffmpeg -f flv -i rtmp://server/dir/livestream -c copy dvr.mp4
You could use VLC to play the RTMP url also.

icecast + Adobe Flash Media Live Encoder

So I am working with a community TV channel to stream their TV station in Audio only formate. I know that they currently use Adobe Flash Media Live Encoder to send a WebTV stream to a provider. What we are discussing is creating an Icecast stream of their TV broadcast.
I am wondering is there a way to take Adobe Flash Media Live Encoder stream and read metadata and send all that to an Icecast stream either using FFMPEG or other technologies?
It is possible to have FFmpeg act as an RTMP server which you could connect your encoder to:
ffmpeg -listen 1 rtmp://127.0.0.1:1935 …
However, I think you'll find it would be better to use FFmpeg in parallel, encoding from the same source. I'm going to guess that you're not using MPEG-4 and AAC audio on your Icecast stream, so it would be better to encode from the source rather than to transcode already lossy audio/video.
As for metadata, depending on your media format, you'll have to handle that out-of-band with a separate script.

How do I publish a video compressed with h265?

I set up a nginx server and I can broadcast using rtmp. I do this by compressing it with h264, but I also need to compress it with h265. rtmp and flv do not support h265.
as a result can you offer me a server and protocol to use h265? get open source if possible.
edit:
okey should not explain the problem a little more. I can send a video to my nginx server with a client and watch this video in hls and dash.
But what I have to do is compress this video with a h265 applet with a converter like ffmpeg and then watch it or send it to another client. I couldn't find anything other than rtmp to send and receive videos.
MP4 and MKV both support H.265 video streams.
You can use DASH or HLS to stream your MP4 segments.

how to stream high quality video and audio from iPhone to remote server through Internet

I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.

Open Source program for converting wave files for RTMP streaming

Can anyone suggest any open source linux program for converting .wav files to flash format for RTMP streaming? Does RTMP support any format other than flash?
Flash Media Server supports three audio formats for streaming: Nellymoser, MP3, and ACC. You also can play MP3 files directly from the Flash Player view HTTP download, you don't really need to use RTMP (which is more advantageous for video due to higher bitrate).
Here's a good article on streaming audio with FMS:
http://www.adobe.com/devnet/flashmediaserver/articles/beginner_audio_fms3.html
For conversion, you can use ffmpeg.
http://fosswire.com/post/2007/11/using-ffmpeg-to-convert-to-mp3/
You can stream via the CRTMP server to flash clients. VLC can play RTMP streams.