I have a lot of very large recorded mpeg.ts files (each several hours) on a server and I want them to stream to a website.
I was thinking to use red5 with xuggler.
My client could request a file, and red 5 could stream it to the client.
Xuggler could transcode the stream on the fly to flash video.
Is this a good way to go?
Can anybody direct me to demo code?
Thank you
Related
I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.
I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices.
But in my case i have multiple video files keep received from another program, my goal is use the Flash Media server to perform a live boardcasting with these video file one by one.
That means when client watching the streaming, they will not notice the server is play mov1, then mov2, then mov4, then mov5... and so on.
Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?
Can FMS achieve such purposes? any tutorial provided would be very helpful!
Edit for Open Bounty
I want to basically deliver a live stream of video where a list of videos are source. I am currently using Flash Media Server with Cloudfront CDN to deliver content. So if I have video1, video2, and video3. I want to play them back to back as a live stream (so no skipping ahead in video), is it possible to do this? Bounty goes to clever workaround. Think of this as a television channel.
i have been working on the live streaming technologies for the past 1and half year . There is no option in flash live encoder for any file encoding.
1.To encoder your file you can use your dvd player devices or some thing else which supports usb devices playback options.and use the dvd player output to broadcast using flash media live encoder.
2.And the another set is to setup windows media encoder that supports file encoding(no need od dvd player) but it supports only windows media services.
At present i live webcast video file in this way only for my company http://www.malar.tv/live.php
I need a software to stream mp3 files and give me API or something that I can use it to show which track is now being played. Actually I have create an Online Radio with offline file source and showing playlist in frontend or player.
I've tried Windows media Encoder, Microsoft Expressions Encoder and IceCast and no one of them does what I need.
Is there any suggestions?
IceCast + ezStream did the job.
I've m3u8 file with all the TS files. MPMoviePlayerController play them fine via http request on the streaming server. But I'd like to get the files locally in order to play them again later without any connection.
I managed to download m3u8 file and all the TS files locally on my device, I edited m3u8 files to point to local .ts instead of http ones, but I can't read them from this emplacement.
(VLC can do it well)
Is there a way to download the segments while playing (to avoid 2 downloads) and then to play them locally with MPMoviePlayerController or else.
.m3u8 is Apple HTTP Live Streaming, right? I think what you're trying to do simply goes against the design of that technology. You should expose the original file and allow it to be downloaded.
From what I understand, it's in the design of streaming that you don't get explicit access to the pieces in order to put them back together. For instance, Netflix uses streaming via Silverlight, and one of the benefits (to Netflix) is that it protects the data from being saved as if it were downloaded. Also, since HTTP Live Streaming allows a stream to switch bitrates on the fly, it's designed such that each time slice can be encoded at any number of bitrates, and none of them is canonical.
In theory, there might be a way to collect all the slices for a particular bitrate and re-encode them into a single video. But Apple's playback APIs are not going to give you that opportunity.
Instead of HTTP Live Streaming, consider progressive download. Just serve the original video file (transcode it to something the iPhone likes if necessary). If your server is configured properly, the playback APIs will do small requests to get particular chunks of the file, rather than the whole thing in one go, and it's a close second to proper streaming. I wish I could find where I read about this so I could give the proper name for it. Amazon S3 is set up to serve this way, if you need a quick solution.
But beware, Apple's docs say,
If your app delivers video over
cellular networks, and the video
exceeds either 10 minutes duration or
5 MB of data in a five minute period,
you are required to use HTTP Live
Streaming. (Progressive download may
be used for smaller clips.)
I have a program on the server side that keeps generating a series of JPEG files, and I want to play these files on the client browser as a video stream, with a desired frame rates (this video should be playing while the new JPEG files are being generated). Meanwhile, I have a wav file that is handy and I want to play this wav file in the client side, when the streaming video is being played.
Is there anyway to do it? I have done a plenty of research but can't find a satisfactory solution -- they are either just for video streaming or just for audio streaming.
I know mjpg-streamer at http://sourceforge.net/projects/mjpg-streamer/ is capable of playing streaming videos in MJPG format from JPEG files, but it doesn't look like that it can play streaming audios.
I am very new to this area, so more detailed explanation will be extremely appreciated. Thank you so much!!!
P.S. a solution/library in C++ is preferred but anything else would help as well. I am working on linux.
The browser should be able to do this natively, no? Firefox can do this certainly, if you simply give it the correct url of the streaming mjpeg source. The mjpeg stream should be properally formatted.
I figured it out. The proper way of doing it is to use ffmpeg, libav and an RTMP server, such as red5.