We have some encrypted HTTP live streams which we have been using on iOS device apps. We would like to use the same streams to stream content to various browsers now. With safari it is not an issue since the quicktime plugin works fine and plays the streams. But with friefox, chrome, IE we are running into issues as to how to play these streams.
Will a quicktime plugin for these browsers (assuming it is available for all of these) be able to play the streams. Is there any other player with browser plugins which can be used to play HTTP live streams (encrypted streams) on browsers?
Thanks.
Related
I'm building a frontend for a proprietary IPTV backend, which receives its channel streams via multicast, then for the end users it uses udpxy to convert the traffic from multicast to HTTP. However, neither ExoPlayer or AVFoundation seem to accept this stream (both the VLC app & flutter_vlc_player play this stream nicely) - I'm using the video_player plugin for Flutter, so I may miss over some live stream specific impls. Can anyone point me in the right direction? Thanks!
Media information (VLC):
udpxy: udpxy 1.0-23.12 (prod) standard [Linux 5.10.0-10-amd64 x86_64], the output stream is application/octet-stream (maybe that can cause some problems?)
What direction should i look for Flutter camera audio/video stream to RTMP Media Server?
Developing a live broadcasting solution,
Our attempts so far;
I. PWA approach, Antmedia WebRTC SDK was used to establish WebRTC ICE candidate connection between web mobile and antmedia server.
Problem faced: seems connection too heavy for some phones. One of my phones was able to stream through.
II. Looked in the direction of hybrid, i have been searching internet for cordova/ionic SDK for live streaming to RTMP Media Server, no luck so far. Found bambuser https://www.npmjs.com/package/cordova-plugin-bambuser, seems the SDK is tied to their live streaming services.
Please, what do i need to know? Also need expert review on Flutter camera plugin Image-Bytes access + ffmpeg for encoding and transporting streams to RTMP url rtmp://SERVER_NAME/LiveApp/STREAM_ID
I am working on a live streaming project and came across many services like Wowza, Dacast, Ant etc. The one suits for my requirement uses RTMP protocol and so I will have to use an encoding software like OBS to publish the stream. I actually want to publish the stream from browser/iOS/Android.
I came across this FB presentation and seems like they are using RTMP protocol. FB is successfully doing the broadcast from the browser somehow.
Can I get an insight into how the things would be working with FB / similar RTMP based live streaming apps? Thanks in advance.
Facebook supports RTMP ingest of video (used by people who utilize the Live API), as well as WebRTC ingest for browser clients.
RTMP is not used as a distribution protocol. For that, there is DASH.
I'm interested in setting up a streaming video server (perhaps on a cloudfront server) with videojs. I understand that flash video can be streamed, however, is it possible to stream video using videojs and a different codec? (like h246). I tried looking through the videojs documentation and forums but did not find any additional info.
Video.js has limited support for RTMP streaming in Flash, but hopefully more in the next few months.
HTTP Live Streaming (HLS) is the most supported streaming format for HTML5 (iOS, Safari, latest Android). Video.js can support that on the devices that support HLS natively.
I think you would have to transcode the h264 file on the fly to get the effect you want. Subsonic is a program which will read your file structure, display your videos and music in a webui, transcode the audio/video and stream it--but it uses jwplayer, not videojs.
However, it is opensource, so if you want to try to modify that, I'm sure it would be possible.
How can I stream video data from the network and play it on an iPhone?
First, are you developing a Web app optimized for iPhone or a native application ?
In the first case, your only option is to transcode your video files to Quicktime H.264 (m4v or mp4 extension). You can use Quicktime Pro (use the export menu) or VLC (as a free alternative). Then simply add a hyperlink to the video file on your HTTP server. Make sure it presents the right content-type and stuff (read Safari Web Content Guide for iPhone OS: Configuring Your Server). That'll work for web and native apps (in a native app you would use the MPMoviePlayerController view). So can "stream" (technically called progressive download of a Quicktime movie file).
If you're talking about streaming live content (i.e. content that you produce live or transcode a live feed) there is currently no official way of doing it (as of iPhone OS 2.2). iPhone OS does not support RTSP/RTP streaming. A number of native iPhone applications (such as UStream.tv and Orb Live) have created their custom live streaming solution (most of them transfer a delayed streams with many seconds of latency over HTTP then somehow decode it on the phone using FFmpeg or other libraries).
Are you trying to stream video in your app or just streaming on your iPhone? For streaming video through an app, use the MPMoviePlayerController and pass the URL of your video to it. The MPMoviePlayerController will itself stream the video and play it for you.
If you're looking for a server based solution (with a very affordable Amazon EC2 option), be sure to check out Wowza at http://www.wowzamedia.com/advanced.php
It streams directly to iPhone/iPod Touch without a custom app.
note: I'm not affiliated with them at all... just a fan/customer.
edit: Just noticed how old this question was. :)