Flutter camera plugin to transcode and transport streams to Media Server - flutter

What direction should i look for Flutter camera audio/video stream to RTMP Media Server?
Developing a live broadcasting solution,
Our attempts so far;
I. PWA approach, Antmedia WebRTC SDK was used to establish WebRTC ICE candidate connection between web mobile and antmedia server.
Problem faced: seems connection too heavy for some phones. One of my phones was able to stream through.
II. Looked in the direction of hybrid, i have been searching internet for cordova/ionic SDK for live streaming to RTMP Media Server, no luck so far. Found bambuser https://www.npmjs.com/package/cordova-plugin-bambuser, seems the SDK is tied to their live streaming services.
Please, what do i need to know? Also need expert review on Flutter camera plugin Image-Bytes access + ffmpeg for encoding and transporting streams to RTMP url rtmp://SERVER_NAME/LiveApp/STREAM_ID

Related

Does ExoPlayer or AVFoundation work with an udpxy stream?

I'm building a frontend for a proprietary IPTV backend, which receives its channel streams via multicast, then for the end users it uses udpxy to convert the traffic from multicast to HTTP. However, neither ExoPlayer or AVFoundation seem to accept this stream (both the VLC app & flutter_vlc_player play this stream nicely) - I'm using the video_player plugin for Flutter, so I may miss over some live stream specific impls. Can anyone point me in the right direction? Thanks!
Media information (VLC):
udpxy: udpxy 1.0-23.12 (prod) standard [Linux 5.10.0-10-amd64 x86_64], the output stream is application/octet-stream (maybe that can cause some problems?)

How to create my own custom RMTP server from scratch and stream to multiple destinations

I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.

How to stream video(local video file) from an android phone to different android phones

I am trying to establish multimedia streaming among multiple android devices. Where all of them play a synchronized video. An good example of my idea is Samsung video group play. Thanx in advance.
Google how to implement an RTSP server on the host device and then all of the client devices can connect to the RTSP stream. For the synchronization matter - either use multicast or server the same payload with adjusted timestamps to all the clients.

iOS - Develop iPhone app to stream camera video to a computer?

I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.

Cross Platform Video streaming

I'm looking for a video streaming soluiton which has the ability to upload the video files to the server and deliver to multiple receivers on-demand across the hardware and software platforms (Desktop, Tablet, Mobile, Windows, Android, iOS, etc.). The solution should also support streaming live videos.
Can HTML-5 used as client for the above requirements? IF so, what should be the server side streaming solution? Any feedback and alternatives will be very helpful.
Appreciate it.
You may look at MediaMosa, it is a backend that handles video management. You may create your own application on the front-end.