How can we stream videos to aws kinesis stream with flutter? - flutter

So I want to stream videos to Kinesis stream from flutter. I have searched through the kinesis documents but couldn't find any SDKs available for flutter.
Is there any library available to do it?
Or if anyone has done it before, would really appreciate the help.

As Far as I know, there is no current implementation in flutter for Kinesis Video "Producer" SDK.
However there's Android Implementation, so what I would suggest is to add this Android Native Code in your project, and call it from Flutter Side.
The flutter camera library can be modified to work with Kinesis Producer SDK.
Or like #Andrija said, REST API as a proxy for Kinesis can be used. But the downside is audio won't be streamed, you might need to container (MKV/MP4) the audio and video and send that.
Having said all that, if you can somehow encode the video and audio (MKV/MP4) from flutter, then you can use putRecord in aws_kinesis_api flutter api to send it to Kinesis. But nowhere does it say it has to be video/audio, this is a generic api to put data into the stream.
EDIT: This confirms that as of current SDK, there is no Kinesis Producer code for Flutter https://github.com/agilord/aws_client/issues/242#issuecomment-860731956

Related

Flutter Video_Player - Play video from asp.net web API using stream

Flutter is new to me, and I'm creating an app using video_player library, where the video plays from local resource as well as from the internet using direct url like https://flutter.github.io/assets-for-api-docs/assets/videos/bee.mp4.
However, I am trying to read a stream in chunks and play them to video_player, but it does not seem to be working.
I have created an API with the help of Asynchronously streaming video with ASP.NET Web API which return the Stream outputStream as return value.
Note: The video stored in my file server and API server are also different, and both are on the same network. API server can read the data in chunks from the actual url that is not publicly accessible and its working fine in html video tag.
Does the video_player support such streaming or do I need to use any other player?
Please help me guys, whether they are in the right direction or not.
I am sure the details provided are not enough, but as a flutter beginner this is what i can provide here.
Best Regards,
Hi ten

Does ExoPlayer or AVFoundation work with an udpxy stream?

I'm building a frontend for a proprietary IPTV backend, which receives its channel streams via multicast, then for the end users it uses udpxy to convert the traffic from multicast to HTTP. However, neither ExoPlayer or AVFoundation seem to accept this stream (both the VLC app & flutter_vlc_player play this stream nicely) - I'm using the video_player plugin for Flutter, so I may miss over some live stream specific impls. Can anyone point me in the right direction? Thanks!
Media information (VLC):
udpxy: udpxy 1.0-23.12 (prod) standard [Linux 5.10.0-10-amd64 x86_64], the output stream is application/octet-stream (maybe that can cause some problems?)

Nest Camera Devices returing RTSPS stream URL - Not able to play rtsps: stream

Anybody having idea how to play RTSPS stream in Flutter/Android Player. I am fetching Stream URL from Nest Camera API device, But it returns RTSPS Stream URL which is not playable. Can Anyone Guide me how to play it, Just for INfO its RTSPS url stream not RTSP?
I came across the following library for playing RTSPS streams however I'm experiencing a Read timed out error when attempting to play the rtsps:// stream returned from the Google Nest API.
https://github.com/alexeyvasilyev/rtsp-client-android
I would be very interested to know if this library works for you or if anyone has managed to play these rtsps streams on Android or iOS.

How to get audio volume (as stream) from WebRTC's MediaStream in Flutter?

I am using flutter mediasoup client to stream video+audio from the server.
This works well in general.
However, I now want to measure audio level (ie, loud/soft) from the incoming audio stream so that I can display an audio level indicator widget.
The underlying stream is this webrtc class, but there doesn't seem to be any API to directly extract audio level.
I found this thread in flutter-webrtc repo, but it led to no concrete solution.
So, I wonder if anyone has had any success in extracting audio level from webrtc media stream.
Thank you.
I also posted this question in flutter-webrtc github repo and got a response that eventually led to a solution in my case:
https://github.com/flutter-webrtc/flutter-webrtc/issues/833

How to create my own custom RMTP server from scratch and stream to multiple destinations

I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.