I have tried lot of research.. let me know how to live stream without camera view using RTMP server.I have tried with flutter_live,flutter_webrtc and rtmp_publiser plugin's.
Please try to describe your use scenaio:
What do you want to use to publish to RTMP server? OBS? FFmpeg? Chrome? Android? iOS?
What do you want to play the stream from RTMP server? VLC? Chrome? Android ? iOS?
Why do you need WebRTC?
What did you connect the camera to?
What app or platform like you want? YouTube? Twitter? Discord? Zoom? TikTok?
Conflict question, because WebRTC always use camera:
In flutter how to use WebRTC to stream video to RTMP?
how to live stream without camera
Didn't get you.
Related
I want to know how can I get GStreamer to work on Flutter application to show a video on media player.
Any help and tips would be appriciated
Do you really need GStreamer on the device to play the video?
The main point is the source of the video - you can play:
video file (https://pub.dev/packages/video_player)
WebRTC stream (https://pub.dev/packages/flutter_webrtc)
HLS stream (https://pub.dev/packages/video_player_web_hls)
without GStreamer at all.
If need gstreamer just use platformviews to embed and do the rendering using native android and ios. worked for me.
I am new to Flutter App and I have to create a WEBRTC based video and audio calling app for both IOS and Android without using TURN Server with FCM for creating a connection between 2 users.
Could anyone please help me out how can we achieve this functionality?
Need something very similar to Peer js.
Any docs or videos explaining it will also help.
There is actually an existing plugin specific on the implementation of webRTC using Flutter. As for the tutorials, try to check the webRTC codelab and good video tutorial about "Realtime Communication with WebRTC in Flutter & Dart".
What direction should i look for Flutter camera audio/video stream to RTMP Media Server?
Developing a live broadcasting solution,
Our attempts so far;
I. PWA approach, Antmedia WebRTC SDK was used to establish WebRTC ICE candidate connection between web mobile and antmedia server.
Problem faced: seems connection too heavy for some phones. One of my phones was able to stream through.
II. Looked in the direction of hybrid, i have been searching internet for cordova/ionic SDK for live streaming to RTMP Media Server, no luck so far. Found bambuser https://www.npmjs.com/package/cordova-plugin-bambuser, seems the SDK is tied to their live streaming services.
Please, what do i need to know? Also need expert review on Flutter camera plugin Image-Bytes access + ffmpeg for encoding and transporting streams to RTMP url rtmp://SERVER_NAME/LiveApp/STREAM_ID
I need to implement iphone video streaming to server. I've googled a lot but I found only receiving video streams from server. It is made using UIWebView or MPMoviewPlayer.
But how can I stream my captured media to server in realtime?
How can it be done?
check out this Apple sample code. this is using a AVFoundation.
StitchedStreamPlayer
I have just created and iPhone web app, which has some x264 (mp4) video files on it. When I link directly to the file on the iPhone and the user taps the link, the video player is loaded and the video starts playing.
Using the app on an Android phone causes the browser to download the video instead of just playing it. Is there a way to force a video player to just boot up and play the video not download it?
Thanks in advance.
You should know that Android is quite strict regarding the video streams that you can stream. To be able to watch a stream a video (progressively watch and download) the video container must be correctly formatted.
There are many ways to create a container suitable for progressive streaming. You can look it up here: http://groups.google.com/group/android-beginners/browse_thread/thread/2a801ce5f71b5aaf?pli=1
I have successfully created a streamable video. Try to open it from your browser: http://students.mimuw.edu.pl/~nh209484/Video000.3gp