I need to do a Live Broadcast apps using flutter on both iOS and Android.
While broadcasting, the video can be recorded by both the broadcaster and viewer (Up to 500).
I have did some studies and seems like WebRTC is the way to go. But I need to know some additional info:
How can broadcast be done? Send to a server and the server will send to individual viewer separately?
Can I put something overlay on the video while it is playing? Example: Text, Icons, Buttons .... etc
Related
I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.
hi everybody i try to develop a web application that can control Smart tv like this guide http://samsungdforum.com/Guide/tut00024/index.html i work fine but now i would like to upload video from computer then it can display on the smart tv like image shown on the tutorial have any one any idea or exemple or suggestion about modification of code that can i do that can help me i would like to modify code of convergence tutorial than can sens message or send video client application to smart tv application
Sending files is covered by the tutorial. You can find API reference for this here.
Sending video file is not exactly a wise thing, because there is a 3MB limit for a file that can be sent using Convergence API. This API is designed for sending messages between TV and external client rather than files. If you want to launch video playback, send video URL from web app to the TV and let the TV download the video by itself.
I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.
I am working with a group at developing an app that will essentially be a 'radio' app. One view that will just play whatever audio is streaming at the time, and another view or two of archives to listen to past programs. What I am working on right now is how to assemble the view to play. The site in question is on-this-rock.org and the source for playing is here
Any suggestions for how I can best go about building the player to stream in the audio, without needing the rest of the site graphics?
Thanks
The stream URL is actually:
http://s4.voscast.com:8080/
This is just a SHOUTcast stream. You can build your radio player to connect directly to it. No need for the HTML/Flash on the website itself.
You can find this easily by looking at your browser tool's network tab, or by using a tool such as Fiddler or Wireshark.
How can I send video from an iPhone to an iPad?
I'm building a robot that is an iPhone controlling an arduino, for the next phase I would like to be able to send some live streaming video from the iPhone to see in an iPad and have the iPad sending commands to the iPhone.
so how to send live streaming video from one device to the other (WiFi preferred or BlueTooth), and how to control one device via wireless from the other?
EDIT:
The best example for what I intend to do is the Parrot AR Drone and another app for the toy,
app clone to pilot the Quadracopter
The difference is that I would be getting the image from an iPhone and sending the control orders to the iPhone [from an iPad] as well, not a separate hardware.
Thanks a lot!
Most of the apps I've seen that do this use AVFoundation to capture data form the video camera - then push the frames to a server somewhere. You probably won't want to push every frame. For the receiving side of things I would have a server hosting a web page with an html5 video tag looking at an m3u8 playlist. Have your files from the iphone go into the playlist folder.
<video src="http://yourserver.com/path/to/stream/yourPlaylist.m3u8">
Your browser does not support the VIDEO tag
</video>
Then set your view on the ipad or computer to look at that webpage. There is for sure a more direct way of sending the files straight to the ipad for viewing - but I like being able to view the video from any broswer :)
If you want to stay away from a web view on the ipad you can also get the files as you would retrieve any file over a network. The web view is just the easiest way in my opinion.
How to integrate Live555 in XCode (iOS SDK)
hope this helps!