Smart home 2-way communication like Google home hub and Nest hello - actions-on-google

Is there a trait to implement real-time 2-way communication with Google Smart home like Google home hub and Nest hello 2-way audio video?
I'm trying to develop the app which can support 2-way audio video with google home.
What kinds of protocols should I support for 2-way audio video?

Related

What equipment is supported action.devices.traits.CameraStream base on webRTC?

I have implemented CameraStream base on webRTC. It doesn't work on Chromecast and says "Sorry ,it looks like the cameraName doesn't support streaming to remote screens."
But it works fine on Google nest Hub.
I'd like to know which equipment is supported under webRTC.
Right now the chromecast devices do not support WebRTC protocol for camera streaming. However we are working to expand the supported protocols on our devices. You can post a feature request at https://issuetracker.google.com/issues/new?component=655104&template=1284148
New generation Google Nest Hub family of devices should provide support for WebRTC streaming.
Right now Chromecast devices don't support webRTC protocol. If you want to use the cameraStream trait with your devices and stream to a ChromeCast device, you can use these protocols. Teams are looking into expanding the chromecast capabilities.

How to create my own custom RMTP server from scratch and stream to multiple destinations

I am using StreamYard services to do a live stream to multiple destinations like Facebook and youtube. I want to create a mobile app using flutter that can receive that stream and use it (i mean to display the live stream). Streamyard uses only RMTP for a custom destination.
My question is: is there a way to create and host a custom RMTP online that can be shared between the StreamYard platform and my mobile app. I want it to work whenever I do live stream from the StreamYard it will be shared to Facebook, youtube, and my mobile app as well. I have done a lot of research but i find out the only way is to use windows or linux as a host, which i want it online.
Also, alternative solutions are welcome like using webRTC.
Because you use StreamYard, I think you need to use the INVITE feature to start a Video Chat then convert to live streaming, it works like bellow:
UserA --WebRTC--->---+
+--->- StreamYeard -->-RTMP-->- YouTube/Twitch.tv
UserB --WebRTC---->--+
You need to buy a non-free plan to support Custom RTMP destinations to publish the RTMP to your media server like SRS or Nginx, then you could broadcast to multiple destination, like this:
+->-- YouTube/Twitch.tv
|
StreamYeard ->-RTMP-+->- Custom RTMP destinations --+--RTMP-> YouTube/Twitch.tv
(SRS/Nginx media server) |
+--HLS/FLV--> Flutter App
Note: Once streaming to your RTMP server or video cloud platform, you could covert to HLS/HTTP-FLV for your FlutterAPP to play it. About player and protocol, please read here. It depends on which part you want to build by yourself, and it's possible to build by open-source projects.
Note: Note that you could use StreamYeard to streaming to YouTube and Custom RTMP server, or use FFmpeg to pull stream from your custom server then publish to any other live streaming platform.
For this solution, the StreamYeard actually plays as Video Chat or video conference platform, like ZOOM. It will transcode each WebRTC stream and mix all the audio and videos to one RTMP stream.
So you could use WebRTC server to build your StreamYeard, then use FFmpeg to transcode and mix the streams, because it is off topic so let me stop here.

Flutter camera plugin to transcode and transport streams to Media Server

What direction should i look for Flutter camera audio/video stream to RTMP Media Server?
Developing a live broadcasting solution,
Our attempts so far;
I. PWA approach, Antmedia WebRTC SDK was used to establish WebRTC ICE candidate connection between web mobile and antmedia server.
Problem faced: seems connection too heavy for some phones. One of my phones was able to stream through.
II. Looked in the direction of hybrid, i have been searching internet for cordova/ionic SDK for live streaming to RTMP Media Server, no luck so far. Found bambuser https://www.npmjs.com/package/cordova-plugin-bambuser, seems the SDK is tied to their live streaming services.
Please, what do i need to know? Also need expert review on Flutter camera plugin Image-Bytes access + ffmpeg for encoding and transporting streams to RTMP url rtmp://SERVER_NAME/LiveApp/STREAM_ID

Using video chat apis for IOS without managing the serverside

I am looking for some ready-to-go APIs provided by platforms to deal with video chatting etc. There are some examples like Twilio or OpenTok but you have to create the server side by yourself.
Are there any other examples where I don't have to deal with the server?
I recommend to try ConnectyCube, they have a Swift code sample for video calling
You do not need to bother with any server-side stuff, can concentrate only on client-side app development
Code samples https://developers.connectycube.com/ios/code-samples
Video Chat swift code sample https://github.com/ConnectyCube/connectycube-ios-samples/tree/master/sample-videochat-swift
Getting Started guide https://developers.connectycube.com/ios/
Video Chat API documentation https://developers.connectycube.com/ios/videocalling
The following features are supported:
1-1 video chat
Group video chat
Cross-platform
Screen sharing
WebRTC based
End-to-end Encryption by default
VP8/H264 video codecs supported
Mute/Unmute audio/video stream
Switch video input devices (cameras)
Video recording
Vidyo.io
You can use this its free for 2000 minutes but it will work without a server,
https://vidyo.io/platform/mobile/

How do media browser plugins function?

If I want to use Google Video chat on my browser
I have to download and install a plugin for it to work.
I would like to make a piece of software that creates
some interactions with a video displayed in the browser.
I assume that it might be problematic doing it with one solution
for all the browser, so if I might need to focus on only one browser
lets talk about firefox, although I think the firefox addon SDK
would not let me do a thing as complex as video interaction.
But how does the Google Video chat plugin work for the browsers?
It's only an example for one of those plugins that lets you
do activities (media in this case) with your browser
which are normally impossible.
As I understand it, Google Video Chat uses Flash.
I'm looking for something official-looking to back that up now...
Edit: I think this explains it pretty well.
Flash Player exposes certain audio/video functions to the (SWF) application. But the Flash Player does not give access to the raw real-time audio/video data to the application. There are some ActionScript API classes and methods: the Camera class allows you to capture video from your camera, the Microphone class allows you to capture audio from your microphone, the NetConnection/NetStream classes allow you to stream the video from Flash Player to remote server and vice-versa, the Video class allows you to render video either captured by Camera or received on NetStream. Given these, to display the video in Flash Player the video must be either captured by Camera object or received from remote server on NetStream. Luckily, ActionScript allows you to choose which Camera to use for capture.
When the Google plugin is installed, it exposes itself as two Camera devices; actually virtual device drivers. These devices are called 'Google Camera Adaptor 0' and 'Google Camera Adaptor 1' which you can see in the Flash Player settings, when you right click on the video. One of the device is used to display local video and the other to display the remote participant video. The Google plugin also implements the full networking protocol and stack, which I think are based on the GTalk protocol. In particular, it implements XMPP with (P2P) Jingle extension, and UDP-based media transport for transporting real-time audio/video. The audio path is completely independent of the Flash Player. In the video path: the plugin captures video from the actual camera device installed on your PC, and sends it to the Flash Player via one of the virtual camera device driver. It also encodes and sends the video to the remote user. In the reverse direction, it receives video (over UDP) from the remote user, and gives it to the Flash Player via the second of the virtual camera device drivers. The SWF application running in the browser creates two Video objects, and attaches them to two Camera object, one each for the two virtual video device, instead of attaching it to your real camera device. This way, the SWF application can display both the local and remote video in the Flash application.