What equipment is supported action.devices.traits.CameraStream base on webRTC? - actions-on-google

I have implemented CameraStream base on webRTC. It doesn't work on Chromecast and says "Sorry ,it looks like the cameraName doesn't support streaming to remote screens."
But it works fine on Google nest Hub.
I'd like to know which equipment is supported under webRTC.

Right now the chromecast devices do not support WebRTC protocol for camera streaming. However we are working to expand the supported protocols on our devices. You can post a feature request at https://issuetracker.google.com/issues/new?component=655104&template=1284148
New generation Google Nest Hub family of devices should provide support for WebRTC streaming.

Right now Chromecast devices don't support webRTC protocol. If you want to use the cameraStream trait with your devices and stream to a ChromeCast device, you can use these protocols. Teams are looking into expanding the chromecast capabilities.

Related

Flutter camera plugin to transcode and transport streams to Media Server

What direction should i look for Flutter camera audio/video stream to RTMP Media Server?
Developing a live broadcasting solution,
Our attempts so far;
I. PWA approach, Antmedia WebRTC SDK was used to establish WebRTC ICE candidate connection between web mobile and antmedia server.
Problem faced: seems connection too heavy for some phones. One of my phones was able to stream through.
II. Looked in the direction of hybrid, i have been searching internet for cordova/ionic SDK for live streaming to RTMP Media Server, no luck so far. Found bambuser https://www.npmjs.com/package/cordova-plugin-bambuser, seems the SDK is tied to their live streaming services.
Please, what do i need to know? Also need expert review on Flutter camera plugin Image-Bytes access + ffmpeg for encoding and transporting streams to RTMP url rtmp://SERVER_NAME/LiveApp/STREAM_ID

Which protocol should be used to send video stream to a media server for live streaming?

I am working on an iOS application to stream live video from iPhone to a media server and then make it available to a larger audience using RTSP.
Which protocol or method I should use to send the video stream to a server.
Thanks.
HTTP Live streaming is not designed for you needs, it's for server->clients + I won't comment about huge delay it implies
You better check RTSP or RTMP protocol and LivU blog
For Celluar
Apple seems to make a distinction between apps that are used for just streaming content from servers and those that are used for some type of conferencing.
I think VOIP types are safe, and it seems like gocoder presenter types apps don't have issues either. There's no official page detailing this, but there is some mention under what apples considers VOIP app.
No app has issues if its over wifi only.

Stream video on Android

I have a video stream that I used in an iPhone application. I'm now working to port the application to Android so I want to use the same stream.
As Apple requiered, I created a HTTP Live Streaming (media segmenter, m3u8 file, etc.). You can find the stream here: http://envue.insa-lyon.fr/smartphone/aloun_stream/prog_index.m3u8 .
I want to use this same stream on Android. Did someone have the same a resembling experience?
Honeycomb/Android 3.0 has limited support for HLS. Anything before that does not have built in support, but there are supposed to be third party SDKs that will do it, but searching shows a lot of people that can't ge a hold of the third party dev.
Check the Android dev docs to find out what is not supported.
I've given up on the m3u8 stream. I just used mp4-s with android-streaming capabilities.
you have to use webscoket to continuously get TS files as Apple defines, and send to a player to decode the H.264+AAC within TS packet
Check android 4.0 - it claims to support HTTP Live Streaming 3.0 fully, including HTTPS. For older versions I've seen some people reommening it,but haven't tried myself

Possible to Stream Video on iphone/iPad Using Windows Encoder?

Hello all i use windows encoder to stream video online and have a server that i use to broadcast this stream. i am trying to make an app that streams video to the iPhone/iPad using a unique link. i have seen apps out that stream their own DVR cameras so there must be a type of converter or encoder to use. any suggestions?
The short answer is no, not at this time. The iPhone/iPad/iPod Touch work natively with the Apple HTTP Adaptive segmented streaming protocols. MMS (Windows Media) streams are not compatible with "i" devices and will not play. You will need to look into encoding your video with this other format. Check out the Apple specs for a full description of the protocol. Future versions of Windows Media Services (4.0) are claiming that they will support the Apple protocols but this is only a preview/beta at this time and may not truly support the Apple specs.
If your trying to do on-demand iPhone video, you can utilize a service such as Encoding.com to pre-encode your files in the adaptive segmented format for your users to view. For live encoding, Telestream has a product called Wirecast which can encode in a h.264 Apple approved baseline format which can be sent to a service such as Akamai, Multicast Media, or Wowza Server for distribution to your clients.

What's the equivalent of FMS for iphone?

Since iphone doesn't support flash at all.
Is it Darwin streaming server ?
You'll want to use what Apple refers to as HTTP Live Streaming. Follow that link to Apple's Overview documentation. That should give you enough to start with, as well as links to more detailed documentation.
In particular, take note of the Requirements for Apps section. Here Apple lays out the required use of HTTP Live Streaming for iOS apps that will be delivering video over the cellular network.
You can stream video (H.264) on iphone with Wowza Media Server 2