I want to make a feature to play Youtube live broadcast in my Unity webgl project, how should I implement it - unity3d

When I find the plugin to play video it prompts me CORS cross domain problem,Is there any other way to implement the live broadcast function of Youtube in webgl? One way is to set up a server to download the video and then perform the video transmission, but this is too much traffic for the serverenter image description here

Related

upload video on a Convergence Application

hi everybody i try to develop a web application that can control Smart tv like this guide http://samsungdforum.com/Guide/tut00024/index.html i work fine but now i would like to upload video from computer then it can display on the smart tv like image shown on the tutorial have any one any idea or exemple or suggestion about modification of code that can i do that can help me i would like to modify code of convergence tutorial than can sens message or send video client application to smart tv application
Sending files is covered by the tutorial. You can find API reference for this here.
Sending video file is not exactly a wise thing, because there is a 3MB limit for a file that can be sent using Convergence API. This API is designed for sending messages between TV and external client rather than files. If you want to launch video playback, send video URL from web app to the TV and let the TV download the video by itself.

iOS - Develop iPhone app to stream camera video to a computer?

I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.

Stream Audio off site for iOS app?

I am working with a group at developing an app that will essentially be a 'radio' app. One view that will just play whatever audio is streaming at the time, and another view or two of archives to listen to past programs. What I am working on right now is how to assemble the view to play. The site in question is on-this-rock.org and the source for playing is here
Any suggestions for how I can best go about building the player to stream in the audio, without needing the rest of the site graphics?
Thanks
The stream URL is actually:
http://s4.voscast.com:8080/
This is just a SHOUTcast stream. You can build your radio player to connect directly to it. No need for the HTML/Flash on the website itself.
You can find this easily by looking at your browser tool's network tab, or by using a tool such as Fiddler or Wireshark.

Flash Media Server live streaming with multiple video files

I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices.
But in my case i have multiple video files keep received from another program, my goal is use the Flash Media server to perform a live boardcasting with these video file one by one.
That means when client watching the streaming, they will not notice the server is play mov1, then mov2, then mov4, then mov5... and so on.
Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?
Can FMS achieve such purposes? any tutorial provided would be very helpful!
Edit for Open Bounty
I want to basically deliver a live stream of video where a list of videos are source. I am currently using Flash Media Server with Cloudfront CDN to deliver content. So if I have video1, video2, and video3. I want to play them back to back as a live stream (so no skipping ahead in video), is it possible to do this? Bounty goes to clever workaround. Think of this as a television channel.
i have been working on the live streaming technologies for the past 1and half year . There is no option in flash live encoder for any file encoding.
1.To encoder your file you can use your dvd player devices or some thing else which supports usb devices playback options.and use the dvd player output to broadcast using flash media live encoder.
2.And the another set is to setup windows media encoder that supports file encoding(no need od dvd player) but it supports only windows media services.
At present i live webcast video file in this way only for my company http://www.malar.tv/live.php

How do media browser plugins function?

If I want to use Google Video chat on my browser
I have to download and install a plugin for it to work.
I would like to make a piece of software that creates
some interactions with a video displayed in the browser.
I assume that it might be problematic doing it with one solution
for all the browser, so if I might need to focus on only one browser
lets talk about firefox, although I think the firefox addon SDK
would not let me do a thing as complex as video interaction.
But how does the Google Video chat plugin work for the browsers?
It's only an example for one of those plugins that lets you
do activities (media in this case) with your browser
which are normally impossible.
As I understand it, Google Video Chat uses Flash.
I'm looking for something official-looking to back that up now...
Edit: I think this explains it pretty well.
Flash Player exposes certain audio/video functions to the (SWF) application. But the Flash Player does not give access to the raw real-time audio/video data to the application. There are some ActionScript API classes and methods: the Camera class allows you to capture video from your camera, the Microphone class allows you to capture audio from your microphone, the NetConnection/NetStream classes allow you to stream the video from Flash Player to remote server and vice-versa, the Video class allows you to render video either captured by Camera or received on NetStream. Given these, to display the video in Flash Player the video must be either captured by Camera object or received from remote server on NetStream. Luckily, ActionScript allows you to choose which Camera to use for capture.
When the Google plugin is installed, it exposes itself as two Camera devices; actually virtual device drivers. These devices are called 'Google Camera Adaptor 0' and 'Google Camera Adaptor 1' which you can see in the Flash Player settings, when you right click on the video. One of the device is used to display local video and the other to display the remote participant video. The Google plugin also implements the full networking protocol and stack, which I think are based on the GTalk protocol. In particular, it implements XMPP with (P2P) Jingle extension, and UDP-based media transport for transporting real-time audio/video. The audio path is completely independent of the Flash Player. In the video path: the plugin captures video from the actual camera device installed on your PC, and sends it to the Flash Player via one of the virtual camera device driver. It also encodes and sends the video to the remote user. In the reverse direction, it receives video (over UDP) from the remote user, and gives it to the Flash Player via the second of the virtual camera device drivers. The SWF application running in the browser creates two Video objects, and attaches them to two Camera object, one each for the two virtual video device, instead of attaching it to your real camera device. This way, the SWF application can display both the local and remote video in the Flash application.