Multiple videos for a player In Agora Unity SDK - unity3d

I have agora sdk integrated in my current project. I create a prefab having video surface component for a user whenever he/she joins and video is rendered on that. Now if I want to create another prefab and render the video of the same player on that too then it gets messed up and the video for that player doesn't work. The main thing, I can't have multiple video surfaces rendering a single user's video. The reason for this is that at some stage in my game I want to display video in rectangle and somewhere else I want to display in a round masked image so I create separate prefabs. Any help would be much appreciated.

this is actually possible and got tested many times. Please share how you manage the VideoSurface creation and assignment of the joined remote user.

Related

Technology to use for creating an mobile app which can detect and track position of random objects in real time?

I like to know which technology I can use for creating android application which has the feature of detecting an random object like hoarding board and after detecting, a video must played over that object. If the user move the camera , the video must also move over that object position.
I have an example of what I want to achieve in this video of big burger https://www.youtube.com/watch?v=lhXW8_7CaHM . But I want to play that video on any type of hoarding the camera detect. Not only on specific trained object.
I have study some technology like tensorflow and Vuforia which can made ar application.
But I'm not sure if it can detect real time objects with tracking of the objects position.

Video player in google daydream VR sdk

I am trying to render 2D video content using google VR SDK, I tried to refer the documentation regarding the video viewports given in the dev page but it seems obsolete and incomplete and the samples on the github do not give any reference on how to create 2D quads and how to render external texture on it, there is only 360 video sample available. any reference or help would be appreciated, thanks!!
Playing a video is pretty much easy using GoogleVR SDK. All you need to do is, take a quad objects (from 3D objects), drag the GvrVideoTexturePlayer script (from the downloaded SDK), and set the material of mesh as VideoMaterial (from the downloaded SDK). Now just select the type of video (Dash/HLS) and enter the URL, the video should play.
Make sure that you are testing on the device as the video won't play on emulator. For 360° videos, replace quad object with sphere, rest is same.

How to implement 360 video and Sound in Unity3D for VR

Can someone please provide me some insights about how the pipeline is to implement 360 video and sound in VR? I know a thing or two about video editing, sound and unity3d but would like to know more about how to do all these things for VR. Lets say I want to shoot a 360 video, then put it on VR but also I want to incorporate the sound captured. Also I would like to have some interactive spots on it.
Edit: If I want to make interactive spots on it, will that mean I need different 360 cameras shooting from the spots I want the interaction to happen? or will the one video shot with one camera allowed for that?
Thank you
First you have to choose target platfrom e.g. IOS,Android etc. Than you have to find out video player which support 360 video like AVPROMEDIAPLAYER from unitiy3d's AssetStore.
for Interactive spots in video you have to make some local database like thing e.g. xml file for position of trigger and time for doing any activity.Hope this will help you.

Easy Movie Texture and Google Cardboard Buttons

I'm making an app for 360° video with Google Cardboard. I have the Easy Movie Texture plugin, but I can't find the way to make a UI. The controls appear on the left side. Is there a way to make them appear with the movement of the head?
create a canvas and attach as child of CardboardMain.
I use this method along with a reticle canvas to allow selection via gaze

Pause/Freeze a scene with a trackable active in vuforia unity 3d

I am developing an app with vuforia Cloud Recos. I want to add the feature of allowing the user to pause the page so she does not have to keep pointing the device on the target to view the trackable. This is pretty useful when I want to show texts. Is there anyway to achieve that on Unity3D ? A good example is Microsoft's Here City Lens app which includes a button to pause the page as the screenshot shows;
You could take a screenshot of the screen and apply it to an Image UI object. That is if you do not need the camera feed anymore.
If you need interaction with the elements, I would only take a screenshot of the camera feed without items. Get AR camera transform, apply it to a new camera, disable AR camera.Then apply the screenshot to a background plane covering the whole screen. Keep items on as well and they do not listen to Vuforia anymore. You are pretty much recreating a basic Unity scene. The items should not be moving with Vuforia, the camera is. So they are still in the middle and you need to know where was the camera when you took the shot. Your scene is complete