Rendering video overlay with other unity cameras - unity3d

I have a question about video overlay function of tango in unity.
I want to display video overlay image with NGUI elements.
When I turn on 'Enable video overlay' options in TangoApplication script, displays of other camera in unity scene are disappeared except UI created by Unity itself.(OnGUI and UGUI componets)
Is there a way to use other unity cameras with video overlay?
please, answer.
Thank you.

Related

How to pause/freeze the AR Camera in Unity?

I am making an AR game with Unity.
I used the commonly used 'Time.timeScale = 1' to use the pause function,
This will turn the AR Camera screen into a black screen.
In addition, if you add the ar occlusion manager function in this state, the app will be down with the screen bugged.
When using AR Camera, what is the proper way to implement temporary suspension?
The actual screen you are viewing with AR camera should be seen with a frozen frame.
Please
What i would do is take a camera shot
ScreenCapture.CaptureScreenshot("TestImage.png");
then i would display it on a raw image
RawImage qrRenderer;
renderer.material.mainTexture = webcamTexture;
you can at that time disable youe camera
camera.enabled = false;

Transparent video in Android AR Scene

I am looking to solve the problem of displaying a transparent video in the AR scenes using Unity ARFoundation and Android platform.
I mean, accurately with a simple effect presented for the iOS platform: https://www.youtube.com/watch?v=vralbqaeqrk
In the normal 3D application I use the transcoded .Webm file and I achieve the intended purpose.
Using the same solution in the AR (ARCore) scene the background color is visible.
Can you use specialized/dedicated assets? Or should I stop dreaming about such a result using Unity and Android?
You need to make sure that your video clip does have an alpha channel then just click keep alpha property in video importing section and hit apply. However it will only show if your video does have an alpha.
Then just attach a Video player component to the gameobject which has a Mesh renderer.
Make sure the Render mode is Material override and Material property tells unity on which map of the material video output will be displayed.
If you want to play it on UI, just make a render texture and assign it to RawImage and assign the Video player with following settings.
Lastly make sure the render texture you created does have support for alpha.

Unity how to play video on plane without being affected by lighting

I'm trying to play a video in Unity. I figured it can be done with the VideoPlayer, so I put a VideoPlayer with a RenderTexture on a Plane and created a Material for the Plane that utilises this texture to render the video.
This works fine.
However, the video appears way to bright compared to the original content. You can see this on these pictures:
I found this video https://www.youtube.com/watch?v=KG2aq_CY7pU which looked promising but afterall didn't change the result, the video still appears way too bright.
Here is how I configured the Material
This is how the GO structure is set up
Config of the VideoPlayer
and the config of the Plane
How can I play the video just "normal" like I would see it in VLC Player or any other video player?
P.S. Not sure if it is important but I'm working on an AR Project with Vuforia, where the video is supposed to be played on the image target. But it makes no difference if I play the video on the target or in a scene in Unity, it is the same effect.
Thank you!
Don't use the emission. Instead rather simply use a Shader that is Unlit (basically meaning no shadows, it is always fully illuminated).
In your case you can simply use the built-in shader
Unlit -> Texture and should be fine,

How can I turn off camera video background in Unity ARKit

I'm trying to build a "lights-off" feature in my ARKit app, where I basically turn off the camera video background and only show my 3D content, sort of like VR.
Could someone please help me figure out how to turn off the the video background. I can see that a material called YuVMaterial is being used to render the camera texture but setting that to single color actually covers the entire screen and doesn't show my 3D content either.
You can uncheck the UnityEngine.XR.ARFoundation.ARCameraBackground component under the main AR camera, it will just render a black background.
In Unity, you can switch between cameras while using ARKit. The main camera has a run time spherical video applied to it, so it's not actually your device camera, but a rendering of what the device camera sees. By switching cameras, you can effectively "turn off" the background video image, but still take advantage of the ARKit properties. Have fun.

How to play an Alpha video inside an Unlit sphere in Unity?

I'm trying to play an Alpha Video inside a sphere which has a StereoPanoSphereMaterial using a shader "GoogleVR/Demo/VideoDemo InsideShader" and is used to play a 360 degree video.
I'm using a GoogleVR/Unlit/TransparentOVerlay shader on a Quad to run my Alpha Video inside the sphere. It appears to be running fine when I run it in the editor but when I run it on my device it just shows a blank Quad which is supposed to be transparent and the video cannot be seen as well.
I've tried playing it with other given shaders as well like FX/Flare, Unlit/Transparent to no use.
Any tips on how to play an Alpha video inside my 360 degree video sphere?
Game View in the Editor
Scene View in the Editor
View on Phone
So for anybody following this thread. The problem is that GVR SDK apparently do not support .mov format as of now even when Unity does. So we converted the video in a .webm format and it is working not to the best of our expectations but does the job of playing the Alpha Videos inside the sphere playing the stereoscopic video for now.
Also note that out of the shaders present in default with Unity only FX/Flare shader works well with this format to play videos in VR.
P.S. I also tried to place a PNG image as a component inside the view, even that is not working for now, even when it works in Unity.