I am looking to solve the problem of displaying a transparent video in the AR scenes using Unity ARFoundation and Android platform.
I mean, accurately with a simple effect presented for the iOS platform: https://www.youtube.com/watch?v=vralbqaeqrk
In the normal 3D application I use the transcoded .Webm file and I achieve the intended purpose.
Using the same solution in the AR (ARCore) scene the background color is visible.
Can you use specialized/dedicated assets? Or should I stop dreaming about such a result using Unity and Android?
You need to make sure that your video clip does have an alpha channel then just click keep alpha property in video importing section and hit apply. However it will only show if your video does have an alpha.
Then just attach a Video player component to the gameobject which has a Mesh renderer.
Make sure the Render mode is Material override and Material property tells unity on which map of the material video output will be displayed.
If you want to play it on UI, just make a render texture and assign it to RawImage and assign the Video player with following settings.
Lastly make sure the render texture you created does have support for alpha.
Related
I'm trying to play a video in Unity. I figured it can be done with the VideoPlayer, so I put a VideoPlayer with a RenderTexture on a Plane and created a Material for the Plane that utilises this texture to render the video.
This works fine.
However, the video appears way to bright compared to the original content. You can see this on these pictures:
I found this video https://www.youtube.com/watch?v=KG2aq_CY7pU which looked promising but afterall didn't change the result, the video still appears way too bright.
Here is how I configured the Material
This is how the GO structure is set up
Config of the VideoPlayer
and the config of the Plane
How can I play the video just "normal" like I would see it in VLC Player or any other video player?
P.S. Not sure if it is important but I'm working on an AR Project with Vuforia, where the video is supposed to be played on the image target. But it makes no difference if I play the video on the target or in a scene in Unity, it is the same effect.
Thank you!
Don't use the emission. Instead rather simply use a Shader that is Unlit (basically meaning no shadows, it is always fully illuminated).
In your case you can simply use the built-in shader
Unlit -> Texture and should be fine,
Basically, there is a problem with ARFoundation package in Unity3D.
At first, I generate avatar model from the AvatarSDK. Not really important at this problem. but anyway. It returns skinned mesh renderer with blendshapes.
What I want to do next is to control those blendshapes with my own face through front camera. For this, on my scene I loaded ARSession/ARInputManager, ARSessionOrigin/ARFaceManager.
Then, when by model is generated, I enable ARFaceManager component to track face.
What I see next is front camera image with my virtual model on it.
The problem is I need to track face by front camera, but I do not want to see front camera image on screen.
Can it be solved?
In order to do this, you need to create a new skybox cubemap material first.
After you've done so, navigate to AR Session Origin > Main Camera in your hierarchy and find the AR Camera Background component.
Click the Use Custom Material checkbox and load your newly created skybox material as the custom material.
This will override the default camera material setting.
Good luck!
I'm using Unity 2019.3.0a2 and created the project with the HDRP settings, at my scene I have a plane with a material using an HDRP/lit shader, the plane contains the video player component and the render mode is set to "material Override" the renderer target is the same of the plane and the material property is "_MainTex".
Problem is I see no video I can only hear the audio, video properties are as follows:
format .mp4, length 6:00, size: 1280x720, 30fps.
I already updated to the latest unity version, also to the latest High definition rendering pipeline available which for me is v6.5.3 and still no video is there any fix for this?
You need to do the following in HDRP:
Create a new material
Change the material Shader to HDRP/Unlit
Drag and drop your RenderTexture onto Surface Inputs --> Color
The HDRP/Lit shader does not have a "_MainTex" property. The property you want to render to is "_BaseColorMap".
Change the Material Property value to "_BaseColorMap".
Found the problem, apparently the best approach is to make a render texture and on the video player you select "Render Texture" and place the newly created "render texture", at the material you just need to place that same texture in the albedo, the details are in the video.
https://youtu.be/KG2aq_CY7pU
I'm trying to build a "lights-off" feature in my ARKit app, where I basically turn off the camera video background and only show my 3D content, sort of like VR.
Could someone please help me figure out how to turn off the the video background. I can see that a material called YuVMaterial is being used to render the camera texture but setting that to single color actually covers the entire screen and doesn't show my 3D content either.
You can uncheck the UnityEngine.XR.ARFoundation.ARCameraBackground component under the main AR camera, it will just render a black background.
In Unity, you can switch between cameras while using ARKit. The main camera has a run time spherical video applied to it, so it's not actually your device camera, but a rendering of what the device camera sees. By switching cameras, you can effectively "turn off" the background video image, but still take advantage of the ARKit properties. Have fun.
I'm trying to play an Alpha Video inside a sphere which has a StereoPanoSphereMaterial using a shader "GoogleVR/Demo/VideoDemo InsideShader" and is used to play a 360 degree video.
I'm using a GoogleVR/Unlit/TransparentOVerlay shader on a Quad to run my Alpha Video inside the sphere. It appears to be running fine when I run it in the editor but when I run it on my device it just shows a blank Quad which is supposed to be transparent and the video cannot be seen as well.
I've tried playing it with other given shaders as well like FX/Flare, Unlit/Transparent to no use.
Any tips on how to play an Alpha video inside my 360 degree video sphere?
Game View in the Editor
Scene View in the Editor
View on Phone
So for anybody following this thread. The problem is that GVR SDK apparently do not support .mov format as of now even when Unity does. So we converted the video in a .webm format and it is working not to the best of our expectations but does the job of playing the Alpha Videos inside the sphere playing the stereoscopic video for now.
Also note that out of the shaders present in default with Unity only FX/Flare shader works well with this format to play videos in VR.
P.S. I also tried to place a PNG image as a component inside the view, even that is not working for now, even when it works in Unity.