I'm trying to build a "lights-off" feature in my ARKit app, where I basically turn off the camera video background and only show my 3D content, sort of like VR.
Could someone please help me figure out how to turn off the the video background. I can see that a material called YuVMaterial is being used to render the camera texture but setting that to single color actually covers the entire screen and doesn't show my 3D content either.
You can uncheck the UnityEngine.XR.ARFoundation.ARCameraBackground component under the main AR camera, it will just render a black background.
In Unity, you can switch between cameras while using ARKit. The main camera has a run time spherical video applied to it, so it's not actually your device camera, but a rendering of what the device camera sees. By switching cameras, you can effectively "turn off" the background video image, but still take advantage of the ARKit properties. Have fun.
Related
I am making an AR game with Unity.
I used the commonly used 'Time.timeScale = 1' to use the pause function,
This will turn the AR Camera screen into a black screen.
In addition, if you add the ar occlusion manager function in this state, the app will be down with the screen bugged.
When using AR Camera, what is the proper way to implement temporary suspension?
The actual screen you are viewing with AR camera should be seen with a frozen frame.
Please
What i would do is take a camera shot
ScreenCapture.CaptureScreenshot("TestImage.png");
then i would display it on a raw image
RawImage qrRenderer;
renderer.material.mainTexture = webcamTexture;
you can at that time disable youe camera
camera.enabled = false;
I am looking to solve the problem of displaying a transparent video in the AR scenes using Unity ARFoundation and Android platform.
I mean, accurately with a simple effect presented for the iOS platform: https://www.youtube.com/watch?v=vralbqaeqrk
In the normal 3D application I use the transcoded .Webm file and I achieve the intended purpose.
Using the same solution in the AR (ARCore) scene the background color is visible.
Can you use specialized/dedicated assets? Or should I stop dreaming about such a result using Unity and Android?
You need to make sure that your video clip does have an alpha channel then just click keep alpha property in video importing section and hit apply. However it will only show if your video does have an alpha.
Then just attach a Video player component to the gameobject which has a Mesh renderer.
Make sure the Render mode is Material override and Material property tells unity on which map of the material video output will be displayed.
If you want to play it on UI, just make a render texture and assign it to RawImage and assign the Video player with following settings.
Lastly make sure the render texture you created does have support for alpha.
I was going to use the VideoPlayer to render to Camera Near Plane, but I also want to display subtitles for the video for the sake of accessibility. I'm wondering what the best way to do that is.
I can't see anything on a canvas if I render to Near Plane. I'd like the video to appear in front of the scene so that I can have the scene there once the video is complete.
Do I need to be using a render texture to achieve this? Seems like a render texture might incur some unnecessary overhead for my purposes, but I could be wrong.
The idea is this:
Far Background - Scene
Background - Black Image (so i can fade to scene)
Middleground - Video
Foreground - Subtitles
More info:
This is a 2D point and click adventure game with a pre-rendered cutscene.
You could do this with a render texture, place it in front of the camera at an exact distance and size, but I wouldn't. Probably would be a different camera anyway for lighting or clipping purposes.
I would use a second Camera, rendering over top of the Main Camera, with the subtitle UI's canvas targeting the second camera's screen space, and clearing depth only. It will render what it sees, but with a totally transparent background. Then, you can render your video on either the main camera's near plane or the new subtitle camera's far plane.
You could put your black square in front of this camera, too, though it would be in front of the video. It could be UI on the main camera, or stick a third camera in between them. You might have to worry about performance if there are too many cameras, but I have used two or three before to no noticeable performance hit.
Robert Mocks's answer is perfectly tenable and makes sense to me. Thank you for that!
What I decided to do instead was use a RawImage so that I wouldn't have to deal with extra cameras. This way I can use the canvas as I normally would and don't have to deal with render textures.
This involves using the API Only setting along with the following code:
rawImage.texture = videoPlayer.texture;
That seems to work well for me.
I have a question about video overlay function of tango in unity.
I want to display video overlay image with NGUI elements.
When I turn on 'Enable video overlay' options in TangoApplication script, displays of other camera in unity scene are disappeared except UI created by Unity itself.(OnGUI and UGUI componets)
Is there a way to use other unity cameras with video overlay?
please, answer.
Thank you.
I am developing an app with vuforia Cloud Recos. I want to add the feature of allowing the user to pause the page so she does not have to keep pointing the device on the target to view the trackable. This is pretty useful when I want to show texts. Is there anyway to achieve that on Unity3D ? A good example is Microsoft's Here City Lens app which includes a button to pause the page as the screenshot shows;
You could take a screenshot of the screen and apply it to an Image UI object. That is if you do not need the camera feed anymore.
If you need interaction with the elements, I would only take a screenshot of the camera feed without items. Get AR camera transform, apply it to a new camera, disable AR camera.Then apply the screenshot to a background plane covering the whole screen. Keep items on as well and they do not listen to Vuforia anymore. You are pretty much recreating a basic Unity scene. The items should not be moving with Vuforia, the camera is. So they are still in the middle and you need to know where was the camera when you took the shot. Your scene is complete