Error on Main Camera in Unity: "The renderer used by this camera doesn't support camera stacking. Only Base camera will render." - unity3d

I just updated my unity version from 2018 to 2019. Also, I've got a new render pipeline for the 2D lights feature, and there is this weird white flickering when my character moves, and even sometimes without movement. I think the Main Camera warning shown in the title is the cause, but I could be wrong. Does anyone have a fix?

I am not sure about the problem you referenced in your description, but as for the error in your title I had a similar problem and it was because I had two cameras in the scene, one overlay and one base, the overlay was for my UI and I was using Universal Render Pipeline with 2D which is not yet supported. So I cannot stack multiple cameras, when I do it will only load the base camera.
https://forum.unity.com/threads/does-urp-support-ui-canvas-overlay.768077/

you may try to select main camera than at inspector section open "rendering" and mark the "renderer" as "UniversalRendererPipelineAsset".

This is because you are using 'deferred' rendering, and camera stacking is only supported on forward rendering.
Either select forward rendering on your Universal pipeline asset, or create a new render pipeline asset with forward rendering and assign it to the camera.
If you are unsure which one to choose, read: https://docs.unity3d.com/Manual/RenderingPaths.html

Related

Make a URP renderer feature affect only the current camera

I'm making a renderer feature with a single ScriptableRenderPass. This renderer feature is present on a single 2D Renderer, like so:
and I have a single camera that is using this renderer, that only affects a particular layer of the camera:
The camera only renders everything on the PixelPerfect layer, ignoring anything else. This camera is in a camera stack, like so:
But, somehow, the renderer feature on Downscaled Camera affects the Background Camera - I suspect that the render pass somehow sees everything from the previous cameras, but I have no idea how that even makes sense, as when singling out only the downscaled camera, I only see the layer that I have set the Camera to cull.
Here's how the Downscaled Camera is set-up:
I'm Blitting to the renderingData.cameraData.renderer.cameraColorTarget in Execute.
I've found this post on the GameDev StackExchange, but this was before the era of URP and scriptable renderer features, but it describes my problem perfectly. Any thoughts?
I am having a similar issue where I have selected a custom renderer on my camera, but it refuses to use my custom renderer and only uses the default. I have yet to figure out why.
EDIT: For future reference, I fixed my problem. Turns out there was no problem. The scene view (and subsequently any camera previews for cameras you have selected) will always render with the default renderer. My render target was being rendered with the correct renderer.

(Unity3D) When the Camera is in a object, Camera will stop rendering the object

I'm using unity 2018.4.14f1 personal (I don't use 2019 or 2020 because it lags my computer)
I'm using the Unity Standard Assets Player Prefab and Cinemachine Freelook for the camera. I have some water, and when my player walks into it, its fine. However, when the camera comes into the water, it stops rendering the water. Is there anyway I can fix it?
Update: I've somewhat got it working, however its hollow when your inside. Is there anyway to fix that?
Video : https://easyupload.io/2b0p3a
(I'm quite a noob so if you need any screenshots please ask.)
The problem here is that the water will only rendered when looking from the outside as the normalized are modeled so. The program renders outs objects that it thinks is not in view. You can load the model into a 3d program and then copy and invert the model to allow your camera to see the water, or I believe there are some shader option to stop this optimization. You can also look in this Reddit thread.

Unity LWPR does not see some layers

I have 2D Unity project with two cameras: the main one and one designed for parallax effect.
After I installed LWPR for setting lights the second camera stopped showing its layer in game.
Is there a way to fix this?
The practice of rendering two cameras at the same time as you are describing, "camera stacking", is not currently supported on LWRP or URP. There is some discussion about adding support for it again.
You could try using a camera to render onto a render texture and display that as your background.
For the 2D light, it is present but is greyed out. You should be able to enable experimental feature use in the player settings for the project.

Pause/Freeze a scene with a trackable active in vuforia unity 3d

I am developing an app with vuforia Cloud Recos. I want to add the feature of allowing the user to pause the page so she does not have to keep pointing the device on the target to view the trackable. This is pretty useful when I want to show texts. Is there anyway to achieve that on Unity3D ? A good example is Microsoft's Here City Lens app which includes a button to pause the page as the screenshot shows;
You could take a screenshot of the screen and apply it to an Image UI object. That is if you do not need the camera feed anymore.
If you need interaction with the elements, I would only take a screenshot of the camera feed without items. Get AR camera transform, apply it to a new camera, disable AR camera.Then apply the screenshot to a background plane covering the whole screen. Keep items on as well and they do not listen to Vuforia anymore. You are pretty much recreating a basic Unity scene. The items should not be moving with Vuforia, the camera is. So they are still in the middle and you need to know where was the camera when you took the shot. Your scene is complete

NGUI invisible after tracking with Vuforia

I am using Vuforia 4-2-3, the latest NGUI verion and Unity5.0.1.p3
My GUI works fine until I track a target. After that, my GUI is invisible However, the collision still works. So buttons are working, only I can't see sprites, textures or labels.
There is a 3D building that shows up while tracking. That 3D object uses the standard shader. The NGUI atlas uses the unlit/transparend colored shader.
I guess there is a conflict between those? Did someone else had this problem before?
EDIT:
This is what my hierarchy looks like
I have an Image Target with several 3D objects.
The NGUI and the ARCamera are two different objects aswell.
This is what my NGUI looks like, when I start tracking
Where have you linked your NGUI to? The ARCamera or another GameObject?
I would suggest you link your script to the ARCamera at all times. This would ensure that it shows, since a GameObject below the ARCamera it might not show since the hierarchy pattern to be followed by general Unity users.
EDIT: If you've used OnGUI() for your GUI needs, then the script in which its contained should be attached as a script on the ARCamera. Also, try putting ARCamera on top of Image Target in the Heirarchy.