Make a URP renderer feature affect only the current camera - unity3d

I'm making a renderer feature with a single ScriptableRenderPass. This renderer feature is present on a single 2D Renderer, like so:
and I have a single camera that is using this renderer, that only affects a particular layer of the camera:
The camera only renders everything on the PixelPerfect layer, ignoring anything else. This camera is in a camera stack, like so:
But, somehow, the renderer feature on Downscaled Camera affects the Background Camera - I suspect that the render pass somehow sees everything from the previous cameras, but I have no idea how that even makes sense, as when singling out only the downscaled camera, I only see the layer that I have set the Camera to cull.
Here's how the Downscaled Camera is set-up:
I'm Blitting to the renderingData.cameraData.renderer.cameraColorTarget in Execute.
I've found this post on the GameDev StackExchange, but this was before the era of URP and scriptable renderer features, but it describes my problem perfectly. Any thoughts?

I am having a similar issue where I have selected a custom renderer on my camera, but it refuses to use my custom renderer and only uses the default. I have yet to figure out why.
EDIT: For future reference, I fixed my problem. Turns out there was no problem. The scene view (and subsequently any camera previews for cameras you have selected) will always render with the default renderer. My render target was being rendered with the correct renderer.

Related

Is there a way to apply bloom to a specific object?

I've currently noticed that, if i uncheck the "is Global" checkbox on the Bloom Effect of a Post Processing Volume, even thought I adjusted my layer to affect one in particular, the Bloom doesnt apply to that layer I've set in the P-p layer. In fact, it doesn't apply at all. Either it sets bloom for everything in the scene, or it doesn't.
Extras: I have no Pipeline asset, maybe thats the issue, but I've tried to setting one LRP (because for some reason URP in my 2019.2.17f1 version doenst exist) and it just breaks all my materials that i use for Particle Systems (Particles/Standard Unlit) even if i upgrade them for LRP materials.
Any ideas? If it's possible to deliver a solution to both these problems excellent, but the main one is the title question.
Note: The "camera stacking" approach mentioned here applies only to Unity URP. For the Unity Built-in Render Pipeline or Unity versions prior to 2019.3.0f3 you can achieve a similar effect with RenderTextures. Though Unity HDRP has no explicit "camera stacking" feature it does allow for the same net effect via the HDRP-specific Graphics Compositor.
"Is there a way to apply bloom to a specific object?"
You could take a leaf out of Unity camera stacking whereby one set of objects are rendered by one camera and another set with a different camera. The results of each camera rendering are merged together automatically by Unity and presented to the screen.
But don't take my word for it, this is what Unity has to say:
In the Universal Render Pipeline (URP), you use Camera Stacking to layer the output of multiple Cameras and create a single combined output. Camera Stacking allows you to create effects such as a 3D model in a 2D UI, or the cockpit of a vehicle. Tell me more...
...and (my emphasis):
A Camera Stack overrides the output of the Base Camera with the combined output of all the Cameras in the Camera Stack. As such, anything that you can do with the output of a Base Camera, you can do with the output of a Camera Stack. For example, you can render a Camera Stack to a given render target, apply post-process effects, and so on. Tell me more...
When you consider that each camera has the potential for its own rendering settings (including bloom) the solution is clear:
ensure there are two cameras in the scene, say My Default Camera and Bloomin' Camera
create a custom layer called "Bloom"
assign whatever objects you want to be rendered with a bloom to layer Bloom
setup the camera stack as per "Adding a Camera to a Camera Stack".
My Default Camera should be set to "Base":
Bloomin' Camera should be set to overlay:
Add Bloomin' Camera to My Default Camera Stack settings:
ensure that the Culling mask for My Default Camera has the Bloom layer unticked. This ensures that the objects to be bloomed are only drawn once on the Bloom layer
ensure that the Culling mask for Bloomin' Camera has a single ticked entry for the Bloom layer and nothing else. You don't want to double-up on rendering otherwise you will get funky and undesirable z-order effects apart from hurting game performance. Other layers will be rendered by My Default Camera.
apply bloom effects to camera Bloomin' Camera
run game, celebrate
The is global might sound confusing at first. Ultimately it does not mean where to apply the post processing effect, but when to apply the effect. If it is set to Global, it will always be applied, otherwise you can set a layer and a border that triggers the effect.
The general approach is to only set emission to materials where you want the effect to take place. If your Materials are to dark otherwise you should adjust the ambient lighting settings.
Atleast in URP there are some work arounds for older versions like this, but afaik this does not work in 2020.3 since they made some changes on URP and the camera system.
edit: on the video Chris Hull
Chris Hull game an answer for how to do it with the new system
#Mezzanine Add your actual game objects to a created bloom layer.
Create two cameras and set one of them to cull everything except that
bloom layer you made. Set the other to only cull the bloom layer. Then
you can set your camera to overlay and it will be added to the other.
You can then use separate post process stacks on these cameras. Note
that you can only bloom objects in the background with this technique
as if you add bloom to an overlay camera, for some reason it just adds
bloom to everything rather than just the things in that camera view.
Doesn't make much sense and makes the purpose of the layers redundant
in my opinion. If you can find a way to add post process to the
overlay camera before it is added to the final image, to do let me
know.
i have not tested that yet, but i presume it's still valid.

Unity VideoPlayer with Subtitles

I was going to use the VideoPlayer to render to Camera Near Plane, but I also want to display subtitles for the video for the sake of accessibility. I'm wondering what the best way to do that is.
I can't see anything on a canvas if I render to Near Plane. I'd like the video to appear in front of the scene so that I can have the scene there once the video is complete.
Do I need to be using a render texture to achieve this? Seems like a render texture might incur some unnecessary overhead for my purposes, but I could be wrong.
The idea is this:
Far Background - Scene
Background - Black Image (so i can fade to scene)
Middleground - Video
Foreground - Subtitles
More info:
This is a 2D point and click adventure game with a pre-rendered cutscene.
You could do this with a render texture, place it in front of the camera at an exact distance and size, but I wouldn't. Probably would be a different camera anyway for lighting or clipping purposes.
I would use a second Camera, rendering over top of the Main Camera, with the subtitle UI's canvas targeting the second camera's screen space, and clearing depth only. It will render what it sees, but with a totally transparent background. Then, you can render your video on either the main camera's near plane or the new subtitle camera's far plane.
You could put your black square in front of this camera, too, though it would be in front of the video. It could be UI on the main camera, or stick a third camera in between them. You might have to worry about performance if there are too many cameras, but I have used two or three before to no noticeable performance hit.
Robert Mocks's answer is perfectly tenable and makes sense to me. Thank you for that!
What I decided to do instead was use a RawImage so that I wouldn't have to deal with extra cameras. This way I can use the canvas as I normally would and don't have to deal with render textures.
This involves using the API Only setting along with the following code:
rawImage.texture = videoPlayer.texture;
That seems to work well for me.

Error on Main Camera in Unity: "The renderer used by this camera doesn't support camera stacking. Only Base camera will render."

I just updated my unity version from 2018 to 2019. Also, I've got a new render pipeline for the 2D lights feature, and there is this weird white flickering when my character moves, and even sometimes without movement. I think the Main Camera warning shown in the title is the cause, but I could be wrong. Does anyone have a fix?
I am not sure about the problem you referenced in your description, but as for the error in your title I had a similar problem and it was because I had two cameras in the scene, one overlay and one base, the overlay was for my UI and I was using Universal Render Pipeline with 2D which is not yet supported. So I cannot stack multiple cameras, when I do it will only load the base camera.
https://forum.unity.com/threads/does-urp-support-ui-canvas-overlay.768077/
you may try to select main camera than at inspector section open "rendering" and mark the "renderer" as "UniversalRendererPipelineAsset".
This is because you are using 'deferred' rendering, and camera stacking is only supported on forward rendering.
Either select forward rendering on your Universal pipeline asset, or create a new render pipeline asset with forward rendering and assign it to the camera.
If you are unsure which one to choose, read: https://docs.unity3d.com/Manual/RenderingPaths.html

Does setting camera's culling mask to Nothing when UI covers the screen is good?

I have a pretty large and full scene and therefore gets a lot of draw calls.
Sometimes I display a video in the game, which covers the entire screen.
When I tested my game with Unity's profiler tool I noticed that the camera still renders everything (although occlusion culling is enabled and calculated), and it causes the video to lag.
My question is how can I disable the camera?
When I disable the Camera component or the camera's GameObject I get a warning ⚠ in the game view that says No camera is rendering to this display. Which, I guess, is not good (correct me if I'm wrong).
So I was wondering if cancelling the culling mask on the camera (by setting it to Nothing) would force unity to stop render the scene.
Or does it still do some work in the background?
(Like with UI elements that still being rendered even though they are fully transparent).
Thanks in advance
I have a pretty large and full scene and therefore gets a lot of draw
calls.
I recommend activating "Instancing" on your materials, it can greatly reduce draw calls.
When the UI Pops open, it can help removing the "Default" layer (or whatever layer the majority of your renderers are) from the active cameras. You can do this easily with layer masks. Or you can just set Camera.main.farClippingPlane to 1 or any low number.

I set the depth for the camera, but it will be rendered first

I am using Unity 5.4.1f1.
There are multiple cameras on the scene and there is camera I would like to draw last for post effects.
But whatever value I set for the depth of the camera,
That camera will be rendered first on Profiler.
Therefore, the rendered object is not displayed.
How can I change the rendering order?
I solved it myself. When Camera.targetTexture set, then render first.