Multiple cameras assigned different depths - unity3d

A similar question was asked in the past, but the new version of Unity, does not solve my problems. When setting multiple cameras with different depth, they handle different layer. But when I add the shader blur, the second chamber, which handles only one layer, this blur is added to all elements. Why is it, and how to fix it?

When applying the texture, depth and alignment of the behavior of the camera changes. For example, a shader camera should be in the background, with the settings skoboh, because the main can monitor everything and retain only the second shader setting depth only. Sorry for bad english.

Related

Is there a way to apply bloom to a specific object?

I've currently noticed that, if i uncheck the "is Global" checkbox on the Bloom Effect of a Post Processing Volume, even thought I adjusted my layer to affect one in particular, the Bloom doesnt apply to that layer I've set in the P-p layer. In fact, it doesn't apply at all. Either it sets bloom for everything in the scene, or it doesn't.
Extras: I have no Pipeline asset, maybe thats the issue, but I've tried to setting one LRP (because for some reason URP in my 2019.2.17f1 version doenst exist) and it just breaks all my materials that i use for Particle Systems (Particles/Standard Unlit) even if i upgrade them for LRP materials.
Any ideas? If it's possible to deliver a solution to both these problems excellent, but the main one is the title question.
Note: The "camera stacking" approach mentioned here applies only to Unity URP. For the Unity Built-in Render Pipeline or Unity versions prior to 2019.3.0f3 you can achieve a similar effect with RenderTextures. Though Unity HDRP has no explicit "camera stacking" feature it does allow for the same net effect via the HDRP-specific Graphics Compositor.
"Is there a way to apply bloom to a specific object?"
You could take a leaf out of Unity camera stacking whereby one set of objects are rendered by one camera and another set with a different camera. The results of each camera rendering are merged together automatically by Unity and presented to the screen.
But don't take my word for it, this is what Unity has to say:
In the Universal Render Pipeline (URP), you use Camera Stacking to layer the output of multiple Cameras and create a single combined output. Camera Stacking allows you to create effects such as a 3D model in a 2D UI, or the cockpit of a vehicle. Tell me more...
...and (my emphasis):
A Camera Stack overrides the output of the Base Camera with the combined output of all the Cameras in the Camera Stack. As such, anything that you can do with the output of a Base Camera, you can do with the output of a Camera Stack. For example, you can render a Camera Stack to a given render target, apply post-process effects, and so on. Tell me more...
When you consider that each camera has the potential for its own rendering settings (including bloom) the solution is clear:
ensure there are two cameras in the scene, say My Default Camera and Bloomin' Camera
create a custom layer called "Bloom"
assign whatever objects you want to be rendered with a bloom to layer Bloom
setup the camera stack as per "Adding a Camera to a Camera Stack".
My Default Camera should be set to "Base":
Bloomin' Camera should be set to overlay:
Add Bloomin' Camera to My Default Camera Stack settings:
ensure that the Culling mask for My Default Camera has the Bloom layer unticked. This ensures that the objects to be bloomed are only drawn once on the Bloom layer
ensure that the Culling mask for Bloomin' Camera has a single ticked entry for the Bloom layer and nothing else. You don't want to double-up on rendering otherwise you will get funky and undesirable z-order effects apart from hurting game performance. Other layers will be rendered by My Default Camera.
apply bloom effects to camera Bloomin' Camera
run game, celebrate
The is global might sound confusing at first. Ultimately it does not mean where to apply the post processing effect, but when to apply the effect. If it is set to Global, it will always be applied, otherwise you can set a layer and a border that triggers the effect.
The general approach is to only set emission to materials where you want the effect to take place. If your Materials are to dark otherwise you should adjust the ambient lighting settings.
Atleast in URP there are some work arounds for older versions like this, but afaik this does not work in 2020.3 since they made some changes on URP and the camera system.
edit: on the video Chris Hull
Chris Hull game an answer for how to do it with the new system
#Mezzanine Add your actual game objects to a created bloom layer.
Create two cameras and set one of them to cull everything except that
bloom layer you made. Set the other to only cull the bloom layer. Then
you can set your camera to overlay and it will be added to the other.
You can then use separate post process stacks on these cameras. Note
that you can only bloom objects in the background with this technique
as if you add bloom to an overlay camera, for some reason it just adds
bloom to everything rather than just the things in that camera view.
Doesn't make much sense and makes the purpose of the layers redundant
in my opinion. If you can find a way to add post process to the
overlay camera before it is added to the final image, to do let me
know.
i have not tested that yet, but i presume it's still valid.

Recommendations for clipping an entire scene in Unity

I'm looking for ways to clip an entire unity scene to a set of 4 planes. This is for an AR game, where I want to be able to zoom into a terrain, yet still have it only take up a given amount of space on a table (i.e: not extend over the edges of the table).
Thus far I've got clipping working as I want for the terrain and a water effect:
The above shows a much larger terrain being clipped to the size of the table. The other scene objects aren't clipped, since they use unmodifed standard shaders.
Here's a pic showing the terrain clipping in the editor.
You can see the clipping planes around the visible part of the terrain, and that other objects (trees etc) are not clipped and appear off the edge of the table.
The way I've done it involves adding parameters to each shader to define the clipping planes. This means customizing every shader I want to clip, which was fine when I was considering just terrain.
While this works, I'm not sure it's a great approach for hundreds of scene objects. I would need to modify whatever shaders I'm using, and then I'd have to be setting additional shader parameters every update for every object.
Not being an expert in Unity, I'm wondering if there are other approaches that are not "per shader" based that I might investigate?
The end goal is to render a scene within the bounds of some plane.
One easy way would be to use Box Colliders as triggers on each side of your plane. You could then turn off Renderers on objects staying in the trigger with OnTriggerEnter/OnTriggerStay and turn them on with OnTriggerExit.
You can also use Bounds.Contains.

How can I use baked lighting on sprites? / How to light up a large area in 2D?

I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.

Unity3D: Why is particle lighting making directional shadows disappear?

UPDATE: As #BenHayward suspected, this is a bug. <link>
I have a very simple setup of cubes on a plane comprising a grid of quads. A directional light is shining down at the scene at an angle, producing a set of shadows from the cubes onto the quads.
Now I'm trying to produce an explosion effect with Unity's particle system, but when I add a point light to the particle system it causes all the directional-lighting shadows to disappear, whether they're in line of sight of the particle or not.
The shadows reappear when the particle is destroyed. Replicating the particle effect with pure C# doesn't cause any problems.
(Oh, and obviously I'm using the deferred rendering path.)
Any ideas? This is driving me off the wall.
[EDIT: I should have mentioned that the point light added to the particle system is set to cast shadows. The Unity standard particle pack has shadow-casting disabled by default. They too cause the problem when I turn the shadow-casting on.]
Based on the project that you linked to, it seems as though the particle system is causing the shadow cast from the directional light to flicker on and off quickly. I suspect this is a bug, since if it were intended behaviour, I wouldn't expect it to flicker in this manner.
In cases where this is not a bug, the problem can be caused by a couple of issues:
You can only have a certain number of dynamic (shadow casting) lights in your scene which are seen by the camera frustum. By default, this number is quite low (I think it's 4). You can increase this number by going to Edit > Project Settings > Quality. Set the Pixel Light count higher from its default value. You will need to increase this value to be greater than the total number of lights in your effect. Higher values will allow more lights to be rendered on the screen, but this reduces performance.
It depends on the shaders which you are using to receive the shadows. Some shaders will only render shadows for one directional light. The light which is used isn't necessarily too easy to determine. If you are using the standard Unity shader this shouldn't be a problem. But if you are using a mobile compatible surface shader or something you've written yourself then this could be the cause of the problem.
Also, for an explosion, I'd recommend using just one single point light (not lights attached to each particle), as this is all that is required. Any more lights would result in considerable performance impact on the GPU especially if there are more than one explosion in the scene at any one time.
I recreated the scene as you described, i can't recreate your issue.
i mostly followed this tutorial, and added a few cubes in a plane:
https://unity3d.com/learn/tutorials/topics/graphics/adding-lighting-particles
I will need a screenshot of your lights componnents, both the directional and the point light, the particles, and the cubes (mostly the material); I cannot comment because i dont have enough reputation yet, so i'll delete this once you add the screenshots;

Unity Particle System is rendering outside a masked viewport

I have a ScrollView with a working mask that block images, text etc when not in the viewport (visible area).
The problem I have is that ALL particles sytems are ALWAYS rendering and visible on screen, whether they are part of the viewport or not.
I would to know:
1) if masking is possible on Particle Systems
2) and if it is what have I overlooked or missed that makes the particles visible.
FYI I have tried layers, adding a specific mask to the object to the object with the particle system, adding a mask to the parent of the object with the particle system, and randomly altering renderer settings, and I'm ready to cry.
The problem is not the particle systems themselves, but with the shader the particles use.
The way Unity's Mask Stencil system works is through the stencil buffer, which only works if your shader plays nice with it. If you want to try to modify your shader for this, here is the relevant documentation. Otherwise, try changing to a different shader or using a different method to hide your particles, such as modifying Camera.rect, for which the documentation is here.
By the way, if we're being a stickler for terminology here, "viewport" doesn't mean what you think it means (within the context of computer graphics).