Are masked sprites and Stencil Buffers in Unity only visible on one eye if deployed on HoloLens2? - unity3d

Is it possible that Sprite Renderer in a HoloLens2 Unity Project, which are masked via a SpriteMask are only visible on one eye in the final HoloLens2 build (UWP via VisualStudio2019 deployed on HoloLens2 device).
I also experienced the same behaviour on elements which are masked with a StencilShader.
I am using a 24-bit depth buffer for my unity project if that helps, otherways the StencilShader wouldn´t work.

To display certain objects to one eye, for different rendering configurations, different ways need to be used.
With MultiPass, you should be able to set a camera to render to only one eye(Camera component -> Output-> Target Eye), so doing per eye stuff is super easy. With Stereo Instancing, you can draw in Shader but you will need to multiply the projection matrix of the current eye.

Related

Unity 2D Renderer URP Makes my UI Camera have a background color

I am trying to add lighting to my 2D project, so I created a new Universal Render Pipeline that takes in a 2D renderer. I have 2 separate cameras, one for UI, and one for game elements. The moment I add a pipeline to my project, my UI Camera has a background color, although the clear flags are set to depth only.
What I've tried so far:
Turning of Post-Processing, did not work.
Camera stacking, but it is not available on the 2D Renderer, according to the docs.
Making the camera background color transparent, but the Alpha channel does not seem to affect the color in any way.
The unity version I m working on is 2019.3.4f1
I found the solution. So just in case anyone runs to this problem in the future, make sure you have the latest version of the URP on the package manager, then you will have the option to use camera stacking which works just fine for this case.

Is there a way to apply bloom to a specific object?

I've currently noticed that, if i uncheck the "is Global" checkbox on the Bloom Effect of a Post Processing Volume, even thought I adjusted my layer to affect one in particular, the Bloom doesnt apply to that layer I've set in the P-p layer. In fact, it doesn't apply at all. Either it sets bloom for everything in the scene, or it doesn't.
Extras: I have no Pipeline asset, maybe thats the issue, but I've tried to setting one LRP (because for some reason URP in my 2019.2.17f1 version doenst exist) and it just breaks all my materials that i use for Particle Systems (Particles/Standard Unlit) even if i upgrade them for LRP materials.
Any ideas? If it's possible to deliver a solution to both these problems excellent, but the main one is the title question.
Note: The "camera stacking" approach mentioned here applies only to Unity URP. For the Unity Built-in Render Pipeline or Unity versions prior to 2019.3.0f3 you can achieve a similar effect with RenderTextures. Though Unity HDRP has no explicit "camera stacking" feature it does allow for the same net effect via the HDRP-specific Graphics Compositor.
"Is there a way to apply bloom to a specific object?"
You could take a leaf out of Unity camera stacking whereby one set of objects are rendered by one camera and another set with a different camera. The results of each camera rendering are merged together automatically by Unity and presented to the screen.
But don't take my word for it, this is what Unity has to say:
In the Universal Render Pipeline (URP), you use Camera Stacking to layer the output of multiple Cameras and create a single combined output. Camera Stacking allows you to create effects such as a 3D model in a 2D UI, or the cockpit of a vehicle. Tell me more...
...and (my emphasis):
A Camera Stack overrides the output of the Base Camera with the combined output of all the Cameras in the Camera Stack. As such, anything that you can do with the output of a Base Camera, you can do with the output of a Camera Stack. For example, you can render a Camera Stack to a given render target, apply post-process effects, and so on. Tell me more...
When you consider that each camera has the potential for its own rendering settings (including bloom) the solution is clear:
ensure there are two cameras in the scene, say My Default Camera and Bloomin' Camera
create a custom layer called "Bloom"
assign whatever objects you want to be rendered with a bloom to layer Bloom
setup the camera stack as per "Adding a Camera to a Camera Stack".
My Default Camera should be set to "Base":
Bloomin' Camera should be set to overlay:
Add Bloomin' Camera to My Default Camera Stack settings:
ensure that the Culling mask for My Default Camera has the Bloom layer unticked. This ensures that the objects to be bloomed are only drawn once on the Bloom layer
ensure that the Culling mask for Bloomin' Camera has a single ticked entry for the Bloom layer and nothing else. You don't want to double-up on rendering otherwise you will get funky and undesirable z-order effects apart from hurting game performance. Other layers will be rendered by My Default Camera.
apply bloom effects to camera Bloomin' Camera
run game, celebrate
The is global might sound confusing at first. Ultimately it does not mean where to apply the post processing effect, but when to apply the effect. If it is set to Global, it will always be applied, otherwise you can set a layer and a border that triggers the effect.
The general approach is to only set emission to materials where you want the effect to take place. If your Materials are to dark otherwise you should adjust the ambient lighting settings.
Atleast in URP there are some work arounds for older versions like this, but afaik this does not work in 2020.3 since they made some changes on URP and the camera system.
edit: on the video Chris Hull
Chris Hull game an answer for how to do it with the new system
#Mezzanine Add your actual game objects to a created bloom layer.
Create two cameras and set one of them to cull everything except that
bloom layer you made. Set the other to only cull the bloom layer. Then
you can set your camera to overlay and it will be added to the other.
You can then use separate post process stacks on these cameras. Note
that you can only bloom objects in the background with this technique
as if you add bloom to an overlay camera, for some reason it just adds
bloom to everything rather than just the things in that camera view.
Doesn't make much sense and makes the purpose of the layers redundant
in my opinion. If you can find a way to add post process to the
overlay camera before it is added to the final image, to do let me
know.
i have not tested that yet, but i presume it's still valid.

VR Headset issue: objects displayed in one eye because of default shader. (in my case TextMeshPro shader)

I have:
Unity
High Definition Render Pipeline (HDRP)
Single pass rendering (no ability to change)
VR Headset
Part of objects displayed only in one eye.
Investigation show that the reason of this -- non-compatible shaders for such environment. For particle system I have resolved such problem with own shader created using ShaderGraph (below is solution)
But My problem of this question is:
I have no idea how to write correct shader for TextMeshPro Text.
Can anybody help with this?
THIS FIX WORKS FOR PARTICLE SYSTEM one eye issue: Shadergraph shader (works well with transparency, Color over lifetime, etc)

What objects and APIs would I use to fill the unity game window with custom graphics?

I need to build an app using Unity which doesn't use a traditional camera to generate the graphics. I'll build them using some custom shaders and a few cameras whose results get stuffed in rendertextures and then frobbed. (Think http://www.purplefrog.com/~thoth/art/kaleidescope/kaleid1.html but even weirder)
I'm not sure what objects I would put in the scene to accomplish this. In any normal app you just put a camera and point it at the right spot and Unity gets the pixels into the window, but that is just not how this thing will work.
I'm not sure if I should be using a UI Canvas or what APIs would be used to copy various render textures into the proper locations.
If you are not targeting WebGL you can create a RenderTexture of the proper size (maybe using RenderTexture.GetTemporary) and use Graphics.CopyTexture or other techniques to assemble the image you want displayed in the game window.
Once you have the pixels you want in the RenderTexture you can use Graphics.Blit(src, (RenderTexture)null); which will copy the pixels into the game window. These pixels will be stretched if the game window is not the same size as the RenderTexture.
This technique worked for me in the editor's game window, but when I compile it to WebGL, all I get is a mostly-grey screen with a really big black rectangle in the bottom left.

Sphere 360-texture low quality

I'm creating an app to show 360 images with Cardboard.
I created a scene in Unity using Cardboard camera and sphere. I mapped 360-image to a sphere texture.
When viewing the texture is low quality and has sawtooths so the details are not good quality.
Any ideas to solve this texture problem? I tried a script which creates a different kind of sphere but it didn't solve the problem.
You need to use an icoshpere for this to work, you'll still gonna get some distortion near the polls, but it's far better than the uv ones that Unity provides.
The second thing is that you'll need a high detail icoshpere for this to work, as you'll need more vertexes.
The third thing is the textures quality and size. I think the default fov for Unity is around 60, but you'll map the texture for a fov of 360, so you'll need textures of higher size compared to the on screen texture you are using.
You can look over this article if you want more details about the differences between icoshperes and uv spheres, or just go to the bottom of the article and download the unity project. The project includes already made icoshperes and you can experiment with them to find out which one is best suited for your project. I'm using the Octahedron Sphere 4 R1. Any less polys and there are too many distortion, any higher one and the fps drops to much.