How to get Order Independent Transparency in VR/Unity working? - unity3d

I'm using This github project to create order independent transparency in my projects. It's working fine with a normal camera [no OIT] [OIT], but when I try to use it with a VR setup, it just doesn't render any object on the "transparent" layer that is using the above project. [VR no OIT] [VR no working with OIT].
On top of just not rendering anything marked as transparent, the left and right cameras become offset in a way that they shouldn't, so you get a disorienting effect as if you're eyes are in the wrong place.
I'm using the Oculus SDK, but I don't think it's that. The same thing happens if I just use a camera that feeds to the Oculus headset.
Here is the Unity project, if you want to see for yourself.
Thank you
EDIT: I was also occassionally getting a weird effect where the spheres would render all black and remain centered on my left eye, and the rest of the scene was rendered upside down. The right eye would render everything not marked as transparent correctly. I believe I was using the single camera setting in the OVR camera rig when this happened, but I couldn't get it to produce the same error when I went back to record these errors.
Additionally I am using Unity 2018.2.0f2 and the Oculus SDK v1.30.1

Related

Unity: Alternative for camera stacking/layering?

Unity 2021.3.16f1/URP 12.1.8
I've just started with Unity a few weeks ago and am still getting to grips with how everything works. So please don't assume I know everything there is to know about Unity. Treat me as a n00b. 😉
I'm building a VR game for the Quest/Quest2. I have a scene with a keypad on a wall. When the payer "clicks" it, I want the scene to go dark, and a large version of the (3D) keypad to appear with which he can then interact (enter numbers). This keypad must always stay in the middle of his view.
What I did was create a canvas, and added a black plane with 50% transparency and the large version of the keypad. I've set up the canvas as follows:
This works somewhat. It has two major disadvantages: 1) the keypad is receiving lighting from the scene while I want it to be fully lit all the time, and 2) the canvas and all its children clip through walls and objects while I always want it to be rendered in front of everything else (yes I know this will mess with your depth perception in VR, but I already have a solution for that).
So the next thing I tried, was stacking cameras. I created a second camera and set is as an overlay camera. I also set its Culling Mask to UI:
Additionally, I added the new camera to my Main Camera as a stacked camera. I changed the Culling Mask of the Main Camera to everything but UI:
This works they way I want it but at a cost: performance takes a huge hit. My frame rate actually halved. I read everywhere that this is a known problem for mobile devices (which the Quest really is).
Another solution I read about, is using RenderObjects. But I can't really find how to use this. I'm not even sure it really is a solution to what I'm trying to achieve.
So can anyone tell me how I should go about doing this? Thanks in advance!
The solution of lighting is the that you can setup the layer of your keyboard to something like "Keyboard"
And in Directional Light you can uncheck keyboard layer.
The solution of second problem is that you can change culling mask of camera on Run Time Like this:
~(1 << LayerMask.NameToLayer("Keyboard"))
renders everything except the transparent layer.
1 << LayerMask.NameToLayer("Keyboard")
renders only the transparent layer.
NOTE: You can set your own layer and check/uncheck what you want it is just a example

Unity Oculus Rift S - Unable to change camera's field of view

In my current project, I am using the Oculus Integration package to interface my app with my Oculus Rift S headset and Unity 2021.3.6f1 URP.
While working on the project I played with the camera settings to get better visuals, and noticed that the field of view attribute always goes back to 90 (even when I set it at run time).
I went over the scripts that were imported from the Oculus package, mainly the following ones (since those are the ones used in the project): OVRCameraRig, OVRManager, OVRHeadsetEmulator.
But wasn't able to find anywhere in the code what is the cause. I even tried searching through all the scripts (using my IDE) for any piece of code that changes the fieldOfView property, and found some scripts but none of them is used in the project, and commenting those lines made no difference…
So, my question is why can't I change my camera's field of view? What caused it to constantly be set to 90?
Its a bad idea to change the default FOV on a VR camera - its meant to match the actual FOV of the headset and most users will experience heavy nausea if you change it more than a few degrees from the correct value. If you want to experience just how bad this feels, place a quad in front of the main camera, with unlit textured material with a texture written to from another camera (with a different fov). This simple solution will not give you stereoscopy, but should be enough to experience just how bad of and idea this is

Why is the Motion Blur post processing effect not working in Unity (mac)?

I just can't seem to get motion blur to work in my Unity game. I have added the Post Processing package, created a custom layer for it, added a Post Process Layer to my camera, a Post Process Volume object to my scene and I've linked them together using a dedicated layer. I've also added the effects I want to the Post Processing Profile.
I'm pretty sure I've done this all correctly as I have successfully added many post processing effects, including Bloom, Vignette, Depth of Field and Lens Distortion. These work fine. But when I add Motion Blur, despite turning the Sample Count to maximum, there simply doesn't seem to be a difference.
My scene is very simple at the moment, containing only a sphere with its normals inverted, centred around a user-controlled camera - the standard '360 Degree Photo' setup, basically. There are no lights and I am using a bright white ambient light to illuminate everything equally and optimally. The render pipeline is the default one for 3D games.
I have tried both spinning the camera AND spinning the sphere using the mouse. Neither seems to result in any appreciable blur. Anyone know why this is not working?
Unity Version: 2020.3.3f1 (Personal)
Computer: 2013 iMac

Unity LWPR does not see some layers

I have 2D Unity project with two cameras: the main one and one designed for parallax effect.
After I installed LWPR for setting lights the second camera stopped showing its layer in game.
Is there a way to fix this?
The practice of rendering two cameras at the same time as you are describing, "camera stacking", is not currently supported on LWRP or URP. There is some discussion about adding support for it again.
You could try using a camera to render onto a render texture and display that as your background.
For the 2D light, it is present but is greyed out. You should be able to enable experimental feature use in the player settings for the project.

Physics messed up with cardboard scene in Unity

I am in the process of putting together an app using the Google Cardboard SDK. The user will be able to use the app with or without cardboard. So, there is a switch button inside the app, that activates and deactivates stereo rendering.
The app also uses the Vuforia SDK to track image targets. If a specific target is recognized, some 3D objects above the target and a particle system starts to emit particles.
Everything works fine in non-stereo mode. Particles are emitted and falling correctly as intended. They should simulate snow. Also if the user turns the image target to an angle, the 3D objects above fall down.
When switching to stereo mode, the physics are messed up completely. The snow particles are not falling anymore, they seem to "teleport" around the screen. Also the 3D objects do fall upwards, with a really heavy negative gravity. Timescale seems multiplied several times, but is not - I double checked that. Gravity also does not change when switching between non-stereo and stereo rendering.
Everything works fine in Unity Editor in moth modes. It only appears on the device, which is an iPhone 5.
Cardboard SDK is version 0.52, which is the newest.
Unity is version 5.3.1.
Vuforia is 5.0.6, which is not the newest, but release notes do not indicate a fix concerning physics. Will update it anyway as a next step.
Vuforia is 5.0.10, which is the latest version.
I double checked gravity and timescale, which are not changing when switching between modes. I have a hard time figuring out what might cause the physics to mess up.
EDIT:
I did some further investigations. I made me a little gizmo sitting always in front of the camera but getting the rotation of the Unity world space axes, so I know the 3D-world is oriented in relation to the camera. And it turns out, that when in VR mode with the Google cardboard camera system, the world does spin around the camera heavily. I managed to hold the test device in a way, so it is slowing down and almost freezing, but I have no explanation for the effect yet.
I managed to get my setup right again. Unfortunately I did not find the source of the weird behavior. But By deleting the Vuforia Prefab and the Cardboard Prefab and adding them again to the scene, the problem was solved.