Unity Oculus Rift S - Unable to change camera's field of view - unity3d

In my current project, I am using the Oculus Integration package to interface my app with my Oculus Rift S headset and Unity 2021.3.6f1 URP.
While working on the project I played with the camera settings to get better visuals, and noticed that the field of view attribute always goes back to 90 (even when I set it at run time).
I went over the scripts that were imported from the Oculus package, mainly the following ones (since those are the ones used in the project): OVRCameraRig, OVRManager, OVRHeadsetEmulator.
But wasn't able to find anywhere in the code what is the cause. I even tried searching through all the scripts (using my IDE) for any piece of code that changes the fieldOfView property, and found some scripts but none of them is used in the project, and commenting those lines made no difference…
So, my question is why can't I change my camera's field of view? What caused it to constantly be set to 90?

Its a bad idea to change the default FOV on a VR camera - its meant to match the actual FOV of the headset and most users will experience heavy nausea if you change it more than a few degrees from the correct value. If you want to experience just how bad this feels, place a quad in front of the main camera, with unlit textured material with a texture written to from another camera (with a different fov). This simple solution will not give you stereoscopy, but should be enough to experience just how bad of and idea this is

Related

Hololens + Unity: GameObjects are invisible

After I build my Unity project and send it to the Hololens, I have the following problem:
The splash screen appear followed by a debugging window on the bottom. In the background is a white net. However, you can't see any game objects. I've tested a lot but haven't found a solution for that. Visual Studio does not display any error messages. What I've looked at roughly:
These are my modules. Im using the 2019.4.22f1 version of Unity and the MRTK Foundation Toolkit 2.7.2.
My build settings
My project settings
I tried to place the objects in the middle of the camera and changed the colors.
MRTK settings I haven't changed anything most of the time
Main camera settings
My scene
When i start the scene i get this error in the console. I dont know if this has anything to do with my problem
i have two possible solutions (no guarantee)##
you could spawn the objects on input directly in front of the
camera, add a debug.log("object in front of you"); so you can find
the issue.
If this doesnt work i would try to test differnet types of materials
like you do with HDRP.
if this does not work either i probably cant help you out now.
It seems like your GameObject is too far to be hidden behind by the mesh. Please make the spatial mesh invisible by setting the Display Option property of Spatial Mesh Observer Setting to None, this item can be found under the Spatial Awareness profile of the MRTK profile.

How to get Order Independent Transparency in VR/Unity working?

I'm using This github project to create order independent transparency in my projects. It's working fine with a normal camera [no OIT] [OIT], but when I try to use it with a VR setup, it just doesn't render any object on the "transparent" layer that is using the above project. [VR no OIT] [VR no working with OIT].
On top of just not rendering anything marked as transparent, the left and right cameras become offset in a way that they shouldn't, so you get a disorienting effect as if you're eyes are in the wrong place.
I'm using the Oculus SDK, but I don't think it's that. The same thing happens if I just use a camera that feeds to the Oculus headset.
Here is the Unity project, if you want to see for yourself.
Thank you
EDIT: I was also occassionally getting a weird effect where the spheres would render all black and remain centered on my left eye, and the rest of the scene was rendered upside down. The right eye would render everything not marked as transparent correctly. I believe I was using the single camera setting in the OVR camera rig when this happened, but I couldn't get it to produce the same error when I went back to record these errors.
Additionally I am using Unity 2018.2.0f2 and the Oculus SDK v1.30.1

Unity gear VR reticle pointer shows double when focusing on close objects

I'm developing a VR app in Unity for the Samsung Gear VR and I'm trying to implement a pointer so the user can interact with the objects in the scene. When you look at distant objects it looks fine, but when you focus on close objects (which is highly needed for the app mechanics) the pointer appears to be duplicated, so you need to center the desired object in the middle of the points :P
What I've tried
-Using the GvrReticlePointer that comes with the GoogleVR package for cardboard
-Creating my own pointer by adding a canvas to the main camera with an image in the center
-Changing some of the Camera settings like field of view, stereo separation, etc.
-Configure my phone via a QR code http://imgur.com/fVrNrQk
Steps to reproduce (With canvas added to camera)
1.- Create a simple scene with a few objects to look at in Unity
2.- Set build settings for android
3.- Configure player settings to enable "Virtiual Reality Supported"
4.- Add Oculus as Virtual Reality SDK
5.- Set package name and minimum API level
6.- Add a canvas to the camera
7.- Add an image to the canvas, a cross will do the job
Observations
I'm using Unity 5.6.0b10 since google cardboard's site recommends using this version for the GoogleVR package. And I'm using the Samsung Gear VR with a Samsung Galaxy S6 edge + phone.
Solved
Apparently this is a well documented issue called voluntary Diplopia, and it's a human bug not a software one (read here, Unity's documentation, section The Reticle Interaction in VR).
The problem is trying to put the reticle at a fixed point in the user interface, like traditional 3D games. When looking at closer objects in VR this is going to cause this seeing double problem.
The solution is to position the reticle at the point in the 3D space the user is looking at. If he's looking closer, the reticle is drawn closer. Of course now you also have to scale the reticle accordingly, so the users can see it the same size no matter where they're looking at.
Unity also provides some example scripts about this, you can find them in the assets store, is called VR Samples.
Now I have performance issues (I'm working on mobile platforms): sometimes, when you turn your head fast you can see the reticle where it was drawn before. But looks way better than the double reticle version.

Physics messed up with cardboard scene in Unity

I am in the process of putting together an app using the Google Cardboard SDK. The user will be able to use the app with or without cardboard. So, there is a switch button inside the app, that activates and deactivates stereo rendering.
The app also uses the Vuforia SDK to track image targets. If a specific target is recognized, some 3D objects above the target and a particle system starts to emit particles.
Everything works fine in non-stereo mode. Particles are emitted and falling correctly as intended. They should simulate snow. Also if the user turns the image target to an angle, the 3D objects above fall down.
When switching to stereo mode, the physics are messed up completely. The snow particles are not falling anymore, they seem to "teleport" around the screen. Also the 3D objects do fall upwards, with a really heavy negative gravity. Timescale seems multiplied several times, but is not - I double checked that. Gravity also does not change when switching between non-stereo and stereo rendering.
Everything works fine in Unity Editor in moth modes. It only appears on the device, which is an iPhone 5.
Cardboard SDK is version 0.52, which is the newest.
Unity is version 5.3.1.
Vuforia is 5.0.6, which is not the newest, but release notes do not indicate a fix concerning physics. Will update it anyway as a next step.
Vuforia is 5.0.10, which is the latest version.
I double checked gravity and timescale, which are not changing when switching between modes. I have a hard time figuring out what might cause the physics to mess up.
EDIT:
I did some further investigations. I made me a little gizmo sitting always in front of the camera but getting the rotation of the Unity world space axes, so I know the 3D-world is oriented in relation to the camera. And it turns out, that when in VR mode with the Google cardboard camera system, the world does spin around the camera heavily. I managed to hold the test device in a way, so it is slowing down and almost freezing, but I have no explanation for the effect yet.
I managed to get my setup right again. Unfortunately I did not find the source of the weird behavior. But By deleting the Vuforia Prefab and the Cardboard Prefab and adding them again to the scene, the problem was solved.

Unity3d animated cursor

For change cursor i using this:
UnityEngine.Cursor.SetCursor(CursorTexture,
new Vector2(CursorTexture.width, CursorTexture.height) * 0.5f,
CursorMode.ForceSoftware);
I want to animate cursor when something happens.
Is it possible to anumate cursor using Cursor.SetCursor?
You can do it like LearnCocos2D says. The problem will be it will flicker a lot and the other problem you will most likely have is that the mouse pointer will be really sluggish. This is because software mouse pointers are not rendered by hardware so its always a couple of frames behind actual user's input on the pointer device.
Then for the animated texture to work on web browser you need to make sure you export the needed shaders you are using if anything on a resources folder of your web player project since lots of shaders are not exported to the web build by default. It should work if you are using a standard diffuse, but I think that for a mouse pointer since most likely it uses transparency then it may not work by default. You'll need to find the actual shader being used and export that by manually for your build.
Unity should have support for hardware animated cursors at least on PC, but sadly it doesn't...