I have a problem with my VR game.
I have build a simple scene, and when I start the game, RivaTuner shows 90-80 frames per second, but when I put on the headset, the frame rate is very low, I want to know why it works like that.
I used Oculus Quest 2 with Oculus Link.
Related
I am facing a problem after exporting the project. I do not know what the reason is, but my mobile phone becomes hot after a few minutes.
The project is new and does not have any script, just add ARKit XR Plugin.
It's quite common "thermal condition" for any device running Augmented Reality app. ARKit, RealityKit, ARCore, Vuforia or MRTK's tracking stage is highly CPU-intensive. Your phone not only tracks and reconstructs a surrounding environment at 60 fps but also simultaneously renders 3D geometry with PBR shaders, textures, shadows, animation and physics.
In some cases, Face tracking is even more CPU-intensive than World tracking. This can be possible due to the point that RGB channels coming from the selfie camera are in tandem with a segmented Alpha channel and ZDepth channel, coming from TrueDepth sensor. And there are more than 50 facial blendshapes deforming geometry at 1/60 fraction of a second.
Pay particular attention to the fact that native Xcode builds of ARKit apps written in Swift (using UIKit or, especially, SwiftUI) run considerably faster than Unity builds of ARKit apps.
I'm creating VR game for Gear VR device. I'm using unity 2019.1.2 version. Even though it showed higher frame rate and lesser draw call amount in my game, when playing its gives slow response. I reduce the materials amount and now only using 2 to 3 materials for whole scene. I tested using Samsung S6. Can't figure it out what's the wrong here.
Many thanks for your considerations
I am kind of new to Unity in general, i am building a 2 player online board game using PUN 2. where players camera opposite to each other. i rotate the Game Board 180 degree for 2nd player. I have encountered a problem with the changing position. my approach to this problem is not correct to rotate 180 degree.
Anytime I edit the camera's field of view in Unity to 100 (instead of the default 60), either Oculus or Unity is limiting the FOV to 60 for my Samsung S7 with Gear VR headset. The 100 is ignored, and 60 is used when using the Gear VR to view the VR app.
Apparently, the support was disabled in March 2017 (or slightly before) to customize the FOV setting of a photosphere in Unity via the camera.
I've heard that the reason you can't change FOV is because it is a device-specific setting (S6 is different that S7, or Oculus Rift is different than Gear VR for Unity camera's FOV value).
Why?
The word "Virtual Reality" gives the answer to this question.
Just like you can't change fov of your eyes, you shouldn't be able to mess with fov in the app. Depending on device screen (or other factors), they have pre-setup the value so its most resembling to reality.
Moreover there could be other usability problems like VR-Sickness. So Unity took control of this.
Hope this helps.
I have downloaded core assets of Leap Motion from the official website. What I was trying to do is to see my hands in Oculus Rift. There are some predefined scenes that are already added into core assets, for example 500Blocks. However, when I'm trying to load this scene I just get a scene with blocks but hands are not detected. I'm pretty sure that Oculus Rift and Leap Motion are turned on. You can see on the picture of what I get.
What I want is simply to have detected my hands and being able to interact with cubes. How can I do this?
I have Leap Motion of 2.2.7, Oculus Rift 2, and Unity 5.1.1. I built the scene and launched the version with directToRift.
Is your leap motion plugged into oculus rift (won't work) or directly to your machine? (more likely to work)
Is your leap motion working fine, e.g. try the Visualizer while oculus rift is running
Do you have "Allow Images" enabled in the Leap Motion Control Panel (accessible through the taskbar icon)? The white background suggests that passthrough is turned off. 500 Blocks uses our Image Hands assets, so passthrough is needed to see your hands.