Is there Optimizations in the render pipeline for Virtual Reality - virtual-reality

Im trying to get my head around the render pipeline on a Head Mounted Display.
Given that a we have a target refresh rate of 90hz per screen.
Is there efficiency built within the pipeline that benefit from a reduced compute load on the reduced delta from one frame to another in VR?
Im wondering does the fact that less pixels have changed in the image from Frame A to B # 90fps compared to Frame A to B # 45fps given the same movement on screen.
I.e is the workload per frame from moving 1 frame to another anyway reduced by these new frames.
http://imgur.com/6rRiWGM

AFAIK all frames on VR HMDs are always rendered from scratch as in other 3d applications. If there was a good method to magically interpolate the rendering why would it only be used on VR?
There is however another trick called Timewarp. With proper async timewarp implementation if you don't provide a new frame in time, the last one is rotated by the delta of your headset rotation.
So when you look around, the head movement is still looking as if your
app would have high fps.
If you were not moving and there is nothing "stuck" to the camera like a GUI, this is a very good ilusion.
Currently timewarp is working well on GearVR and Vive, possibly on production ready Oculus (but not on DK2 on 0.8 drivers, still haven't got my new headset).

Related

Unity Oculus Quest game stutters/lags when head is moved from side to side

Firstly, I've built for the Quest and checked the Unity profiler - everything seems good.
The weird thing is, when I move my head from side to side I don't see any framerate drops in the profiler but I see a noticeable stutter/lag in the headset. When my head is still or even just rotating, everything looks great, there's no stutter at all.
This stutter only occurs when I'm close to my UI and moving my head. When my head is static or rotating, it's all good. If I move back from the UI, there's no stutter either.
I don't think that it's an issue with too complex geometry or materials, as surely this would show up as a framerate drop in the profiler. I'm guessing that the camera movement is causing some kind of frame mismatch which (for some weird reason) isn't showing up in the profiler.
Does anybody have any ideas as to what might be causing this?
Well, I found the issue after narrowing the issue down to a specific set of GameObjects. It seems that the issue was being caused by a custom 'Dissolve' shader I was using. After setting the shaders back to 'standard' the problem went away! Weird...
This happens if your game FPS is lower than the device refresh rate.
For example if the headset is displaying 90 frames per second and your game is only able to render 70 frames per second, the headset needs to reuse a rendered frame every few frames.
When the game doesn't provide a rendered frame in time the device will just take the last rendered "image" and adjust it to the changed headset position and rotation. If it's a rotation change only, you will not notice, because the headset can easily just rotate the last image. But if it's also movement, the device can't simulate the difference of movement of close objects versus far objects on the image, so if you are moving an object close to a camera (e.g. your hand), it will stay at the same position for two frames (every few frames), which will make it look like it's stuttering.

Unity 2D - Animation drops FPS dramatically

I created a 2D sprite animation using the 2d animation package and the 2D Ik package. My character is one sprite sheet (PBS file). In the PBS file all the spites (eyes, mouth, etc, - character is basically a square with a face) are arranged and bones are attached. Then I animated the character's idle animation in Unity.
The animation is complex and is a total of 1028 frames (about 17 seconds).
The scene is almost empty otherwise. There are a few sprites with colliders and rigid bodies for simple platforms. There is a background image which is 1024 px. x 1024 px.
In play mode the FPS drops down to around 30 FPS (and under).
I have another scene without the animation but with a HUGE number of assets (for a 2D scene hundreds of sprites and many of them constantly in motion). This scene runs at 210++ FPS consistently.
Why does this one animation kill the FPS? I'm just getting started with creating animation for all the characters. If I add similar animations to NPCs in the scene (enemies) then this thing will probably not function at all.
Any suggestions are appreciated.
For anyone having the same issue, make sure you have Burst and Collections packages installed (via the Package manager), it extremely improves the performance of 2D Animation.
So I spent a couple of days this week and created sprite sheet animations to replace the skeletal animation I made in Unity. As I suspected, the performance is light years ahead. I have MORE animations now and some of them are as long as 270 frames. Yet I consistently see greater than 220 FPS. It's too bad that unity's 2D animattion package is so slow. To accomplish the same animation I was looking for, I ended up rigging and animating my character in Blender 2.8 using Andreas Esau's COA Tool (Cut-Out-Animation Tools) Add-On.
COA Tools is an awesome tool but I wish it could export a rendered sprite sheet to Unity. I ended up exporting each frame as a separate image and using TexturePacker to make the 2048px square sprite sheets I needed. Once I got the sprite Sheets into unity I was able to quickly set up animation clips and test it out.
As I said, the sprite sheets are far and away better than animation created directly in Unity.
If you bothered to read all the comments on my original post, I will say that there are a lot of scripts running and a lot of computation happening. It is there in the profiler for sure. But this was definately not the main culprit for taking down the FPS. It was absolutely the animations.
Looking at your profiler screenshot, SpriteSkin.LateUpdate() takes a large chunk of your frame time. To reduce the amount of time CPU spends on deformation, you can limit the number of vertices in that are used for each Sprite Part - you can adjust it in the Skinning Editor with Edit Geometry tool. Basically, the less the amount of vertices the better the performance.
Also, make sure that each Sprite Skin component has the Enable Batching option enabled. This will enable Burst and Collections packages to speed up calculations. For more details, checkout this 2D Animation documentation.
Recently, Unity a free ebook was released that covers in details best practices and how to prepare art for 2D animation. You can find it here.

How to have a reference frame for markerless inside out tracking in VR, to achieve absolute positional tracking and prevent drift

We have the new Vive Focus headset, which has markerless inside out tracking. By default these headsets can only do relative tracking from their initial position. But this isn't totally accurate, you get some drift, and the position in the virtual world and the real world go out of sync. For the y position, this can mean ending up at the wrong height in the virtual world as well. Also, you don't know the user's absolute position, which you would need for a mulitplayer game for instance, so your players don't run into each other.
Now the ZED camera from Stereolabs has an option to use a reference frame (which I assume is a pointmap), which it will then use to do absolute positional tracking by calculating your position relative to the reference frame, instead of to the last frame (which I assume normal markerless inside out tracking does). Of course the ZED code is in a dll, so my question is, how difficult is it to code this system using a reference frame for the Vive Focus or another markerless inside out tracked headset. Preferably in C#, preferably using the Unity plugin, but any example would help.
And what I'm wondering about this reference frame system is would one reference frame be enough? The ZED documentation says you need to look at approximately the same scene as you were when you first made the reference frame. This makes sense, otherwise how would the system find its reference. But doesn't that mean you would need more references, for the other sides of your room as well? The ZED documentation also says that using the reference frame setting can cause jumps in VR, when syncing to the reference. Would this be a big problem? Because if it would jump all the time, that would only increase motion sickness, which is a big enough problem as it is in VR. And finally, would it require a lot of processing power to track using a reference frame? Because we're dealing with standalone headsets here powered by mobile processors, they have a hard enough time of it as it is.
Or would it be feasible to make something using markers and maybe Vuforia and do absolute positional tracking that way?
Thanks for any input!

Unity 3D low fps

I'm using Unity3D 5.3 version. I'm working on a 2D "Endless running" game. It's working normally on PC. But when I compile it to my phone, all of my gameobjects are shaking when they are moving. Gameobjects are in a respawn loop. I'm increasing my Camera's transform position x. So when my camera is in action, all of the other objects look like they are shaking a lot and my game is working slowly on my phone as a result. I tried to play my game at Samsung, on discovery phones. It's working normally on some of them. But even on some Samsung devices it's still shaking. So i don't understand what the problem is. Can you help me with this?
One thing you can do is start optimising, if you have a game that is either finished or close to it. If you open the profiler, click "Deep Profile" and then run it in the editor on your PC, you'll get a very detailed breakdown of what is using the most resources within your game. Generally it's something like draw calls or the physics engine doing unnecessary work.
Another thing that might help is to use Time.deltaTime, if you aren't already. If the script that increases the transform doesn't multiply the increase by Time.deltaTime, then you're moving your camera by an amount per frame rather than per second, which means that if you have any framerate drops for any reason, the camera will move a smaller distance and that could be throwing out some of your other calculations. Using Time.deltaTime won't improve your framerate, but it will make your game framerate independant, which is very important.

Stuttering animation in iPhone OpenGL ES although fps is high

I am building a 2d OpenGL es application For iPad it displays a background texture and numerous textures on top of it which are always in motion.
Every frame their location is recalculated based on time delta and speed and the entire thing is being rendered at 60 fps successfully, but still as the movement speed of the sprites raises, thing look stuttering.
Any ideas? Are there inherit problems with what I'm doing? Are there known design patterns for smooth animation?
Try to compute time delta as average of last N frames because it's possible to have some frames that take more time than others. Do you use time delta for animations? It is very important to use it! Also try to load all resources at loading time instead loading them when you use them. This also may slow some frames.
If you take a look at the time deltas, you'll probably find they're not very consistent frame-to-frame. This probably isn't because the frames are taking different amounts of time to render, it's just an artifact of the batched, pipelined nature of the GLES driver.
Try using some kind of moving average of the time deltas instead: http://en.wikipedia.org/wiki/Moving_average
Note that even though the time deltas you're observing in your application aren't very consistent, the fact that you're getting 60fps implies the frames are probably getting displayed at fixed intervals (I'm assuming you're getting one frame per display v-sync). What this means is that the delay between your application making GL calls and the resulting frame appearing on the display is changing. It's this change that you're trying to hide with a moving average.