I have an application that uses the Unity terrain engine to view the terrain (and models on the terrain) with a few different fields of view. This is essentially a camera with telescopic zoom that transitions from 1x to 3x then to 9x.
The problem I'm having is that the various detail roll off settings (Detail Distance, Tree Distance, Billboard start etc.) are all based on the distance from the camera to the 'detail'. At 3x and 9x zoom the view starts at 200 units, and goes out to 2000 units. The landscapes look pretty rubbish, none of the grass shows up, and the trees are all billboarded (like a mid 90s game :-))
I'm trying to set a min & max range for detail based on what I can see in my viewport, not how far the camera is from that detail.
Has anybody got a suggestions as to how I can ramp up distant details when I have my tighter FOVs?
Thanks in advance.
Try add mipmaps for all textures on scene and disable lods on objects and check new details!
Make sure you are using standard size for your objects (1 unit ~ 1 meter)
Make sure all textures sizes are power of two and have mipmaps enabled.
Go to quality settings and Make sure Max LOD level is 0
Change LOD bias and see if it helps (1 is more details, 0 is less details)
Check detail distance in terrain itself and make sure it is far enough.
Related
I want to place virtual objects (holograms) at far distances (20+ meters) in the HoloLens 1. However, at such distances holograms become unstable and appear to "swim" in the display. Has anyone had success with this? What worked for you?
Some potential fixes include:
Ensure 60 FPS
Adjust Stabilization Plane
Employ visual markers (vuforia)
Use static room scan (may not scale well)
For me, frame rate is not an issue. And I am using Unity 2017.4.4f1. Currently, I have a single world anchor and all objects are set relative to this anchor.
20+ meters is a lot and I am not sure if this will work good enough.
Ensuring 60 fps or at least 50/55+ is important but this wont solve the swimming at this distance. A low framerate might only cause additional swimming :)
Everything that should appear statically placed in the room should be on or very close to the stabilization plane. So what you want to avoid is having the far objects at very different distances from the user. That would otherwise cause the ones farthest off from the stabilization plane to swim.
If you only have the far away object try placing the stabilization plane at the same distance as the object, if the distances are changing a lot you can also update the stabilization plane distance at runtime to always set it to the current distance to the object.
Would be interesting to hear if it worked out :)
One more thing: If I remember correctly, objects should ideally placed directly or in close proximity to their world anchor to help stabilization.
20 metres is too far. The docs
Best practices When holograms cannot be placed at 2m and conflicts
between convergence and accommodation cannot be avoided, the optimal
zone for hologram placement is between 1.25m and 5m. In every case,
designers should structure content to encourage users to interact 1+ m
away (e.g. adjust content size and default placement parameters).
I'm trying to figure out why my Object's textures keep turning white once I scale the object down to 1% (or less) of its normal size.
I can manipulate the objects realtime with my fingers and there is a threshold where all the textures (except a few) turn completely ghost white, as shown below:
https://imgur.com/wMykeFw
Any input to fix is appreciated!
One potential cause of this issue is due to how certain shaders can miscalculate how to render textures when scales are set to low values.
To be able to render this asset so small using the same shader, re-import the mesh with a smaller scale factor (in the mesh import settings), and that may fix it.
select ARCamera then camera, in the inspector, select the cameras clipping plane and increase it(you want to find the minimum possible clipping that works to save on memory, so start at 20000, and work your way backwards til it stops working, then back up a notch).
next (still in the cameras inspector), select Rendering Path and set it to Legacy Vertex Lit
this should clear it up for you
i'm new to Swift, and now i'm trying to build sky map app like the application "star chart".
i already got a sky map image from NASA and cover it on SCNsphere, also already set camera node in the center of this sphere to make it looks like 360 degrees. Furthermore, i used accelerator to check what direction the camera is looking at.
i know that the sky map like “star chart” doesn't need internet to update data. so now the biggest problem is that i don't know how to correct the position of my sky map according to people's current time and location.
Any good advice and help? Thanks in advance!!! cause i tried vary hard to find some related information but still stuck in here for three weeks.
You just need to rotate your map with time+longitude around Earth's rotation axis and with latitude around axis longitude=90 degrees while earth is placed in the center of your sphere. For stars the offset does not matter so you can ignore Sun-Earth distance and also Earth's radius as well.
The time rotation must be day+year rotations together. On top of that you have to apply precession and nutation if you want to have higher precision.
Of coarse the stars are moving too so if you need really high precision and or high time interval to cover (hundreds or thousands of years) then this approach is not good and you should use stellar catalog with the motions implemented.
For more info see related:
How to draw sky chart
Plotting a star chart efficiently
If you want to use catalog and real colors then you will also need
Star B-V color index to apparent RGB color
simplified atmospheric scattering GLSL shader
And finally here some hints for such applications:
Is it possible to make realistic n-body solar system simulation in matter of size and mass?
I'm struggling with performance in unity. I created a very simple game scene. Very low poly. 2 light sources, 1 Directional 1 Point, I have the Standard Shader with Alberdo and Occlusion texture set (this for all few 3D objects in the scene)
The issue is, I was expecting fps to be around 60 but it is 29ish..
What things I have to consider regarding performance in this scenario? it is very frutstating since it is a very, very simple scene
see images:
In your Quality settings, as mentioned by Hamza, set shadow resolution to medium setting and set V sync count to "Dont sync".
UPDATE: the problem was in floor size. I just made it smaller and problem is solved.
Hello there, I spent almost 1 week already googling... hope that you can help me with that. I baked my shadows, result and all needed settings are shown on the picture. The problem is that quality of shadow on the tree is much much better than of shadow on the ground. Do you have any guesses why?
I just made this one tree bigger than others so that it's easy to see the problem...
(shadow quality is set as the highest, rendering path is Deffered lightning)
Reducing the size of the ground object solved the issue.
The likely reason this works is that texel budget per object is limited by the lightmap size regardless of user-defined texels per unit. For example, if lightmaps are 2048x2048 and the ground is 2x2 km, there is ~1 pixel per square meter, assuming 1 unit is 1 meter.
Although it was not the case here, inconsistencies between shadows may be observed when dynamic and static shadows use inconsistent resolutions.