iPhone 5S - Possible Depth Buffer issue - iphone

In my application, I render plane over plane. Lower plane has Z = 0, second one has Z = 0.5. If I render them (lower first), I got missing part of render, as shown on picture
On iPhone 4 and desktop (using ES emulator), there is everything correct and no problem. What could cause this bevaiour ?
Same problem occurs also for other parts of scene, like tracks, tubes (green and blue on this picture). Problem occurs, when I move camera

Ok... I have solved this. There was problem in my shader that caused depth buffer to be filled incorectly.
I have used
precision mediump float;
and that caused geometry to be not precise and Z = 0 vs Z = 0.5 has been mixed together.
Changing precision to highp solved the issue.
Bottom line. This "optimalization" was huge mistake and never use mediump in Vertex Shader (unless you are facing some performance impact and even that its not worth it. The difference in rendering is not noticable)

(This is in response to your own answer, which is only partially correct)
You've got a case of Z-fighting going on, due to the mapping of your scene's Z values, to the z-buffer. This may be a non-linear mapping (1/f(Z) is common), but I'm not sure on floating point z-buffers.
Your scene is really simple, and while chunking more z-buffer range at the problem is a partial solution, it's at the cost of performance, and not really understanding the issue. You may well run into this same problem again even with the highest possible precision z-buffer you can use on your platform!
Look at your scenes; you want to map the z-range in the 3D scene, to the maximum possible range of values the z-buffer can store, else you're wasting chunks of the range of numbers the z-buffer can store. Calculating this mapping per-frame can be useful, depending on what you want to do with the z-buffer later on.
Have a look here for some calculations. Note, that with a floating point z-buffer, you may well be worse off than with an integer one if you're chucking away a lot of small numbers - that's where the vast majority of possible storable values of a floating point number are!

Related

FPS drops when scale object. How fix? Unity Android

There is an empty scene with one standard cube. If you change its scale to (5;5;1), then the fps does not drop.
But if I change it to (5; 10; 1) my fps drops to ~30.
If I move the camera away from the cube with the scale (5;10;1), then the fps is again 60.
Maybe I have wrong camera settings or something else.
How to achieve high fps without moving the camera away?
p.s. The fps does not drop in the editor. Only after launching on android.
Unity version 2020.3.18f1. Tried on another version same problem.
cube with scale(5;5;1)
cube with scale(5;10;1)
cube with scale(5;10;1) and camera is distant
The problem may be due to the fact that more pixels is rendered (fragment shader is executed for every one) when the object is scaled up. The other hint is that when you move the camera far from the object the frame rate is increased as the rendered object generates fewer pixels.
As you mentioned that the program runs on Android, changing a regular shader to mobile shader may improve the performance.
From some of Unity's documentation on transforms
Performance Issues and Limitations with Non-Uniform Scaling
Non-uniform scaling is when the Scale in a Transform has different values for x, y, and z; for example (2, 4, 2). In contrast, uniform scaling has the same value for x, y, and z; for example (3, 3, 3). Non-uniform scaling can be useful in a few select cases but should be avoided whenever possible.
Non-uniform scaling has a negative impact on rendering performance. In order to transform vertex normals correctly, we transform the mesh on the CPU and create an extra copy of the data. Normally we can keep the mesh shared between instances in graphics memory, but in this case you pay both a CPU and memory cost per instance.
I'm not certain if your Z's scale matters in this case, because you're only rendering the x-y plane. I can't comment for certain on why the performance hit is reduced as you increase your camera distance. I suspect Unity has some intelligent vertex manipulation going on to simplify rendering of distant objects, saving you on CPU cost.
That being said, try to avoid non-uniform scaling. Primitives should typically only be used as placeholders.

Does changing Physics.defaultContactOffset have an important impact on performance?

As usual, the documentation lacking some information we have to gather somewhere else: Physics.defaultContactOffset.
Physics.defaultContactOffset is used by the collision detection system to predictively enforce the contact constraint.
Unity explains you should use 1 unit = 1 meter for physic simulation.
I needed a lot of small spheres and cubes: 10cm width. Thus 0,1 "unit".
What they dont say is that when you're working on a small scale (I'm using objects of 0,1m width = 10cm) you have to change Physics.defaultContactOffset to a smaller value than the default one.
Hence my question: is Physics.defaultContactOffset important for calculations, i.e. if I change this to a very small value, does it have a negative impact on performance?
I have to change it from 0.001 to 0.00001 to get an acceptable collision detection system and I'm worried about a negative impact on performance.
From Unity3D documentation on Default Contact Offset:
Use this to set the distance the collision detection system uses to
generate collision contacts. The value must be positive, and if set
too close to zero, it can cause jitter. This is set to 0.01 by
default. Colliders only generate collision contacts if their distance
is less than the sum of their contact offset values.
So we can assume the physics engine is calculating distances between colliders and checking if the distance counts as a collision or not. I don't think it matters so much for performance as the calculation is done anyway.
With all this being said, Unity3d physics engine doesn't really do well with tiny objects, so it's better if you scale the spheres up to 1 unit, and scale everything else to compensate. You will most likely run into issues with these tiny colliders.

Weird Lines 3D Unity

I'm working on a project, using unity 5.4.
In this projects blocks are stacked next to eachother.
However there appear some annoying weird lines. Also on android these
line occur more often than on PC.
For illustration purposes I added an image and video.
Please zoom in on the picture to see, the line I'm speaking of, clearly.
Could anyone please provide a solution to get red of this nuissance.
Thanks in advance.
Literature:
Block alignment code snippet:
for (int x = 0; x < xSize; x++)
for (int z = 0; z < zSize; z++)
{
Vector3 pos = new Vector3(x, -layerDepth, z);
InstantiateBlock(pos);
}
Video link: https://youtu.be/5wN1Wn51d_Y
You have object seams!
This occurs when there is a physical or perceived gap between objects.
There are multiple causes for this.
1. Floating Point Imprecision
This could be because you are setting the position of the cubes to int's but they have floating point dimensions. The symptom for this is usually no white seams when the camera is close to the objects, and then they gradually appear as you get further away due to floating point imprecision. More.
Most of these blocks appear to line up exactly, from most camera positions. But from the occasional unfortunate position, the exact value for A's position plus its vertex at (0.5,0.5,-0.5) might be slightly different to object B's position plus its vertex at (-0.5,0.5,-0.5) . The result is that Unity shows a tiny gap, within which you can see the shadowed side of cube A.
If you consider the following on paper 3 == 1/3 * 3 this is mathematically correct, however using floats, 1/3 == 0.333333... and subsequently 3 * 0.333333... == 0.999999... BINGO! random gap between objects!
So how to solve? Use floats to calculate the positions of your objects. new vector3(1,1,1); should be new vector3(1f,1f,1f); - for example. For further reading on this try this SOP.
2. Texture Wrap-mode
If you are using textures on your objects, try changing the Wrap-Mode of your texture from wrap to clamp, or try upping the texture padding.
3. Shadow Acne - (Lighting and Shadow artifacts)
This is the arbitrary patterns of pixels in shadow when they should really be lit or NOT lit.
To prevent shadow acne, a Bias value can be added to the distance in the shadow map to ensure that pixels on the borderline definitely pass the comparison as they should, or to ensure that while rendering into the shadow map. source.
In Unity... go to your light source and then increase the Shadow Type > shadow Bias I would suggest doubling the default value of 0.05 and then continue so until fixed. You don't want to crank this value to max because...
Do not set the Bias value too high, because areas around a shadow near the GameObject casting it are sometimes falsely illuminated. This results in a disconnected shadow, making the GameObject look as if it is flying above the ground.
Are you using different blocks that you put against eachother? Your problem sounds like the blocks are not completely against eachother which causes you to see the side of the next block (this explains the camera Y changing: you might see the side better from higher up). That side will have different lighting and appear as a different/lighter colour. To check if this is the problem, try overlapping them slightly manually in the editor and see if the problem still occurs.
Making the blocks kinematic solves that. The issue is the rigid bodies bumping up against one another.

Unity Terrain Stitching Gaps

So, I'm attempting to create a simple dynamic endless terrain using simplex noise.
So far I've got the noise working just fine - however I am having issues with the terrain having discontinuities at the edges. At first I thought this was due to the fact that I was not calling SetNeighbors on the Terrain objects, but adding this did not seem to yield any improvement.
terrain.GetComponent<Terrain>().SetNeighbors(left, top, right, bottom);
This problem seems to be caused by the slight differences in height between each terrain position - but making these set the same will effect the terrain quality (will reduce how jagged the terrain can be in certain cases) and generally seems inelegant. I've been going through the unity docs trying to find how to address this, but have yet to find anything.
Is there something I'm missing? Or is my only option to fiddle the heights on one of the sides to match the other?
Thanks for reading, appreciated as always.
Terrain image for reference
A couple things-
First, make sure you're setting SetNeighbors() on ALL the terrain objects, not just one.
Secondly, if the terrain don't match up exactly, it either means that the terrains aren't calculating their data quite correctly, or there's some floating point error going on. However, I have a suspicion that it's the first one, given that manually changing the points affects the quality. Make sure you know that terrains have n^2 + 1 points, and also make sure that the point to query from your simplex function with is calculated in world space.
If you can't figure it out, post your code and I'll take a look.
Also, your terrain might look better if you used octaved (a.k.a factal) noise on your Simplex noise function, depending on what you're looking for.
Cheers!

Unity3d, baked shadows quality

UPDATE: the problem was in floor size. I just made it smaller and problem is solved.
Hello there, I spent almost 1 week already googling... hope that you can help me with that. I baked my shadows, result and all needed settings are shown on the picture. The problem is that quality of shadow on the tree is much much better than of shadow on the ground. Do you have any guesses why?
I just made this one tree bigger than others so that it's easy to see the problem...
(shadow quality is set as the highest, rendering path is Deffered lightning)
Reducing the size of the ground object solved the issue.
The likely reason this works is that texel budget per object is limited by the lightmap size regardless of user-defined texels per unit. For example, if lightmaps are 2048x2048 and the ground is 2x2 km, there is ~1 pixel per square meter, assuming 1 unit is 1 meter.
Although it was not the case here, inconsistencies between shadows may be observed when dynamic and static shadows use inconsistent resolutions.