Texels vs Units vs Pixels in Lightmaps? - unity3d

So in the configurations of the Lightmapping I have a lot of strange settings and can't quite understand how they work together:
First off texels = pixels, right? So texel is just a fancy name for a
pixel?
Lightmap Resolution in texels per unit. What is unit in this
context? And how does it fit that stuff in the constant Lightmap Size
(let's say 256 texels=pixels)?
What is the difference between Indirect Resolution and Lightmap Resolution? And why I can't adjust Indirect Resolution in Progressive CPU?

Indirect resolution is the fidelity of indirect lighting, light that comes from stuff like a bright surface being lit by the sun. Lightmap resolution is the fidelity of general lighting (basically direct lights)
I'm fairly sure You can't set it in progressive CPU because it'll eventually reach your lightmap resolution anyway. Progressive CPU does a low quality lighting pass as you edit your scene, and over time improves the quality, so you can get a rough idea of your lighting as you edit, but eventually (particularly in areas you aren't playing around in) the lighting will reach your max resolution. It's purely to improve your workflow, so you can edit your scene without waiting for a slow light pass.
I recommend reading the relevant pages:
https://docs.unity3d.com/Manual/ProgressiveLightmapper.html
https://docs.unity3d.com/Manual/Lightmapping.html
Unity's lighting system is powerful, but also quite complex, take your time and read up on the documentation, when I first started working with it I spent a few days just absorbing all the information, there's a lot to take in.

Related

Unity - how to provide diffuse lighting

I have a simple scene of the interior of a house (less roof). It does not in any way need to look realistic, just to be geometrically correct, therefore the walls and furnishings and fittings are simply constructed from primitive objects - cubes and cylinders etc.
The layout is fine, the problem is the lighting - very black shadows. The scene has the standard single directional light source.
What I need to do is provide overall diffuse lighting - equivalent to an overcast day.
I should point out that I am pretty much a novice on all this - lighting, shaders etc, though I have been reading a lot.
From what I read it appears that this is controlled by shaders, shaders being attached to materials, materials being applied to the objects. However, it doesn't seem to make much sense to me. Surely, a shader, if part of the object by virtue of being attached to the material, can only deal with how light might be reflected off the surface - but the light has to get there first.
Therefore, there must be a way of providing an overall diffuse light in the first place?
Or have I got this completely wrong? How does one get rid of the blackness on the non-illuminated side of an object? So far the only way I have found is to make the surface emit light, ie glow a bit, which surely must not be right.
Your general understanding of how this all works is correct. One way to look at it object request rendering, looks up the material, the material binds shader to a set of parameters. The shader then gets executed, once per light in the scene that affects it (this is simplyfying things but we'll get to that in a bit). This is why lights are expensive (in forward rendering that is), until optimizations start to kick in, this means rendering the scene n times.
So yes, you could just add a constatnt factor in the shader, to achieve the effect of 'ambient' or 'diffuse' lighting. But that shader, in order to support all the features like reflectivity etc, would have to be crazy complicated.
Fortunately, with unity we also get a middle layer called Standard Shader, which does pretty much all of the math underneath, and releases you from the necessity for writing shader code.
For a gentle, diffused look, you definitely want to look at baked Indirect Illumination features of Unity, maybe even lit everything with area lights only.
Its probably also a good idea to looki into light probe groups. They work with spherical harmonics, encoding only the low frequency components of the lighting data, effectively only using slow changing factors like general direction of the light.
Finaly look into reflection probes (and skyboxes while at it), theres few good free HDR probes available that will emit light into your scene (when baking lightmaps and baking lightprobes), enabling surprising realism, compared to default unity skybox.
If you don't want harsh directional light, just disable it (although it's often useful to know what is your strongest light source in your sene - even if its a skybox with some clouds, i would probably keep a scene light just to know faster if anything goes wrong

Unity light baking - How to manage mixed lighting

I have a pretty large scene, where I have 1 directional light (sun) set to mixed.
I mainly used realtime lighting during development in order to save time.
Now before release I want to bake, to improve performance for my users.
My questions are
Is there any rule of thumb for any of the values in light-settings?
which objects should I set to static (I set everything that doesnt move to static) and is something wrong if my bake takes 3-4 hours?
Should I use realtime GI since I have mixed lighting, or is Baked GI enough?
I use fairly high settings for my final bake because I want it to looks nice, and I bake everything that doesnt move in the scene (thousands of objects). While it does take hours, the light map size ends up at around 60mb after compression, which doesnt seem bad.
My settings are like the image below, except I've increased:
Direct samples: 200
Indirect samples: 1000
Resolution: 24
Parameters: Default High
The reason for increasing these values is simply because my objects did not look good in a bake with any lower settings.
I want to bake, to improve performance
Notes:
Baking usually improves visuals.
But this creates a lot of textures, so it may lower performance.
Using ONLY the baked lights, it's basically unlit - very good performance.
Mixed means you bake the indirect lights (because without realtime raytracing you won't have indirect lighting*) and still use realtime lights for some things like hard shadows.
If you don't want to rotate the directional light (sun) or move point lights around, there is no use in baking the realtime GI.
[*] : We can fake some realtime GI / soft indirect lighting by pre-baking this. So "Realtime GI" allows you to rotate the Directional Light (Sun) or even move lights around, but not objects. And you need to bake it, usually including some light-probes. So "realtime" doesn't mean it's bake-less. And it's not the same as real raytracing.
Duration:
Using 2018.3 or 2019.1 (not sure) you can try the GPU Lightmapper (preview) - it is a lot faster. However using CPU 3-4 hours is quite normal to me.
Static?
Marking things as static is always good for performance, but you can also mark non-static things as "Lightmap-static" if you want movable houses to be pre-baked for example.
Lightmap Debug Mode:
The reason for increasing these values is simply because my objects did not look good
In Scene view, you can select a "Lightmap" mode (not sure how it's called).
It displays basic chess/checker texture on all objects with lightmap-UVs. Use this view to scale lightmap UVs to increase details on close objects for example.
edit:
Mixed Lighting - Performance?
This will bake the light for all static (or at least lightmap-static) objects in the scene. During runtime, lighting is not computed for the static ones anymore.
At least that's what the Manual says:
Selecting the ‘Mixed’ baking mode, GameObjects marked as static will still include this light in their Baked GI lightmaps. However, unlike lights marked as ‘Baked’, Mixed lights will still contribute realtime, direct light to non-static GameObjects within your scene.
I just tested:
"Baked" - After Baking, rotating the sun didn't affect the scene until rebake.
"Mixed" - I can rotate the sun and affect shadows. Note: "static" objects still get some indirect light (reflecting from the surroundings) while non-static objects are not baked, and therefore completely black on the side which is not facing the sun.
The Lighting tab says
Mixed lights provide realtime direct lighting. Indirect lighting gets baked into lightmaps and light probes. [...]
Performance:
This is hard to answer without a test, as I am only relying on theoretical facts, and maybe assumptions.
I would say the Mixed Light mode is just to have better visuals (indirect lighting). This is precomputed, but the textures still need to be multiplied/added to the pixel lighting in the shader - this costs a bit performance.
And the direct light is still computed in realtime - so if I understood the docs correctly, Mixed is always worse than only realtime lighting in terms of performance.
But: If your sun doesn't even move - you can use it in "baked" mode instead of "mixed" - this will bake shadows to texture and save the calculation at runtime. This is the only option that for sure is better in performance. I'm not 100% sure about the mixed mode.
And realtime lights will still affect them. A torch for example.
However your non-static player will not get the sunlight lighting/shadow because it cannot be baked. You could try using an unlit shader on him, and add a fake shadow under it. Or you place a point light above it to fake the sun.

How to get rid of "shadow teeth" in Unity?

I tried everything but nothing can affect this. The only thing is when I change shadow resolution to "low", it becomes more smooth (obviously), but still not the best. Those shadows also look better if angle of view is less acute. Quality settings are the best, light source is a spotlight. Material on those things uses standard shader. What do I do wrong?
Image is enlarged.
You...can't. :(
The problem is that the shadows being cast are essentially just a texture. And texture points (aka "pixels") are square. This shadow texture is then "cast" from the light source (think about the light as being a camera: every pixel that it can see that is an object, that becomes a "dark" pixel in the lightmap; its a bit more complicated than that, but not by much).
Your objects and light are definitely not square up from each other. And in fact, can never be as your cubes are rotated five to ten degrees from each other forming a curve. Which means that some edge, somewhere, is going to get jaggy. This also explains why changing the light's position and orientation to a different angle affects the result: those edges more closely (or less closely) align with where the lightmap pixels are.
You can try various settings, such as Stable Fit or higher quality shadows (this is really just "use a bigger texture" so those jaggies get smaller as the same volume is covered by more shadow-pixels) but fundamentally you're not going to get a better result.
Unless...
You use baked lighting. Open up the Lighting window (Window -> Lighting), set your lights as baked rather than forward/deferred (this means they will not be realtime and may not move or otherwise change) and then in the Lighting window, bake your lights.
This essentially creates a second texture that is wrapped around your objects and gives them shadows and the pixels line up differently and generally give smoother shadow edges when the object's faces align with the shadow-casting edge (such as your stacked cubes). The textures are also much larger than the runtime light textures because it doesn't have to be recomputed every frame (realtime lights are restricted so they don't consume gigabytes of video RAM).
Baking will take a while, let it do its thing.
Have you tried with Stable Fit (under Quality settings)?

How to fix low resolution lightmaps in Unity

I'm currently testing how baked lightmaps work with models which I made in blender. After building lightmaps I noticed that they are really pixelated in some areas. Then I was trying to figure out which part of lightmap i causing this effect. It seems that it's Indirect Resolution, because when i turned it down as low as possible, pixelated parts completely disappear.
The problem is that form what I saw in other projects Indirect Resolution is much lower than Baked Resolution so I don't know why in my project it looks like this. I also tried to crank up Indirect Resolution but results weren't satisfying.
These lightmaps might seem fine to you but you can clearly see darker areas that look like "splash" and doesn't match resolution of the rest.
There are screenshots of how lighting works with two different setups:
This is setup that i used for lightmap in first screenshot:
First screenshot:
This is example of low res part in screen above:
Second screenshot (the only change is Indirect Resolution changed to 20):
Try adjusting the lighting-map settings of the object in the Lighting Window, especially the Advanced Paramters.

Why is the texture size changed?

I've made a small png (100x30) and added it as a texture in Unity. I've also added it to a script so I can poll it, but if I print out its width and height it now states its 128x32. What happened?
I've tried to adjust the camera, but it does not seems to have anything to do with it. Any clue?
Generally textures are scaled to a power of 2 on both sides for performance reasons.
For example multiplication and division are faster if the GPU can assume that you are working with POT textures. Even mipmaps generation process relies on that because division by 2 never produces a remainder.
Old GPU (I don't know exactly but probably even several mobile GPU these days) requires strictly POT textures.
Unity by default will try to scale the texture to a power of 2. You can disable or tweak that setting switching texture import settings to "Advanced".
For more details checkout the doc.
Note that generally you want NPOT textures only for GUI, where you need to control exactly the resolution on screen and mipmap aren't used. For everything in the 3D scene power of 2 performs better.