How to get rid of "shadow teeth" in Unity? - unity3d

I tried everything but nothing can affect this. The only thing is when I change shadow resolution to "low", it becomes more smooth (obviously), but still not the best. Those shadows also look better if angle of view is less acute. Quality settings are the best, light source is a spotlight. Material on those things uses standard shader. What do I do wrong?
Image is enlarged.

You...can't. :(
The problem is that the shadows being cast are essentially just a texture. And texture points (aka "pixels") are square. This shadow texture is then "cast" from the light source (think about the light as being a camera: every pixel that it can see that is an object, that becomes a "dark" pixel in the lightmap; its a bit more complicated than that, but not by much).
Your objects and light are definitely not square up from each other. And in fact, can never be as your cubes are rotated five to ten degrees from each other forming a curve. Which means that some edge, somewhere, is going to get jaggy. This also explains why changing the light's position and orientation to a different angle affects the result: those edges more closely (or less closely) align with where the lightmap pixels are.
You can try various settings, such as Stable Fit or higher quality shadows (this is really just "use a bigger texture" so those jaggies get smaller as the same volume is covered by more shadow-pixels) but fundamentally you're not going to get a better result.
Unless...
You use baked lighting. Open up the Lighting window (Window -> Lighting), set your lights as baked rather than forward/deferred (this means they will not be realtime and may not move or otherwise change) and then in the Lighting window, bake your lights.
This essentially creates a second texture that is wrapped around your objects and gives them shadows and the pixels line up differently and generally give smoother shadow edges when the object's faces align with the shadow-casting edge (such as your stacked cubes). The textures are also much larger than the runtime light textures because it doesn't have to be recomputed every frame (realtime lights are restricted so they don't consume gigabytes of video RAM).
Baking will take a while, let it do its thing.

Have you tried with Stable Fit (under Quality settings)?

Related

Unity - how to provide diffuse lighting

I have a simple scene of the interior of a house (less roof). It does not in any way need to look realistic, just to be geometrically correct, therefore the walls and furnishings and fittings are simply constructed from primitive objects - cubes and cylinders etc.
The layout is fine, the problem is the lighting - very black shadows. The scene has the standard single directional light source.
What I need to do is provide overall diffuse lighting - equivalent to an overcast day.
I should point out that I am pretty much a novice on all this - lighting, shaders etc, though I have been reading a lot.
From what I read it appears that this is controlled by shaders, shaders being attached to materials, materials being applied to the objects. However, it doesn't seem to make much sense to me. Surely, a shader, if part of the object by virtue of being attached to the material, can only deal with how light might be reflected off the surface - but the light has to get there first.
Therefore, there must be a way of providing an overall diffuse light in the first place?
Or have I got this completely wrong? How does one get rid of the blackness on the non-illuminated side of an object? So far the only way I have found is to make the surface emit light, ie glow a bit, which surely must not be right.
Your general understanding of how this all works is correct. One way to look at it object request rendering, looks up the material, the material binds shader to a set of parameters. The shader then gets executed, once per light in the scene that affects it (this is simplyfying things but we'll get to that in a bit). This is why lights are expensive (in forward rendering that is), until optimizations start to kick in, this means rendering the scene n times.
So yes, you could just add a constatnt factor in the shader, to achieve the effect of 'ambient' or 'diffuse' lighting. But that shader, in order to support all the features like reflectivity etc, would have to be crazy complicated.
Fortunately, with unity we also get a middle layer called Standard Shader, which does pretty much all of the math underneath, and releases you from the necessity for writing shader code.
For a gentle, diffused look, you definitely want to look at baked Indirect Illumination features of Unity, maybe even lit everything with area lights only.
Its probably also a good idea to looki into light probe groups. They work with spherical harmonics, encoding only the low frequency components of the lighting data, effectively only using slow changing factors like general direction of the light.
Finaly look into reflection probes (and skyboxes while at it), theres few good free HDR probes available that will emit light into your scene (when baking lightmaps and baking lightprobes), enabling surprising realism, compared to default unity skybox.
If you don't want harsh directional light, just disable it (although it's often useful to know what is your strongest light source in your sene - even if its a skybox with some clouds, i would probably keep a scene light just to know faster if anything goes wrong

How can I use baked lighting on sprites? / How to light up a large area in 2D?

I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.

Unity3D: Why is particle lighting making directional shadows disappear?

UPDATE: As #BenHayward suspected, this is a bug. <link>
I have a very simple setup of cubes on a plane comprising a grid of quads. A directional light is shining down at the scene at an angle, producing a set of shadows from the cubes onto the quads.
Now I'm trying to produce an explosion effect with Unity's particle system, but when I add a point light to the particle system it causes all the directional-lighting shadows to disappear, whether they're in line of sight of the particle or not.
The shadows reappear when the particle is destroyed. Replicating the particle effect with pure C# doesn't cause any problems.
(Oh, and obviously I'm using the deferred rendering path.)
Any ideas? This is driving me off the wall.
[EDIT: I should have mentioned that the point light added to the particle system is set to cast shadows. The Unity standard particle pack has shadow-casting disabled by default. They too cause the problem when I turn the shadow-casting on.]
Based on the project that you linked to, it seems as though the particle system is causing the shadow cast from the directional light to flicker on and off quickly. I suspect this is a bug, since if it were intended behaviour, I wouldn't expect it to flicker in this manner.
In cases where this is not a bug, the problem can be caused by a couple of issues:
You can only have a certain number of dynamic (shadow casting) lights in your scene which are seen by the camera frustum. By default, this number is quite low (I think it's 4). You can increase this number by going to Edit > Project Settings > Quality. Set the Pixel Light count higher from its default value. You will need to increase this value to be greater than the total number of lights in your effect. Higher values will allow more lights to be rendered on the screen, but this reduces performance.
It depends on the shaders which you are using to receive the shadows. Some shaders will only render shadows for one directional light. The light which is used isn't necessarily too easy to determine. If you are using the standard Unity shader this shouldn't be a problem. But if you are using a mobile compatible surface shader or something you've written yourself then this could be the cause of the problem.
Also, for an explosion, I'd recommend using just one single point light (not lights attached to each particle), as this is all that is required. Any more lights would result in considerable performance impact on the GPU especially if there are more than one explosion in the scene at any one time.
I recreated the scene as you described, i can't recreate your issue.
i mostly followed this tutorial, and added a few cubes in a plane:
https://unity3d.com/learn/tutorials/topics/graphics/adding-lighting-particles
I will need a screenshot of your lights componnents, both the directional and the point light, the particles, and the cubes (mostly the material); I cannot comment because i dont have enough reputation yet, so i'll delete this once you add the screenshots;

Purpose of mipmaps for 2D sprites?

In current Unity,
For use in Unity.UI as conventional UI ..
for any "Sprite (2D and UI)", in fact it always defaults to having "Generate Mip Maps" turned ON. Every time you drop an image in, you have to turn that "off" and apply.
As noted in the comments, these days you can actually use world space UI canvasses, and indeed advanced users may indeed have (say) "buttons that float over the head of Zelda and they are in the far distance". However if you're a everyday Unity user adding a button, just turn it off :)
In Unity, "sprites" can still be positioned in 3D space. For example, on a world space canvas. Furthermore, mipmaps are used when the sprite is scaled. This is because the mipmap sampling is determined by the texel size rather than the distance.
If a sprite is flat and perfectly scaled then there is no reason to use mipmaps. This would likely apply to your icon example.
I suspect that it is enabled by default for 2D games where sprites will often not be perfectly scaled. To clarify, a sprite does not need to be on a canvas. Sprites can exist as their own GameObject with a Sprite Renderer (not on a canvas.) When this is the case, scaling the camera view will change the sprite's size on the screen resulting in mipmapping due to the texel size changing. This results in making the sprite always perfectly scaled challenging without a canvas.

Metallic sparkle effect in OpenGL ES?

I'm working on an Android and iPhone app. I'm rendering lots of smallish (about 32 pixels) billboards to the screen for a particle system and want to give a glitter-like sparkle to each billboard e.g. as the particles are falling, random ones will briefly light up and sparkle as they catch the light. Is there a simple way to achieve this effect? As a limitation, I cannot use pixel/vertex shaders.
I was thinking something along the lines of a giving each billboard metal-like lighting effect (although I'm not sure how to do this part) coupled with giving each billboard a random and constantly rotating normal with flat shading so that each billboard would randomly light up. I'm having trouble making it look nice.
Disclaimer: I don't know OpenGL, and I did't actually try anything I write below.
You can have another, 'brightly lit', texture and substitute it when normal is nearly at the 'shine' position.
Take a piece of metal and rotate it. Once the normal is close to 'full shine' position, the metal shines a bit brighter, and a muted reflex travels through it, with a bright flash in the middle, then it is dull again.
If you can, apply a second bright texture of a narrow 'reflex' band and move it through surface of the billboards that are in a near-shine position, shifting them accordingly to normal angle. When the normal is at the shine position (± epsilon), apply the 'full shine' texture.
Also, unless your plates fly in vacuum, there will be a halo due to atmosphere. Add a rectangle say 50% bigger than the plate right behind it, and apply to it a semi-transparent halo texture that becomes fully transparent closer to edges. You only need it at the fill shine moment.