This is probably a noob question, but I am playing with lightmaps in unity and I am finding that with baked lights and the same lighting settings (intensity etc) the scene looks much darker.
This means that for me to test how it will actually look realtime is not a good indicator and baking every time I change a color or increase light intensity would be an extremely time consuming process.
Is this normal? Is there a good workflow for this that someone can share?
I also have the issue of character which needs to have realtime lights. If I increase the light intensity to compensate for the above, the character appears very bright (as it's using realtime).
Hope this makes sense and somebody can help me out!
Cheers,
You need to think that Lightmapping is a pre setup of the scene in the same way you make any prefab or character with their textures and uvmap, the only difference is that you have prerender in the texture the light and shadows to skip realtime light and shadow calculations.
In addition, you can add some realtime lightning effect without modify your lightmapping and some GI to got your desire illumination in all scene including your characters.
Related
I just started using URP in Unity for a game in progress. I'm doing a sort of sprites-in-3d thing, so I'm rendering some sprite sheets on quads. To do this, I create a Material with the sprite sheet and use tiling/offset to render the proper frame of animation by making a call like:
CombatMaterial?.SetTextureOffset("_BaseMap", new Vector2( (AnimationDefinitions[animationDefinition] % 16) * .0625f, CombatMaterial.mainTextureOffset.y));
I'm currently trying to add some feedback into my game for when characters use abilities or get hit by flickering the material. Because the base color starts at white and goes to black, that won't really work; the only other thing I seem to have available to me is emission, which looks great. Using a 0xAAAish color achieves the effect I'm looking for. I've been using the Feel Unity asset to do this, but I've also attempted using something like this:
CombatMaterial?.SetColor("_EmissionColor", Color.white);
The problem is, once I've set the _EmissionColor, the main texture offset no longer updates in game, thereby ruining all animations. If I change the texture offset manually through the inspector at runtime, animations don't work AND the _EmissionColor flickering stop working. If I mess around with the color of the _BaseMap in the inspector, _EmissionColor flickering starts working again.
Before I start diving into some unsightly color adjustments in an attempt to make this work again, I would love to know if I'm doing something that is simply unsupported by URP/Materials/whatever, or if there is some alternative to what I'm doing that's a little more straightforward.
Thank you!
After trying a bunch of random stuff, I don't have a "real" solution, but the game IS working how I want it to.
What worked for me was setting the _EmissionColor on the Material to (1,1,1). For some reason, when the _EmissionColor is set to (0,0,0) it's a black (ha) hole and won't accept future changes to the _EmissionColor. I assume this is some shader nonsense (with the base Lit Shader that URP uses) that I am clearly unfamiliar with.
Hopefully this helps anyone doing something as pointlessly against the grain as I am!
I have a simple scene of the interior of a house (less roof). It does not in any way need to look realistic, just to be geometrically correct, therefore the walls and furnishings and fittings are simply constructed from primitive objects - cubes and cylinders etc.
The layout is fine, the problem is the lighting - very black shadows. The scene has the standard single directional light source.
What I need to do is provide overall diffuse lighting - equivalent to an overcast day.
I should point out that I am pretty much a novice on all this - lighting, shaders etc, though I have been reading a lot.
From what I read it appears that this is controlled by shaders, shaders being attached to materials, materials being applied to the objects. However, it doesn't seem to make much sense to me. Surely, a shader, if part of the object by virtue of being attached to the material, can only deal with how light might be reflected off the surface - but the light has to get there first.
Therefore, there must be a way of providing an overall diffuse light in the first place?
Or have I got this completely wrong? How does one get rid of the blackness on the non-illuminated side of an object? So far the only way I have found is to make the surface emit light, ie glow a bit, which surely must not be right.
Your general understanding of how this all works is correct. One way to look at it object request rendering, looks up the material, the material binds shader to a set of parameters. The shader then gets executed, once per light in the scene that affects it (this is simplyfying things but we'll get to that in a bit). This is why lights are expensive (in forward rendering that is), until optimizations start to kick in, this means rendering the scene n times.
So yes, you could just add a constatnt factor in the shader, to achieve the effect of 'ambient' or 'diffuse' lighting. But that shader, in order to support all the features like reflectivity etc, would have to be crazy complicated.
Fortunately, with unity we also get a middle layer called Standard Shader, which does pretty much all of the math underneath, and releases you from the necessity for writing shader code.
For a gentle, diffused look, you definitely want to look at baked Indirect Illumination features of Unity, maybe even lit everything with area lights only.
Its probably also a good idea to looki into light probe groups. They work with spherical harmonics, encoding only the low frequency components of the lighting data, effectively only using slow changing factors like general direction of the light.
Finaly look into reflection probes (and skyboxes while at it), theres few good free HDR probes available that will emit light into your scene (when baking lightmaps and baking lightprobes), enabling surprising realism, compared to default unity skybox.
If you don't want harsh directional light, just disable it (although it's often useful to know what is your strongest light source in your sene - even if its a skybox with some clouds, i would probably keep a scene light just to know faster if anything goes wrong
I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.
UPDATE: As #BenHayward suspected, this is a bug. <link>
I have a very simple setup of cubes on a plane comprising a grid of quads. A directional light is shining down at the scene at an angle, producing a set of shadows from the cubes onto the quads.
Now I'm trying to produce an explosion effect with Unity's particle system, but when I add a point light to the particle system it causes all the directional-lighting shadows to disappear, whether they're in line of sight of the particle or not.
The shadows reappear when the particle is destroyed. Replicating the particle effect with pure C# doesn't cause any problems.
(Oh, and obviously I'm using the deferred rendering path.)
Any ideas? This is driving me off the wall.
[EDIT: I should have mentioned that the point light added to the particle system is set to cast shadows. The Unity standard particle pack has shadow-casting disabled by default. They too cause the problem when I turn the shadow-casting on.]
Based on the project that you linked to, it seems as though the particle system is causing the shadow cast from the directional light to flicker on and off quickly. I suspect this is a bug, since if it were intended behaviour, I wouldn't expect it to flicker in this manner.
In cases where this is not a bug, the problem can be caused by a couple of issues:
You can only have a certain number of dynamic (shadow casting) lights in your scene which are seen by the camera frustum. By default, this number is quite low (I think it's 4). You can increase this number by going to Edit > Project Settings > Quality. Set the Pixel Light count higher from its default value. You will need to increase this value to be greater than the total number of lights in your effect. Higher values will allow more lights to be rendered on the screen, but this reduces performance.
It depends on the shaders which you are using to receive the shadows. Some shaders will only render shadows for one directional light. The light which is used isn't necessarily too easy to determine. If you are using the standard Unity shader this shouldn't be a problem. But if you are using a mobile compatible surface shader or something you've written yourself then this could be the cause of the problem.
Also, for an explosion, I'd recommend using just one single point light (not lights attached to each particle), as this is all that is required. Any more lights would result in considerable performance impact on the GPU especially if there are more than one explosion in the scene at any one time.
I recreated the scene as you described, i can't recreate your issue.
i mostly followed this tutorial, and added a few cubes in a plane:
https://unity3d.com/learn/tutorials/topics/graphics/adding-lighting-particles
I will need a screenshot of your lights componnents, both the directional and the point light, the particles, and the cubes (mostly the material); I cannot comment because i dont have enough reputation yet, so i'll delete this once you add the screenshots;
I'm generating a dungeon out of prefabs which means I design a room, save it in the resource folder and instantiate it at a random position with a random rotation while the game is running.
The problem I have is the lighting.
Because of the above mentioned generation process it has to be dynamic but it doesn't seem to work. Below you can see the comparison between a baked and realtime rendered room:
Baked (I also don't know where these strange lighting borders (on the walls), which are looking like someone painted the light with watercolors are coming from):
Realtime:
As you can see, the realtime room doesn't seem to reflect light in any way.
These are my lighting settings:
And this is my 'sun':
What am I doing wrong?
Your lighting settings have Ambient Light set to 0- with realtime lighting, this means nothing that can directly see the source of a light will be lit at all. The screenshot with baked lighting looks different because it has a baked lightmap.
If you're trying to get the real-time lighting to look exactly like the baked, soyy, but Unity refuses to bake lightmaps at runtime. The closest you can probably get is by setting your Ambient Light to a color and its intensity above zero. Playing around with Light Probes probably won't be much good, since you need to light an entire room in a vacuum.
An alternate solution, depending on how well you know Unity, would be to Frankenstein together different scenes, which is mentioned briefly in Unity's Intro to Global Illumination, though I can't find it anywhere else.
Relevant links:
Baked Lightmaps: http://docs.unity3d.com/Manual/GIIntro.html
Light Probes: http://docs.unity3d.com/Manual/LightProbes.html
Ambient Light: http://docs.unity3d.com/Manual/GlobalIllumination.html