I am following this video to start learning about Unity.
https://www.youtube.com/watch?v=VV3sSM0gjQ8
In this video, his scene is pretty dark aside from the lights that he put in, while my scene is like this:
https://i.gyazo.com/5c038a72e47e37234ab56cecba29bee0.png
No matter what I do with the light toggle, I can never get it like his. How do I do this?
You are not seeing light in your scene. Click the light icon in your toolbar (its next to the shaded and 2D button) to see how the light looks in your scene.
For even better results use lightmapping. This means you put light in the scene and let the program calculate it for you. What you are using now is realtime lighting in your scene. Lightmapping will also enable you to use softshadows. Read the basics about it here: http://docs.unity3d.com/410/Documentation/Manual/Lightmapping.html
Related
Developing a simple zombie game with day-night cycle. So, for the player to handle the night, i am making a flashlight. Modeled it, and i need a real Unity light to make it a flashlight (but it's anyways flashlight, but broken..?). The light works neither in Scene or Game window. Here i'll show you some screenshots (and a test video):
I tried clearing cache, as i found on Internet, but it only worked until i switched tabs.
EDIT for #BugFinder [30.01.2023]
Light component for the flashlight:
Switching between Point and Spot light isn't doing anything.
Your light mode is currently set to Baked. This instructs Unity to prerender a static light when you press bake lighting in the lighting menu. Obviously you don't want something static for your flashlight, so change the mode to Realtime.
Unity 2021.3.16f1/URP 12.1.8
I've just started with Unity a few weeks ago and am still getting to grips with how everything works. So please don't assume I know everything there is to know about Unity. Treat me as a n00b. 😉
I'm building a VR game for the Quest/Quest2. I have a scene with a keypad on a wall. When the payer "clicks" it, I want the scene to go dark, and a large version of the (3D) keypad to appear with which he can then interact (enter numbers). This keypad must always stay in the middle of his view.
What I did was create a canvas, and added a black plane with 50% transparency and the large version of the keypad. I've set up the canvas as follows:
This works somewhat. It has two major disadvantages: 1) the keypad is receiving lighting from the scene while I want it to be fully lit all the time, and 2) the canvas and all its children clip through walls and objects while I always want it to be rendered in front of everything else (yes I know this will mess with your depth perception in VR, but I already have a solution for that).
So the next thing I tried, was stacking cameras. I created a second camera and set is as an overlay camera. I also set its Culling Mask to UI:
Additionally, I added the new camera to my Main Camera as a stacked camera. I changed the Culling Mask of the Main Camera to everything but UI:
This works they way I want it but at a cost: performance takes a huge hit. My frame rate actually halved. I read everywhere that this is a known problem for mobile devices (which the Quest really is).
Another solution I read about, is using RenderObjects. But I can't really find how to use this. I'm not even sure it really is a solution to what I'm trying to achieve.
So can anyone tell me how I should go about doing this? Thanks in advance!
The solution of lighting is the that you can setup the layer of your keyboard to something like "Keyboard"
And in Directional Light you can uncheck keyboard layer.
The solution of second problem is that you can change culling mask of camera on Run Time Like this:
~(1 << LayerMask.NameToLayer("Keyboard"))
renders everything except the transparent layer.
1 << LayerMask.NameToLayer("Keyboard")
renders only the transparent layer.
NOTE: You can set your own layer and check/uncheck what you want it is just a example
Ok so I've been trying to make a custom 2D lighting system in Unity, and I'm at that annoying stage where I know what I want to do but I'm not sure how to do it.
Here's the plan:
There will be dedicated light objects with their own meshes. These meshes determine the shape of the light.
Before the camera renders the whole scene, it does an extra render of just the light meshes with a black background to create a lightmap.
Then the camera renders the scene as normal (does NOT render the light meshes this time). Every object has a shader that will access the lightmap and shade itself appropriately depending on the color of the lightmap at that point.
That's the idea anyway. I sorta threw together a botched form of this. I used a separate camera to render the lightmap into a render texture with a culling mask so that it only rendered the light meshes, which are on their own layer. I then manually passed that texture to the shaders which use their screen uvs to sample from it.
This works sorta ok, but the scene view is completely messed up since it tries to light things as if you were looking at it from the perspective of the lighting camera. I feel like this would make the system hard to use, so I want to try to make some that feels a bit more cohesive.
Here's some screenshots to explain:
The tan-ish box is my "light," which gets rendered to the light cam, visible in scene. This next shot is what renders to the lightmap:
The black background is not from the big black box, the clear flag is just set to Black.
Now according to this lightmap, the middle of the screen should be lit up. and that's exactly what happens:
Notice that in the game view, since the light camera is set up with the same position/rotation/perspective settings as the game camera, it looks fine:
The main problem is figuring out that extra render. Is there anyway to create an extra pass for the main camera before the scene render that only renders the light meshes? I could probably figure out the rest from there. It would also be nice if I could make the lightmap a global shader variable, that way I don't have to pass it to each individual material, but one thing at a time, right?
Thanks so much to anyone who can shed some light on this subject. I'm still pretty new to shaders and rendering, so any help is much appreciated.
If I understand correctly, your problem is the appearance of your lights in Scene View, right ?
For that, you can create a custom Gizmos for them and hide the original objects. There's a tutorial:
https://learn.unity.com/tutorial/creating-custom-gizmos-for-development-2019-2#5fa30655edbc2a002192105c
I just started using URP in Unity for a game in progress. I'm doing a sort of sprites-in-3d thing, so I'm rendering some sprite sheets on quads. To do this, I create a Material with the sprite sheet and use tiling/offset to render the proper frame of animation by making a call like:
CombatMaterial?.SetTextureOffset("_BaseMap", new Vector2( (AnimationDefinitions[animationDefinition] % 16) * .0625f, CombatMaterial.mainTextureOffset.y));
I'm currently trying to add some feedback into my game for when characters use abilities or get hit by flickering the material. Because the base color starts at white and goes to black, that won't really work; the only other thing I seem to have available to me is emission, which looks great. Using a 0xAAAish color achieves the effect I'm looking for. I've been using the Feel Unity asset to do this, but I've also attempted using something like this:
CombatMaterial?.SetColor("_EmissionColor", Color.white);
The problem is, once I've set the _EmissionColor, the main texture offset no longer updates in game, thereby ruining all animations. If I change the texture offset manually through the inspector at runtime, animations don't work AND the _EmissionColor flickering stop working. If I mess around with the color of the _BaseMap in the inspector, _EmissionColor flickering starts working again.
Before I start diving into some unsightly color adjustments in an attempt to make this work again, I would love to know if I'm doing something that is simply unsupported by URP/Materials/whatever, or if there is some alternative to what I'm doing that's a little more straightforward.
Thank you!
After trying a bunch of random stuff, I don't have a "real" solution, but the game IS working how I want it to.
What worked for me was setting the _EmissionColor on the Material to (1,1,1). For some reason, when the _EmissionColor is set to (0,0,0) it's a black (ha) hole and won't accept future changes to the _EmissionColor. I assume this is some shader nonsense (with the base Lit Shader that URP uses) that I am clearly unfamiliar with.
Hopefully this helps anyone doing something as pointlessly against the grain as I am!
Just incase, what i meant is as depicted in the image below. Video is from the video player component and image is from a screenshot of the video opened in VLC. Even in the project folder the video is more red then on the video player component. It's just really weird and wondering if its somehow fixable? I read somewhere about colour space in the player settings and mine is already set to Gamma.
You should check from the menu Window>lighting>skybox that "FOG" is off.
If that doesn't help, I found this answer on the Internet:
I think that's because you are using directional light or another light to your plane, disable it from your plane with any way, for the
dull color, you can try this,
The main idea is make your plane to be like multiply layer to the
white space.
Set the material of your plane be more metalic, and smoothness around
0.5 (you can play with this) thick the reflection on forward rendering options, next step is make our mirror (plane) reflecting white color.
Go to window > lighting and set your skybox material using
Sprites-Default, it will make you entire world become white.
Please see if that works for you.