In Unity is there a way to stop the camera rendering few objects or lets say far away objects so that it is more performant? I can achieve this with adjusting the far-clipping plane. But I also want the objects to be seen by camera. Such that the objects are visible in the scene but not rendered by the camera? This is specifically for VR purpose. Is there a way to achieve such an unusual thing?
If I understand correctly, you're asking for something to be visible but unrendered. This poses a conflicting problem, as rendering something is the process of making it visible.
Normally, you'd work with not rendering things outside of the viewport and LOD (level of detail).
LOD
Basically rendering things far away with low level of detail, making the game more performant. You can see how to set it up here: https://docs.unity3d.com/Manual/LevelOfDetail.html
Occlusion Culling
Culling is partly done by Unity, as can be read here: https://docs.unity3d.com/Manual/OcclusionCulling.html
There might be more answers than this, I'm not sure I fully understand what you wish and I am not a game designer by profession so my optimization skills are sub-par.
Related
I'm new to Unity and trying to figure out the best way to create animated backgrounds. To be clear, I'm not asking you to give me an exact solution or instructions, and I would be grateful if you just tell me which direction to look in, and I will figure it out by reading the documentation.
I'm interested in how animated backgrounds are created in 2D Unity (for example, as here: https://youtu.be/OxiGlmV6ByA?t=1075 flying leaves are visible on the background). I only thought of using particles or just creating standard animations in Unity. But, the second way seems too long and complex, and about the particles, I'm not sure how much it affects the performance in a mobile game. Google searches mostly give instructions on how to create parallax backgrounds or moving backgrounds in Unity.
In general, I will be grateful if you tell me which approach is the most optimal for creating an animated background in a mobile 2D game in Unity.
The particles system is well optimized, for what you want to do it will not affects the performance, even on mobile.
I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.
So I've hit a bit of an oddball in my project. I'm creating a horror scene. To support the atmosphere I've used Unity's lighting component fog. For my camera to see this fog, I need it to be on Forward rendering. However, I'm creating a hallway with different rooms and lights, and these lights seem to shine through my wall objects when forward rendering is off. Something I can fix by using deferred rendering instead (But then there's no fog).
It feels weird I'm in a position where I have to choose between the two and can't have it both ways. I tried messing around with some of the Legacy rendering, but no dice. All lights have a shadow strength of 1, and walls even have additional "Shadows only" walls, just to be sure nothing gets through.
It should be mentioned that I'm using one plane for all rooms (Not prefabs, one BIG object) if that has any impact at all.
Anyone experienced similar issues who has any workarounds?
If anyone runs into a similar issue, it's important to have different planes for different rooms. I was using 1 plane for the whole level, which caused this lighting bug to occur.
I had a similar problem... i solved with:
Set my camera rendering path with deferred
camera component
and set the component post process layer (asset)
post process layer
I am building a game where everything behind the player is greyed out as if it lives in a memory. But I don't know how I can achieve this effect. Is this where shaders are being used for?
Now I can create a SKLightNode to create the lighting and make it dark around the edges. But I like to field of view for the character to be 120 degrees. Everything outside of that angle should be greyed out.
Of course in the future I like the view to be blocked by obstacles but that is outside of the scope of this question.
A desaturation shader for SpriteKit can be found at my blog post on that subject. Note that this works in terms of an input texture, so you may need to adapt things to work on top of a tiled background. Also note that there is a new iOS 9 API to support capture of the output of a whole node, which may be useful to you in implementing this.
I'm new to Lighting in 3D. Just started working with Unity3D. I was creating a sample for myself to test shadows and there is a problem.
As you can see that i have created two simple walls with two cubes. Also I have setup a directional light. Let's go the backside of the walls to view the problem
Technically the front wall should be blocking the shadow of the back wall. But it is not. I have painted a read line to show that where the shadow of front wall is overlapping the shadow of the other, meaning going all the was through the wall. Why is that happening. Help please...
set your shader to DIFFUSE. i have the same problem and solved this. my spotlight is passing thor
that is interesting indeed, i have used unity3d for 5 plus years and never seen or noticed this. however, this might seem like a weird request, could you set the ground and the two cubes to bumped diffuse and make sure the cubes are touching the ground.
Since the shader that is used might allow shadows to pass, secondly, could you go to player settings and check if you are using forward of differed rendering, since their lighting techniques are very different they might have different results.
but all in all, best guess is that the shader you are using allows shadows to pass.