Decreasing decoration gameobjects performance cost - unity3d

A lot of my gameobjects in my scene only serve as decoration. In blender it shows me that I have about 100k vertices and in unity it says that I have 240k. After initialization nothing about them gets changed, not even their shadows change.
Ideas:
Some of the meshes are doubled but I assumed it wouldn't change much if I they would derive from the same mesh because it still needs to get loaded to times.
I already made them static but they are still pretty performance heavy.
The camera doesn't always move so I thought about creating an image of the current background and setting it as the render texture and setting the background as inactive as long as the camera isn't moving.
I hope, however, that there's a better solution.
My settings for the gameobjects' mesh renderer:
All of them have the same material:
The graphics stats in the editor:

Related

Does setting camera's culling mask to Nothing when UI covers the screen is good?

I have a pretty large and full scene and therefore gets a lot of draw calls.
Sometimes I display a video in the game, which covers the entire screen.
When I tested my game with Unity's profiler tool I noticed that the camera still renders everything (although occlusion culling is enabled and calculated), and it causes the video to lag.
My question is how can I disable the camera?
When I disable the Camera component or the camera's GameObject I get a warning ⚠ in the game view that says No camera is rendering to this display. Which, I guess, is not good (correct me if I'm wrong).
So I was wondering if cancelling the culling mask on the camera (by setting it to Nothing) would force unity to stop render the scene.
Or does it still do some work in the background?
(Like with UI elements that still being rendered even though they are fully transparent).
Thanks in advance
I have a pretty large and full scene and therefore gets a lot of draw
calls.
I recommend activating "Instancing" on your materials, it can greatly reduce draw calls.
When the UI Pops open, it can help removing the "Default" layer (or whatever layer the majority of your renderers are) from the active cameras. You can do this easily with layer masks. Or you can just set Camera.main.farClippingPlane to 1 or any low number.

Recommendations for clipping an entire scene in Unity

I'm looking for ways to clip an entire unity scene to a set of 4 planes. This is for an AR game, where I want to be able to zoom into a terrain, yet still have it only take up a given amount of space on a table (i.e: not extend over the edges of the table).
Thus far I've got clipping working as I want for the terrain and a water effect:
The above shows a much larger terrain being clipped to the size of the table. The other scene objects aren't clipped, since they use unmodifed standard shaders.
Here's a pic showing the terrain clipping in the editor.
You can see the clipping planes around the visible part of the terrain, and that other objects (trees etc) are not clipped and appear off the edge of the table.
The way I've done it involves adding parameters to each shader to define the clipping planes. This means customizing every shader I want to clip, which was fine when I was considering just terrain.
While this works, I'm not sure it's a great approach for hundreds of scene objects. I would need to modify whatever shaders I'm using, and then I'd have to be setting additional shader parameters every update for every object.
Not being an expert in Unity, I'm wondering if there are other approaches that are not "per shader" based that I might investigate?
The end goal is to render a scene within the bounds of some plane.
One easy way would be to use Box Colliders as triggers on each side of your plane. You could then turn off Renderers on objects staying in the trigger with OnTriggerEnter/OnTriggerStay and turn them on with OnTriggerExit.
You can also use Bounds.Contains.

Unity transparent sprite render order

I've been trying to fix this issue for a while. It's about the render order of transparent particle sprites, regardless of the shader used, some of the sprites from the background nebula are rendered above the foreground ones. The image should clarify the situation. The sprites are Quads with materials on them which happen to use the Legacy Shaders/Particles/Alpha Blended shader.
I've even tried setting the renderQueue of the foreground quads' materials to a value higher than that of the background quads, but even that didn't help
It seems whatever I do, the render order of the transparent sprites is messed up. The shader currenty used is Particles/Additive Blend, but using similar shaders didn't really help.
Particle system geometry is batched, so the render order is determined by the particle system itself. In the settings for the particle system, go to the last category "Rendering". In there you should find a field called "Sort Mode" that determines which particles are put in front of others. It sounds like you want the "By Distance" option.

Unity3D: Why is particle lighting making directional shadows disappear?

UPDATE: As #BenHayward suspected, this is a bug. <link>
I have a very simple setup of cubes on a plane comprising a grid of quads. A directional light is shining down at the scene at an angle, producing a set of shadows from the cubes onto the quads.
Now I'm trying to produce an explosion effect with Unity's particle system, but when I add a point light to the particle system it causes all the directional-lighting shadows to disappear, whether they're in line of sight of the particle or not.
The shadows reappear when the particle is destroyed. Replicating the particle effect with pure C# doesn't cause any problems.
(Oh, and obviously I'm using the deferred rendering path.)
Any ideas? This is driving me off the wall.
[EDIT: I should have mentioned that the point light added to the particle system is set to cast shadows. The Unity standard particle pack has shadow-casting disabled by default. They too cause the problem when I turn the shadow-casting on.]
Based on the project that you linked to, it seems as though the particle system is causing the shadow cast from the directional light to flicker on and off quickly. I suspect this is a bug, since if it were intended behaviour, I wouldn't expect it to flicker in this manner.
In cases where this is not a bug, the problem can be caused by a couple of issues:
You can only have a certain number of dynamic (shadow casting) lights in your scene which are seen by the camera frustum. By default, this number is quite low (I think it's 4). You can increase this number by going to Edit > Project Settings > Quality. Set the Pixel Light count higher from its default value. You will need to increase this value to be greater than the total number of lights in your effect. Higher values will allow more lights to be rendered on the screen, but this reduces performance.
It depends on the shaders which you are using to receive the shadows. Some shaders will only render shadows for one directional light. The light which is used isn't necessarily too easy to determine. If you are using the standard Unity shader this shouldn't be a problem. But if you are using a mobile compatible surface shader or something you've written yourself then this could be the cause of the problem.
Also, for an explosion, I'd recommend using just one single point light (not lights attached to each particle), as this is all that is required. Any more lights would result in considerable performance impact on the GPU especially if there are more than one explosion in the scene at any one time.
I recreated the scene as you described, i can't recreate your issue.
i mostly followed this tutorial, and added a few cubes in a plane:
https://unity3d.com/learn/tutorials/topics/graphics/adding-lighting-particles
I will need a screenshot of your lights componnents, both the directional and the point light, the particles, and the cubes (mostly the material); I cannot comment because i dont have enough reputation yet, so i'll delete this once you add the screenshots;

Does the entire scene get drawn on screen?

Ok so i'm building a game and just using the WebPlayer.
It plays just fine for me and others with no performance issues.
I loaded it to an iPhone 5 to see how it would handle performance and it's far from acceptable. Likely due to the nature of it and all of the objects and effects being drawn.
Is the entire scene loaded at once or are the items that are out of range only drawn when the camera is in that area?
Here is the game
http://burningfistentertainment.com/3D/DevilsNote/index.html
Any pointers would be great.
Is the entire scene loaded at once
Yes it is. When you load a scene everything contained is loaded into memory. Eventually try to split the scene.
are the items that are out of range only drawn when the camera is in
that area
Each active GameObject inside camera frustum with a Renderer component will be rendered. Always animated Animator components can affect performances even if not rendered.
I loaded it to an iPhone 5 to see how it would handle performance and
it's far from acceptable. Likely due to the nature of it and all of
the objects and effects being drawn.
It's hard to say what could be the bottleneck without knowing more about your code. If you think the problem is GPU related, here's
some tips for mobile:
reduce draw-calls (it depends on the device, I'd say no more than 50-70) => reduce materials count(use atlas), mark static objects as static (this way can be static batched),...
limit overdraw: reduce size and number of transparent objects (what about rain? how do you implemented it?)
consider using occlusion culling (probably not a problem in your game, but if the depth complexity increases it could save you a lot of GPU workload)