Does anyone know if I upload trees created in Blender or 3DS max? For example, why can I not use the wind and do not move the leaves?
Custom tree meshes can be used, they just need a bit more work.
It would be relatively easy to add a hinge constraint and apply the wind force directly on the tree mesh, but the tree would simply pivot around the hinge, and not bend as expected.
The built-in trees use geometry shaders to mutate the tree mesh using the wind forces. You would need to apply the same shaders to your tree to make the geometry bend, instead of just swinging.
The Unity Asset Store has a number of plugins available:
Advanced Terrain Shaders v2 (Free)
Custom Tree Importer extension which can simplify importing custom tree meshes (Paid)
Another workaround for a simple tree might be to use a character rig on the tree body, and use particles for the leaves / branches. This would be computationally expensive, and difficult to modify.
Related
I have some problems with batching trees. I'm using the default unity terrain system and trees. My problem is that trees won't get batched together (I've set up static, dynamic, and GPU instancing) and as I've inspected the frame debugger I've come to these results:
I used trees with the optimized bark material and optimized leaf material
What causes distinct draw call:
1- Wind
2- color and size variations for trees
draw call reason: non instanced properties set for instanced shader
if I remove Wind or variations the GPU instancing would work (I don't want to remove wind and variations), Is there any way to batch trees in this case?
One of the best solution is that using the trees from the Unity not importing trees from outside the Unity.
In my game, I modify mesh at runtime using a damage algorithm.
After that, I would like to save them.
I get all meshes with:
MeshFilter[] meshfilters = MyObject.GetComponentsInChildren<MeshFilter>();
then i modify single mesh.
How to save that in my original FBX file ?
Thanks
Unity's internal mesh type is not the same as an FBX, FBX is a file storage format, while Unity uses a different internal structure at runtime, for greater efficiency.
As far as I know there is no standard way to save out a runtime mesh as an FBX (this is a feature of 3D modelling software, not a game engine)
You can save your deformations somehow and re-apply them when you reload the model, when I did a similar thing (custom generated terrain meshes at runtime) I would generate a mesh at runtime, then save the data required to generate that mesh, then pass it into the same flow I used to generate the mesh in the first place.
It is possible to serialise meshdata in the unity editor as meshdata is actually a different format from FBX (Unity reads an FBX, then generates a mesh asset from the FBX's data, then associates that mesh with the FBX) You do this by playing with AssetDatabase.CreateAsset() https://docs.unity3d.com/ScriptReference/AssetDatabase.CreateAsset.html
But I don't think this is what you're intending to do.
If you want the full FBX suite of features (skinning, animation, use in other unity projects) and you want them at runtime, you don't really have any options that I know of. Short of implementing your own FBX serialisation library (not really a reasonable solution)
I've built a working surface shader (call it "wonderland") that renders as invisible unless a companion "lookingGlass" shader intersects with it from the viewpoint of the camera. Simple stencil shader arrangement.
Easy peasy.
I can add shader settings to specify a plane, or even just a minimum worldspace Z value, and use clip() to only render pixels on one side of that plane... (in other words, I could use that to trim the content that's allowed by the Stencil.)
What I want to do is use the stencil on surfaces "through the looking glass", (to reveal geometry that's inside the looking glass) and to always render those surfaces when they're on "our" side of the looking glass (to always show them if they're on this side of the looking glass portal). eg., if z<0, render if the Stencil Ref value is satisfied. if z>=0, render regardless.
Now, in Unity I can attach two materials to the MeshRenderer component (one with a stencil shader, one with a "plane cutoff" shader) - that works fine. It's pretty awesome, actually, at least visually. But while I haven't benchmarked it yet, I instinctively believe it's going to massively impact framerate if there are a number of objects, fairly complicated geometry, etc., set up with this arrangement.
(I can also manage shader attachment in code, and only do this when I expect something to transition, but I'm really hoping to get a unified shader out of this to avoid unnecessary draw calls.)
As it turns out, what I was looking to do is impossible.
The two shaders I wish to combine are both surface shaders. While you can combine multiple surface shaders into a multipass shader, you cannot combine multiple surface shaders, with a Stencil, and with a clip() where the clip is applied to passes that the Stencil is not and vice-versa.
There are combinations that can achieve parts of this, or can achieve the entire goal with surface and vert (or other non-surf) shaders, but the combination of requirements stipulated by this question isn't supported as desired.
While this does not answer the question, the workaround in Unity is to create two materials that provide each piece of functionality. They can both exist on the item that needs both pieces, and code can otherwise manage whether one or the other or both is actively in use.
Similar solutions would be available in other packages.
This question is (mostly) game engine independent but I have been unable to find a good answer.
I'm creating a turn-based tile game in 3D space using Unity. The levels will have slopes, occasional non-planar geometry, depressions, tunnels, stairs etc. Each level is static/handcrafted so tiles should never move. I need a good way to keep track of tile-specific variables for static levels and i'd like to verify if my approaches make sense.
My ideas are:
Create 2 Meshes - 1 is the complex game world, the second is a reference mesh overlay that will have minimal geometry; it will not be rendered and will only be used for the tiles. I would then Overlay the two and use the 2nd mesh as a grid reference.
Hard-code the tiles for each level. While tedious it will work as a brute force approach. I would, however, like to avoid this since it's not very easy to deal with visually.
Workaround approach - Convert the 3d to 2D textures and only use 1 mesh.
"Project" a plane down onto the level and record height/slope to minimize complexity. Also not ideal.
Create individual tile objects for each tile manually (non-rendered). Easiest solution i could think of.
Now for the Unity3D specific question:
Does unity allow selecting and assigning individual Verts/Triangles/Squares of a mesh and adding componenets, scripts, or variables to those selections; for example, selecting 1 square in the 10x10 unity plane and telling unity the square of that plane now has a new boolean attached to it? This question mostly refers to idea #1 above, where i would use a reference mesh for positional and variable information that were directly assigned to the mesh. I have a feeling that if i do choose to have a reference mesh, i'd need to have the tiles be individual objects, snap them in place using the reference, then attach relevant scripts to those tiles.
I have found a ton of excellent resources (like http://www-cs-students.stanford.edu/~amitp/gameprog.html) on tile generation (mostly procedural), i'm a bit stuck on the basics due to being new to unity and im not looking for procedural design.
I have a very simple but large scene containing lots of objects and a lot of these objects are small but curved objects so they have large polygon counts. The FPS on the scene is really horrible. I learned that a Level of Detail optimization should help alot.
I am using three.js and it has an option to set LOD. But the model, doesn't have any LOD information (alternate meshes for each object corresponding to distance from the object). Is there something like a tool to generate this information by automatically by decimating the original mesh to create the alternate meshes?
But I can't imagine how textures will be skinned on the decimated meshes. Do I have to manually create the LOD information? 3D editors like Blender, 3dsMax, Unity editor let me set these meshes up individually. But I have about 200 meshes in my scene.
Level of Detail information can not be generally generated automatically. And yes it a painstaking process to create the LOD info. You can look at the LOD Book site for help.
The accepted answer to this question is actually not quite correct anymore.
While it's true that it's a painstaking process to create LOD data it gets easy when using InstaLOD. InstaLOD is a fully automatic 3d optimization solution that's able to optimize any static and skeletal mesh and maintain all vertex attributes like texture coordinates. Besides the polygon optimization, InstaLOD also features remeshing, occlusion culling, imposter creation and other unique methods related to the optimization of individual 3D models and complex scenes.
DISCLAIMER: I am one of the devs of InstaLOD.