When you create a tessellation shader in HLSL, is there a way to track these new triangles with ids? I have tried using semantics such as SV_PrimitiveID in the geometry shader along with SV_InstanceID in the domain shader but they seem to output the original mesh's data. I hope to be able to store the data in a compute buffer. Any suggestions would help!
Thank you
You could look at stream-output stage (https://msdn.microsoft.com/en-us/library/windows/desktop/bb205121(v=vs.85).aspx) to gather the generated triangles into a buffer from the geometry shader stage.
By sending the tessellation coordinates along with the vertices to the geometry shader it becomes possible to uniquely identify each triangle by three barycentric coordinate triples.
Related
I think i have a difficult problem right here..
I want to able to get the surfaces of f.e. the orange object in this three.js example https://threejs.org/examples/?q=stl#webgl_loader_stl
i want to click with the mouse, find the correct surface, which should then be highlighted, so i make sure this was the surface i want.
(i already implemented raycaster successfully, so thats not an issue)
The intersectObject method returns an array of intersections, each of which has face property. The face contains vertex indices.
For STL files containing multiple solids, each solid is assigned to a different group, and the groups are available in the geometry object that is returned from STLLoader. Each group is defined by a range of vertex indices.
So, I think you can correlate the vertex indices returned from the raycaster with the vertex indices in the geometry groups.
I am currently developing an asteroid mining/exploration game with fully deformable, smooth voxel terrain using marching cubes in Unity 3D. I want to implement an "element ID" system that is kind of similar to Minecraft's, as in each type of material has a unique integer ID. This is simple enough to generate, but right now I am trying to figure out a way to render it, so that each individual face represents the element its voxel is assigned to. I am currently using a triplanar shader with a texture array, and I have gotten it set up to work with pre-set texture IDs. However, I need to be able to pass in the element IDs into this shader for the entire asteroid, and this is where my limited shader knowledge runs out. So, I have two main questions:
How do I get data from a 3D array in an active script to my shader, or otherwise how can I sample points from this array?
Is there a better/more efficient way to do this? I thought about creating an array with only the surface vertices and their corresponding ID, but then I would have trouble sampling them correctly. I also thought about possibly bundling an extra variable in with the vertices themselves, but I don't know if this is even possible. I appreciate any ideas, thanks.
I am trying to create an advanced shader for an AR application in Unity. Therefore I need all vertices and their neighbors from my gameobject (within a C# script). Getting the vertices is not the problem, but how do I get their neighbors(maybe with an indexbuffer)?
I am not new to shaders, but to shaders within Unity.
After I got the neighbors I would like to pass them from a C# script to a function within a shader file. I guess that should be possible in Unity, is it not?
The easiest way I can think of is by searching the triangle index. The pattern is always the same, always 3 indices define a vertex. since you know the index of your vertex, you can just search the triangle array and return the other two of any triangle, which will give you all vertices within the range of the desired one.
I want glow on my sprites using the UV coordinates, but the problem is, if the sprite originates from an atlas created by Unity's sprite packer, then the UV aren't normalized from 0 to 1, but from and to two arbitrary values. How do I normalize UV data for a single sprite that resides in an atlas? Am I required to parse additional information into the shader or should I already have the necessary information to do this process? The image below describes the situation:
The hand to the left is a sprite not from an atlas. The hand on the right is a sprite from an atlas. I want the right hand to look the same as the hand on the left.
I am not that familiar with shaders yet, so I am reliant to using shaderforge. I am using the following shaderforge layout:
You probably already know this, but the fundamental problem is the output of your "UV Coords" node. The other nodes in your shader are expecting normalized UVs ranging from 0 to 1, but that's not what you're getting when you use the texture atlas.
I can think of two ways to solve that. They're both viable, so I'd recommend trying whichever one fits more cleanly into your workflow.
Add a second UV channel
It's easy to treat UV0 as the only UV channel, but for certain techniques it can be helpful to add multiple UV coords to each vertex.
As an example, lightmapping is a popular feature where each model has its own individual textures (diffuse/normal/etc), but each scene has a pre-baked lightmap texture that is shared between multiple models -- sort of like an atlas for lighting information. UVs for these will not match, so the lightmap UVs are stored on a second channel (UV1).
In a similar fashion, you could use UV0 for atlas UVs and UV1 for local UVs. That gives you clean input on the [0,1] range that you can use for that multiply effect.
Add material params
You could scale and offset the UVs so that they are normalized.
Before rendering, find the min and max values for the mesh's UV coords
Pass those values in as material parameters
Add shader nodes to scale and offset the input UV, such that the range is normalized
For example, you could add min to each UV (offset), then divide by max - min (scale).
So, I have a functioning voxel engine that creates smoothed terrain in chunks of 1x1x1 meter in a 1024 radius around my player.
I wanted to create a geometry shader that not only continues to texture the ground appropriately, but also creates grass (preferably waving with the wind).
I have found some basic billboard geometry shaders to get me started, but they seem to cause the mesh to stop texturing. Is there anyway to do both from one shader?
Do I pass the mesh triangles and the new grass triangles on to the fragment shader with a flag? Thanks in advance!
You can do this by implementing two passes to the shader as it turns out. My first pass is a simple surface shader, but my second pass is the geometry shader. The multi-pass still results in 130 FPS, so it seems to be adequate.