unity3d fragment shader - distance from pixel to vertex? - unity3d

I'm trying to create a star field fragment shader in which the vertices of a mesh are treated as stars. I want the color of the pixel to go between 0,0,0,1 and 1,1,1,1 as the distance from the pixel (interpolated from the position output of the vertex shader) to the vertex (output from the vertex shader in some way that will prevent it from getting interpolated) goes between some value to 0. I want the value to be calculated based on the z value of the vertex.
Is this possible? How can it be done?
I'm having trouble coming up with the distance from the interpolated vertex position to the non-interpolated one in the fragment shader. Maybe it could be done by somehow getting the screen coordinate of the vertex in the vertex shader, and passing that as a color value or something? Because I seem to be able to access the screen coordinates for a fragment, but not the world coordinates. (If I try to use anything that has position semantics in the fragment shader, I get an error, as I guess you'd expect.)
Additionally, I'd really like to also pass a (star) color value and size value along with a vertex, from the vertex shader, and have those affect the output of the fragment shader.

Related

Is it possible to pass a custom position to calculate lighting in unity surface shaders?

I want to create a shader that uses different coordinates for light calculations than for what's being displayed. This probably sounds strange, but I would like to do this for lighting a top-down 2D game.
I want to write a vertex shader that offsets the Z coordinate by the value of the Y coordinate for display, but uses unmodified coordinates for lighting calculation.
Is this possible to do, and if so, where would I start?
So far I have a surface shader that offsets the Z coordinate by the value of the Y coordinate, but unity is using the modified coordinates to calculate lighting, I would like unity to use the unmodified coordinates for light calculations.

How to get a pixel color from texture by the position of the object in shader graph?

I want to create a shader that will grab a pixel color by the object position
Basically Texture2D.GetPixel(transform.position.x,transform.position.z), but in shader.
You need a SampleTexture2D node to get pixel color in texture, Use Position Node as a sample uv, the Position will give a 3d vector, you can use Split node and Combine node to extract the axis you want and recombine them, then connect it to the UV socket on SampleTexture2D node.

Rotate vertices selected using weight map on UVs in Unity3D's Shader Graph around pivot point

TLDR: Can't figure out the correct Shader Graph setup for using UV and vertex displacement to cheaply animate a (unrigged) mesh.
I am trying to rotate a part of the mesh based on the UV coordinates, e.g: fromX 0 toX 0.4, fromY 0 toY 0.6. The mesh is created uv-mapped with this in mind.
I have no problem getting the affected vertices in this area. Problem is that I want to rotate these verts for customizable axis e.g. axis(X:1, Y:0, Z:1) using a weight so that the rotation takes place around a pivoted point. I want the bottom selection to stay connected to the rest of the mesh while the other affected vertices neatly rotate around this point.
The weight can be painted by using split UV channels as seen in the picture:
I multiply the weighted area with a rotation node to rotate it.
And I add that to the negative multiplied position (the rest of the verts, excluding the rotated area) to get the final output displacement.
But the rotated mesh is bent. I need it to be stiff as in the whole part rotated with weight=1 except for the very pivoting vertex.
I can get it as described using a weight=1 based rotation, but the pivot point becomes the center of the mesh, not the desired point.
How can I do this correctly?
Been at it for days, please help :')
I started using Unity about a month ago, and this is one of the first issues I faced.
The node you are using will always transform the vertices around the origin.
I think you have two options available:
Translate the vertices by the offset of where you want to rotate the wings. This would require storing the pivot point of the wings in the mesh somehow - This could done by utilizing a spare UV channel, or by using the vertex color channel.
Use bones and paint the weights in your chosen 3D package. This way, you can record the animation, and use Unity's skinned mesh shader to render it.
Hope that helps.
Try this:
I've used the UV ranges from your example applied to a sphere of unit size. The spheres original pivot is in the centre, and its adjusted pivot is shifted 0.5 on the Y axis.
The only variable the shader doesn't know, is the adjusted pivot position; so I pass this through the material.
I've not implemented your weight in the graph, as I just wanted to show you the process. You can easily plug that in.
The color output is just being used for debug purposes.
The first image is with the default object pivot.
The second image is with the adjusted pivot.
The final image is the graph. (Note the logic group is driving the vertex rotation based on the UV mask).

Do I need triangle information in my surface shader?

Update
The main question is: How can I pass the world space vertex positions of the triangle to the surface shader in a Unity shader.
As mentioned in a comment it might be possible to pass them from a geometry shader. But I read somewhere that implementing a custom geometry shader overwrites Unitys logic to calculate shadows etc.
I would add the triangle information in the Input structure. But before I change my mesh generation logic for it I would like to know if this is feasible. For this solution the vertex positions of the triangle must be constant for every pixel in a triangle and not be interpolated.
This is the original question:
I am writing a surface shader for a triangle mesh. I set a custom vertex attribute with a texture id to every vertex. Now I want the surface shader to apply the texture as seen in the following image. (Note that each color is representing a texture)
In the surface shader I need the 3 vertices that define the triangle and their texture ids. Furthermore I need to position of the pixel I am drawing.
If all texture ids are the same I pick this texture for all pixels.
If one or two texture ids differ I calculate the pixels distance to the triangle vertices and pick the texture like seen in the next image:
The surface shader needs to be aware of the pixels triangle. With this logic I should get the shading I am looking for. I am creating my mesh programmatically so I can add the triangle vertices and their texture ids as vertex attributes and pass it to the surface shader.
But I am not sure if this is feasible with how surface/vertex shaders work. Is there a relationship between the vertex and the pixel to get my custom triangle information from? Is there a better way of doing this?
I am using Unitys ShaderLab for my shaders.
No, you should not be (nor have acceess to) using vertex data in a fragment shader. In a fragment shader you only have access to data about that given pixel, you cannot go back and look at the mesh that formed it (this is the way the pipeline is constructed).
What you can do (and is a common practice) is to bake the data into one of the available channels (i.e. other UV Mapping channels) of the verts within the Vertex Shader. This way the Fragment shader will have access to the value via interpolators
Ok I think I found a solution. Thank you for the comments, they where useful.
Fist I change my grid topology to not use shared vertices. With this I can use a vertex color channel to set the texture ids.
vertexColor.r = vertexTextureId0; // Texture to use for vertex 0
vertexColor.g = vertexTextureId1; // Texture to use for vertex 1
vertexColor.b = vertexTextureId2; // Texture to use for vertex 2
I do not have to worry about interpolation because all vertices of the triangle have the same color information.
Now I create a texture to look up to which vertex my pixel belongs to. This texture looks similar to the images I posted in the question. I have to transpose the UV coordinates according to the TWO or THREE case. This solution gives me the freedom to easily change the edge and make it more ragged.

How to apply a texture image in GLGravity Teapot iphone?

For openGl expample, Xcode given a project GlGravity. But Instead of showing yellow color how to apply a Texture image without textureCoordinates?.
You need some kind of texture coordinate, otherwise the whole concept of textures makes no sense: A texture is a function mapping a set of n coordinates to some value (depth, luminance, alpha, colour or combination of those) defined by data the samples are taken from and interpolated.
You can generate the texture coordinates, either statically from your mesh, or in the vertex shader. Or you supply them directly. But you need some texture coordinates to make this work. A very cheap and simple texture coordinate generator is using the vertex position as texture coordinate; this will project your texture along the coordinates axes onto the model. So if you've got a 2D texture it will be applied in the XY plane, as if there were a parallel projecting slide projector at coordinates (0,0,\infinity).