For openGl expample, Xcode given a project GlGravity. But Instead of showing yellow color how to apply a Texture image without textureCoordinates?.
You need some kind of texture coordinate, otherwise the whole concept of textures makes no sense: A texture is a function mapping a set of n coordinates to some value (depth, luminance, alpha, colour or combination of those) defined by data the samples are taken from and interpolated.
You can generate the texture coordinates, either statically from your mesh, or in the vertex shader. Or you supply them directly. But you need some texture coordinates to make this work. A very cheap and simple texture coordinate generator is using the vertex position as texture coordinate; this will project your texture along the coordinates axes onto the model. So if you've got a 2D texture it will be applied in the XY plane, as if there were a parallel projecting slide projector at coordinates (0,0,\infinity).
Related
I want to create a shader that uses different coordinates for light calculations than for what's being displayed. This probably sounds strange, but I would like to do this for lighting a top-down 2D game.
I want to write a vertex shader that offsets the Z coordinate by the value of the Y coordinate for display, but uses unmodified coordinates for lighting calculation.
Is this possible to do, and if so, where would I start?
So far I have a surface shader that offsets the Z coordinate by the value of the Y coordinate, but unity is using the modified coordinates to calculate lighting, I would like unity to use the unmodified coordinates for light calculations.
I want to create a shader that will grab a pixel color by the object position
Basically Texture2D.GetPixel(transform.position.x,transform.position.z), but in shader.
You need a SampleTexture2D node to get pixel color in texture, Use Position Node as a sample uv, the Position will give a 3d vector, you can use Split node and Combine node to extract the axis you want and recombine them, then connect it to the UV socket on SampleTexture2D node.
I'm pretty new to shader graph and shaders in general. I'm working on a 2D project and I'm trying to make a shader that rotates an arrow to make a flow-like material and use it on a sprite shape.
Basically what I want to do is make a proper version of this:
What I'm currently doing is multiplying the Y position of the position node by an exposed vector 1 and using it in Rotate node (which I know is pretty hacky and won't work if the shape is not an arc.)
Aligning UV with arbitrary mesh seems bit hard. Why not bend pre-made mesh instead? Graph below bends vertex positions around axis Z at given point and strength (0 makes mesh invisible tho), but, you can easily replace that Position node with UV and plug results into Sample Texture 2D. I just guess bending a mesh will give you better/easier results.
Create a subdivided and well UV-mapped rectangle plane
Bend that plane with a vertex shader (attached graph bends around Z axis)
graph is based on code from Blender source
Update
The main question is: How can I pass the world space vertex positions of the triangle to the surface shader in a Unity shader.
As mentioned in a comment it might be possible to pass them from a geometry shader. But I read somewhere that implementing a custom geometry shader overwrites Unitys logic to calculate shadows etc.
I would add the triangle information in the Input structure. But before I change my mesh generation logic for it I would like to know if this is feasible. For this solution the vertex positions of the triangle must be constant for every pixel in a triangle and not be interpolated.
This is the original question:
I am writing a surface shader for a triangle mesh. I set a custom vertex attribute with a texture id to every vertex. Now I want the surface shader to apply the texture as seen in the following image. (Note that each color is representing a texture)
In the surface shader I need the 3 vertices that define the triangle and their texture ids. Furthermore I need to position of the pixel I am drawing.
If all texture ids are the same I pick this texture for all pixels.
If one or two texture ids differ I calculate the pixels distance to the triangle vertices and pick the texture like seen in the next image:
The surface shader needs to be aware of the pixels triangle. With this logic I should get the shading I am looking for. I am creating my mesh programmatically so I can add the triangle vertices and their texture ids as vertex attributes and pass it to the surface shader.
But I am not sure if this is feasible with how surface/vertex shaders work. Is there a relationship between the vertex and the pixel to get my custom triangle information from? Is there a better way of doing this?
I am using Unitys ShaderLab for my shaders.
No, you should not be (nor have acceess to) using vertex data in a fragment shader. In a fragment shader you only have access to data about that given pixel, you cannot go back and look at the mesh that formed it (this is the way the pipeline is constructed).
What you can do (and is a common practice) is to bake the data into one of the available channels (i.e. other UV Mapping channels) of the verts within the Vertex Shader. This way the Fragment shader will have access to the value via interpolators
Ok I think I found a solution. Thank you for the comments, they where useful.
Fist I change my grid topology to not use shared vertices. With this I can use a vertex color channel to set the texture ids.
vertexColor.r = vertexTextureId0; // Texture to use for vertex 0
vertexColor.g = vertexTextureId1; // Texture to use for vertex 1
vertexColor.b = vertexTextureId2; // Texture to use for vertex 2
I do not have to worry about interpolation because all vertices of the triangle have the same color information.
Now I create a texture to look up to which vertex my pixel belongs to. This texture looks similar to the images I posted in the question. I have to transpose the UV coordinates according to the TWO or THREE case. This solution gives me the freedom to easily change the edge and make it more ragged.
A Texture2D is mapped on a sphere. Thanks to GetPixel(raycastHit.texturecoord) I can get the pixel value of this texture, when, for instance, a ray hit the sphere. So if I convert an image I to a texture T and map T on a GameObject G, I can get the pixel value from G to I (G--textureCoord-->T--GetPixel-->I). If I convert those (x,y) coordinates into world coordinates, I can trigger event on certain colors, so it works. I use this solution to perform pixel to world position.
But I can't do the opposite. Imagine that I have a list of bounding boxes for different objects on the image I (so with coordinates between I.width and I.height). If I convert those coordinates (bx1, bx2) into world coordinates it simply doesn't work.
I noticed that when I compare the GetPixel value when I target a given color on G with my controller, I don't get the same pixel coordinates as the one on the original image.
In the end I want to get the texture coordinate from an image (G<--textureCoord_to_word_coordinates--T<--?????---I).