As u can see in the image, on larger tiles (n > 1) the texture should be repeated as long as the current rect size.. i don't know how i can achieve this!
FYI, im getting the tile texture id with the alpha value of the vertex color.
Here the shader im using..
[UPDATE]
Thanks for clarifying the uv coordinates, unfortunately that doesn't answer my question. Take a look at the following pixture...
Your shader is fine; it's actually the vertex UVs that are the problem:
So for all rectangles the uv coordinates are as following [0, 0] / [0, rect.height] / [rect.width, 0] / [rect.width, rect.height]. So the uvs are going beyond 1
Your shader is designed to support the standard UV space, in which case you should replace rect.width and rect.height with 1.
By using UV coords greater than one, you're effectively asking for texels outside of the specified texture. When used with a texture atlas, that means you're asking for texels outside of the specified tile -- in this case, those happen to be white, and that's what you're seeing in the rendered output.
Tiling with an atlas texture
Updating because I missed an important detail: you want a tiling material.
Usually, UVs interpolate linearly:
For tiling, you essentially want more of a "sawtooth" output:
For a non-atlas texture, you can adjust material scale/wrap settings and call it done. For an atlas texture, it's possible but you'll end up with a shader and/or geometry that aren't quite standard.
The "most standard" solution would be if your larger quads are on a separate mesh from the smaller ones:
Add a float material param named uv_scale or some such
Add a Multiply node that scales incoming UVs by uv_scale
Pass output from that into a Frac node
Pass output from that into the UV Tile node
Pseudocode is roughly: uv = frac(uv * uv_scale)
If you need all of your quads to be in the same mesh, you end up needing non-standard geometry:
Change your UVs again (going back to rect.width and rect.height)
Add a Frac node before the UV Tile node
This is a simpler shader change, but has the downside that your geometry will no longer be cleanly supported in other shaders.
Thanks rutter!
i've implemented your solution into my shader and now it works perfectly!
so for everyone looking for this here is the shader im using now
Cheers, M
Related
I've had an issue with normal maps not behaving correctly in my custom shader and finally managed to find the cause. It turns out it was the way UV of my objects were mapped. In UV0 I stored a mapping to a color palette texture - the UVs were all scrambled together as the only thing that mattered was that they are on a pixel with the correct color. In UV1 I stored the traditional UV uwrap, which I used to apply the normal map. To get the normal map I used a set-up like this:
I'm doing my own light calculations so I need to transform the normal from tangent space to world space before using it.
This approach was causing two issues - weird artifacts and the normals being "stuck" to the object:
The sphere on the right is upside down and if you look at the normals they are also upside down. The artifacts are on both spheres, but they are visible on the right one from this perspective.
What seems to be the cause is the way I used UV0 to map the object to a color palette. It somehow affects the tangent to world space transformation done by the Transform node (I know it's this node because removing it makes the artifacts disappear). Also, swapping the UV channels so that the traditional unwrap is in UV0 and the palette mapping is in UV1 fixes the issue:
There are no artifacts and the normals aren't stuck to the object.
So why is the transform node affected by UV mapping? I thought it does the transformation based on the geometry of the object. And if it uses UV maps, why is there no dropdown to select which UV it's going to use?
The tangent space (aka texture space) is partially defined by the geometry (the normal) but also by the uv coordinates (the tangent).
Just as the normal is derived from the vertex position, the tangent is inferred from the UVs. It's essentially an object space vector (xyz) that points to the U axis (horizontally in your UV space) and is perpendicular to the normal.
The normal map is a vector in texture space, and the channels of the bitmap can be seen as the offsets from the vertex defined tangent space for each of its vectors.
half3 TangentToObjectSpace(half3 input, half3 nml, half3 tgt, half3 btg) {
return tgt * input.x + btg * input.y + nml * input.z;
}
As you can see the input.x (normal map red channel, which defines the horizontal part of the vector) is modulating the tangent. We need another vector for the green channel, which we can generate using a cross product: provided the two existing vectors you get a new one perpendicular to both. The catch: for flipped UV shells this generated vector is pointing in the wrong direction. In Unity tangents are actually vector4, and the last component (w) is the flip of the binormal.
Unity uses the tangent space of the first UV coordinate by default, which is why you fixed your issue by swapping them :)
It's very, very likely that the Transform node is dependent on the UV selection insofar it depends on the output of the Texture Sample node. So it`s probably a subtle difference of the output in the sample node that makes the transform node behave this way, and not some hidden UV setting on the transform node.
So, I want to make scene same to this Sphere Scene
Now I have mesh with random generation as a ground and a sphere. But I dont't know how to cull off spheres geometry above mesh. Tried to use Stencil, and hightmap. Stencil rendered ground in front, but sphere above ground is still rendered. Using heightmap, to get know if it needs to render (I compared height map and worldPos) is problematic, because the texture is superimposed over the all sphere, and not projected onto it. Can you help. Is there any shader function to cull off all above mesh.
I did something similar for an Asteroids demo a few years ago. Whenever an asteroid was hit, I used a height map - really, just a noise map - to offset half of the vertices on the asteroid model to give it a broken-in-half look. For the other half, I just duplicated the asteroid model and offset the other half using the same noise map. The effect is that the two "halves" matched perfectly.
Here's what I'd try:
Your sphere model should be a complete sphere.
You'll need a height map for the terrain.
In your sphere's vertex shader, for any vertex north of the equator:
Sample the height map.
Set the vertex's Y coordinate to the height from the height map. This will effectively flatten the top of the sphere, and then offset it based on your height map. You will likely have to scale the height value here to get something rational.
Transform the new x,y,z as usual.
Note that you are not texturing the sphere. You're modifying the geometry. This needs to happen in the geometry part of the pipeline, not in the fragment shader.
The other thing you'll need to consider is how to add the debris - rocks, etc. - so that it matches the geometry offset on the sphere. Since you've got a height map, that should be straightforward.
To start with, I'd just get your vertex shader to flatten the top half of the sphere. Once that works, add in the height map.
For this to look convincing, you'll need a fairly high-resolution sphere and height map. To cut down on geometry, you could use a plane for the terrain and a hemisphere for the bottom part. Just discard any fragment for the plane that is not within the spherical volume you're interested in. (You could also use a circular "plane" rather than a rectangular plane, but getting the vertices to line up with the sphere and filling in holes at the border can be tricky.)
As I realised, there's no standard way to cull it without artifacts. The only way it can be done is using raymarching rendering.
I want glow on my sprites using the UV coordinates, but the problem is, if the sprite originates from an atlas created by Unity's sprite packer, then the UV aren't normalized from 0 to 1, but from and to two arbitrary values. How do I normalize UV data for a single sprite that resides in an atlas? Am I required to parse additional information into the shader or should I already have the necessary information to do this process? The image below describes the situation:
The hand to the left is a sprite not from an atlas. The hand on the right is a sprite from an atlas. I want the right hand to look the same as the hand on the left.
I am not that familiar with shaders yet, so I am reliant to using shaderforge. I am using the following shaderforge layout:
You probably already know this, but the fundamental problem is the output of your "UV Coords" node. The other nodes in your shader are expecting normalized UVs ranging from 0 to 1, but that's not what you're getting when you use the texture atlas.
I can think of two ways to solve that. They're both viable, so I'd recommend trying whichever one fits more cleanly into your workflow.
Add a second UV channel
It's easy to treat UV0 as the only UV channel, but for certain techniques it can be helpful to add multiple UV coords to each vertex.
As an example, lightmapping is a popular feature where each model has its own individual textures (diffuse/normal/etc), but each scene has a pre-baked lightmap texture that is shared between multiple models -- sort of like an atlas for lighting information. UVs for these will not match, so the lightmap UVs are stored on a second channel (UV1).
In a similar fashion, you could use UV0 for atlas UVs and UV1 for local UVs. That gives you clean input on the [0,1] range that you can use for that multiply effect.
Add material params
You could scale and offset the UVs so that they are normalized.
Before rendering, find the min and max values for the mesh's UV coords
Pass those values in as material parameters
Add shader nodes to scale and offset the input UV, such that the range is normalized
For example, you could add min to each UV (offset), then divide by max - min (scale).
i have a standard plane created with unity and replaced its mesh filter (that had 121 tri, 202 vertices) with a mesh filter made in blender that has 2 tri/4 vertices.
if i set the material up with a texture, i get only a very small portion of the texture drawn on the plane. How can i draw the full texture on the new plane?
You need to adjust your UV mapping so that the 4 vertices cover the whole image. Take a look at the this demo file especially at the UV scene layout.
If a texture shows that way it means either the UVs of the imported model are wrong or the texture tilling or offset in the material are wrong.
Instead of importing a mesh for such simple shape you can create one procedurally in code like this: https://github.com/doukasd/Unity-Components/blob/master/ProceduralPlane/Assets/Scripts/Procedural/ProceduralPlane.cs
I am attempting to create a tile engine using a pixel shader and two textures. One texture will hold the tileset and one the map.
Is it possible to read the texture data as actual (unsampled) data so I can pull indexes from the map?
What is the best way to read that pixel data?
I have tried just text2D but that leaves something to be desired (I am a bit new to pixel shaders to be honest).
Basically, I need a way to read the actual data from a specific pixel in my map texture and use that as an integer index into the tile texture. Assume I have managed to create and pass the appropriate textures to the shader.
Any thoughts?
(using monogame for metro so dx level 9_1)
If you use tex2D and pass in (x + 0.5) / width and (y + 0.5) / height, you should get the exact pixel value at (x, y). More information here: Texture memory-tex2D basics