How to convert world position to UV - space? - unity3d

I have a gameObject1 which transmit it's position into shader of some screenGameObject. ScreenGameObject has position(0,0,0), rotation(0,0,0), scale(1,1,1).
In the shader I need to make a texture follow this gameObject1.
In general, I think, need to convert world position of gameObject1 to uv - space of ScreenGameObject.
Here I have found some task, which is v close to my: link
In the vertex shader I wrote :
o.uvOffset = mul(unity_WorldToObject, _FrameTexPos).xy;
In frag function I have:
return tex2D(_AimTex, i.uvOffset).rrrr;
As result: the _AimTex texture moves wrongly at all:
_FrameTexPos.x makes texture move along OY, _FrameTexPos.y makes it move along local OX
I guess this:
mul(unity_WorldToObject, _FrameTexPos).xy;
Do not convert _FrameTexPos rightly (from world space, to object's uv space).

Related

Reconstruct near plane ray intersection with camera

I am trying to reconstruct the point where the ray of the camera rendering the current pixel intersects the near plane.
I need the coordinates of the intersection point in the local coordinates of the object being rendered.
This is my current implementation:
float4 nearClipLS = mul(inv_modelViewProjectionMatrix , float4((i.vertex.x / i.vertex.w), (i.vertex.y / i.vertex.w),-1., 1.)); nearClipLS /= nearClipLS.w;
There's got to be a more efficient way to do it, but the following should, in theory, work.
Find the offset vector from the camera to the pixel:
float3 cam2pos = v.worldPos - _WorldSpaceCameraPos;
Get the camera's forward vector:
float3 camFwd = UNITY_MATRIX_IT_MV[2].xyz;
Get the dot product of the two to determine how far the point projects in the direction of the camera's forward axis:
float projDist = dot(cam2pos, camFwd);
Then, you should be able to use that data to re-project the point onto the near clip plane:
float nearClipZ = _ProjectionParams.y;
float3 nearPos = _WorldSpaceCameraPos+ (cam2pos * (nearClipZ / projDist));
This solution doesn't address edge cases (like when it's even with or behind the camera, which could cause problems), so you may want to check those once you get it working.

How do I decode a depthTexture into linear space in the [0-1] range in HLSL?

I've set a RenderTexture as my camera's target texture. I've chosen DEPTH_AUTO as its format so it renders the depth buffer:
I'm reading this texture in my shader, with float4 col = tex2D(_DepthTexture, IN.uv); and as expected, it doesn't show up linearly between my near and far planes, since depth textures have more precision towards the near plane. So I tried Linear01Depth() as recommended in the docs:
float4 col = tex2D(_DepthTexture, IN.uv);
float linearDepth = Linear01Depth(col);
return linearDepth;
However, this gives me an unexpected output. If I sample just the red channel, I get the non-linear depthmap, and if I use Linear01Depth(), it goes mostly black:
Question: What color format is chosen when using DEPTH_AUTO in the Render Texture inspector, and how do I convert it to linear [0, 1] space? Is there a manual approach, like a logarithmic function, or an exponential function that I could use?

Decal wrap around mesh

I'm working on tattoo simulator program, i need to know if there's a way for the decal (tattoo) to wrap arond the target mesh, like having a tattoo that goes from one side to the other side of lets say leg, or event behind it.
Not at runtime, using a projected decal, no.
What you need here instead is a procedural tattoo map. Think of it as another texture, like a lightmap. You may need a custom shader, but it could possibly be done with the secondary albedo channel of the standard shader.
The tricky part is writing to that texture. I'll outline the basic algorithm, but leave it up to you to implement:
The first thing you need to be able to do is unwrap the mesh's triangles in code. You need to identify which edges are contiguous on the UV map, and which are separate. Next, you need a way to identify the tattoo and the initial transform. First, you'll want to define an origin on the tattoo source texture that it will rotate around. Then you'll want to define a structure that references the source texture, and the UV position (Vector2) / rotation (float) / scale (float) to apply it to in the destination texture.
Once you have the tattoos stored in that format, then you can start building the tattoo mask texture for the skin. If your skin uvs have a consistent pixel density, this is a lot easier because you can work primarily in uv-space, but if not, you'll need to re-project to get the scale for each tri. But, basically, you start with the body triangle that contains the origin, and draw onto that triangle normally. From there, you know where each vertex and edge of that triangle lies on the tattoo source texture. So, loop through each neighboring triangle (I recommend a breadth-first recursive method) and continue it from the edge you already know. If all three verts fall outside the source texture's rect, you can stop there. Otherwise, continue with the next triangle's neighbors. Make sure you're using the 3D mesh when calculating neighbors so you don't get stuck at seams.
That algorithm is going to have an edge case you'll need to deal with for when the tattoo wraps all the way around and overlaps itself, but there are a couple different ways you can deal with that.
Once you've written all tattoos to the tattoo texture, just apply it to the skin material and voila! Not only will this move all the calculations out of real-time rendering, but it will let you fully control how your tattoos can be applied.
You can use a decal projector using Unity's official preview tool Render Pipelines - High Definition.
Here's how I used it to project a "tatoo" onto a bucket. You can apply it to your model of course.
(Child the decal projector so that the tatoo follows the model)
The best way to import Render Pipelines - High Definition package is to use Unity Hub to create a new project, choosing it as a template. If it's an existing project, this official blog might help you.
Once you succefully set up the package, follow this tutorial and you'll be able to project tatoos onto your models anywhere you want.
I've done something similar with a custom shader. I think it would do what you want. Mine is dynamically rendering flags based on rank and type of a unit for an iPad game prototype. Exactly how you'll do it depends a bit on how you have things setup in your project, but here's what mine looks like - first image is the wireframe showing the mesh and second is with the shaders turned on and shows them adding the colors and emblem based on rank and unit. I've just included the shader for the top flag since that has the unit emblem added added similar to how you want your tattoo to be:
Note that you can attach multiple shaders to a particular mesh.
And the emblem is just an image with transparency that is added to the shader and referenced as a texture within the shader:
You can see we also have a picture that has some shadow texture that's used as the background for the banner.
This is my first shader and was written a while ago, so I'm sure it's sub-optimal in all kinds of ways, but it should hopefully be enough to get you started (and it still works in Unity 2018.3.x, though I had to hack in some changes to get it to compile):
Shader "Custom/TroopFlagEmblemShader" {
Properties {
_BackColor ("Background Color", Color) = (0.78, 0.2, 0.2) // scarlet
_MainTex ("Background (RGBA)", 2D) = "" {}
_EmblemTex("Emblem (RGBA)", 2D) = "" {}
_Rank ( "Rank (1-9)", Float ) = 3.0
}
SubShader {
Pass {
CGPROGRAM
#pragma exclude_renderers xbox360 ps3 flash
#pragma target 3.0
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata {
float4 vertex: POSITION;
float4 texcoord: TEXCOORD0;
};
struct v2f {
float4 pos: SV_POSITION;
float2 uv: TEXCOORD0;
};
uniform sampler2D _MainTex;
uniform sampler2D _EmblemTex;
uniform float3 _BackColor;
uniform float _Rank;
v2f vert( appdata v )
{
v2f o;
o.pos = UnityObjectToClipPos( v.vertex );
o.uv = v.texcoord.xy;
return o;
}
float4 frag( v2f IN ) : COLOR
{
float4 outColor;
float4 backTextureColor = tex2D( _MainTex, IN.uv.xy );
float4 emblemTextureColor = tex2D( _EmblemTex, IN.uv.xy );
// not drawing the square at all above rank 5
if ( _Rank >= 6.0 )
discard;
if ( _Rank < 5 ) // 4 and below
{
outColor = float4( (emblemTextureColor.rgb * emblemTextureColor.a) +
(((1.0 - emblemTextureColor.a) * backTextureColor.rgb) * _BackColor.rgb) , 1 );
// float4(_BackColor.rgb, 1 ));
}
else if ( _Rank >= 5.0 ) // but excluded from 6 above
{
// 5 is just solid backcolor combined with background texture
outColor = float4( backTextureColor.rgb * _BackColor.rgb, 1 );
}
return outColor;
}
ENDCG
}}
}
Shaders are a bit maddening to learn how to do, but pretty fun once you get them working - like most programming :)
In my case the overlay texture was the same size/shape as the flag which makes it a bit easier. I'm thinking you'll need to add some parameters to the shader that indicate where you want the overlay to be drawn relative to the mesh and do nothing for vertexes/fragments outside your tattoo bounds, just as a first thought.

Would this shader code snippet still be converted as i.uv in Unity?

I'm following this tutorial: https://www.youtube.com/watch?v=CzORVWFvZ28 to convert some code from ShaderToy to Unity. This is the shader that I'm attempting to convert: https://www.shadertoy.com/view/Ws23WD.
I've going through the tutorial and I noticed that one of the changes to make is to take 'fragCoord.xy/iResolution.xy' and use 'i.uv' instead. But what if I had 'fragCoord.x/iResolution.x' and 'fragCoord.y/iResolutiony'? Does 'i.uv' replace both of those statements?
Here's how it appears in my code:
float2 uv = float2(fragCoord.x / iResolution.x, fragCoord.y / iResolution.y);
uv -= 0.5;
uv /= float2(iResolution.y / iResolution.x, 1);
iResolution.x and iResolution.y refers to the pixel dimensions of the render window. In unity, you have _ScreenParams.x and _ScreenParams.y for screen pixel dimensions, but you only need to worry about this if you are rendering screen-space effects. If you want to render this as an effect on a plane, you use i.uv.x and i.uv.y instead for the same purpose. FragCoord / iResolution has the purpose of giving you the position of the fragment in the range of 0-1 on your screen, which is effectively equivalent to a UV-coordinate. Subtracting by 0.5 gives you the range -0.5 to 0.5.
The last line, uv /= float2(iResolution.y / iResolution.x, 1), is for correcting aspect ratio. If you are using an UV-mapped, square plane, you don't need to bother with this.
To summarize, these lines are equivalent to:
float2 uv = i.uv - (float2)0.5;

2D Water in Unity

I've programmed a 2D Water Effect with Springs similar to this one. Now I want to implement it in a Vertex Shader (in Unity). But for Wave Propagation I need to know the left and right Neighbors (to calculate the affecting Force) of the current Vertex and somehow save the resulting Force for the next Iteration. I have no Idea how to do that.
You should create a texture representation of the offsets you need for vertex manipulation and then use tex2lod() in your vertex shader - these are supported in shader model 3 and up.
You then use a fragment shader to generate or update the texture.
Your vertex shader could look something like this:
sampler2D _OffsetMap;
// vertex shader
vert_data vert (appdata_base v) {
vert_data v_out;
float4 vertexPos = v.vertex;
vertexPos += tex2Dlod (_OffsetMap, float4(vertexPos.xz, 0, 0));
v_out.position = UnityObjectToClipPos (vertexPos);
return v_out;
}
You can ofcourse also use the same fetch to manipulate vertex normals.