How to make things "scale independent" in shader? - unity3d

I tried to make border using rectangular sdf but with different scale it gets distorted.
I was thinking about object scale but can't figure how to do this myself
Object scales: (1, 1), (1, 0.125)
How it looks
Desired result (Made in photoshop, slightly out of proportion)
There is vertex, fragment shader and rectangle function
float rectangle(float2 position, float2 size){
float2 component_wise_edge_distance = abs(position) - size;
float outside_distance = length(max(component_wise_edge_distance, 0));
float inside_distance = min(max(component_wise_edge_distance.x, component_wise_edge_distance.y), 0);
return outside_distance + inside_distance;
}
v2f vert(appdata v){
v2f o;
o.position = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}
float4 frag(v2f i) : SV_TARGET
{
half3x3 m = UNITY_MATRIX_M;
half3 objectScale = half3(
length( half3( m[0][0], m[1][0], m[2][0] ) ),
length( half3( m[0][1], m[1][1], m[2][1] ) ),
length( half3( m[0][2], m[1][2], m[2][2] ) )
);
float4 coords = i.uv;
coords *= 8;
float2 position = float2(coords.x - 4, coords.y - 4);
float sdf = step(0, rectangle(position, float2(3, 3)));
float4 col = float4(sdf.xxx, 1);
return col;
}

Yes, you need to use the scale in your calculations.
First, center your coords so they run from -1 to 1 instead of 0 to 1.
float2 coords = i.uv * 2 - 1;
Scale your coords by your scale.
coords *= objectScale;
Now define your rectangle by scale - border.
float border = 0.2;
float2 size = float2(objectScale.x - border, objectScale.y - border);
float sdf = rectangle(coords, size);
sdf = step(0, sdf);
This will give you a fixed width border regardless of scale, though from your images it looks like you're expecting it to shrink with the scale.
To do that, you could scale the border by the smallest axis.
border = border * min(objectScale.x, objectScale.y);
Note that using the object scale will not work with UI elements since by the time they get to your shader Unity has batched them all together and their object scale is meaningless. If that's the case you'll need to pass the scale in as a variable from a script.

Related

Dynamically recalculating normals after vertex displacement

Can anyone let me know if I'm on the right tack with this: I have a vertex shader that bumps outward dynamically depending on a point passed in (think a mouse running under a rug). In order for the lighting to update properly, I need to recalculate the normals after modifying the vertex position. I have access to each vertex point as well as the origin.
My current thinking is I do some sort of math to determine the tangent / bitangent and use a cross product to determine the normal. My math skills aren't great, what would I need to do to determine those vectors?
Here's my current vert shader:
void vert(inout appdata_full v)
{
float3 worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
float distanceToLift = distance(worldPos, _LiftOrigin);
v.vertex.y = smoothstep(_LiftHeight, 0, distanceToLift / _LiftRadius) * 5;
}
A simple solution is covered in this tutorial by Ronja, which I'll summarize here with modifications which reflect your specific case.
First, find two points offset from your current point by a small amount of tangent and bitangent (which you can calculate from normal and tangent):
float3 posPlusTangent = v.vertex + v.tangent * 0.01;
worldPos = mul(unity_ObjectToWorld, posPlusTangent).xyz;
distanceToLift = distance(worldPos, _LiftOrigin);
posPlusTangent.y = smoothstep(_LiftHeight, 0, distanceToLift / _LiftRadius) * 5;
float3 bitangent = cross(v.normal, v.tangent);
float3 posPlusBitangent = v.vertex + bitangent * 0.01;
worldPos = mul(unity_ObjectToWorld, bitangent).xyz;
distanceToLift = distance(worldPos, _LiftOrigin);
posPlusBitangent.y = smoothstep(_LiftHeight, 0, distanceToLift / _LiftRadius) * 5;
Then, find the difference between these offsets and the new vertex pos to find the new tangent and bitangent, then do another cross product to find the resulting normal:
float3 modifiedTangent = posPlusTangent - v.vertex;
float3 modifiedBitangent = posPlusBitangent - v.vertex;
float3 modifiedNormal = cross(modifiedTangent, modifiedBitangent);
v.normal = normalize(modifiedNormal);
Altogether:
float find_offset(float3 localV)
{
float3 worldPos = mul(unity_ObjectToWorld, localV).xyz;
float distanceToLift = distance(worldPos, _LiftOrigin);
return smoothstep(_LiftHeight, 0, distanceToLift / _LiftRadius) * 5;
}
void vert(inout appdata_full v)
{
v.vertex.y = find_offset(v.vertex);
float3 posPlusTangent = v.vertex + v.tangent * 0.01;
posPlusTangent.y = find_offset(posPlusTangent);
float3 bitangent = cross(v.normal, v.tangent);
float3 posPlusBitangent = v.vertex + bitangent * 0.01;
posPlusTangent.y = find_offset(posPlusBitangent);
float3 modifiedTangent = posPlusTangent - v.vertex;
float3 modifiedBitangent = posPlusBitangent - v.vertex;
float3 modifiedNormal = cross(modifiedTangent, modifiedBitangent);
v.normal = normalize(modifiedNormal);
}
This is a method of approximation, but it may be good enough!

Image size independent shader

I'm trying to create a shader for an image material that draws a circle regardless of the aspect ratio of the image itself.
In Shadertoy (hlsl) I can do the following to create a round circle, regardless of aspect ratio:
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 uv = fragCoord/iResolution.xy;
uv -= 0.5;
uv.x *= iResolution.x/iResolution.y; // < this compensates for the aspect ratio
float l = length(uv);
float s = smoothstep(0.5, 0.55, l);
vec4 col = vec4(s);
fragColor = vec4(col);
}
Which gives the following output
If I remove the line uv.x *= iResolution.x/iResolution.y; the circle will warp based on the current aspect ratio.
Now I want to create the same effect in Unity, so I tried the (to me seemingly) same approach.
_MainTex_TexelSize contains the width/height of the texture (from the docs):
{TextureName}_TexelSize - a float4 property contains texture size information:
- x contains 1.0/width
- y contains 1.0/height
- z contains width
- w contains height
Shader "Unlit/Shader"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
}
SubShader
{
Tags { "RenderType"="Opaque" }
LOD 100
Blend SrcAlpha OneMinusSrcAlpha
Cull off
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
sampler2D _MainTex;
float4 _MainTex_ST;
float4 _MainTex_TexelSize;
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
o.uv -= 0.5;
o.uv.x *= _MainTex_TexelSize.z / _MainTex_TexelSize.w;
return o;
}
float DrawCircle(float2 uv, float radius, float fallOff)
{
float d = length(uv);
return smoothstep(radius, fallOff, d);
}
fixed4 frag (v2f i) : SV_Target
{
fixed4 col = tex2D(_MainTex, i.uv);
float c = DrawCircle(i.uv, 0.5, 0.55);
col = lerp(col, fixed4(1,0,0,1), c);
return col;
}
ENDCG
}
}
}
The shader compiles as is, but the circle will still stretch based on the aspect ratio of the image.
I thought this may be due to the way the uv's are set up using o.uv = TRANSFORM_TEX(v.uv, _MainTex); so I tried dividing that by the image's size:
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
o.uv / _MainTex_TexelSize.zw;
o.uv -= 0.5;
However this did nothing
and setting up the uv's differently like so
o.uv = v.uv / _MainTex_TexelSize.zw;
o.uv / _MainTex_TexelSize.zw;
o.uv -= 0.5;
results in the circle's center moving to the upper right, but still warp when the aspect ratio change.
What step am I missing/doing wrong to get the aspect ratio independent result like I get in shadertoy?
The aspect ratio of the input texture _MainTex has nothing to do with the aspect ratio of the output*. In the shadertoy example that output is the screen, and iResolution gives you the screen dimensions (the equivalent in unity is _ScreenParams). If you want to draw a quad that is not full screen, you have to match the quad aspect ratio with the _MainTex aspect ratio to use _MainTex_TexelSize, or else just provide the aspect ratio or dimensions in a shader property (that is basically what _ScreenParams does):
float _Aspect;
fixed4 frag(v2f i) : SV_Target
{
i.uv -= .5;
i.uv.x *= _Aspect;
fixed4 col = tex2D(_MainTex, i.uv);
float c = DrawCircle(i.uv, .5, .55);
col = lerp(col, fixed4(1,0,0,1), c);
return col;
}
You could calculate the aspect ratio with derivatives. Here dx and dy are the amount of uv change per pixel. This would also be useful if you want to have, for example, fallOff always be 10 pixels.
fixed4 frag(v2f i) : SV_Target
{
i.uv -= .5;
float dx = ddx(i.uv.x);
float dy = ddy(i.uv.y);
float aspect = dy/dx;
i.uv.x *= aspect;
fixed4 col = tex2D(_MainTex, i.uv);
float c = DrawCircle(i.uv, .5, .55);
col = lerp(col, fixed4(1,0,0,1), c);
return col;
}

Shader that transforms a mercator projection to equirectangular?

I am trying to make a shader in Unity taking a mercator projection texture as a source and converting it to an equirectangular projection texture.
Input example:
Output example:
This example does the opposite with an equirectangular as source.
If you look at the source of the above example:
// mercator
float latClamped = clamp(lat, -1.4835298641951802, 1.4835298641951802);
float yMerc = log(tan(PI / 4.0 + latClamped / 2.0)) / PI2;
float xMerc = xEqui / 2.0;
vec4 mercatorPos = vec4(xMerc, yMerc, 0.0, 1.0);
Can anyone help to reverse this so I'm able to go from a mercator map as a source to equirectangular (or even better, azimuthal).
Looking for a way to do 2D texture deformations going from x/y to longitude(x)/latitude(y) and back.
I appreciate your input.
If you want to output the equirectangular projection, you need to convert from equirectangular coordinates to mercator coordinates and then sample the mercator projection at those coordinates.
This is what it would look like in a fragment shader from uvs:
//uv to equirectangular
float lat = (uv.x) * 2 * PI; // from 0 to 2PI
float lon = (uv.y - .5f) * PI; // from -PI to PI
// equirectangular to mercator
float x = lat;
float y = log(tan(PI / 4. + lon / 2.));
// bring x,y into [0,1] range
x = x / (2*PI);
y = (y+PI) / (2*PI);
// sample mercator projection
fixed4 col = tex2D(_MainTex, float2(x,y));
The same thing applies to the azimuthal projection: You can go from azimuthal coordinates -> equirectangular -> mercator and sample the image. Or you can find a formula to go directly from azimuthal -> mercator. The wiki pages have a bunch of formulas to go back and forth between projections. Here is a full shader to play around with. Input is a mercator projection and outputs a equirectangular or azimuthal projection (choose from the dropdown menu)
Shader "Unlit/NewUnlitShader 1"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
[Enum(Equirectangular,0,Azimuthal,1)]
_Azimuthal("Projection", float) = 0
}
SubShader
{
Tags { "RenderType"="Opaque" }
LOD 100
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
sampler2D _MainTex;
float4 _MainTex_ST;
float _Azimuthal;
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
return o;
}
#define PI 3.141592653589793238462f
#define PI2 6.283185307179586476924f
float2 uvToEquirectangular(float2 uv) {
float lat = (uv.x) * PI2; // from 0 to 2PI
float lon = (uv.y - .5f) * PI; // from -PI to PI
return float2(lat, lon);
}
float2 uvAsAzimuthalToEquirectangular(float2 uv) {
float2 coord = (uv - .5) * 4;
float radius = length(coord);
float angle = atan2(coord.y, coord.x) + PI;
//formula from https://en.wikipedia.org/wiki/Lambert_azimuthal_equal-area_projection
float lat = angle;
float lon = 2 * acos(radius / 2.) - PI / 2;
return float2(lat, lon);
}
fixed4 frag(v2f i) : SV_Target
{
// get equirectangular coordinates
float2 coord = _Azimuthal ? uvAsAzimuthalToEquirectangular(i.uv) : uvToEquirectangular(i.uv);
// equirectangular to mercator
float x = coord.x;
float y = log(tan(PI / 4. + coord.y / 2.));
// brin x,y into [0,1] range
x = x / PI2;
y = (y + PI) / PI2;
fixed4 col = tex2D(_MainTex, float2(x,y));
// just to make it look nicer
col = _Azimuthal && length(i.uv*2-1) > 1 ? 1 : col;
return col;
}
ENDCG
}
}
}

Not getting uniform thickness in wireframe shader

Shader "Custom/Geometry/Wireframe"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_WireframeVal ("Wireframe width", Range(0.000, 0.035)) = 0.05
_Color ("color", color) = (1, 1, 1, 1)
_BackColor ("Back color", color) = (1, 1, 1, 1)
}
SubShader
{
Tags { "RenderType"="Opaque" "Glowable" = "True" }
Pass
{
Cull Back
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma geometry geom
#include "UnityCG.cginc"
struct v2g {
float4 pos : SV_POSITION;
};
struct g2f {
float4 pos : SV_POSITION;
float3 center : TEXCOORD0;
};
v2g vert(appdata_base v) {
v2g o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
return o;
}
[maxvertexcount(3)]
void geom(triangle v2g IN[3], inout TriangleStream<g2f> triStream) {
float2 p0 = IN[0].pos.xy / IN[0].pos.w;
float2 p1 = IN[1].pos.xy / IN[1].pos.w;
float2 p2 = IN[2].pos.xy / IN[2].pos.w;
float2 edge0 = p1 - p0;
float2 edge1 = p2 - p1;
float2 edge2 = p0 - p2;
float area = abs(edge1.x * edge2.y - edge1.y * edge2.x);
g2f o;
o.pos = IN[0].pos;
o.center = float3(area/length(edge1) , 0, 0);
triStream.Append(o);
o.pos = IN[1].pos;
o.center = float3(0, 0, area/length(edge2) );
triStream.Append(o);
o.pos = IN[2].pos;
o.center = float3(0, area/length(edge0), 0);
triStream.Append(o);
}
float _WireframeVal;
fixed4 _BackColor;
float4 _Color;
fixed4 frag(g2f i) : SV_Target
{
if(min(i.center.x ,(min(i.center.y,i.center.z))) > _WireframeVal)
{
discard;
}
return _BackColor;
}
ENDCG
}
}
}
Here is my wireframe shader code
i am not getting uniform edges throughout
Also when i reduce the width almost to zero (0.0001) the some pixels of wires are not drawn what can i do for that ? I want to make the wireframe shader as unity's built it wireframe mode how can i achieve that ?
Actually, that is uniform thickness. You're just thinking about what "thickness" is in this context, due to how the object is being drawn.
On the edge of the cube facing the camera where it looks like it's thicker than the other edges. Well, you're half right. There is more white pixels drawn there.
But the reason for that is because there are two edges there! And each one is contributing the desired value towards the thickness of the wireframe.
Both the blue outlined face and the red outlined face are contributing to the thickness of that edge, whereas on a corner where the adjacent face is pointing away from the camera (say, the bottom edge of the blue outlined face), only one face contributes to the overall effect.
This isn't noticeable when the thickness is very small or very far away (because 0.2 pixels + 0.2 pixels = 0.4 pixels, round up: 1 pixel), but does become apparent at higher thickness values, or if you non-uniformly scale the object. For example, I have this in a project I was working on a few weeks ago (mind, mine draws the backfaces too and the alpha depth sorting is off):
The reason for this is because where the "edges" are is done through computing the barycentric coordinates of the triangle, which are just an approximation, and will give skewed results if the triangles aren't equilateral.

Get the tangent from a normal direction cg unity3d

I am writing a shader in cg where I displace the vertexes. Because I displace the vertexes, I recalculate the normals so they point away from the surface and give that information to the fragment function. In the shader I also implemented a normal map, now am I wondering shouldn't I also recalculate the tangents? And is there a formula to calculate the tangent? I read that it is on a 90 degrees angle of the normal, could I use the cross product for that?
I want to pass on the right tangent to VOUT.tangentWorld. This is my vertex function:
VertexOutput vert (VertexInput i)
{
VertexOutput VOUT;
// put the vert in world space
float4 newVert = mul(_Object2World,i.vertex);
// create fake vertexes
float4 v1 = newVert + float4(0.05,0.0,0.0,0.0) ; // X
float4 v2 = newVert + float4(0.0,0.0,0.05,0.0) ; // Z
// assign the displacement map to uv coords
float4 disp = tex2Dlod(_Displacement, float4(newVert.x + (_Time.x * _Speed), newVert.z + (_Time.x * _Speed),0.0,0.0));
float4 disp2 = tex2Dlod(_Displacement, float4(v1.x + (_Time.x * _Speed), newVert.z + (_Time.x * _Speed),0.0,0.0));
float4 disp3 = tex2Dlod(_Displacement, float4(newVert.x + (_Time.x * _Speed), v2.z + (_Time.x * _Speed),0.0,0.0));
// offset the main vert
newVert.y += _Scale * disp.y;
// offset fake vertexes
v1 += _Scale * disp2.y;
v2 += _Scale * disp3.y;
// calculate the new normal direction
float3 newNor = cross(v2 - newVert, v1 - newVert);
// return world position of the vert for frag calculations
VOUT.posWorld = newVert;
// set the vert back in object space
float4 vertObjectSpace = mul(newVert,_World2Object);
// apply unity mvp matrix to the vert
VOUT.pos = mul(UNITY_MATRIX_MVP,vertObjectSpace);
//return the tex coords for frag calculations
VOUT.tex = i.texcoord;
// return normal, tangents, and binormal information for frag calculations
VOUT.normalWorld = normalize( mul(float4(newNor,0.0),_World2Object).xyz);
VOUT.tangentWorld = normalize( mul(_Object2World,i.tangent).xyz);
VOUT.binormalWorld = normalize( cross(VOUT.normalWorld, VOUT.tangentWorld) * i.tangent.w);
return VOUT;
}
Isn't it just the vector v2 - newVert or v1 - newVert because the point along the surface? And how do I know which one of the two it is?
I've used something like the below:
Vector3 tangent = Vector3.Cross( normal, Vector3.forward );
if( tangent.magnitude == 0 ) {
tangent = Vector3.Cross( normal, Vector3.up );
}
Source: http://answers.unity3d.com/questions/133680/how-do-you-find-the-tangent-from-a-given-normal.html