Surface shader - More transparency for distant points - unity3d

I am trying to create my first Shader in Unity3D. My goal is to add more alpha to pixels if they are close to some point in world space. But I can't get it right. My pixels are not getting smooth transparency (From min value, to max value), just or min, or max..
Here is my code:
Shader "Custom/Shield" {
Properties {
_MainTex ("Color (RGB) Alpha (A)", 2D) = "white" {}
_TexUsage ("Text usage", Range(0.1, 0.99)) = 0
_HitPoint ("Hit point", Vector) = (1, 1, 1, 1)
_Distance ("Distance", float) = 4.0
}
SubShader {
Tags { "Queue" = "Transparent" "RenderType" = "Transparent" }
LOD 200
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
// And generate the shadow pass with instancing support
#pragma surface surf Standard fullforwardshadows alpha
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
half _TexUsage;
float3 _HitPoint;
fixed _Distance;
struct Input {
float2 uv_MainTex;
float3 worldPos;
};
void surf (Input IN, inout SurfaceOutputStandard o) {
IN.uv_MainTex.x = frac(IN.uv_MainTex.x + frac(_Time.x));
o.Albedo = tex2D(_MainTex, IN.uv_MainTex).rgba;
float dist = distance(_HitPoint, IN.worldPos);
float minAlpha = 0.2;
float st = step(_Distance, dist);
float blend = (dist / _Distance) * (1 - st) + minAlpha * st;
o.Alpha = blend;
}
ENDCG
}
FallBack "Diffuse"
}
And here is a example how it works now:
But this area should be less visible, if not so close to hit point.
What am I am doing wrong?

Related

How can i convert uv coordinates to world space?

I am trying to implement a shader. It will use with Unity LineRenderer. Shader will have noise that scrolling overtime raltive to texture coordinates. For example in parallel to x axis of uv space of texture. I have an implementation, but i dont'know how to get direction relative to texture uv (consider the texture rotation) in a vert function. I am only have a world space-relativew scrolling.
Main problem - how to convert uv coordinates (for example (0, 0) or (1, 0)) to world space?
Here is a my shader:
Shader "LineRendering/Test"
{
Properties
{
[PerRendererData] _MainTex("Sprite Texture", 2D) = "white" {}
_Freq("Frequency", Float) = 1
_Speed("Speed", Float) = 1
}
SubShader
{
Tags
{
"Queue" = "Transparent"
"IgnoreProjector" = "True"
"RenderType" = "Transparent"
"PreviewType" = "Plane"
"CanUseSpriteAtlas" = "True"
}
LOD 200
Cull Off
Lighting Off
ZWrite Off
Fog { Mode Off }
Blend One OneMinusSrcAlpha
Pass
{
CGPROGRAM
#pragma target 3.0
#pragma vertex vert
#pragma fragment frag
#pragma enable_d3d11_debug_symbols
#include "noiseSimplex.cginc"
struct appdata_t
{
fixed4 vertex : POSITION;
fixed2 uv : TEXCOORD0;
};
struct v2f
{
fixed4 vertex : SV_POSITION;
fixed2 texcoord : TEXCOORD0;
fixed2 srcPos : TEXCOORD1;
};
uniform fixed _Freq;
uniform fixed _Speed;
v2f vert(appdata_t IN)
{
v2f OUT;
OUT.vertex = UnityObjectToClipPos(IN.vertex);
OUT.texcoord = IN.uv;
OUT.srcPos = mul(unity_ObjectToWorld, IN.vertex).xy;
OUT.srcPos *= _Freq;
//This is my trying to convert uv coordinates to world coodinates, but it is still unsuccessfully.
//fixed2 v0Pos = mul(unity_WorldToObject, fixed3(0, 0, 0)).xy;
//fixed2 v1Pos = mul(unity_WorldToObject, fixed3(1, 0, 0)).xy;
//fixed2 scrollOffset = v1Pos - v0Pos;
fixed2 scrollOffset = fixed2(1, 0);
OUT.srcPos.xy -= fixed2(scrollOffset.x, scrollOffset.y) * _Time.y * _Speed;
return OUT;
}
fixed4 frag(v2f IN) : COLOR
{
fixed4 output;
float ns = snoise(IN.srcPos) / 2 + 0.5f;
output.rgb = fixed3(ns, ns, ns);
output.a = ns;
output.rgb *= output.a;
return output;
}
ENDCG
}
}
}
Noise library getted form here: https://forum.unity.com/threads/2d-3d-4d-optimised-perlin-noise-cg-hlsl-library-cginc.218372/#post-2445598. Please help me.
Texture coordinates are already in texture space. If I understand correctly, you should be able to just do this:
v2f vert(appdata_t IN)
{
v2f OUT;
OUT.vertex = UnityObjectToClipPos(IN.vertex);
OUT.texcoord = IN.uv;
OUT.srcPos = IN.uv;
OUT.srcPos *= _Freq;
fixed2 scrollOffset = fixed2(1, 0);
OUT.srcPos.xy -= fixed2(scrollOffset.x, scrollOffset.y) * _Time.y * _Speed;
return OUT;
}
Option 1
Each of your UVs is associated with a specific vertex. Once you can establish which UV is assigned to which vertex, then look up the world position of the vertex.
Option 2
Another way to do this though may be with a texture that is a pre-baked image of the local space coordinates of the object. In the texture, the XYZ coords would map to RGB. Then you'll do a texture lookup and get to local object coordinates. You'll then have to multiply that vector by the world projection matrix in order to get the actual world space value.
When you create the texture, you'll have to account for the inability to store negative values. So first you'll have to set up the object so that it fits entirely inside the world coordinates of [-1, 1], in all three axes. Then, as part of the baking procedure, you'll have to divide all values by two, and then add .5. This will ensure that all your negative coordinate space values are stored from [0,.5) and all positive values are stored from [.5,1].
Note
I had a hard time understanding your exact request. I hope my options help with your program

How to add emission to Unity custom shader?

I use the following shader in Unity.
Ring like sonar is spreading from the place where game objects collided.
With reference to the underlying image.
I would like to add emission to make this ring visible even in the dark, but I am new to shaders. I am in trouble because I do not know where to change.
The setting for making it dark is as follows.
Set the Intensity Multiplier of Environment Lightning to 0.
Remove skybox.
Set the Background of Camera to black.
The shader I use is below.
Shader "MadeByProfessorOakie/SimpleSonarShader" {
Properties{
_Color("Color", Color) = (1,1,1,1)
_MainTex("Albedo (RGB)", 2D) = "white" {}
_Glossiness("Smoothness", Range(0,1)) = 0.5
_Metallic("Metallic", Range(0,1)) = 0.0
_RingColor("Ring Color", Color) = (1,1,1,1)
_RingColorIntensity("Ring Color Intensity", float) = 2
_RingSpeed("Ring Speed", float) = 1
_RingWidth("Ring Width", float) = 0.1
_RingIntensityScale("Ring Range", float) = 1
_RingTex("Ring Texture", 2D) = "white" {}
}
SubShader{
Tags{ "RenderType" = "Opaque" }
LOD 200
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
#pragma surface surf Standard fullforwardshadows
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
sampler2D _RingTex;
struct Input {
float2 uv_MainTex;
float3 worldPos;
float3 uv_Illum;
};
// The size of these arrays is the number of rings that can be rendered at once.
// If you want to change this, you must also change QueueSize in SimpleSonarShader_Parent.cs
half4 _hitPts[20];
half _StartTime;
half _Intensity[20];
half _Glossiness;
half _Metallic;
fixed4 _Color;
fixed4 _RingColor;
// 追加
fixed4 _EmissionLM;
half _RingColorIntensity;
half _RingSpeed;
half _RingWidth;
half _RingIntensityScale;
void surf(Input IN, inout SurfaceOutputStandard o) {
fixed4 c = tex2D(_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;
half DiffFromRingCol = abs(o.Albedo.r - _RingColor.r) + abs(o.Albedo.b - _RingColor.b) + abs(o.Albedo.g - _RingColor.g);
// Check every point in the array
// The goal is to set RGB to highest possible values based on current sonar rings
for (int i = 0; i < 20; i++) {
half d = distance(_hitPts[i], IN.worldPos);
half intensity = _Intensity[i] * _RingIntensityScale;
half val = (1 - (d / intensity));
if (d < (_Time.y - _hitPts[i].w) * _RingSpeed && d >(_Time.y - _hitPts[i].w) * _RingSpeed - _RingWidth && val > 0) {
half posInRing = (d - ((_Time.y - _hitPts[i].w) * _RingSpeed - _RingWidth)) / _RingWidth;
// Calculate predicted RGB values sampling the texture radially
float angle = acos(dot(normalize(IN.worldPos - _hitPts[i]), float3(1,0,0)));
val *= tex2D(_RingTex, half2(1 - posInRing, angle));
half3 tmp = _RingColor * val + c * (1 - val);
// Determine if predicted values will be closer to the Ring color
half tempDiffFromRingCol = abs(tmp.r - _RingColor.r) + abs(tmp.b - _RingColor.b) + abs(tmp.g - _RingColor.g);
if (tempDiffFromRingCol < DiffFromRingCol)
{
// Update values using our predicted ones.
DiffFromRingCol = tempDiffFromRingCol;
o.Albedo.r = tmp.r;
o.Albedo.g = tmp.g;
o.Albedo.b = tmp.b;
o.Albedo.rgb *= _RingColorIntensity;
}
}
}
o.Metallic = _Metallic;
o.Smoothness = _Glossiness;
}
ENDCG
}
FallBack "Diffuse"
}
It's always worth to have a look at the Built-In shaders when you create custom shaders.
Just pick those which behave the way you want and see what they do.
Here is an example for the BlinnPhong surface:
//Properties
_EmissionLM ("Emission (Lightmapper)", Float) = 0
[Toggle] _DynamicEmissionLM ("Dynamic Emission (Lightmapper)", Int) = 0
//Output
o.Emission = c.rgb * tex2D(_Illum, IN.uv_Illum).a;
Hope it helps you
The problem was solved. I rewrote the code as follows. Thank you very much for those who thought.
I rewrote as follow.
// Determine if predicted values will be closer to the Ring color
half tempDiffFromRingCol = abs(tmp.r - _RingColor.r) + abs(tmp.b - _RingColor.b) + abs(tmp.g - _RingColor.g);
if (tempDiffFromRingCol < DiffFromRingCol)
{
// Update values using our predicted ones.
//DiffFromRingCol = tempDiffFromRingCol;
/*
o.Albedo.r = tmp.r;
o.Albedo.g = tmp.g;
o.Albedo.b = tmp.b;
o.Albedo.rgb *= _RingColorIntensity;
*/
// I Changed here
o.Emission = tmp;
}

Spherical mapping

I'm having trouble projecting objects (for example a plane) onto a spherical surface.
The shader just have to take vertex local position (P0), convert it in world coordinates (P1), then find the vector from a given center (C) to P1 (P1 - C). So normalize this vector and multiply by a given coefficient, and finally convert back to local coordinates.
I'm working in Unity with surface shaders
Shader "Custom/testShader" {
Properties {
_MainTex("texture", 2D) = "white" {}
_Center("the given center", Vector) = (0,0,0)
_Height("the given coefficient", Range(1, 1000) = 10
}
Subshader {
CGPROGRAM
#pragma surface surf Standard vertex:vert
sampler2D _MainTex;
float3 _Center;
float _Height;
struct Input { float2 uv_MainTex; }
// IMPORTANT STUFF
void vert (inout appdata_full v) {
float3 world_vertex = mul(unity_ObjectToWorld, v.vertex) - _Center;
world_vertex = normalize(world_vertex) * _Height;
v.vertex = mul(unity_WorldToObject, world_vertex);
}
// END OF IMPORTANT STUFF
void surf (Input IN, inout SurfaceOutputStandard o) {
o.Albedo = tex2D(_MainTex, IN.uv_MainTex).rgb;
}
ENDCG
}
}
Now the problem is that in the scene, where I have some planes with this shader, they look split and much more little that what they are supposed to be. Any ideas?
EDIT
Here are some screenshots:
You are transforming world_vertex as a direction (X,Y,Z,0) instead of position (X,Y,Z,1). See this for more info.
So this line
v.vertex = mul(unity_WorldToObject, world_vertex); should be
v.vertex = mul(unity_WorldToObject, float4(world_vertex, 1) );

Unity worldPos relative direction

I'm trying to code a shader similar to this one from the Unity manual which “slices” the object by discarding pixels in nearly horizontal rings via the Clip() function.
Shader "Example/Slices" {
Properties {
_MainTex ("Texture", 2D) = "white" {}
_BumpMap ("Bumpmap", 2D) = "bump" {}
}
SubShader {
Tags { "RenderType" = "Opaque" }
Cull Off
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float2 uv_MainTex;
float2 uv_BumpMap;
float3 worldPos;
};
sampler2D _MainTex;
sampler2D _BumpMap;
void surf (Input IN, inout SurfaceOutput o) {
clip (frac((IN.worldPos.y+IN.worldPos.z*0.1) * 5) - 0.5);
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));
}
ENDCG
}
Fallback "Diffuse"
}
Rather than just horizontal lines I want to be able to slice at an arbitrary angle. I discovered through experimentation (since I'm new to coding shaders) that the multiplier on worldPos Z does indeed change the angle of the slice, so I set a property variable to it:
clip (frac((IN.worldPos.y+IN.worldPos.z*_MYANGLEVARIABLE) * 5) - 0.5);
This has two problems however. 1) Values up to 1.0 rotate the lines by up to 45 degrees but beyond this the lines start to go "squiggly" and convolve into all sorts of patterns rather than neat lines and 2) this only works if the face is oriented toward the positive or negative X axis. When facing Z the lines don't move and when facing Y they get bigger but don't rotate.
Changing IN.worldPos.y to IN.worldPos.x does what you might expect - similar situation but working as expected in Z rather than X.
Any ideas how to
1) Achieve arbitrary angles?
2) Have them work regardless of facing direction?
I'm using worlPos because I always want the lines to be relative to the object rather than screen space but perhaps there's another way? My actual shader is a fragment rather than a surface shader & I'm passing worldPos from the vert to the frag.
Many thanks
To get arbitrary angles you can define a plane using a normal vector and clip using the dot product.
Shader "Example/Slices" {
Properties {
_MainTex ("Texture", 2D) = "white" {}
_BumpMap ("Bumpmap", 2D) = "bump" {}
_PlaneNormal ("Plane Normal", Vector) = (0, 1, 0)
}
SubShader {
Tags { "RenderType" = "Opaque" }
Cull Off
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float2 uv_MainTex;
float2 uv_BumpMap;
float3 worldPos;
};
sampler2D _MainTex;
sampler2D _BumpMap;
float3 _PlaneNormal;
void surf (Input IN, inout SurfaceOutput o) {
float d = dot(_PlaneNormal, IN.worldPos);
clip (frac(d) - 0.5);
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));
}
ENDCG
}
Fallback "Diffuse"
}
The longer the normal vector here, the more frequent the slices.
If you want the slices to be relative to the object, you'll need to use a set of coordinates other than worldPos. Possibly this answer would help: http://answers.unity3d.com/questions/561900/get-local-position-in-surface-shader.html

Removing a part of a tex2D before combining it

Im coding a unity surface shader to slowly apply a rust effect like this:
//Take 1 base color texture.
//Take 1 rust decal texture and 1 greyscale maps.
//Take 1 float range value.
Then:
//Use the range to remove from the grayscale map all the pixels that are darker than the value itself, then make theese greysclae map the rust alpha, then apply this composited rust layer over the color texture.
I managed to do this:
void surf (Input IN, inout SurfaceOutputStandard o) {
half4 C = tex2D (_MainTex, IN.uv_MainTex); //Color Texture
half4 R = tex2D (_RustTex, IN.uv_RustTex); //Rust texture
half4 RG = tex2D (_RustGuide, IN.uv_RustGuide); //Greyscale texture
//Here i need to compose the rust layer
half4 RustResult = //??? Maybe a Clip() function or what? and how?
//Here i apply the previusly composed layer over the color texture. Already tested and working.
half4 Final = lerp (C, RustResult, RustResult.a);
o.Albedo = c.rgb;
o.Alpha = c.a;
}
So how i can complete this shader?
I cant find a detailed documentation about the usable functuons in surface shaders.
EDIT: I almost get what i need using saturate(); functionlike the following
Properties {
_MainTex ("Base (RGB)", 2D) = "" {} //the color texture
_RustTex ("Rust Texture (RGB)", 2D) = "" {} //the rust texture
_RustGuide ("Rust Guide (A)", 2D) = "" {} //the rust greyscale texture
_RustAmount ("Rust Amount", range(0.0, 1.0)) = 0.0 //the rust amount float value
_RustMultiplier ("Rust Multiplier", float) = 2
}
SubShader {
Tags { "RenderType"="Opaque" }
LOD 200
CGPROGRAM
#pragma target 3.0
#include "UnityPBSLighting.cginc"
#pragma surface surf Standard
sampler2D _MainTex;
sampler2D _RustTex;
sampler2D _RustGuide;
float _RustAmount;
float _RustMultiplier;
struct Input {
float2 uv_MainTex;
float2 uv_RustTex;
float2 uv_RustGuide;
};
void surf (Input IN, inout SurfaceOutputStandard o) {
half4 M = tex2D (_MainTex, IN.uv_MainTex);
half4 R = tex2D (_RustTex, IN.uv_RustTex);
half4 RG = tex2D (_RustGuide, IN.uv_RustGuide);
half4 RustResult;
RustResult.rgb = R.rgb;
if (_RustAmount > 0) {
RustResult.a = trunc(saturate(RG.a * _RustAmount * _RustMultiplier);
}
half4 Final = lerp (M, RustResult, RustResult.a);
o.Albedo = Final.rgb;
o.Alpha = Final.a;
}
ENDCG
}
FallBack Off
}
This makes the effect that I need. The only problem now is how i can blur the edges of the alpha?
Use the range to remove from the grayscale map all the pixels that are
darker than the value itself
Can't you simply clamp the values below _RustAmount float? Something like:
float greyScaleMapValue = tex2D(_RustGuide, IN.uv_RustGuide).a; //assuming rust guide is stored as a single channel
float clampedMap = clamp(greyScaleMapValue , _RustAmount, 1); //clamped map stores a value clamped between _RustAmount and 1 -> every pixel darker than _RustAmount are 0
half3 albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
half3 rust = tex2D (_RustTex, IN.uv_RustTex).rgb;
float3 finalCol = lerp(albedo, rust, clampedMap); // all values in the map below _RustAmount will have plain albedo value, the other will be blended with rust using the map
return float4(finalCol,1);
Note that the code above produce some abrupt transition from texture to rust (more abrupt more _RustmAmount is higher than zero). You want eventually to remap each value higher than zero after the clamp in [0,1] range.
If you need to smooth the transition you can remap the interval [_RustAmount,1] into [0,1]:
float clampedMapNormalized = (clampedMap - _RustAmount) / (1 - _RustAmount);
Hope this helps
Side note:
avoid branching in the shaders (even if branching on uniforms shouldn't be such a pain on modern hardware)
if the map uses the same sets of uv coordinates (and tiling)of one of the other 2 textures, than you can pack it inside the relative alpha channel using one less texture sample operation.
since your shader is opaque, I guess final alpha value isn't relevant, so I just used a float3 to minimize the values to be lerped.