I'm applying a "burning" dissolve shader to an object which uses the Standard shader already. Because I need the Standard shader for better lighting, etc, I looked into how to use them "both" or how to extend the standard shader.
According to the Unity docs I can define which lighting mode I want to use in the pragma so I used Standard. That must not work the way I thought, because I see none of the lighting I do when using the standard shader directly.
I'm using the burning dissolve shader from Harry Alisavakis
Shader "Custom/BurnDissolve" {
Properties {
_Color ("Color", Color) = (1,1,1,1)
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_SliceGuide("Slice Guide (RGB)", 2D) = "white" {}
_SliceAmount("Slice Amount", Range(0.0, 1.0)) = 0
_BurnSize("Burn Size", Range(0.0, 1.0)) = 0.15
_BurnRamp("Burn Ramp (RGB)", 2D) = "white" {}
_BurnColor("Burn Color", Color) = (1,1,1,1)
_EmissionAmount("Emission amount", float) = 2.0
}
SubShader {
Tags { "RenderType"="Opaque" }
LOD 200
Cull Off
CGPROGRAM
#pragma surface surf Standard
fixed4 _Color;
sampler2D _MainTex;
sampler2D _SliceGuide;
sampler2D _BumpMap;
sampler2D _BurnRamp;
fixed4 _BurnColor;
float _BurnSize;
float _SliceAmount;
float _EmissionAmount;
struct Input {
float2 uv_MainTex;
};
void surf (Input IN, inout SurfaceOutputStandard o) {
fixed4 c = tex2D (_MainTex, IN.uv_MainTex);
half test = tex2D(_SliceGuide, IN.uv_MainTex).rgb - _SliceAmount;
clip(test);
if (test < _BurnSize && _SliceAmount > 0) {
o.Emission = tex2D(_BurnRamp, float2(test * (1 / _BurnSize), 0)) * _BurnColor * _EmissionAmount;
}
o.Albedo = c.rgb;
o.Alpha = c.a;
}
ENDCG
}
FallBack "Standard"
}
I also tried adding this as another subshader to a copy of the Standard shader source, but I wasn't able to get it working there. If I put it before the others, it works as if it was alone, if I put it after, the standard shader works like if it was alone.
I'm still new to shaders so I'm sure I'm misunderstanding this. How can I get the dissolve shader to apply as well the standard shader, without hopefully having to manually rewrite stuff.
Per request, here are the images I'm using (which were originally provided by the linked dissolve shader article).
As _SliceGuide:
As _BurnRamp:
Using the #prama target 3.0 and o.Smoothness = 0.5f changes Shaman mentioned, I get this:
Which is definitely better, but I guess there are effects Standard is giving me because here's how Standard by itself looks:
Standard shader uses shader model 3.0 target, to get nicer looking lighting.
So you need add this: #pragma target 3.0.
Also Standard shader's default config has its smoothness set to 0.5 whereas in your shader's case it's set to 0, so in order to make the shaders match in terms of lighting, you have to add following line: o.Smoothness = 0.5f;.
Related
In Unity I have modified a cube surface shader that works properly while the associated object is stationary, but when the object is moved the texture is moved on the object. I would like the texture to not move if the object moves or ideally even if vertices of the object are moved around at runtime. This is a gif created during runtime of what happens when the object is stationary vs moving:
GiphyLink
This is the shader code I've been working with (this is the code used in the gif that I linked above):
"CShader" {
Properties{
_CubeMap("Cube Map", Cube) = "white" {}
_CubeMap2("Cube Map 2", Cube) = "white" {}
_Color("Color", Color) = (1,1,1,1)
_Color3("Color 1", Color) = (1,1,1,1)
_Color4("Color 2", Color) = (1,1,1,1)
_Blend("Texture Blend", Range(0,1)) = 0.0
_Glossiness("Smoothness", Range(0,1)) = 0.0
_Metallic("Metallic", Range(0,1)) = 0.0
}
SubShader{
Tags { "RenderType" = "Fade" }
CGPROGRAM
#pragma target 4.5
#pragma surface surf Standard alpha:fade vertex:vert
struct Input {
float2 uv_CubeMap;
float3 customColor;
};
fixed4 _Color3;
fixed4 _Color4;
half _Blend;
half _Glossiness;
half _Metallic;
samplerCUBE _CubeMap;
samplerCUBE _CubeMap2;
void vert(inout appdata_full v, out Input oo) {
UNITY_INITIALIZE_OUTPUT(Input, oo);
oo.customColor = v.vertex.xyz;
}
void surf(Input INN, inout SurfaceOutputStandard oo) {
fixed4 d = texCUBE(_CubeMap2, INN.customColor) * _Color3;
d = lerp(d, texCUBE(_CubeMap, INN.customColor) * _Color4, 1 / (1 + exp(100 * (-(INN.uv_CubeMap.y)))));
oo.Albedo = d.rgb;
oo.Metallic = _Metallic;
oo.Smoothness = _Glossiness;
oo.Alpha = d.a;
}
ENDCG
}
Fallback "Diffuse"
I've tried many things including setting vertices in c# script and passing them to the shader but nothing has worked so far likely because I've coded it wrong or had the wrong procedure.
Any help would be greatly appreciated, Thank you.
I'm trying to make an animated material with the MRTK in Unity. The goal is to make an effect with look like starting with a circle and propagating the texture to rest of the plane.
For now I use the MixedRealityToolkit standard shader and use the round corner option. With an animation I made that :
My problem is that I can't tile the texture to reduce the size of the texture and repeat it. Also for non-square object, the texture is stretched and it's not really nice.
If I try to change the tile setting, the texture is not repeated (texture is well in "Repeat Mode", it works when I untick Round Corners option)
(If I display Unity selection outline, I obtained the repeated texture, but it's not displayed ... )
Does anyone have a good idea to do that with the MRTK shaders or how to write a specific shader for this effect ?
I found a solution, writing my own shader :
Shader "Custom/testShader"
{
Properties
{
_Color ("Color", Color) = (1,1,1,1)
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_ForegroundMask("Foreground Mask", 2D) = "white" {}
_RadiusSize("Radius size", Range(0,2)) = 0
_BorderWidth("Smooth Edge width", Range(0,1)) = 0.5
_Glossiness ("Smoothness", Range(0,1)) = 0.5
_Metallic ("Metallic", Range(0,1)) = 0.0
}
SubShader
{
Tags {"Queue"="Transparent" "RenderType"="Transparent" }
LOD 200
ZWrite Off
Blend SrcAlpha OneMinusSrcAlpha
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
#pragma surface surf Standard fullforwardshadows alpha:fade
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
sampler2D _ForegroundMask;
struct Input
{
float2 uv_MainTex;
float2 uv_ForegroundMask;
};
half _Glossiness;
half _Metallic;
fixed4 _Color;
float _RadiusSize;
float _BorderWidth;
// Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.
// See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.
// #pragma instancing_options assumeuniformscaling
UNITY_INSTANCING_BUFFER_START(Props)
// put more per-instance properties here
UNITY_INSTANCING_BUFFER_END(Props)
void surf (Input IN, inout SurfaceOutputStandard o)
{
fixed2 dist;
dist.x = IN.uv_ForegroundMask.x - 0.5;
dist.y = IN.uv_ForegroundMask.y - 0.5;
fixed circle= 1.0 - smoothstep(_RadiusSize-(_RadiusSize * _BorderWidth),_RadiusSize+(_RadiusSize * _BorderWidth), dot(dist,dist)*4.0);
fixed4 c = tex2D(_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;
// Metallic and smoothness come from slider variables
o.Metallic = _Metallic;
o.Smoothness = _Glossiness;
o.Alpha = c.a * circle;
}
ENDCG
}
FallBack "Diffuse"
}
I've only just started learning Unity, but because I come from a background of coding in C#, I've found the standard scripting to be very quick to learn. Unfortunately, I've now come across a problem for which I believe a custom shader is required and I'm completely lost when it comes to shaders.
Scenario:
I'm using a custom distance scaling process so that really big, far away objects are moved within a reasonable floating point precision range from the player. This works great and handles scaling of the objects based on their adjusted distance so they appear to actually be really far away. The problem occurs though when two of these objects pass close to eachother in game space (this would still be millions of units apart in real scale) because they visibly collide.
Ex: https://www.youtube.com/watch?v=KFnuQg4R8NQ
Attempted Solution 1:
I've looked into flattening the objects along the player's view axis and this fixes the collision, but this affects shading and particle effects so wasn't a good option
Attempted Solution 2:
I've tried changing the RenderOrder, but because sometimes one object is inside the mesh of another (though the centre of this object is still closer to the camera) it doesn't fix the issue and particle effects are problematic again.
Attempted Solution 3:
I've tried moving the colliding objects to their own layer, spawning a new camera with a higher depth at the same position as my main camera and forcing the cameras to only see the items on their respective layers, but this caused lighting issues as some objects are lighting others and I had only a limited number of layers so this solution was quite limiting as it forced me to only have a low number of objects that could be overlapping at a time. NOTE: this solution is probably the closest I was able to come to what I need though.
Ex: https://www.youtube.com/watch?v=CyFDgimJ2-8
Attempted Solution 4:
I've tried updating the Standard shader code by downloading it from Unity's downloads page and creating my own, custom shader that allows me to modify the ZWrite and ZTest properties, but because I've no real understanding of how these work, I'm not getting anywhere.
Request:
I would greatly appreciate a Shader script code example of how I can programmatically force one object who's mesh is either colliding with or completely inside another mesh to render in front of said mesh. I'm hoping I can then take that example and apply it to all the shaders that I'm currently using (Standard, Particle Additive) to achieve the effect I'm looking for. Thanks in advance for your help.
In the gif below both objects are colliding and according to the camera position the cube is in front of the sphere but I can change their visibility with the render queue:
If that's what you want you only have to add ZWrite Off in your subshader before the CGPROGRAM starts, the following is the Standard Surface Shader including the line:
Shader "Custom/Shader" {
Properties {
_Color ("Color", Color) = (1,1,1,1)
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_Glossiness ("Smoothness", Range(0,1)) = 0.5
_Metallic ("Metallic", Range(0,1)) = 0.0
}
SubShader {
Tags { "RenderType"="Opaque" }
LOD 200
ZWrite Off
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
#pragma surface surf Standard fullforwardshadows
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
struct Input {
float2 uv_MainTex;
};
half _Glossiness;
half _Metallic;
fixed4 _Color;
// Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.
// See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.
// #pragma instancing_options assumeuniformscaling
UNITY_INSTANCING_BUFFER_START(Props)
// put more per-instance properties here
UNITY_INSTANCING_BUFFER_END(Props)
void surf (Input IN, inout SurfaceOutputStandard o) {
// Albedo comes from a texture tinted by color
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;
// Metallic and smoothness come from slider variables
o.Metallic = _Metallic;
o.Smoothness = _Glossiness;
o.Alpha = c.a;
}
ENDCG
}
FallBack "Diffuse"
}
Now sorting particles, look at the shadows and how they collide and how we can change their visibility regardless of their position.
Here's the shader for particles, I'm using the Unity Built-in shader, the only thing added is Ztest Always
Shader "Particles/Alpha Blended Premultiply Custom" {
Properties {
_MainTex ("Particle Texture", 2D) = "white" {}
_InvFade ("Soft Particles Factor", Range(0.01,3.0)) = 1.0
}
Category {
Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent" "PreviewType"="Plane" }
ZTest Always
Blend SrcAlpha OneMinusSrcAlpha
ColorMask RGB
Cull Off Lighting Off ZWrite Off
SubShader {
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma target 2.0
#pragma multi_compile_particles
#pragma multi_compile_fog
#include "UnityCG.cginc"
sampler2D _MainTex;
fixed4 _TintColor;
struct appdata_t {
float4 vertex : POSITION;
fixed4 color : COLOR;
float2 texcoord : TEXCOORD0;
UNITY_VERTEX_INPUT_INSTANCE_ID
};
struct v2f {
float4 vertex : SV_POSITION;
fixed4 color : COLOR;
float2 texcoord : TEXCOORD0;
#ifdef SOFTPARTICLES_ON
float4 projPos : TEXCOORD1;
#endif
UNITY_VERTEX_OUTPUT_STEREO
};
float4 _MainTex_ST;
v2f vert (appdata_t v)
{
v2f o;
UNITY_SETUP_INSTANCE_ID(v);
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
o.vertex = UnityObjectToClipPos(v.vertex);
#ifdef SOFTPARTICLES_ON
o.projPos = ComputeScreenPos (o.vertex);
COMPUTE_EYEDEPTH(o.projPos.z);
#endif
o.color = v.color;
o.texcoord = TRANSFORM_TEX(v.texcoord,_MainTex);
return o;
}
UNITY_DECLARE_DEPTH_TEXTURE(_CameraDepthTexture);
float _InvFade;
fixed4 frag (v2f i) : SV_Target
{
#ifdef SOFTPARTICLES_ON
float sceneZ = LinearEyeDepth (SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.projPos)));
float partZ = i.projPos.z;
float fade = saturate (_InvFade * (sceneZ-partZ));
i.color.a *= fade;
#endif
return i.color * tex2D(_MainTex, i.texcoord) * i.color.a;
}
ENDCG
}
}
}
}
I'm trying to fluctuate between two values inside a shader to achieve a glowing effect.
I need it to be done inside the shader itself and not via C# scripting.
I've tried using the _Time value that Unity gives us for shader animation but it isn't working:
Shader "Test shader" {
Properties {
_ColorTint ("Color", Color) = (1,1,1,1)
_MainTex ("Base (RGB)", 2D) = "white" {}
_GlowColor("Glow Color", Color) = (1,0,0,1)
_GlowPower("Glow Power", Float) = 3.0
_UpDown("Shine Emitter Don't Change", Float) = 0
}
SubShader {
Tags {
"RenderType"="Opaque"
}
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float4 color : Color;
float2 uv_MainTex;
float3 viewDir;
float4 _Time;
};
float4 _ColorTint;
sampler2D _MainTex;
float4 _GlowColor;
float _GlowPower;
float _UpDown;
void surf(Input IN, inout SurfaceOutput o) {
if (_UpDown == 0) {
_GlowPower += _Time.y;
}
if (_UpDown == 1) {
_GlowPower -= _Time.y;
}
if (_GlowPower <= 1) {
_UpDown = 0;
}
if (_GlowPower >= 3) {
_UpDown = 1;
}
IN.color = _ColorTint;
o.Albedo = tex2D(_MainTex, IN.uv_MainTex).rgb * IN.color;
half rim = 1.0 - saturate(dot(normalize(IN.viewDir), o.Normal));
o.Emission = _GlowColor.rgb * pow(rim, _GlowPower);
}
ENDCG
}
FallBack "Diffuse"
}
This makes the glow grow to the infinite.
What am I doing wrong?
Extending my comment slightly:
You can't use _Time.y in this case, as it is the elapsed time since the game started, thus it will increase over time.
You can use _SinTime.y instead, which represents sin(_Time.y). This means that oscillates between the values -1 and 1. You can use this and assign (maybe a scaled version of _SinTime) to your variable: _GlowPower = C * _SinTime.y
More on build-in shader variables: http://docs.unity3d.com/Manual/SL-UnityShaderVariables.html
For doing a pulsing glow... I'd have a script 'outside' the shader and send in a paramater (_GlowPower) calculate in c# script like this....
glowPow = Mathf.Sin(time);
Then you only need to calculate it once. IF you put it in VErtex shader... it does it once per vertex, and if its in surface shader... then once per pixel = performance waste.
you can send variables to your shader like this... (very handy)
material.SetFloat(propertyName, valueToSend);
So you could send, time, strength, glow or whatverer you want.
If you really need to do a glow calculation per vertex or per pixel, then use
_glowPow = sin(_time);
inside the shader.
I have a scene where I really need depth of field.
Apparently, Unity's depth of field doesn't work with any of the shaders, neither built-in or custom, that process the alpha.
So this happens, for example, with the Transparent/Diffuse shader. Transparent/Cutout works instead.
Here's the simplest custom shader I made that triggers this behaviour:
Shader "Custom/SimpleAlpha" {
Properties {
_MainTex ("Base (RGBA)", 2D) = "white" {}
}
SubShader {
Tags { "RenderType"="Transparent" "Queue"="Transparent" }
//Tags { "RenderType"="Opaque" }
LOD 300
ZWrite Off
CGPROGRAM
#pragma surface surf Lambert alpha
#include "UnityCG.cginc"
sampler2D _MainTex;
struct Input {
float2 uv_MainTex;
};
void surf (Input IN, inout SurfaceOutput o) {
half4 c = tex2D (_MainTex, IN.uv_MainTex);
o.Albedo = c.rgb;
o.Alpha = c.a;
}
ENDCG
}
FallBack "Diffuse"
}
If you try the code in a project you'll notice that EVERY object that wears the shader is blurred with the very same amount instead of being blurred basing on Z.
Any help is much appreciated.
Thanks in advance.
I posted the same question on Unity Answers: http://answers.unity3d.com/questions/438556/my-shader-brakes-depth-of-field.html
Since depth of field is a post processing effect that uses the values stored in the Z-buffer, the following line is the culprit:
ZWrite Off
For transparent objects, Z-buffer writes are usually disabled because the Transparent render queue doesn't need the Z-buffer.
So if you remove that line, you should see depth of field correctly applied to transparent objects. But objects lying behind fully transparent areas will now be blurred using the wrong Z value. As quick fix, you could try to use an alpha test like AlphaTest Greater 0.1.