I'm trying to fluctuate between two values inside a shader to achieve a glowing effect.
I need it to be done inside the shader itself and not via C# scripting.
I've tried using the _Time value that Unity gives us for shader animation but it isn't working:
Shader "Test shader" {
Properties {
_ColorTint ("Color", Color) = (1,1,1,1)
_MainTex ("Base (RGB)", 2D) = "white" {}
_GlowColor("Glow Color", Color) = (1,0,0,1)
_GlowPower("Glow Power", Float) = 3.0
_UpDown("Shine Emitter Don't Change", Float) = 0
}
SubShader {
Tags {
"RenderType"="Opaque"
}
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float4 color : Color;
float2 uv_MainTex;
float3 viewDir;
float4 _Time;
};
float4 _ColorTint;
sampler2D _MainTex;
float4 _GlowColor;
float _GlowPower;
float _UpDown;
void surf(Input IN, inout SurfaceOutput o) {
if (_UpDown == 0) {
_GlowPower += _Time.y;
}
if (_UpDown == 1) {
_GlowPower -= _Time.y;
}
if (_GlowPower <= 1) {
_UpDown = 0;
}
if (_GlowPower >= 3) {
_UpDown = 1;
}
IN.color = _ColorTint;
o.Albedo = tex2D(_MainTex, IN.uv_MainTex).rgb * IN.color;
half rim = 1.0 - saturate(dot(normalize(IN.viewDir), o.Normal));
o.Emission = _GlowColor.rgb * pow(rim, _GlowPower);
}
ENDCG
}
FallBack "Diffuse"
}
This makes the glow grow to the infinite.
What am I doing wrong?
Extending my comment slightly:
You can't use _Time.y in this case, as it is the elapsed time since the game started, thus it will increase over time.
You can use _SinTime.y instead, which represents sin(_Time.y). This means that oscillates between the values -1 and 1. You can use this and assign (maybe a scaled version of _SinTime) to your variable: _GlowPower = C * _SinTime.y
More on build-in shader variables: http://docs.unity3d.com/Manual/SL-UnityShaderVariables.html
For doing a pulsing glow... I'd have a script 'outside' the shader and send in a paramater (_GlowPower) calculate in c# script like this....
glowPow = Mathf.Sin(time);
Then you only need to calculate it once. IF you put it in VErtex shader... it does it once per vertex, and if its in surface shader... then once per pixel = performance waste.
you can send variables to your shader like this... (very handy)
material.SetFloat(propertyName, valueToSend);
So you could send, time, strength, glow or whatverer you want.
If you really need to do a glow calculation per vertex or per pixel, then use
_glowPow = sin(_time);
inside the shader.
Related
In Unity I have modified a cube surface shader that works properly while the associated object is stationary, but when the object is moved the texture is moved on the object. I would like the texture to not move if the object moves or ideally even if vertices of the object are moved around at runtime. This is a gif created during runtime of what happens when the object is stationary vs moving:
GiphyLink
This is the shader code I've been working with (this is the code used in the gif that I linked above):
"CShader" {
Properties{
_CubeMap("Cube Map", Cube) = "white" {}
_CubeMap2("Cube Map 2", Cube) = "white" {}
_Color("Color", Color) = (1,1,1,1)
_Color3("Color 1", Color) = (1,1,1,1)
_Color4("Color 2", Color) = (1,1,1,1)
_Blend("Texture Blend", Range(0,1)) = 0.0
_Glossiness("Smoothness", Range(0,1)) = 0.0
_Metallic("Metallic", Range(0,1)) = 0.0
}
SubShader{
Tags { "RenderType" = "Fade" }
CGPROGRAM
#pragma target 4.5
#pragma surface surf Standard alpha:fade vertex:vert
struct Input {
float2 uv_CubeMap;
float3 customColor;
};
fixed4 _Color3;
fixed4 _Color4;
half _Blend;
half _Glossiness;
half _Metallic;
samplerCUBE _CubeMap;
samplerCUBE _CubeMap2;
void vert(inout appdata_full v, out Input oo) {
UNITY_INITIALIZE_OUTPUT(Input, oo);
oo.customColor = v.vertex.xyz;
}
void surf(Input INN, inout SurfaceOutputStandard oo) {
fixed4 d = texCUBE(_CubeMap2, INN.customColor) * _Color3;
d = lerp(d, texCUBE(_CubeMap, INN.customColor) * _Color4, 1 / (1 + exp(100 * (-(INN.uv_CubeMap.y)))));
oo.Albedo = d.rgb;
oo.Metallic = _Metallic;
oo.Smoothness = _Glossiness;
oo.Alpha = d.a;
}
ENDCG
}
Fallback "Diffuse"
I've tried many things including setting vertices in c# script and passing them to the shader but nothing has worked so far likely because I've coded it wrong or had the wrong procedure.
Any help would be greatly appreciated, Thank you.
as the title described i want to render only some parts of the texture. For example i have an 1024*1024 texture and now i want to render the area(square) that's between the points 0/0 50/50 pixel and the area 600/600 1024/1024.
Is something like that possible?
Maybe you can help me with some logic steps that i need to go, because i don't really know how to start.
Yeah i need an shader with 2 texture slots and a scripts that renders only some parts^^
I think it has something to do with this here: https://answers.unity.com/questions/529814/how-to-have-2-different-objects-at-the-same-place.html
The following basic surface shader "removes" all pixels outside between 50/50 and 600/600 (this is called clipping):
Shader "Custom/ClippedTexture" {
Properties {
_MainTex ("Texture", 2D) = "white" {}
}
SubShader {
Tags { "RenderType" = "Opaque" }
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float2 uv_MainTex;
};
sampler2D _MainTex;
float4 _MainTex_TexelSize;
void surf (Input IN, inout SurfaceOutput o) {
float u = IN.uv_MainTex.x * _MainTex_TexelSize.x;
float v = IN.uv_MainTex.y * _MainTex_TexelSize.y;
if((u < 51 && v < 51) || (u > 599 && v > 599)) {
clip(-1); // skip pixel
} else {
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
}
}
ENDCG
}
Fallback "Diffuse"
}
You can save this Shader into the Shader directory of your project and create a material that uses it. Hope this helps.
I want to draw a horizontal line on an object with shader code (hlsl).
The clipping shader simply takes the distance to a given Y-coordinate in the surface shader and checks if it is higher that a given value.
If so it will discard. The result is a shader that simply clips away all pixels that are not on a line.
void surf (Input IN, inout SurfaceOutputStandard o) {
// Albedo comes from a texture tinted by color
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
float d = abs(_YClip - IN.worldPos.y); // _YClip is is the properties and can be changed
if (d > _LineThickness) {
discard;
}
}
Can I somehow combine this shader with the standard unity shader without changing the code?
I plan to have a gizmo shader that renders lines and all kind of stuff. It would be very practical if I could just tell unity to render this gizmo shader on top.
I believe you might be able to use or adapt this shader to your purpose.
Image showing before y axis reached.
Image showing during, where one half is above cutoff y value and other half is below. Note that the pattern it dissolves in, depends on a texture pattern you supply yourself. So it should be possible to have a strict cutoff instead of a more odd and uneven pattern.
After the object has fully passed by the cutoff y value. What I did in this case is to hide an object inside the start object that is slightly smaller than the first object you saw. But if you don't have anything inside, the object will just be invisible, or clipped.
Shader "Dissolve/Dissolve"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_DissolveTexture("Dissolve Texture", 2D) = "white" {}
_DissolveY("Current Y of the dissolve effect", Float) = 0
_DissolveSize("Size of the effect", Float) = 2
_StartingY("Starting point of the effect", Float) = -1 //the number is supposedly in meters. Is compared to the Y coordinate in world space I believe.
}
SubShader
{
Tags { "RenderType"="Opaque" }
LOD 100
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
// make fog work
//#pragma multi_compile_fog
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
//UNITY_FOG_COORDS(1)
float4 vertex : SV_POSITION;
float3 worldPos : TEXCOORD1;
};
sampler2D _MainTex;
float4 _MainTex_ST;
sampler2D _DissolveTexture;
float _DissolveY;
float _DissolveSize;
float _StartingY;
v2f vert (appdata v) //"The vertex shader"
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
//UNITY_TRANSFER_FOG(o,o.vertex);
return o;
}
fixed4 frag (v2f i) : SV_Target //"For drawing the pixel on top"
{
float transition = _DissolveY - i.worldPos.y; //Cutoff value where world position is taken into account.
clip(_StartingY + (transition + (tex2D(_DissolveTexture, i.uv)) * _DissolveSize)); //Clip = cutoff if above 0.
//My understanding: If StartingY for dissolve effect + transition value and uv mapping of the texture is taken into account, clip off using the _DissolveSize.
//This happens to each individual pixel.
// sample the texture
fixed4 col = tex2D(_MainTex, i.uv);
// apply fog
//UNITY_APPLY_FOG(i.fogCoord, col);
//clip(1 - i.vertex.x % 10); //"A pixel is NOT rendered if clip is below 0."
return col;
}
ENDCG
}
}
}
Here you see the inspector fields available.
I have a similar script with the x axis.
I am looking for a glass shader for Unity that only refracts the objects behind it, or ideas for how to modify an existing glass shader to do that.
This screenshot shows what happens when I use FX/Glass/Stained BumpDistort on a curved plane mesh.
As you can see, the glass shader refracts both the sphere in front of the mesh and the ground behind it. I am looking for a shader that will only refract the objects behind it.
Here is the code for that shader, for reference:
// Per pixel bumped refraction.
// Uses a normal map to distort the image behind, and
// an additional texture to tint the color.
Shader "FX/Glass/Stained BumpDistort" {
Properties {
_BumpAmt ("Distortion", range (0,128)) = 10
_MainTex ("Tint Color (RGB)", 2D) = "white" {}
_BumpMap ("Normalmap", 2D) = "bump" {}
}
Category {
// We must be transparent, so other objects are drawn before this one.
Tags { "Queue"="Transparent" "RenderType"="Opaque" }
SubShader {
// This pass grabs the screen behind the object into a texture.
// We can access the result in the next pass as _GrabTexture
GrabPass {
Name "BASE"
Tags { "LightMode" = "Always" }
}
// Main pass: Take the texture grabbed above and use the bumpmap to perturb it
// on to the screen
Pass {
Name "BASE"
Tags { "LightMode" = "Always" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma multi_compile_fog
#include "UnityCG.cginc"
struct appdata_t {
float4 vertex : POSITION;
float2 texcoord: TEXCOORD0;
};
struct v2f {
float4 vertex : SV_POSITION;
float4 uvgrab : TEXCOORD0;
float2 uvbump : TEXCOORD1;
float2 uvmain : TEXCOORD2;
UNITY_FOG_COORDS(3)
};
float _BumpAmt;
float4 _BumpMap_ST;
float4 _MainTex_ST;
v2f vert (appdata_t v)
{
v2f o;
o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
#if UNITY_UV_STARTS_AT_TOP
float scale = -1.0;
#else
float scale = 1.0;
#endif
o.uvgrab.xy = (float2(o.vertex.x, o.vertex.y*scale) + o.vertex.w) * 0.5;
o.uvgrab.zw = o.vertex.zw;
o.uvbump = TRANSFORM_TEX( v.texcoord, _BumpMap );
o.uvmain = TRANSFORM_TEX( v.texcoord, _MainTex );
UNITY_TRANSFER_FOG(o,o.vertex);
return o;
}
sampler2D _GrabTexture;
float4 _GrabTexture_TexelSize;
sampler2D _BumpMap;
sampler2D _MainTex;
half4 frag (v2f i) : SV_Target
{
// calculate perturbed coordinates
half2 bump = UnpackNormal(tex2D( _BumpMap, i.uvbump )).rg; // we could optimize this by just reading the x & y without reconstructing the Z
float2 offset = bump * _BumpAmt * _GrabTexture_TexelSize.xy;
i.uvgrab.xy = offset * i.uvgrab.z + i.uvgrab.xy;
half4 col = tex2Dproj( _GrabTexture, UNITY_PROJ_COORD(i.uvgrab));
half4 tint = tex2D(_MainTex, i.uvmain);
col *= tint;
UNITY_APPLY_FOG(i.fogCoord, col);
return col;
}
ENDCG
}
}
// ------------------------------------------------------------------
// Fallback for older cards and Unity non-Pro
SubShader {
Blend DstColor Zero
Pass {
Name "BASE"
SetTexture [_MainTex] { combine texture }
}
}
}
}
My intuition is that it has to do with the way that _GrabTexture is captured, but I'm not entirely sure. I'd appreciate any advice. Thanks!
No simple answer for this.
You cannot think about refraction without thinking about the context in some way, so let's see:
Basically, it's not easy to define when an object is "behind" another one. There are different ways to even meassure a point's distance to the camera, let alone accounting for the whole geometry. There are many strange situations where geometry intersects, and the centers and bounds could be anywhere.
Refraction is usually easy to think about in raytracing algorithms (you just march a ray and calculate how it bounces/refracts to get the colors). But here in raster graphics (used for 99% of real-time graphics), the objects are rendered as a whole, and in turns.
What is going on with that image is that the background and ball are rendered first, and the glass later. The glass doesn't "refract" anything, it just draws itself as a distortion of whatever was written in the render buffer before.
"Before" is key here. You don't get "behinds" in raster graphics, everything is done by being conscious of rendering order. Let's see how some refractions are created:
Manually set render queue tags for the shaders, so you know at what point in the pipeline they are drawn
Manually set each material's render queue
Create a script that constantly marshals the scene and every frame calculates what should be drawn before or after the glass according to position or any method you want, and set up the render queues in the materials
Create a script that render the scene filtering out (through various methods) the objects that shouldn't be refracted, and use that as the texture to refract (depending on the complexity of the scene, this is sometimes necessary)
These are just some options off the top of my head, everything depends on your scene
My advice:
Select the ball's material
Right-click on the Inspector window --> Tick on "Debug" mode
Set the Custom Render Queue to 2200 (after the regular geometry is drawn)
Select the glass' material
Set the Custom Render Queue to 2100 (after most geometry, but before the ball)
I'm attempting to build a webiste made in unity to WebGL using the unity 5 beta.
A custom shader I wrote (or more accurately edited from an existing one) no longer works in Unity 5.
Heres what the shader is supposed to do. Create a metaball effect where the alpha ramps up in a circular curve.
Shader turns this..
into this.. (via a render texture)
Heres the whole thing..
//Water Metaball Shader effect by Rodrigo Fernandez Diaz-2013
//Visit http://codeartist.info/ for more!!
Shader "Custom/Metaballs" {
Properties {
_MyColor ("Some Color", Color) = (1,1,1,1)
_MainTex ("Texture", 2D) = "white" { }
_botmcut ("bottom cutoff", Range(0,1)) = 0.1
_topcut ("top cutoff", Range(0,4)) = 0.8
_constant ("curvature constant", Range(0,5)) = 1
}
SubShader {
Tags {"Queue" = "Transparent" }
Pass {
Blend SrcAlpha OneMinusSrcAlpha
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
float4 _MyColor;
float4 _Color;
sampler2D _MainTex;
float _botmcut,_topcut,_constant;
struct v2f {
float4 pos : SV_POSITION;
float2 uv : TEXCOORD0;
};
float4 _MainTex_ST;
v2f vert (appdata_base v){
v2f o;
o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
o.uv = TRANSFORM_TEX (v.texcoord, _MainTex);
return o;
}
half4 frag (v2f i) : COLOR{
half4 texcol,finalColor;
texcol = tex2D (_MainTex, i.uv);
//finalColor=_Color*texcol;
finalColor=_MyColor;
if(texcol.a<_botmcut)
{
finalColor.a= 0;
}
else if((texcol.a>_topcut))
{
finalColor.a= 0;
}
else
{
float r = _topcut-_botmcut;
float xpos = _topcut - texcol.a;
finalColor.a= 1-(_botmcut + sqrt((xpos*xpos)-(r*r)))/_constant;
}
return finalColor;
}
ENDCG
}
}
Fallback "VertexLit"
}
The problem I am having in Unity 5 is that the resulting texture is blank. ie. 0 alpha.
The bit that seems to be causing the problem is this one.
else
{
float r = _topcut-_botmcut;
float xpos = _topcut - texcol.a;
finalColor.a= 1-(_botmcut + sqrt((xpos*xpos)-(r*r)))/_constant;
}
If I comment out the last line of this "finalCOlor...etc etc" then I see something
This is the line that normally creates that circular alpha curve, but in unity 5 it is always resolving to 0 it seems. Has there been some API change? because the math should work out identically to how it worked in unity 4.
Ps. I dont know much about shaders!
A few things that I normally do when tracking down shader issues.
Option 1
Try using PIX or some other standard program to debug the shader. You just need to capture the frame and right click on the pixel and hit debug. I'd pay close attention to what each value is, make sure none are set to 0 that shouldn't be. Also verify in this tool the right textures are being used.
Options 2
If you set finalColor.a to 0.5 does this do anything? If this does you know the issue is in one of your variables being 0. Should _constant even allow the range of 0? I think that should be from >0 to 5 honestly. Also verify you haven't overriden any of the constants or variables on the material, make sure they are still all set to the default. You might even want to just hard set them in the shader to see if that fixes the problem.
Finally, solving shader problems are not easy, but the fact that it worked in Unity 4 and doesn't in 5 tells me that you are probably just resolving something to 0, so I would check that first.
I have no idea why..But changing this line..
finalColor.a= 1-(_botmcut + sqrt((xpos*xpos)-(r*r)))/_constant;
to this..
finalColor.a= 1-(_botmcut + sqrt((r*r)-(xpos*xpos)))/_constant;
Worked.
It doesnt make sense!