render only some parts of the texture Unity - unity3d

as the title described i want to render only some parts of the texture. For example i have an 1024*1024 texture and now i want to render the area(square) that's between the points 0/0 50/50 pixel and the area 600/600 1024/1024.
Is something like that possible?
Maybe you can help me with some logic steps that i need to go, because i don't really know how to start.
Yeah i need an shader with 2 texture slots and a scripts that renders only some parts^^
I think it has something to do with this here: https://answers.unity.com/questions/529814/how-to-have-2-different-objects-at-the-same-place.html

The following basic surface shader "removes" all pixels outside between 50/50 and 600/600 (this is called clipping):
Shader "Custom/ClippedTexture" {
Properties {
_MainTex ("Texture", 2D) = "white" {}
}
SubShader {
Tags { "RenderType" = "Opaque" }
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float2 uv_MainTex;
};
sampler2D _MainTex;
float4 _MainTex_TexelSize;
void surf (Input IN, inout SurfaceOutput o) {
float u = IN.uv_MainTex.x * _MainTex_TexelSize.x;
float v = IN.uv_MainTex.y * _MainTex_TexelSize.y;
if((u < 51 && v < 51) || (u > 599 && v > 599)) {
clip(-1); // skip pixel
} else {
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
}
}
ENDCG
}
Fallback "Diffuse"
}
You can save this Shader into the Shader directory of your project and create a material that uses it. Hope this helps.

Related

Unity Surface Shader Texture Moving When Object Moves

In Unity I have modified a cube surface shader that works properly while the associated object is stationary, but when the object is moved the texture is moved on the object. I would like the texture to not move if the object moves or ideally even if vertices of the object are moved around at runtime. This is a gif created during runtime of what happens when the object is stationary vs moving:
GiphyLink
This is the shader code I've been working with (this is the code used in the gif that I linked above):
"CShader" {
Properties{
_CubeMap("Cube Map", Cube) = "white" {}
_CubeMap2("Cube Map 2", Cube) = "white" {}
_Color("Color", Color) = (1,1,1,1)
_Color3("Color 1", Color) = (1,1,1,1)
_Color4("Color 2", Color) = (1,1,1,1)
_Blend("Texture Blend", Range(0,1)) = 0.0
_Glossiness("Smoothness", Range(0,1)) = 0.0
_Metallic("Metallic", Range(0,1)) = 0.0
}
SubShader{
Tags { "RenderType" = "Fade" }
CGPROGRAM
#pragma target 4.5
#pragma surface surf Standard alpha:fade vertex:vert
struct Input {
float2 uv_CubeMap;
float3 customColor;
};
fixed4 _Color3;
fixed4 _Color4;
half _Blend;
half _Glossiness;
half _Metallic;
samplerCUBE _CubeMap;
samplerCUBE _CubeMap2;
void vert(inout appdata_full v, out Input oo) {
UNITY_INITIALIZE_OUTPUT(Input, oo);
oo.customColor = v.vertex.xyz;
}
void surf(Input INN, inout SurfaceOutputStandard oo) {
fixed4 d = texCUBE(_CubeMap2, INN.customColor) * _Color3;
d = lerp(d, texCUBE(_CubeMap, INN.customColor) * _Color4, 1 / (1 + exp(100 * (-(INN.uv_CubeMap.y)))));
oo.Albedo = d.rgb;
oo.Metallic = _Metallic;
oo.Smoothness = _Glossiness;
oo.Alpha = d.a;
}
ENDCG
}
Fallback "Diffuse"
I've tried many things including setting vertices in c# script and passing them to the shader but nothing has worked so far likely because I've coded it wrong or had the wrong procedure.
Any help would be greatly appreciated, Thank you.

How to tile texture with MixedRealityToolkit shader and Round Corners?

I'm trying to make an animated material with the MRTK in Unity. The goal is to make an effect with look like starting with a circle and propagating the texture to rest of the plane.
For now I use the MixedRealityToolkit standard shader and use the round corner option. With an animation I made that :
My problem is that I can't tile the texture to reduce the size of the texture and repeat it. Also for non-square object, the texture is stretched and it's not really nice.
If I try to change the tile setting, the texture is not repeated (texture is well in "Repeat Mode", it works when I untick Round Corners option)
(If I display Unity selection outline, I obtained the repeated texture, but it's not displayed ... )
Does anyone have a good idea to do that with the MRTK shaders or how to write a specific shader for this effect ?
I found a solution, writing my own shader :
Shader "Custom/testShader"
{
Properties
{
_Color ("Color", Color) = (1,1,1,1)
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_ForegroundMask("Foreground Mask", 2D) = "white" {}
_RadiusSize("Radius size", Range(0,2)) = 0
_BorderWidth("Smooth Edge width", Range(0,1)) = 0.5
_Glossiness ("Smoothness", Range(0,1)) = 0.5
_Metallic ("Metallic", Range(0,1)) = 0.0
}
SubShader
{
Tags {"Queue"="Transparent" "RenderType"="Transparent" }
LOD 200
ZWrite Off
Blend SrcAlpha OneMinusSrcAlpha
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
#pragma surface surf Standard fullforwardshadows alpha:fade
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
sampler2D _ForegroundMask;
struct Input
{
float2 uv_MainTex;
float2 uv_ForegroundMask;
};
half _Glossiness;
half _Metallic;
fixed4 _Color;
float _RadiusSize;
float _BorderWidth;
// Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.
// See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.
// #pragma instancing_options assumeuniformscaling
UNITY_INSTANCING_BUFFER_START(Props)
// put more per-instance properties here
UNITY_INSTANCING_BUFFER_END(Props)
void surf (Input IN, inout SurfaceOutputStandard o)
{
fixed2 dist;
dist.x = IN.uv_ForegroundMask.x - 0.5;
dist.y = IN.uv_ForegroundMask.y - 0.5;
fixed circle= 1.0 - smoothstep(_RadiusSize-(_RadiusSize * _BorderWidth),_RadiusSize+(_RadiusSize * _BorderWidth), dot(dist,dist)*4.0);
fixed4 c = tex2D(_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;
// Metallic and smoothness come from slider variables
o.Metallic = _Metallic;
o.Smoothness = _Glossiness;
o.Alpha = c.a * circle;
}
ENDCG
}
FallBack "Diffuse"
}

custom clipping shader on top of standard shader? (unity)

I want to draw a horizontal line on an object with shader code (hlsl).
The clipping shader simply takes the distance to a given Y-coordinate in the surface shader and checks if it is higher that a given value.
If so it will discard. The result is a shader that simply clips away all pixels that are not on a line.
void surf (Input IN, inout SurfaceOutputStandard o) {
// Albedo comes from a texture tinted by color
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
float d = abs(_YClip - IN.worldPos.y); // _YClip is is the properties and can be changed
if (d > _LineThickness) {
discard;
}
}
Can I somehow combine this shader with the standard unity shader without changing the code?
I plan to have a gizmo shader that renders lines and all kind of stuff. It would be very practical if I could just tell unity to render this gizmo shader on top.
I believe you might be able to use or adapt this shader to your purpose.
Image showing before y axis reached.
Image showing during, where one half is above cutoff y value and other half is below. Note that the pattern it dissolves in, depends on a texture pattern you supply yourself. So it should be possible to have a strict cutoff instead of a more odd and uneven pattern.
After the object has fully passed by the cutoff y value. What I did in this case is to hide an object inside the start object that is slightly smaller than the first object you saw. But if you don't have anything inside, the object will just be invisible, or clipped.
Shader "Dissolve/Dissolve"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_DissolveTexture("Dissolve Texture", 2D) = "white" {}
_DissolveY("Current Y of the dissolve effect", Float) = 0
_DissolveSize("Size of the effect", Float) = 2
_StartingY("Starting point of the effect", Float) = -1 //the number is supposedly in meters. Is compared to the Y coordinate in world space I believe.
}
SubShader
{
Tags { "RenderType"="Opaque" }
LOD 100
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
// make fog work
//#pragma multi_compile_fog
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
//UNITY_FOG_COORDS(1)
float4 vertex : SV_POSITION;
float3 worldPos : TEXCOORD1;
};
sampler2D _MainTex;
float4 _MainTex_ST;
sampler2D _DissolveTexture;
float _DissolveY;
float _DissolveSize;
float _StartingY;
v2f vert (appdata v) //"The vertex shader"
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
//UNITY_TRANSFER_FOG(o,o.vertex);
return o;
}
fixed4 frag (v2f i) : SV_Target //"For drawing the pixel on top"
{
float transition = _DissolveY - i.worldPos.y; //Cutoff value where world position is taken into account.
clip(_StartingY + (transition + (tex2D(_DissolveTexture, i.uv)) * _DissolveSize)); //Clip = cutoff if above 0.
//My understanding: If StartingY for dissolve effect + transition value and uv mapping of the texture is taken into account, clip off using the _DissolveSize.
//This happens to each individual pixel.
// sample the texture
fixed4 col = tex2D(_MainTex, i.uv);
// apply fog
//UNITY_APPLY_FOG(i.fogCoord, col);
//clip(1 - i.vertex.x % 10); //"A pixel is NOT rendered if clip is below 0."
return col;
}
ENDCG
}
}
}
Here you see the inspector fields available.
I have a similar script with the x axis.

Fluctuate float value in an Unity shader

I'm trying to fluctuate between two values inside a shader to achieve a glowing effect.
I need it to be done inside the shader itself and not via C# scripting.
I've tried using the _Time value that Unity gives us for shader animation but it isn't working:
Shader "Test shader" {
Properties {
_ColorTint ("Color", Color) = (1,1,1,1)
_MainTex ("Base (RGB)", 2D) = "white" {}
_GlowColor("Glow Color", Color) = (1,0,0,1)
_GlowPower("Glow Power", Float) = 3.0
_UpDown("Shine Emitter Don't Change", Float) = 0
}
SubShader {
Tags {
"RenderType"="Opaque"
}
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float4 color : Color;
float2 uv_MainTex;
float3 viewDir;
float4 _Time;
};
float4 _ColorTint;
sampler2D _MainTex;
float4 _GlowColor;
float _GlowPower;
float _UpDown;
void surf(Input IN, inout SurfaceOutput o) {
if (_UpDown == 0) {
_GlowPower += _Time.y;
}
if (_UpDown == 1) {
_GlowPower -= _Time.y;
}
if (_GlowPower <= 1) {
_UpDown = 0;
}
if (_GlowPower >= 3) {
_UpDown = 1;
}
IN.color = _ColorTint;
o.Albedo = tex2D(_MainTex, IN.uv_MainTex).rgb * IN.color;
half rim = 1.0 - saturate(dot(normalize(IN.viewDir), o.Normal));
o.Emission = _GlowColor.rgb * pow(rim, _GlowPower);
}
ENDCG
}
FallBack "Diffuse"
}
This makes the glow grow to the infinite.
What am I doing wrong?
Extending my comment slightly:
You can't use _Time.y in this case, as it is the elapsed time since the game started, thus it will increase over time.
You can use _SinTime.y instead, which represents sin(_Time.y). This means that oscillates between the values -1 and 1. You can use this and assign (maybe a scaled version of _SinTime) to your variable: _GlowPower = C * _SinTime.y
More on build-in shader variables: http://docs.unity3d.com/Manual/SL-UnityShaderVariables.html
For doing a pulsing glow... I'd have a script 'outside' the shader and send in a paramater (_GlowPower) calculate in c# script like this....
glowPow = Mathf.Sin(time);
Then you only need to calculate it once. IF you put it in VErtex shader... it does it once per vertex, and if its in surface shader... then once per pixel = performance waste.
you can send variables to your shader like this... (very handy)
material.SetFloat(propertyName, valueToSend);
So you could send, time, strength, glow or whatverer you want.
If you really need to do a glow calculation per vertex or per pixel, then use
_glowPow = sin(_time);
inside the shader.

Unity 3D - Shaders with alpha break depth of field

I have a scene where I really need depth of field.
Apparently, Unity's depth of field doesn't work with any of the shaders, neither built-in or custom, that process the alpha.
So this happens, for example, with the Transparent/Diffuse shader. Transparent/Cutout works instead.
Here's the simplest custom shader I made that triggers this behaviour:
Shader "Custom/SimpleAlpha" {
Properties {
_MainTex ("Base (RGBA)", 2D) = "white" {}
}
SubShader {
Tags { "RenderType"="Transparent" "Queue"="Transparent" }
//Tags { "RenderType"="Opaque" }
LOD 300
ZWrite Off
CGPROGRAM
#pragma surface surf Lambert alpha
#include "UnityCG.cginc"
sampler2D _MainTex;
struct Input {
float2 uv_MainTex;
};
void surf (Input IN, inout SurfaceOutput o) {
half4 c = tex2D (_MainTex, IN.uv_MainTex);
o.Albedo = c.rgb;
o.Alpha = c.a;
}
ENDCG
}
FallBack "Diffuse"
}
If you try the code in a project you'll notice that EVERY object that wears the shader is blurred with the very same amount instead of being blurred basing on Z.
Any help is much appreciated.
Thanks in advance.
I posted the same question on Unity Answers: http://answers.unity3d.com/questions/438556/my-shader-brakes-depth-of-field.html
Since depth of field is a post processing effect that uses the values stored in the Z-buffer, the following line is the culprit:
ZWrite Off
For transparent objects, Z-buffer writes are usually disabled because the Transparent render queue doesn't need the Z-buffer.
So if you remove that line, you should see depth of field correctly applied to transparent objects. But objects lying behind fully transparent areas will now be blurred using the wrong Z value. As quick fix, you could try to use an alpha test like AlphaTest Greater 0.1.