I am following this tutorial on baking the shader into a texture Map here It works great. I just have some problem applying it on a standard shader.
so this is an unwrap shader they made using a vertex/fragment(I think these are responsible to render the vertex into uv space):
v2f vert (appdata v)
{
v2f o;
v.vertex = float4(v.uv.xy, 0.0, 1.0);
o.vertex = mul(UNITY_MATRIX_P, v.vertex);
//o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}
fixed4 frag (v2f IN) : SV_Target
{
//here is where you would put the shader code you want to bake
//for now I will just do a noise
float f = fbm(IN.uv + fbm(5*IN.uv, 4), 4);
fixed4 color = fixed4(f,f,1,1);
return color;
}
I wanted to do that but on a surface shader. Can you guys help me?
You have two options. If its not too late already...
Option 1: Alter the generated code from your surface shader
Option 2: Use vertex modifier and counteract UnityObjectToClipPos(v.vertex);
Unity surface shaders are also just Vertex and Fragmentshaders like your snippet up above. It is a long collection of different passes and predefined functions to add all kinds of effects like shadows, fog etc.
If you create a shader as a surface shader you tell unity which predefined functions to use with pragmas like "surface surf Standard fullforwardshadows"
You can always look at the generated code if you click on you compiled shader there is a button. There you see a long shader with vertex and fragment portions. Its a little cluttered with #ifelse and the code that is run on your machine is only a small portion in the end. It is possible to declutter it by hand and only change the specific parts that you want to alter i.e. add a custom vertex shader.
Since this is time consuming and error prone unity has a lot of customisability of the standard shaders.
Option 2:
One of these is the vertex modifier pragma with vertex:vert you can assign a custom vertex function to be called in the vertex shader to modify the vertex position. (See Normal Extrusion in: here) Unfortunately unity wraps a
UnityObjectToClipPos(v.vertex); directly after your vertex modifier function. So if you do a transformation from object-space to uv-space unity then automatically converts the uv-coordinates to clip-space as if it were object-space coordinates resulting in nonsense. What you can do is to counteract this with mul(transpose(UNITY_MATRIX_IT_MV), v.vertex);
The resulting multiplication then is: PVM*t(t(inv(MV))) = P
That way you cheat the vertex modifier to act as an override. Keep in mind thought that position is also used for other operations shadows etc. If you want to have more control and skip this unecessary double multiplication you can do
Option 2:
You can always go into the generated code from your surface shader and alter the vertex functions. You can i.e. remove the call to UnityObjectToClipPos(v.vertex); manually.
Note: One problem i ran into a lot with UV-Space manipulation is that the clipspace depends on your graphics api used. Wether it is OpenGL like or Directx like you might need to set your z coords to -1.0 or 0.0 in float4(v.uv.xy, 0.0, 1.0);. This also might change depending on your build target.
Edit:
Here is an example vertex modifier for a surface shader:
#pragma surface surf Standard fullforwardshadows vertex:vert
void vert (inout appdata_full v) {
v.vertex = float4(v.texcoord.xy, 0.0, 1.0);
//or float4(v.texcoord.xy, -1.0, 1.0);
v.vertex = mul(transpose(UNITY_MATRIX_IT_MV), v.vertex);
}
Related
I wish to create an unlit shader for a particle system that emits cube meshes, such that each emitted mesh has a hard black outline around it.
Here is the pass for the outline (in Cg):
struct appdata {
float4 vertex : POSITION;
float3 normal : NORMAL;
};
struct v2f {
float4 pos : POSITION;
float4 color : COLOR;
};
uniform float _Outline;
uniform float4 _OutlineColor;
v2f vert(appdata v) {
v2f o;
v.vertex *= ( 1 + _Outline);
o.pos = UnityObjectToClipPos(v.vertex);
o.color = _OutlineColor;
return o;
}
half4 frag(v2f i) :COLOR { return i.color; }
(And after this is a simple pass to render the unlit geometry of the mesh itself...)
As you can see, we are simply stretching the vertices outward... but from what?
For a single cube mesh, the shader works perfectly:
However, when applied to a particle system emitting cube meshes, the shader breaks down:
My suspicion is that the line v.vertex *= ( 1 + _Outline); stretches the vertices outward from the object center, not the mesh center.
Does anyone have a replacement shader or insight on how to fix this problem?
Thanks,
rbjacob
It turns out that I misconstrued the problem. When accessing the POSITION semantic of the vertices, you are getting the vertices of the emitted particles in world space; therefore, stretching the vertices by multiplying is actually just scaling them away from the world center.
To access the vertices relative to each particle, we must be able to access each particle's mesh center from within the shader. To do this, we enable "Custom Vertex Streams" inside the Renderer module of the particle system and press the + button to add the Center stream.
Now we can access TEXCOORD0 (or whatever is specified to the right of the Center stream in the particle renderer GUI) from the shader to get the mesh center in world space. Then we subtract the mesh center from each vertices position, scale outward, and add the mesh center back. And voila, each particle has an outline.
Here are the final vert and frag snippets for the outline pass:
struct appdata {
float3 vertex : POSITION;
float4 color : COLOR;
float3 center : TEXCOORD0;
};
struct v2f {
float4 pos : POSITION;
float4 color : COLOR;
float3 center : TEXCOORD0;
};
uniform float _Outline;
uniform float4 _OutlineColor;
v2f vert(appdata v) {
v2f o;
o.center = v.center;
float3 vert = v.vertex - v.center;
vert *= ( 1 + _Outline);
v.vertex = vert + v.center;
o.pos = UnityObjectToClipPos(v.vertex);
o.color = _OutlineColor;
return o;
}
half4 frag(v2f i) :COLOR { return i.color; }
TLDR: Enable vertex streams, add a stream for the particle center, and access this value in the shader to scale individual vertices outward.
My suspicion is that the line v.vertex *= ( 1 + _Outline); stretches the vertices outward from the object center, not the mesh center.
That would be correct. Or mostly correct (particle systems combine all the particles into one runtime mesh and that's what your shader is applied to, not the underlying individual particle mesh, which isn't obvious). Try your outline shader on a non-convex mesh (that is also not a particle): You'll find that the concave part won't have the desired outline, confirming your suspicion.
I wrote this shader a couple of years back because the only free shaders I could find that generated outlines were either (a) not free or (b) of the "just scale it bigger" variety. It still has problems (such as getting jagged and weird at large thickness values), but I was never able to resolve them satisfactorily. It uses a geometry pass to turn the source mesh's edges into camera-facing quads, then stencil magic to render only the outline portion.
However I am unsure if that shader will function when applied to particles. I doubt it will without modification, but you're free to give it a shot.
I'm working on tattoo simulator program, i need to know if there's a way for the decal (tattoo) to wrap arond the target mesh, like having a tattoo that goes from one side to the other side of lets say leg, or event behind it.
Not at runtime, using a projected decal, no.
What you need here instead is a procedural tattoo map. Think of it as another texture, like a lightmap. You may need a custom shader, but it could possibly be done with the secondary albedo channel of the standard shader.
The tricky part is writing to that texture. I'll outline the basic algorithm, but leave it up to you to implement:
The first thing you need to be able to do is unwrap the mesh's triangles in code. You need to identify which edges are contiguous on the UV map, and which are separate. Next, you need a way to identify the tattoo and the initial transform. First, you'll want to define an origin on the tattoo source texture that it will rotate around. Then you'll want to define a structure that references the source texture, and the UV position (Vector2) / rotation (float) / scale (float) to apply it to in the destination texture.
Once you have the tattoos stored in that format, then you can start building the tattoo mask texture for the skin. If your skin uvs have a consistent pixel density, this is a lot easier because you can work primarily in uv-space, but if not, you'll need to re-project to get the scale for each tri. But, basically, you start with the body triangle that contains the origin, and draw onto that triangle normally. From there, you know where each vertex and edge of that triangle lies on the tattoo source texture. So, loop through each neighboring triangle (I recommend a breadth-first recursive method) and continue it from the edge you already know. If all three verts fall outside the source texture's rect, you can stop there. Otherwise, continue with the next triangle's neighbors. Make sure you're using the 3D mesh when calculating neighbors so you don't get stuck at seams.
That algorithm is going to have an edge case you'll need to deal with for when the tattoo wraps all the way around and overlaps itself, but there are a couple different ways you can deal with that.
Once you've written all tattoos to the tattoo texture, just apply it to the skin material and voila! Not only will this move all the calculations out of real-time rendering, but it will let you fully control how your tattoos can be applied.
You can use a decal projector using Unity's official preview tool Render Pipelines - High Definition.
Here's how I used it to project a "tatoo" onto a bucket. You can apply it to your model of course.
(Child the decal projector so that the tatoo follows the model)
The best way to import Render Pipelines - High Definition package is to use Unity Hub to create a new project, choosing it as a template. If it's an existing project, this official blog might help you.
Once you succefully set up the package, follow this tutorial and you'll be able to project tatoos onto your models anywhere you want.
I've done something similar with a custom shader. I think it would do what you want. Mine is dynamically rendering flags based on rank and type of a unit for an iPad game prototype. Exactly how you'll do it depends a bit on how you have things setup in your project, but here's what mine looks like - first image is the wireframe showing the mesh and second is with the shaders turned on and shows them adding the colors and emblem based on rank and unit. I've just included the shader for the top flag since that has the unit emblem added added similar to how you want your tattoo to be:
Note that you can attach multiple shaders to a particular mesh.
And the emblem is just an image with transparency that is added to the shader and referenced as a texture within the shader:
You can see we also have a picture that has some shadow texture that's used as the background for the banner.
This is my first shader and was written a while ago, so I'm sure it's sub-optimal in all kinds of ways, but it should hopefully be enough to get you started (and it still works in Unity 2018.3.x, though I had to hack in some changes to get it to compile):
Shader "Custom/TroopFlagEmblemShader" {
Properties {
_BackColor ("Background Color", Color) = (0.78, 0.2, 0.2) // scarlet
_MainTex ("Background (RGBA)", 2D) = "" {}
_EmblemTex("Emblem (RGBA)", 2D) = "" {}
_Rank ( "Rank (1-9)", Float ) = 3.0
}
SubShader {
Pass {
CGPROGRAM
#pragma exclude_renderers xbox360 ps3 flash
#pragma target 3.0
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata {
float4 vertex: POSITION;
float4 texcoord: TEXCOORD0;
};
struct v2f {
float4 pos: SV_POSITION;
float2 uv: TEXCOORD0;
};
uniform sampler2D _MainTex;
uniform sampler2D _EmblemTex;
uniform float3 _BackColor;
uniform float _Rank;
v2f vert( appdata v )
{
v2f o;
o.pos = UnityObjectToClipPos( v.vertex );
o.uv = v.texcoord.xy;
return o;
}
float4 frag( v2f IN ) : COLOR
{
float4 outColor;
float4 backTextureColor = tex2D( _MainTex, IN.uv.xy );
float4 emblemTextureColor = tex2D( _EmblemTex, IN.uv.xy );
// not drawing the square at all above rank 5
if ( _Rank >= 6.0 )
discard;
if ( _Rank < 5 ) // 4 and below
{
outColor = float4( (emblemTextureColor.rgb * emblemTextureColor.a) +
(((1.0 - emblemTextureColor.a) * backTextureColor.rgb) * _BackColor.rgb) , 1 );
// float4(_BackColor.rgb, 1 ));
}
else if ( _Rank >= 5.0 ) // but excluded from 6 above
{
// 5 is just solid backcolor combined with background texture
outColor = float4( backTextureColor.rgb * _BackColor.rgb, 1 );
}
return outColor;
}
ENDCG
}}
}
Shaders are a bit maddening to learn how to do, but pretty fun once you get them working - like most programming :)
In my case the overlay texture was the same size/shape as the flag which makes it a bit easier. I'm thinking you'll need to add some parameters to the shader that indicate where you want the overlay to be drawn relative to the mesh and do nothing for vertexes/fragments outside your tattoo bounds, just as a first thought.
I've programmed a 2D Water Effect with Springs similar to this one. Now I want to implement it in a Vertex Shader (in Unity). But for Wave Propagation I need to know the left and right Neighbors (to calculate the affecting Force) of the current Vertex and somehow save the resulting Force for the next Iteration. I have no Idea how to do that.
You should create a texture representation of the offsets you need for vertex manipulation and then use tex2lod() in your vertex shader - these are supported in shader model 3 and up.
You then use a fragment shader to generate or update the texture.
Your vertex shader could look something like this:
sampler2D _OffsetMap;
// vertex shader
vert_data vert (appdata_base v) {
vert_data v_out;
float4 vertexPos = v.vertex;
vertexPos += tex2Dlod (_OffsetMap, float4(vertexPos.xz, 0, 0));
v_out.position = UnityObjectToClipPos (vertexPos);
return v_out;
}
You can ofcourse also use the same fetch to manipulate vertex normals.
I'm currently developping a game where a character can move on a background. The idea will be that this character dig this background. I think it should be done by a shader but i'm a beginner with its. .
I can imagine something like if the character is too far from the position of this pixel, so the alpha of this pixel is 0. And to keep the alpha at 0.If the alpha is equals to 0 you don't make the character test.
But for now, i've tried with the sprites/diffuse shader as base and i can't find a way to get the position of the pixels. I tried something in the surf function but with no real results The surf function is supposed to be executed once by pixel no?
Thanks in advance to help me.
Edit
I've tried some things to finally get a vert frag shader. As i said, i'm trying to compute the alpha with the distance of the pixel.
For now I can't figure out where is my mistake but maybe some will be more talkative.
By the way, my sorting layer has just explode when i put this new shader. I've tried to switch off Ztest and Zwrite but it doesn't work. So if you have any idea (But it's not the main problem)
Shader "Unlit/SimpleUnlitTexturedShader"
{
Properties
{
// we have removed support for texture tiling/offset,
// so make them not be displayed in material inspector
[NoScaleOffset] _MainTex("Texture", 2D) = "white" {}
_Position("Position", Vector) = (0,0,0,0)
}
SubShader
{
Pass
{
CGPROGRAM
// use "vert" function as the vertex shader
#pragma vertex vert
// use "frag" function as the pixel (fragment) shader
#pragma fragment frag
// vertex shader inputs
struct appdata
{
float4 vertex : POSITION; // vertex position
float2 uv : TEXCOORD0; // texture coordinate
};
// vertex shader outputs ("vertex to fragment")
struct v2f
{
float2 uv : TEXCOORD0; // texture coordinate
float4 vertex : SV_POSITION; // clip space position
float3 worldpos:WD_POSITION;
};
// vertex shader
v2f vert(appdata v)
{
v2f o;
// transform position to clip space
// (multiply with model*view*projection matrix)
o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
// just pass the texture coordinate
o.uv = v.uv;
o.worldpos= mul(_Object2World,o.vertex).xyz;
return o;
}
// texture we will sample
sampler2D _MainTex;
float4 Position ;
// pixel shader; returns low precision ("fixed4" type)
// color ("SV_Target" semantic)
fixed4 frag(v2f i) : SV_Target
{
// sample texture and return it
fixed4 col = tex2D(_MainTex, i.uv);
col.a = step(2,distance(Position.xyz, i.worldpos));
return col;
}
ENDCG
}
}
}
The effect of "cutting holes" in geometry can be achieved using vertex shaders. This is possible since vertex shaders allow you to determine the 3d world position of a pixel of an object using the _Object2World matrix. If I wasnt on my phone I'd give a code example, but I'll try to describe it. If you take the vertex position, apply an _Object2World matrix, it will return a 3d world position. Here's a quick example:
float3 worldPos = mul (_Object2World, v.vertex).xyz;
In the fragment portion of the shader, you can then set the transparency based on the distance between worldPos and any other position. I apologize if this is still difficult to follow, I am trying to do this on my phone right now. If anyone wants to elaborate on what I said or give code examples that'd be great. Otherwise, you can study up on vertex shaders here:
http://docs.unity3d.com/Manual/SL-VertexFragmentShaderExamples.html
Hope this helps!
I am trying to write a shader to read the whole frame, pixel by pixel and after some calculations re-write the pixels. I have looked through some codes but most of them were not relevant. Could you give me some hints on how I can read pixels and write pixels in Unity shader programming?
If you have the Pro version of Unity, you can achieve this with image (postprocessing) effects. All you have to do is to implement the OnRenderImage callback on a component of a camera. Then you call Graphics.Blit with a material which has a shader. The shader receives the screen contents as main texture.
You need texture buffers the size of your frame.
Then you want to render your frame into one of the buffers.
Now you need to write a fragment shader that reads one buffers, and writes to the other.
Then finally you draw the fragment shader output as a flat object that covers the screen.
In shader programming, you do not work pixel by pixel, you define a function that will be used on a single pixel at a position which is a float from 0 to 1 in all 3 axis (although you will only be using 2). That fragment shader is then run for lots of pixel all in parallel, that's how it does everything more quickly.
I hope that brief explanation is enough to get you started. Unity fragment shaders are written in Cg. Cg is a language which is half way between OpenGL's language GLSL and DirectX language HLSL, as all the high level languages compile into native instructions on the graphics card they are all fairly similar. So there are plenty of Cg samples about, and once you can write Cg, you will have no problem reading HLSL and GLSL.
Thank you for your advice. they were really helpful. I finally ended up with this code for my shader. And now, a new problem just comes up.
My solution:
To solve my keystone problem, I have adapted the "wearing a glass" idea! it means that I have placed on a plane in front of camera and attached the below shader on it. Then I attached the plane to the camera. The problem right now is that is shader works very well but in my VR setting it does not work because I have several cameras and the scene is distorted in one of them (as I want) but other cameras have a normal scenes. Everything is fine until these two scenes have intersection. In that case I have a disjoint scene (please forgive me if is not a correct word). By the way, I thought that instead of using this shader for a "plane infront of camera" I have to apply it on the camera itself. my shader does not work when I add it to the camera although it works perfectly with the plane object. Could you let me know how can I modify this code to be compatible with camera? I am more than welcome to hear your suggestion and ideas besides of my solution.
Shader "Custom/she1" {
Properties {
top("Top", Range(0,2)) = 1
bottom("Bottom", Range(0,2)) = 1
}
SubShader {
// Draw ourselves after all opaque geometry
Tags { "Queue" = "Transparent" }
// Grab the screen behind the object into _GrabTexture
GrabPass { }
// Render the object with the texture generated above
Pass {
CGPROGRAM
#pragma debug
#pragma vertex vert
#pragma fragment frag
#pragma target 3.0
sampler2D _GrabTexture : register(s0);
float top;
float bottom;
struct data {
float4 vertex : POSITION;
float3 normal : NORMAL;
};
struct v2f {
float4 position : POSITION;
float4 screenPos : TEXCOORD0;
};
v2f vert(data i){
v2f o;
o.position = mul(UNITY_MATRIX_MVP, i.vertex);
o.screenPos = o.position;
return o;
}
half4 frag( v2f i ) : COLOR
{
float2 screenPos = i.screenPos.xy / i.screenPos.w;
float _half = (top + bottom) * 0.5;
float _diff = (bottom - top) * 0.5;
screenPos.x = screenPos.x * (_half + _diff * screenPos.y);
screenPos.x = (screenPos.x + 1) * 0.5;
screenPos.y = 1-(screenPos.y + 1) * 0.5 ;
half4 sum = half4(0.0h,0.0h,0.0h,0.0h);
sum = tex2D( _GrabTexture, screenPos);
return sum;
}
ENDCG
}
}
Fallback Off
}