How to replace colors with textures via Shaders in Unity 3D - unity3d

I have a problem. I tried to search but I can't find what I want.
I have a texture. This texture have a blue, green and black colors. They are masks, this is a face texture actually. I want to replace them like blue color will replace with my eyes texture in unity, green color will replace with face texture.. How can I write this shader? I searched but only I find color changing shaders :( Thanks..

The way i interpret your question is that you want a shader where one texture serves as a mask, blending between 3 other textures. I'm assuming that this is for character customization, to stitch different pieces of face together.
In the fragment (or surf) function, sample your 3 textures and the mask:
fixed4 face = tex2D(_FaceTex, i.uv); // Green channel of the mask
fixed4 eyes = tex2D(_EyeTex, i.uv); // Blue channel of the mask
fixed4 mouth = tex2D(_MouthTex, i.uv); // No mask value (black)
fixed4 mask = tex2D(_MaskTex, i.uv);
Then, you need to blend them together using the mask. Let's assume that whatever black represents in the mask is the background color, and then we interpolate in the other textures.
fixed4 col = lerp(mouth, eyes, mask.b);
Then we can interpolate between the resulting color and our third value:
col = lerp(col, face, mask.g);
You could repeat this once again with the red channel, for a fourth texture. Of course, this assumes that you use pure red, green, or blue in the mask. There are ways to use a more specific color as key too, for instance you can use the absolute of the distance between the mask color and some reference color:
fixed4 eyeMaskColor = (0.5, 0.5, 1, 1);
half t = 1 - saturate(abs(length(mask - eyeMaskColor)));
In this case, t is the lerp factor you use to blend in the texture. The saturate function clamps the value in the range of [0, 1]. If the mask color is the same as eyeMaskColor, then the length of the vector between them is 0 and the statement evaluates to 1.

Related

Problem with surface shader for clipping with a plane in Augmented Reality

I want to clip a 3D model with a plane in Unity, found this great tutorial for it and got it to work easily in the usual Unity environment. The tutorial uses a surface shader to clip all parts above a plane and only show the parts underneath it, so that you get the impression of cutting open a 3D model (see the GIFs in the linked tutorial). Code of the surface shader is this:
Shader "Clippingplane" {
Properties {
_Color ("Tint", Color) = (0, 0, 0, 1)
_MainTex ("Albedo (RGB)", 2D) = "white" {}
_Smoothness ("Smoothness", Range(0, 1)) = 0
_Metallic ("Metalness", Range(0, 1)) = 0
[HDR] _Emission ("Emission", color) = (0,0,0)
[HDR]_CutoffColor("Cutoff Color", Color) = (1,0,0,0)
}
SubShader {
Tags{ "RenderType"="Opaque" "Queue"="Geometry"}
// render faces regardless if they point towards the camera or away from it
Cull Off
CGPROGRAM
#pragma surface surf Standard fullforwardshadows
#pragma target 3.0
sampler2D _MainTex;
fixed4 _Color;
half _Smoothness;
half _Metallic;
half3 _Emission;
float4 _Plane;
float4 _CutoffColor;
struct Input {
float2 uv_MainTex;
float3 worldPos;
float facing : VFACE;
};
void surf (Input i, inout SurfaceOutputStandard o) {
//calculate signed distance to plane
float distance = dot(i.worldPos, _Plane.xyz);
distance = distance + _Plane.w;
//discard surface above plane
clip(-distance);
float facing = i.facing * 0.5 + 0.5; //convert facing from -1/1 to 0/1 for linear interpolation
//normal color stuff
fixed4 col = tex2D(_MainTex, i.uv_MainTex);
col *= _Color;
o.Albedo = col.rgb * facing;
o.Metallic = _Metallic * facing;
o.Smoothness = _Smoothness * facing;
o.Emission = lerp(_CutoffColor, _Emission, facing); // lerp = linear interpolation
}
ENDCG
}
FallBack "Standard"
}
Now I want to convert this whole interaction of moving a clipping plane through a 3D model to Augmented Reality, I use ARFoundation with ARCore for that.
For some reason, the shader doesn't work in AR as expected now anymore. The "inside color" (red) covers the whole model itself, and not only the part where the model is cut open. Seems like the shader can't differentiate between outside and inside anymore? The clipping part works, however.
Screenshot of the whole 3D model showing in red
I played around a bit, but only got it to work to show the correct colors WITHOUT the clipping part working. Especially the part with the facing variable and its conversion seems like the one messing with the result. I don't really know that much about shaders so I'm wondering if anyone could point me in the right direction what is happening with the normals and stuff?
The surface shader from the tutorial works fine in AR when leaving out the "Show the inside" part.
Super weird, that the shader works in the usual Unity environment and not in AR. Any help appreciated!
Playing around a bit more brought me to my solution, I might not have had tested enough... Seems like AR already provides VFACE in a range from 0 to 1, so converting it made things wrong.
I simply removed the conversion part which left me with only:
float facing = i.facing;
That seems to do the job! Hope this helps anyone trying to clip stuff in AR with a surface shader.

Metal Alpha Blending to create smooth line

I am trying to implement the method of drawing a smooth line through a large number of 3D points shown here --> https://ahmetkermen.com/simple-smooth-line-drawing-with-opengl.
I have a small quad at each point, with a billboard modelview matrix, that I render this image onto:
My blending options are defined like this:
// Allow alpha blending
pipelineDescriptor.colorAttachments[0].isBlendingEnabled = true
pipelineDescriptor.colorAttachments[0].rgbBlendOperation = .add
pipelineDescriptor.colorAttachments[0].alphaBlendOperation = .add
pipelineDescriptor.colorAttachments[0].sourceRGBBlendFactor = .sourceAlpha
pipelineDescriptor.colorAttachments[0].sourceAlphaBlendFactor = .oneMinusSourceAlpha
pipelineDescriptor.colorAttachments[0].destinationRGBBlendFactor = .sourceAlpha
pipelineDescriptor.colorAttachments[0].destinationAlphaBlendFactor = .oneMinusSourceAlpha
I am seeing that the alpha blending works with the rest of the scene as shown here:
But the blending does not work with the quads in the line themselves. Some quads show the sharp corner 'over' the next quad in the line:
Perhaps it is a z-order issue, however each point does have a unique position.
My fragment shader is only sampling the texture and setting the color:
fragment float4 frag_main(VertexOut fragmentIn [[stage_in]],
constant FragUniforms &uniforms [[buffer(0)]],
texture2d<float, access::sample> baseColorTexture [[texture(0)]],
sampler baseColorSampler [[sampler(0)]]
{
// Sample the image
float3 baseColor = baseColorTexture.sample(baseColorSampler, fragmentIn.texCoords).rgb;
return float4(baseColor, 1.0);
}
What am I doing wrong and how can I get the blending to work so that the quads appear as a continuous and smooth line?

Is there a way to change the color of the objects per unit depth when seen through the camera

I am trying to change the colours of the objects in a way that they appear to be red when near to the screen and violet for the further object/point from the screen.
I have tried to implement a code for fragment part of the shader in the Unity3D but it is not working.
fixed4 frag(v2f i) : SV_TARGET{
//get depth from depth texture
float depth = tex2D(_CameraDepthTexture, i.uv).r;
//linear depth between camera and far clipping plane
depth = Linear01Depth(depth);
//depth as distance from camera in units
depth = depth * _ProjectionParams.z;
//get source color
fixed4 source = tex2D(_MainTex, i.uv);
if(depth >= _ProjectionParams.z)
return RGB(violet);
else
return RGB_value_of_object_according_to_the_depth;
}
Here I can get the colour of the source, but I can't change the colour of the object to the colour red if it is nearest to the camera and violet if it is furthest.
To be precise I can't fix the code for the if/else condition.
Thanks in advance for any help I can get.
Code Credits: the code used from Ronja's exercise tutorials

How to apply fresnel effect only on one part of the sphere in SceneKit?

I am applying fresnel effect to the SCNSphere to achieve a beautiful glow of the Earth
However, because I am doing model of the Earth, I want that illuminated part of sphere was with fresnel effect and dark side without
Maybe it can be possible to achieve that effect if dark side of sphere will transparent? Or maybe I just can use shader modifier property of Scenekit _surface.fresnel?
For this you need to use an emission instance property that defines the color emitted by each point on a surface.
You can use an emissive map texture to simulate parts of a surface that glow with their own light. SceneKit does not treat the material as a light source—rather, the emission property determines colors for a material independent of lighting. (To create an object that appears to glow, you may wish to combine a geometry with an emissive map and additional SCNLight objects added to the scene.)
var emission: SCNMaterialProperty { get }
You can find emission property in any available SceneKit's shader Constant, Lambert, Blinn, Phong or Physically Based.
let earth = SCNNode(geometry: SCNSphere(radius: 5))
earth.geometry?.firstMaterial?.emission.contents = UIImage(named: "emissionMap.png")
And, of course, if you need to create a square emissionMap.png texture (a.k.a. UV Map) you should make it in Autodesk Maya or in Autodesk 3dsMax or in Maxon Cinema4D, etc. Such maps could be 512x512, 1024x1024, 2048x2048, etc.
I am no expert in Scenekit and shaders and maybe this solution is in no way optimal but one way to do it is to just use shader modifiers :
First, I take the diffuse (=direct, in Scenekit) lighting litting the sphere and convert the red, green and blue values into luminance. Source can be found here.
vec3 light = _lightingContribution.diffuse;
float lum = 0.2126*light.r + 0.7152*light.g + 0.0722*light.b;
I calculate a number based on the angle of the surface from the pov of the camera. surface entry point
float factorOpacity = ( _surface.normal.x * _surface.normal.x + _surface.normal.y * _surface.normal.y );
I set up a fresnel exponent which controls the thickness of the edge halo. Lower values mean the fresnel effect is more visible around the center and higher values mean that the fresnel effect appears more around the edges.
float fresnelExponent = 1.0;
I set up the red, green and blue values (0.0-1.0) for my atmosphere color.
float red = 0.1;
float green = 0.2;
float blue = 0.4;
I set up the Nadir(=looking straight down) transparency that I want. 0.0 means fully transparent and 1.0 means fully opaque.
float nadirTransparency = 0.2;
I set the edge transparency.
float edgeTransparency = 1.0;
I calculate the final fresnel factor taking into account the fresnel exponent.
factorOpacity = pow(factorOpacity,fresnelExponent);
Finally, I compute the final color of the geometry and multiply by the desired transparency depending on the input parameters and the fresnel factors.
_output.color = vec4(red,green,blue,1.0) * min(lum,1.0) * (nadirTransparency + (edgeTransparency - nadirTransparency) * factorOpacity);
The Swift implementation looks like this :
let sphere = SCNSphere(radius: radiusValue)
self.geometry = sphere
let ShaderModifier =
"""
#pragma transparent
vec3 light = _lightingContribution.diffuse;
float lum = 0.2126*light.r + 0.7152*light.g + 0.0722*light.b;
float factorOpacity = ( _surface.normal.x * _surface.normal.x + _surface.normal.y * _surface.normal.y );
float fresnelExponent = 1.0;
float red = 0.1;
float green = 0.2;
float blue = 0.4;
float nadirTransparency = 0.2;
float edgeTransparency = 1.0;
factorOpacity = pow(factorOpacity,fresnelExponent);
_output.color = vec4(red,green,blue,1.0) * min(lum,1.0) * (nadirTransparency + (edgeTransparency - nadirTransparency) * factorOpacity);
"""
self.geometry?.firstMaterial?.shaderModifiers = [.fragment: ShaderModifier]
Here is how it looks (cloud colors are from another shader)
atmosphere shader scenekit
And here is how it looks when setting the nadir transparency to 0, edge transparency to 1, a bright blue color and a high fresnel exponent
amplified fresnel effect

OpenGL ES transparency not working, instead things just blend with the background

So I have a simple simulation set up on my phone. The goal is to have circles of red, white, and blue that appear on the screen with various transparencies. I have most of that working, except for one thing, while transparency sort of works, the only blending happens with the black background. As a result the circle in the center appears dark red instead of showing the white circles under it. What am I doing wrong?
Note I am working in an orthographic 2d projection matrix. All of the objects z positions are the same, and are rendered in a specific order.
Here is how I set it so transparency works:
glEnable(GLenum(GL_DEPTH_TEST))
glEnable(GLenum(GL_POINT_SIZE));
glEnable(GLenum(GL_BLEND))
glBlendFunc(GLenum(GL_SRC_ALPHA), GLenum(GL_ONE_MINUS_SRC_ALPHA))
glEnable(GLenum(GL_POINT_SMOOTH))
//Note some of these things aren't compatible with OpenGL-es but they can hurt right?
Here is the fragment shader:
precision mediump float;
varying vec4 outColor;
varying vec3 center;
varying float o_width;
varying float o_height;
varying float o_pointSize;
void main()
{
vec4 fc = gl_FragCoord;
vec3 fp = vec3(fc);
vec2 circCoord = 2.0 * gl_PointCoord - 1.0;
if (dot(circCoord, circCoord) > 1.0) {
discard;
}
gl_FragColor = outColor;//colorOut;
}
Here is how I pass each circle to the shader:
func drawParticle(part: Particle,color_loc: GLint, size_loc: GLint)
{
//print("Drawing: " , part)
let p = part.position
let c = part.color
glUniform4f(color_loc, GLfloat(c.h), GLfloat(c.s), GLfloat(c.v), GLfloat(c.a))
glUniform1f(size_loc, GLfloat(part.size))
glVertexAttribPointer(0, GLint(3), GLenum(GL_FLOAT), GLboolean(GL_FALSE), 0, [p.x, p.y, p.z]);
glEnableVertexAttribArray(0);
glDrawArrays(GLenum(GL_POINTS), 0, GLint(1));
}
Here is how I set it so transparency works:
glEnable(GLenum(GL_DEPTH_TEST))
glEnable(GLenum(GL_POINT_SIZE));
glEnable(GLenum(GL_BLEND))
glBlendFunc(GLenum(GL_SRC_ALPHA), GLenum(GL_ONE_MINUS_SRC_ALPHA))
glEnable(GLenum(GL_POINT_SMOOTH))
And that's not how transparency works. OpenGL is not a scene graph, it just draws geometry in the order you specify it to. If the first thing you draw are the red circles, they will blend with the background. Once things get drawn that are "behind" the red circles, the "occulded" parts will simply be discarded due to the depth test. There is no way for OpenGL (or any other depth test based algorithm) to automatically sort the different depth layers and blend them appropriately.
What you're trying to do there is order independent transparency, a problem still in research on how to solve it efficiently.
For what you want to achieve you'll have to:
sort your geometry far to near and draw in that order
disable the depth test while rendering