How can I get the lighting information from a skybox? - unity3d

I am freshman to the Unity3d shader. I am writing a custome unlit shader now. I know I can use _WorldSpaceLightPos0.xyz and _LightColor0.rgb to get the information of a directional light source.
However, if I have a skybox instead of a light source, how can I get the light information? The radiance(Li) is coming from my skybox now. How can I get the value of them and compute Li*my_custome_brdf?
Thanks

The skybox does not have a light position since it is omnidirectional, so you can't calculate proper directional lighting from it (only ambient). However, if you just want the skybox color in any direction, there's unity_SpecCube0, unity_SpecCube1, etc. for reflection probes listed by weight. You can use something like this:
inline half3 SurfaceReflection(half3 viewDir, half3 worldNormal, half roughness) {
half3 worldRefl = reflect(-viewDir, worldNormal);
half r = roughness * 1.7 - 0.7 * roughness; // Some magic for calculating MIP level
float4 reflData = UNITY_SAMPLE_TEXCUBE_LOD(
unity_SpecCube0, worldRefl, r * 6
);
return DecodeHDR (reflData, unity_SpecCube0_HDR);
}
If your reflection probe is different from your skybox, just define the skybox as a texCUBE uniform in your shader instead and use that. If you don't want to have to set it for each material using this shader, just declare it in the CGPROGRAM block as a uniform, but not as a ShaderLab property - it will then be considered a shader global. You can then set it through script using Shader.SetGlobalTexture. Check out the documentation for this here:
https://docs.unity3d.com/ScriptReference/Shader.SetGlobalTexture.html

Related

How to remove anti-aliasing artifacts inside the object's material HLSL shader (Unity)

I am trying to create a shader that would blend a background's texture with the 3d object's texture. The background texture is a render texture from another camera. I pass it to the shader of the foreground object.
The idea is to achieve an effect of various blendings on the 3d object "in the front".
The fragment shder looks like this:
fixed4 frag(v2f i) : SV_Target
{
float4 color_from_background = tex2D(_BackTex, float2(i.pos.x / _ScreenParams.x, i.pos.y / _ScreenParams.y)); //A
fixed4 surfcol = tex2D(_MainTex, i.uv);
fixed4 c = BlendMode_Multiply(color_from_background,fixed4(surfcol.rgb,surfcol.a * _Transparency));
return c;
}
It kind of works...
Unfortunately due to imprecisions in the division in the line marked with "//A" - I get visible artifacts along the object's edges. In this line I sample the background texture for the blending's input, How to solve that? First thing that comes to my mind is anti-aliasing. But I dont know how to implement it here. I tried to change sampling levels and filtering of the textures but it does not solve the problem.
I attach image that shows the problem. At the top part you can see the artifacts when using the shader. At the bottom the actual object with non-opaque material.
The solution to this is disabling the anti-aliasing in the project settings globally OR in the target render texture settings and camera settings. I only had it off in the target render texture which was not enough

Toon shader shadow

I'm currently trying to change my asset style from realistic to low poly / cartoonic.
For example, I have a toon surface shader
half4 LightingRamp(SurfaceOutput s, half3 lightDir, half atten) {
half NdotL = dot(s.Normal, lightDir);
half diff = NdotL * 0.5 + 0.5;
half3 ramp = tex2D(_LightingTex, float2(diff, 0)).rgb;
half4 c;
c.rgb = _LightColor0.rgb * atten * ramp *_Color;
c.a = s.Alpha;
return c;
}
where _LightingTex is a 2D texture ramp. This works fine for lighting effects on the objects themselfs.
As I have multiple objects with this shader in my scene, some of them are casting a shadow onto my wall.
As you can see, the shadow here is not a ramp but a continuous gradient, as it is (probably) done in some sort of ambient from unity. My question is now: is there an option to create this colorramp effect on the global shadows as well? Something like this:
Can I do it material shader based, or is it a post processing effect?
Thanks
With surface shaders: No, you can't do it in the shader. Actually, I think the best way to get a unified cartoon effect is to use a color grading LUT as a post effect. The great thing about LUTs is that you can create one easily in photoshop by first applying some cool effects to a regular image until it looks the way you want (such as "Posterize"), and then copy the effect stack to apply to a LUT texture, like this one. When you use this LUT in Unity, everything will look as they would with your Photoshop filters applied. One small caveat I've noticed though is that some standard LUT textures need to be flipped vertically to work with the Post Processing Stack. Here is a nice tutorial on how to create posterized LUTs.
If you want to get the toon-like shadows directly in the shader, it is not any harder than making a regular forward rendered vertex/fragment shader, though this by itself requires a bit of knowledge on how these work - i can recommend looking at the standard shader source code, this, or this (somewhat outdated) tutorial. You can find the details surrounding how to add shadow support from my post here. The only thing you need to change is to add a similar color ramp to the shadow mask:
half shadow = SHADOW_ATTENUATION(IN)
shadow = tex2D(_ShadowRamp, float2(shadow, 0));
For this, you can set the shadow ramp as a global shader variable from script, so you won't have to assign it for each material.

Unity Repeat UV Coordinates On Quadtree Shaderforge

As u can see in the image, on larger tiles (n > 1) the texture should be repeated as long as the current rect size.. i don't know how i can achieve this!
FYI, im getting the tile texture id with the alpha value of the vertex color.
Here the shader im using..
[UPDATE]
Thanks for clarifying the uv coordinates, unfortunately that doesn't answer my question. Take a look at the following pixture...
Your shader is fine; it's actually the vertex UVs that are the problem:
So for all rectangles the uv coordinates are as following [0, 0] / [0, rect.height] / [rect.width, 0] / [rect.width, rect.height]. So the uvs are going beyond 1
Your shader is designed to support the standard UV space, in which case you should replace rect.width and rect.height with 1.
By using UV coords greater than one, you're effectively asking for texels outside of the specified texture. When used with a texture atlas, that means you're asking for texels outside of the specified tile -- in this case, those happen to be white, and that's what you're seeing in the rendered output.
Tiling with an atlas texture
Updating because I missed an important detail: you want a tiling material.
Usually, UVs interpolate linearly:
For tiling, you essentially want more of a "sawtooth" output:
For a non-atlas texture, you can adjust material scale/wrap settings and call it done. For an atlas texture, it's possible but you'll end up with a shader and/or geometry that aren't quite standard.
The "most standard" solution would be if your larger quads are on a separate mesh from the smaller ones:
Add a float material param named uv_scale or some such
Add a Multiply node that scales incoming UVs by uv_scale
Pass output from that into a Frac node
Pass output from that into the UV Tile node
Pseudocode is roughly: uv = frac(uv * uv_scale)
If you need all of your quads to be in the same mesh, you end up needing non-standard geometry:
Change your UVs again (going back to rect.width and rect.height)
Add a Frac node before the UV Tile node
This is a simpler shader change, but has the downside that your geometry will no longer be cleanly supported in other shaders.
Thanks rutter!
i've implemented your solution into my shader and now it works perfectly!
so for everyone looking for this here is the shader im using now
Cheers, M

Correctly render semi-transparent sphere in Unity5

I'm making a space exploration game in Unity and I'm having two problems with semi-transparency.
Each planet is made up of two spheres: One is the combined surface and cloud layer, the other (with a slightly larger radius) depicts the horizon 'glow' by culling front faces and fading alpha toward the outer edge of the sphere. This is MOSTLY working fine, but with the following two problems:
1) In my custom surface shader, when I use the alpha keyword in the #pragma definition, alpha is factored into the rendered sphere, but the 'glow' sphere disappears at a distance of a few thousand units. If I DON'T include the alpha keyword, the sphere does not fade toward the edge, but it renders at distance.
2) Despite trying all RenderType, Queue, ZWrite and ZDepth options, the surface sphere and 'glow' sphere are z-fighting; the game can't seem to decide which polygons are nearer - despite the fact near faces on the glow sphere should be culled. I have even tried pushing the glow sphere away from the player camera and expanding its radius by the same proportion, but I'm STILL, inexplicably, getting the z-fighting between the spheres!
Is there any setting that I'm missing that will enable the 'glow' sphere to be always drawn BEHIND the surface sphere (given that I have tried ALL combinations of ZWrite, ZDepth as detailed above) and is there a way to have an alpha-enabled object NOT disappear at distance?
I cannot seem to figure this out, so any help will be well appreciated!
EDIT
Here's the shader code for my 'glow sphere'. Front faces are culled. I've even tried the Offset keyword to 'push' any drawn polygons further from camera. And I've tried all of the Tag, ZWrite and ZTest options I've been able to find. The shader gets passed a tint Color, an atmosphere density float and a sun direction vector...
Shader "Custom/planet glow" {
Properties {
_glowTint ("Glow Tint", Color) = (0.5,0.8,1,1)
_atmosphereMix ("Atmosphere Mix", float) = 0
_sunDirection ("Sun Direction", Vector) = (0, 0, 0, 0)
}
SubShader {
Tags { "RenderType" = "Opaque" "Queue" = "Geometry" }
Cull Front // I want only the far faces to render (behind the actual planet surface)
Offset 10000, 10000
ZWrite On // Off also tried
ZTest LEqual // I have tried various other options here, incombination with changing this setting in the planet surface shader
CGPROGRAM
#pragma surface surf Lambert alpha
#pragma target 4.0
struct Input {
float3 viewDir;
};
fixed4 _glowTint;
float _atmosphereMix;
float4 _sunDirection;
void surf (Input IN, inout SurfaceOutput o) {
_sunDirection = normalize(_sunDirection);
o.Albedo = _glowTint;
float cameraNormalDP = saturate(dot( normalize(IN.viewDir), -o.Normal ) * 4.5);
float sunNormalDP = saturate(dot( normalize(-_sunDirection), -o.Normal ) * 2);
o.Alpha = _atmosphereMix * sunNormalDP * (cameraNormalDP * cameraNormalDP * cameraNormalDP); // makes the edge fade 'faster'
o.Emission = _glowTint;
}
ENDCG
}
FallBack "Diffuse"
}
Have you thought about rendering large objects in a different scale in another camera to create dynamic skybox? That will certainly resolve z-fighting issues.
You can have, for example, two cameras - one that renders objects in range 0.1-1000, and other, that ranges from 1000 to 100000.
The additional optimization can include rendering environment from afar to the cube skybox and do that not every frame (except maybe special occasions when you destroy a planet from afar).
There is also another optimization concern - you could render a flat ring around the planet, rotated to face the camera, to avoid overdraw on the actual planet surface at all. But that will require more complex lighting calculations, apparently.
Also, have you tried your transparent shader on a camera that does not have Skybox as clear flags? Check this answer on how to use custom skybox.
I seem to have stumbled upon a solution to this problem while trying to resolve world-space 'jittering'...
The game I'm developing uses big distances. I changed the way long-distance objects are positioned (the distances get contracted the further an object is from camera) and that has resolved the z-fighting and the alpha vanishing.
A whole solar system fits into something like 300000 kilometres by using the square root of the distance divided by the distance and multiplying the relative Vector3 and scale of the object by that. Hopefully that info might be useful to someone.
If you are using alpha then you need to change the Tags to render the alpha in the transparency pass. Also turn off ZWrite. And remove the Offset.
I tested your shader in an empty project with both Forward and Deferred rendering. It worked fine with these adjustments.
All opaque geometry is drawn first, then the alpha pass is done and renders all objects with transparency on TOP of all of the objects in the scene. It needs to do this otherwise it won't be able to blend the colors for the alpha.
Tags { "RenderType" = "Transparent" "Queue" = "Transparent" }
ZWrite Off

OpenGL ES shader multi textures

I'm working on creating a 3D landscape. So far I've got a mesh created with vertices and faces, and it all looks decent. I've applied a single texture to it, but want to have multiple textures perhaps based on the height. I'm thinking I need a shader to do this.
I'm new to shaders but so far I've followed this tutorial and have two textures blended together. However, I want to fade one texture out completely at certain heights (or position) and have the other completley show.
I'm really not sure how to approach this using a shader. Could someone give some advice on how to start?
For argument's sake, suppose your source geometry runs from y=0 to y=1 and you want the second texture to be completely transparent at y=0 and completely opaque at y=1. Then you could add an addition varying, named secondTextureAlpha or something like that. Load it with your pre-transformation y values in the shader, then combine the incoming values from the source textures either so that the second is multiplies by secondTextureAlpha, if you want to stick with additive blending, or via the mix function for multiplicative.
So e.g. the fragement shader might end up looking like:
varying vec3 lightDir,normal;
varying lowp float secondTextureAlpha;
uniform sampler2D tex,l3d;
void main()
{
vec3 ct,cf;
vec4 texel;
float intensity,at,af;
intensity = max(dot(lightDir,normalize(normal)),0.0);
cf = intensity * (gl_FrontMaterial.diffuse).rgb +
gl_FrontMaterial.ambient.rgb;
af = gl_FrontMaterial.diffuse.a;
texel = mix(texture2D(tex,gl_TexCoord[0].st),
texture2D(l3d,gl_TexCoord[0].st), secondTextureAlpha);
ct = texel.rgb;
at = texel.a;
gl_FragColor = vec4(ct * cf, at * af);
}
To achieve a more complicated mapping, just adjust how you load secondTextureAlpha in your vertex shader, or take it as an input attribute maybe.