SCNProgram not affecting SCNFloor - swift

In my experiments using the shader modifier, I saw that the array data could not be transferred to the shader.
Scenekit giving buffer size error while passing array data to uniform array in openGL shader
For this reason I decided to try SCNProgram. But now I realize that the shaders I added using SCNProgram do not work on SCNFloor.
Is there a particular reason for this problem?
Super simple shaders which I use for testing;
vertex shader
precision highp float;
attribute vec3 vertex;
uniform mat4 ModelViewProjectionMatrix;
void main()
{
gl_Position = ModelViewProjectionMatrix * vec4(vertex, 1.0);
}
fragment shader
precision mediump float;
void main( void )
{
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
}

You can try to make your own vert and frag shaders to kinda do the same thing. I've done a similar thing with webgl / glsl es2, or just paint it solid, tint the existing floor:
uniform mat4 uCameraWM; //camera world matrix
uniform vec2 uCamAspFov; //x:aspect ratio y:tan(fov/2)
varying vec3 vVDw; //view direction
void main(){
//construct frustum, scale and transform a unit quad
vec4 viewDirWorld = vec4(
position.x * uCamAspFov.x,
position.y,
-uCamAspFov.y, //move it to where 1:aspect fits
0.
);
vVDw = ( uCameraWM * viewDirWorld ).xyz; //transform to world
gl_Position = vec4( position.xy , 0. , 1. ); //draw a full screen quad
}
Frag:
(ignore the cruft, point is to intersect the view direction ray with a plane, map it, you can look up a cubemap in the sky portion instead of discarding it)
uniform float uHeight;
uniform sampler2D uTexDiff;
uniform sampler2D uTexNorm;
uniform sampler2D uTexMask;
varying vec2 vUv;
varying vec3 vVDw;
struct Plane
{
vec3 point;
vec3 normal;
float d;
};
bool rpi( in Plane p , in vec3 p0 , in vec3 vd , out vec3 wp )
{
float t;
t = -( dot( p0 , p.normal ) + p.d ) / dot( vd , p.normal );
wp = p0 + t * vd;
return t > 0. ? true : false;
}
void main(){
Plane plane;
plane.point = vec3( 0. , uHeight , 0. );
plane.normal = vec3( 0. , 1. , .0 );
plane.d = -dot( plane.point , plane.normal );
vec3 ld = normalize( vec3(1.,1.,1.) );
vec3 norm = plane.normal;
float ndl = dot( norm , ld ) ;
vec2 uv;
vec3 wp;
vec3 viewDir = normalize( vVDw );
vec3 h = normalize((-viewDir + ld));
float spec = clamp( dot( h , norm ) , .0 , 1. );
// spec = pow( spec , 5. );
if( dot(plane.normal , cameraPosition) < 0. ) discard;
if( !rpi( plane , cameraPosition , viewDir , wp ) ) discard;
uv = wp.xz;
vec2 uvm = uv * .0105;
vec2 uvt = uv * .2;
vec4 tmask = texture2D( uTexMask , uvm );
vec2 ch2Scale = vec2(1.8,1.8);
vec2 ch2Scale2 = vec2(1.6,1.6);
vec2 t1 = uvt * ch2Scale2 - vec2(tmask.z , -tmask.z) ;
// vec2 t2 = uvt * ch2Scale + tmask.z ;
// vec2 t3 = uvt + vec2(0. , mask.z-.5) * 1.52;
vec3 diffLevels = ( texture2D( uTexDiff , t1 ) ).xyz;
// vec3 diffuse2 = ( texture2D( uTexDiff, fract(t1) * vec2(.5,1.) + vec2(.5,.0) ) ).xyz;
// vec3 diffuse1 = ( texture2D( uTexDiff, fract(t2) * vec2(.5,1.) ) ).xyz;
// vec4 normalMap2 = texture2D( uTexNorm, fract(t1) * vec2(.5,1.) + vec2(.5,.0) );
// vec4 normalMap1 = texture2D( uTexNorm, fract(t2) * vec2(.5,1.) );
float diffLevel = mix(diffLevels.y, diffLevels.x, tmask.x);
diffLevel = mix( diffLevel , diffLevels.z, tmask.y );
// vec3 normalMix = mix(normalMap1.xyz, normalMap2.xyz, tmask.x);
// vec2 g = fract(uv*.1) - .5;
// float e = .1;
// g = -abs( g ) + e;
float fog = distance( wp.xz , cameraPosition.xz );
// float r = max( smoothstep( 0.,e,g.x) , smoothstep( 0.,e,g.y) );
gl_FragColor.w = 1.;
gl_FragColor.xyz = vec3(tmask.xxx);
gl_FragColor.xyz = vec3(diffLevel) * ndl + spec * .5;
}
But overall, the better advice would be just to give up on scenekit and save yourself a TON of frustration.

Finally Apple Developer Technical Support answered the question I asked about this issue.
Here is the answer that they give.
Unfortunately, there is not a way to shade floor as such. The
SceneKit team admits that SCNFloor is a different kind of object that
is not intended for use with SCNProgram. Furthermore, using the
.fragment shader entry point does not work either (such as:)
func setFragmentEntryPoint( _ node: SCNNode ) {
print( #function + " setting fragment entry point for \(node)" )
DispatchQueue.main.asyncAfter( deadline: DispatchTime.now() + DispatchTimeInterval.milliseconds( 2500 ) ) {
let geometry = node.geometry!
let dict: [SCNShaderModifierEntryPoint:String] = [.fragment :
"_output.color = vec4( 0.0, 1.0, 0.0, 1.0 );"]
geometry.shaderModifiers = dict
}
}
Though the SceneKit team considered this behavior to be expected, you
may still file a bug report which in this case will be interpreted as
an API enhancement request.

Related

gl_FragDepth calculated to camera space

Depth fragment shader is A.frag
#version 430 core
uniform float pointSize;
uniform mat4 projectMatrix;
in vec3 eyeSpacePos;
void main(){
vec3 normal;
normal.xy = gl_PointCoord.xy * vec2(2.0, -2.0) + vec2(-1.0,1.0);
float mag = dot(normal.xy, normal.xy);
if(mag > 1.0) discard;
normal.z = sqrt(1.0 - mag);
vec4 pixelEyePos = vec4(eyeSpacePos + normal * pointSize, 1.0f);
vec4 pixelClipPos = projectMatrix * pixelEyePos;
float ndcZ = pixelClipPos.z / pixelClipPos.w;
gl_FragDepth = ndcZ;
}
accepth the depth map shader is B.frag
void main(){
float pixelDepth = texture(u_DepthTex, Texcoord).r;
gl_FragDepth = outDepth;
}
How can convert the pixelDepth into the camera space at B.frag? I have tried many times without success.
The ndc coordinate is in range [-1.0, 1.0] the depth has to be in dept range, which is by default [0.0, 1.0]:
gl_FragDepth = ndcZ * 0.5 - 0.5;

Unity using shader for water effect but I need the normal and location of the shader triangle not the mesh

I am working on a game and I bought a water shader, I am just to time strapped to learn it right now. I am making a game with water and ships. I need the ships to respond to the water shader vertex normal at the ray cast hit position. Frankly I just don't know what I am doing and I would appreciate any help.
Shader
Shader "StylizedWater/Mobile"
{
Properties
{
[HDR]_WaterColor("Water Color", Color) = (0.1176471,0.6348885,1,0)
[HDR]_WaterShallowColor("WaterShallowColor", Color) = (0.4191176,0.7596349,1,0)
_Wavetint("Wave tint", Range( -1 , 1)) = 0
[HDR]_RimColor("Rim Color", Color) = (1,1,1,0.5019608)
_NormalStrength("NormalStrength", Range( 0 , 1)) = 0.25
_Transparency("Transparency", Range( 0 , 1)) = 0.75
_Glossiness("Glossiness", Range( 0 , 1)) = 0.85
[Toggle]_Worldspacetiling("Worldspace tiling", Float) = 1
_NormalTiling("NormalTiling", Range( 0 , 1)) = 0.9
_EdgeFade("EdgeFade", Range( 0.01 , 3)) = 0.2448298
_RimSize("Rim Size", Range( 0 , 20)) = 5
_Rimfalloff("Rim falloff", Range( 0.1 , 50)) = 3
_Rimtiling("Rim tiling", Float) = 0.5
_FoamOpacity("FoamOpacity", Range( -1 , 1)) = 0.05
_FoamSpeed("FoamSpeed", Range( 0 , 1)) = 0.1
_FoamSize("FoamSize", Float) = 0
_FoamTiling("FoamTiling", Float) = 0.05
_Depth("Depth", Range( 0 , 100)) = 30
_Wavesspeed("Waves speed", Range( 0 , 10)) = 0.75
_WaveHeight("Wave Height", Range( 0 , 1)) = 0.5366272
_WaveFoam("Wave Foam", Range( 0 , 10)) = 0
_WaveSize("Wave Size", Range( 0 , 10)) = 0.1
_WaveDirection("WaveDirection", Vector) = (1,0,0,0)
[NoScaleOffset][Normal]_Normals("Normals", 2D) = "bump" {}
[NoScaleOffset]_Shadermap("Shadermap", 2D) = "black" {}
[Toggle(_USEINTERSECTIONFOAM_ON)] _UseIntersectionFoam("UseIntersectionFoam", Float) = 0
[Toggle]_ENABLE_VC("ENABLE_VC", Float) = 0
[Toggle(_LIGHTING_ON)] _LIGHTING("LIGHTING", Float) = 0
[Toggle]_Unlit("Unlit", Float) = 0
_Metallicness("Metallicness", Range( 0 , 1)) = 0
[Toggle(_NORMAL_MAP_ON)] _NORMAL_MAP("NORMAL_MAP", Float) = 0
[Toggle]_USE_VC_INTERSECTION("USE_VC_INTERSECTION", Float) = 0
[Toggle]_EnableDepthTexture("EnableDepthTexture", Float) = 1
[HideInInspector] __dirty( "", Int ) = 1
}
SubShader
{
Tags{ "RenderType" = "Transparent" "Queue" = "Transparent+0" "IgnoreProjector" = "True" "ForceNoShadowCasting" = "True" }
LOD 200
Cull Back
CGPROGRAM
#include "UnityPBSLighting.cginc"
#include "UnityShaderVariables.cginc"
#include "UnityCG.cginc"
#pragma target 3.0
#pragma multi_compile __ _LIGHTING_ON
#pragma multi_compile __ _NORMAL_MAP_ON
#pragma shader_feature _USEINTERSECTIONFOAM_ON
#pragma fragmentoption ARB_precision_hint_fastest
#pragma exclude_renderers xbox360 psp2 n3ds wiiu
#pragma surface surf StandardCustomLighting alpha:fade keepalpha noshadow nolightmap nodynlightmap nodirlightmap nometa noforwardadd vertex:vertexDataFunc
struct Input
{
float3 worldPos;
float4 screenPos;
float4 vertexColor : COLOR;
float2 vertexToFrag713;
float2 vertexToFrag714;
float3 worldRefl;
INTERNAL_DATA
float3 vertexToFrag746;
float3 worldNormal;
};
struct SurfaceOutputCustomLightingCustom
{
half3 Albedo;
half3 Normal;
half3 Emission;
half Metallic;
half Smoothness;
half Occlusion;
half Alpha;
Input SurfInput;
UnityGIInput GIData;
};
uniform sampler2D _Normals;
uniform sampler2D _Shadermap;
uniform half _WaveHeight;
uniform float _ENABLE_VC;
uniform float _Worldspacetiling;
uniform float _WaveSize;
uniform float _Wavesspeed;
uniform float4 _WaveDirection;
uniform float _EnableDepthTexture;
uniform sampler2D_float _CameraDepthTexture;
uniform half _EdgeFade;
uniform half _Transparency;
uniform float _Depth;
uniform half4 _WaterShallowColor;
uniform float4 _RimColor;
uniform float _USE_VC_INTERSECTION;
uniform half _Rimfalloff;
uniform float _Rimtiling;
uniform half _RimSize;
uniform float _NormalTiling;
uniform half _NormalStrength;
uniform half _Glossiness;
uniform float _Unlit;
uniform half4 _WaterColor;
uniform half _Wavetint;
uniform half _FoamOpacity;
uniform float _FoamTiling;
uniform float _FoamSpeed;
uniform half _FoamSize;
uniform float _WaveFoam;
uniform float _Metallicness;
void vertexDataFunc( inout appdata_full v, out Input o )
{
UNITY_INITIALIZE_OUTPUT( Input, o );
float3 ase_vertexNormal = v.normal.xyz;
float4 VertexColors729 = lerp(float4( 0,0,0,0 ),v.color,_ENABLE_VC);
float3 ase_worldPos = mul( unity_ObjectToWorld, v.vertex );
float2 Tiling21 = lerp(( -20.0 * v.texcoord.xy ),( (ase_worldPos).xz * float2( 0.1,0.1 ) ),_Worldspacetiling);
float2 appendResult500 = (float2(_WaveDirection.x , _WaveDirection.z));
float2 WaveSpeed40 = ( ( _Wavesspeed * _Time.x ) * appendResult500 );
float2 HeightmapUV581 = ( ( ( Tiling21 * _WaveSize ) * float2( 0.1,0.1 ) ) + ( WaveSpeed40 * float2( 0.5,0.5 ) ) );
float4 tex2DNode94 = tex2Dlod( _Shadermap, float4( HeightmapUV581, 0, 1.0) );
float temp_output_95_0 = ( saturate( ( _WaveHeight - (VertexColors729).b ) ) * tex2DNode94.g );
float3 Displacement100 = ( ase_vertexNormal * temp_output_95_0 );
v.vertex.xyz += Displacement100;
o.vertexToFrag713 = lerp(( -20.0 * v.texcoord.xy ),( (ase_worldPos).xz * float2( 0.1,0.1 ) ),_Worldspacetiling);
o.vertexToFrag714 = ( ( _Wavesspeed * _Time.x ) * appendResult500 );
#if defined(LIGHTMAP_ON) && ( UNITY_VERSION < 560 || ( defined(LIGHTMAP_SHADOW_MIXING) && !defined(SHADOWS_SHADOWMASK) && defined(SHADOWS_SCREEN) ) )//aselc
float4 ase_lightColor = 0;
#else //aselc
float4 ase_lightColor = _LightColor0;
#endif //aselc
o.vertexToFrag746 = ase_lightColor.rgb;
}
inline half4 LightingStandardCustomLighting( inout SurfaceOutputCustomLightingCustom s, half3 viewDir, UnityGI gi )
{
UnityGIInput data = s.GIData;
Input i = s.SurfInput;
half4 c = 0;
//Start - Stylized Water custom depth
float4 ase_screenPos = float4( i.screenPos.xyz , i.screenPos.w + 0.00000000001 );
float4 ase_screenPosNorm = ase_screenPos / ase_screenPos.w;
ase_screenPosNorm.z = ( UNITY_NEAR_CLIP_VALUE >= 0 ) ? ase_screenPosNorm.z : ase_screenPosNorm.z * 0.5 + 0.5;
float screenDepth795 = LinearEyeDepth(UNITY_SAMPLE_DEPTH(tex2Dproj(_CameraDepthTexture,UNITY_PROJ_COORD(ase_screenPos))));
float distanceDepth795 = ( screenDepth795 - LinearEyeDepth( ase_screenPosNorm.z ) ) / ( lerp( 1.0 , ( 1.0 / _ProjectionParams.z ) , unity_OrthoParams.w) );
#if SHADER_API_MOBILE && UNITY_VERSION >= 20183 //Build only, abs() function causes offset in depth on mobile in 2018.3
#else
distanceDepth795 = abs(distanceDepth795);
#endif
//End - Stylized Water custom depth
float DepthTexture494 = distanceDepth795;
float ColorDepth479 = lerp(1.0,saturate( ( DepthTexture494 / _Depth ) ),_EnableDepthTexture);
float4 VertexColors729 = lerp(float4( 0,0,0,0 ),i.vertexColor,_ENABLE_VC);
float2 Tiling21 = i.vertexToFrag713;
float2 temp_output_24_0 = ( Tiling21 * _Rimtiling );
float2 WaveSpeed40 = i.vertexToFrag714;
float temp_output_30_0 = ( tex2D( _Shadermap, ( ( 0.5 * temp_output_24_0 ) + WaveSpeed40 ) ).b * tex2D( _Shadermap, ( temp_output_24_0 + ( 1.0 - WaveSpeed40 ) ) ).b );
float Intersection42 = saturate( ( _RimColor.a * ( 1.0 - ( ( ( lerp(lerp(1.0,DepthTexture494,_EnableDepthTexture),( 1.0 - (VertexColors729).r ),_USE_VC_INTERSECTION) / _Rimfalloff ) * temp_output_30_0 ) + ( lerp(lerp(1.0,DepthTexture494,_EnableDepthTexture),( 1.0 - (VertexColors729).r ),_USE_VC_INTERSECTION) / _RimSize ) ) ) ) );
float Opacity121 = saturate( ( ( lerp(1.0,saturate( ( DepthTexture494 / _EdgeFade ) ),_EnableDepthTexture) * saturate( ( ( _Transparency * saturate( ( ColorDepth479 + _WaterShallowColor.a ) ) ) + Intersection42 ) ) ) - (VertexColors729).g ) );
float3 ase_worldPos = i.worldPos;
#if defined(LIGHTMAP_ON) && UNITY_VERSION < 560 //aseld
float3 ase_worldlightDir = 0;
#else //aseld
float3 ase_worldlightDir = normalize( UnityWorldSpaceLightDir( ase_worldPos ) );
#endif //aseld
half3 _BlankNormal = half3(0,0,1);
float2 temp_output_705_0 = ( _NormalTiling * Tiling21 );
#ifdef _NORMAL_MAP_ON
float2 staticSwitch760 = ( ( float2( 0.25,0.25 ) * temp_output_705_0 ) + WaveSpeed40 );
#else
float2 staticSwitch760 = float2( 0,0 );
#endif
#ifdef _NORMAL_MAP_ON
float2 staticSwitch761 = ( temp_output_705_0 + ( 1.0 - WaveSpeed40 ) );
#else
float2 staticSwitch761 = float2( 0,0 );
#endif
#ifdef _NORMAL_MAP_ON
float3 staticSwitch763 = ( ( UnpackNormal( tex2D( _Normals, staticSwitch760 ) ) + UnpackNormal( tex2D( _Normals, staticSwitch761 ) ) ) / float3( 2,2,2 ) );
#else
float3 staticSwitch763 = _BlankNormal;
#endif
float3 lerpResult621 = lerp( _BlankNormal , staticSwitch763 , _NormalStrength);
float3 NormalMap52 = lerpResult621;
float dotResult741 = dot( ase_worldlightDir , normalize( WorldReflectionVector( i , NormalMap52 ) ) );
float GlossParam754 = _Glossiness;
float3 lerpResult478 = lerp( (_WaterShallowColor).rgb , (_WaterColor).rgb , ColorDepth479);
float3 WaterColor350 = lerpResult478;
float2 HeightmapUV581 = ( ( ( Tiling21 * _WaveSize ) * float2( 0.1,0.1 ) ) + ( WaveSpeed40 * float2( 0.5,0.5 ) ) );
float4 tex2DNode94 = tex2D( _Shadermap, HeightmapUV581 );
float Heightmap99 = tex2DNode94.g;
float3 temp_cast_0 = (( Heightmap99 * _Wavetint )).xxx;
float3 RimColor102 = (_RimColor).rgb;
float3 lerpResult61 = lerp( ( WaterColor350 - temp_cast_0 ) , ( RimColor102 * 3.0 ) , Intersection42);
float2 temp_output_634_0 = ( WaveSpeed40 * _FoamSpeed );
float4 tex2DNode67 = tex2D( _Shadermap, ( ( _FoamTiling * Tiling21 ) + temp_output_634_0 + ( Heightmap99 * 0.1 ) ) );
#ifdef _USEINTERSECTIONFOAM_ON
float staticSwitch725 = ( 1.0 - tex2DNode67.b );
#else
float staticSwitch725 = saturate( ( 1000.0 * ( ( tex2D( _Shadermap, ( ( _FoamTiling * ( Tiling21 * float2( 0.5,0.5 ) ) ) + temp_output_634_0 ) ).r - tex2DNode67.r ) - _FoamSize ) ) );
#endif
float Foam73 = ( _FoamOpacity * staticSwitch725 );
float3 temp_cast_1 = (2.0).xxx;
float FoamTex244 = staticSwitch725;
float WaveFoam221 = saturate( ( pow( ( tex2DNode94.g * _WaveFoam ) , 2.0 ) * FoamTex244 ) );
float3 lerpResult223 = lerp( ( lerpResult61 + Foam73 ) , temp_cast_1 , WaveFoam221);
float3 FinalColor114 = lerpResult223;
#ifdef _LIGHTING_ON
float3 staticSwitch769 = float3( 0,0,0 );
#else
float3 staticSwitch769 = ( saturate( ( pow( max( 0.0 , dotResult741 ) , ( GlossParam754 * 128.0 ) ) * GlossParam754 ) ) + lerp(( i.vertexToFrag746 * FinalColor114 ),FinalColor114,_Unlit) );
#endif
float3 CustomLighting753 = staticSwitch769;
SurfaceOutputStandard s733 = (SurfaceOutputStandard ) 0;
s733.Albedo = FinalColor114;
s733.Normal = WorldNormalVector( i , NormalMap52 );
s733.Emission = float3( 0,0,0 );
s733.Metallic = _Metallicness;
s733.Smoothness = GlossParam754;
s733.Occlusion = 1.0;
data.light = gi.light;
UnityGI gi733 = gi;
#ifdef UNITY_PASS_FORWARDBASE
Unity_GlossyEnvironmentData g733 = UnityGlossyEnvironmentSetup( s733.Smoothness, data.worldViewDir, s733.Normal, float3(0,0,0));
gi733 = UnityGlobalIllumination( data, s733.Occlusion, s733.Normal, g733 );
#endif
float3 surfResult733 = LightingStandard ( s733, viewDir, gi733 ).rgb;
surfResult733 += s733.Emission;
#ifdef UNITY_PASS_FORWARDADD//733
surfResult733 -= s733.Emission;
#endif//733
#ifdef _LIGHTING_ON
float3 staticSwitch734 = surfResult733;
#else
float3 staticSwitch734 = CustomLighting753;
#endif
c.rgb = staticSwitch734;
c.a = Opacity121;
return c;
}
inline void LightingStandardCustomLighting_GI( inout SurfaceOutputCustomLightingCustom s, UnityGIInput data, inout UnityGI gi )
{
s.GIData = data;
}
void surf( Input i , inout SurfaceOutputCustomLightingCustom o )
{
o.SurfInput = i;
o.Normal = float3(0,0,1);
}
ENDCG
}
}
My Code
private void FixedUpdate()
{
RaycastHit hit;
LayerMask mask = LayerMask.GetMask("ignore");
Physics.Raycast(transform.position, Vector3.down, out hit, Mathf.Infinity, mask);
MeshCollider meshCollider = hit.collider as MeshCollider;
if (!(meshCollider == null || meshCollider.sharedMesh == null))
{
Mesh mesh = meshCollider.sharedMesh;
mesh.RecalculateNormals();
Vector3[] normals = mesh.normals;
int[] triangles = mesh.triangles;
// Extract local space normals of the triangle we hit
Vector3 n0 = normals[triangles[hit.triangleIndex * 3 + 0]];
Vector3 n1 = normals[triangles[hit.triangleIndex * 3 + 1]];
Vector3 n2 = normals[triangles[hit.triangleIndex * 3 + 2]];
// interpolate using the barycentric coordinate of the hitpoint
Vector3 baryCenter = hit.barycentricCoordinate;
// Use barycentric coordinate to interpolate normal
Vector3 interpolatedNormal = n0 * baryCenter.x + n1 * baryCenter.y + n2 * baryCenter.z;
// normalize the interpolated normal
interpolatedNormal = interpolatedNormal.normalized;
// Transform local space normals to world space
Transform hitTransform = hit.collider.transform;
interpolatedNormal = hitTransform.TransformDirection(interpolatedNormal);
// Display with Debug.DrawLine
Debug.DrawRay(gameObject.transform.position, interpolatedNormal * 10f, Color.black);
}
//0-1 on max speed
currentShipScalar = ((shipSpeed - (shipSpeed - Body.velocity.magnitude)) / (shipSpeed));
this only returns the mesh normal not the shader. I used this enter link description here
and this enter link description here
as my code source. I feel dumb I am just exhausted. I don't like that I have not yet understood whats going on but I have a dead line of one week. any help would be apreciated.
The asset you are using is just a shader. Changes in the vertex position inside a shader are just on the GPU and can not be accessed by the CPU. The asset also states, it does not support Buoyancy, so it really just is a visual thing.
To be able to access the wave distortion you either need to have a compute shader or some other preprocessing step that calculates your changes for you, which are used afterwards inside you wave shader.
More information on the same problem
Might be worth taking a look at another solution (maybe the boat attack repo would be a good fit), but I don't know your target device and scope of the project (I guess you're on builtin renderer right now).

Problems porting a GLSL shadertoy shader to unity

I'm currently trying to port a shadertoy.com shader (Atmospheric Scattering Sample, interactive demo with code) to Unity. The shader is written in GLSL and I have to start the editor with C:\Program Files\Unity\Editor>Unity.exe -force-opengl to make it render the shader (otherwise a "This shader cannot be run on this GPU" error comes up), but that's not a problem right now. The problem is with porting that shader to Unity.
The functions for the scattering etc. are all identical and "runnable" in my ported shader, the only thing is that the mainImage() functions manages the camera, light directions and ray direction itself. This has to be ofcourse changed sothat Unity's camera position, view direction and light sources and directions are used.
The main function of the original looks like this:
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
// default ray dir
vec3 dir = ray_dir( 45.0, iResolution.xy, fragCoord.xy );
// default ray origin
vec3 eye = vec3( 0.0, 0.0, 2.4 );
// rotate camera
mat3 rot = rot3xy( vec2( 0.0, iGlobalTime * 0.5 ) );
dir = rot * dir;
eye = rot * eye;
// sun light dir
vec3 l = vec3( 0, 0, 1 );
vec2 e = ray_vs_sphere( eye, dir, R );
if ( e.x > e.y ) {
discard;
}
vec2 f = ray_vs_sphere( eye, dir, R_INNER );
e.y = min( e.y, f.x );
vec3 I = in_scatter( eye, dir, e, l );
fragColor = vec4( I, 1.0 );
}
I've read through the documentation of that function and how it's supposed work at https://www.shadertoy.com/howto .
Image shaders implement the mainImage() function in order to generate
the procedural images by computing a color for each pixel. This
function is expected to be called once per pixel, and it is
responsability of the host application to provide the right inputs to
it and get the output color from it and assign it to the screen pixel.
The prototype is:
void mainImage( out vec4 fragColor, in vec2 fragCoord );
where fragCoord contains the pixel coordinates for which the shader
needs to compute a color. The coordinates are in pixel units, ranging
from 0.5 to resolution-0.5, over the rendering surface, where the
resolution is passed to the shader through the iResolution uniform
(see below).
The resulting color is gathered in fragColor as a four component
vector, the last of which is ignored by the client. The result is
gathered as an "out" variable in prevision of future addition of
multiple render targets.
So in that function there are references to iGlobalTime to make the camera rotate with time and references to the iResolution for the resolution. I've embedded the shader in a Unity shader and tried to fix and wireup the dir, eye and l sothat it works with Unity, but I'm completly stuck. I get some sort of picture which looks "related" to the original shader: (Top is original, buttom the current unity state)
I'm not a shader profesional, I only know some basics of OpenGL, but for the most part, I write game logic in C#, so all I could really do was look at other shader examples and look at how I could get the data about camera, lightsources etc. in this code, but as you can see, nothing works out, really.
I've copied the skelton-code for the shader from https://en.wikibooks.org/wiki/GLSL_Programming/Unity/Specular_Highlights and some vectors from http://forum.unity3d.com/threads/glsl-shader.39629/ .
I hope someone can point me in some direction on how to fix this shader / correctly port it to unity. Below is the current shader code, all you have to do to reproduce it is create a new shader in a blank project, copy that code inside, make a new material, assign the shader to that material, then add a sphere and add that material on it and add a directional light.
Shader "Unlit/AtmoFragShader" {
Properties{
_MainTex("Base (RGB)", 2D) = "white" {}
_LC("LC", Color) = (1,0,0,0) /* stuff from the testing shader, now really used */
_LP("LP", Vector) = (1,1,1,1)
}
SubShader{
Tags{ "Queue" = "Geometry" } //Is this even the right queue?
Pass{
//Tags{ "LightMode" = "ForwardBase" }
GLSLPROGRAM
/* begin port by copying in the constants */
// math const
const float PI = 3.14159265359;
const float DEG_TO_RAD = PI / 180.0;
const float MAX = 10000.0;
// scatter const
const float K_R = 0.166;
const float K_M = 0.0025;
const float E = 14.3; // light intensity
const vec3 C_R = vec3(0.3, 0.7, 1.0); // 1 / wavelength ^ 4
const float G_M = -0.85; // Mie g
const float R = 1.0; /* this is the radius of the spehere? this should be set from the geometry or something.. */
const float R_INNER = 0.7;
const float SCALE_H = 4.0 / (R - R_INNER);
const float SCALE_L = 1.0 / (R - R_INNER);
const int NUM_OUT_SCATTER = 10;
const float FNUM_OUT_SCATTER = 10.0;
const int NUM_IN_SCATTER = 10;
const float FNUM_IN_SCATTER = 10.0;
/* begin functions. These are out of the defines because they should be accesible to anyone. */
// angle : pitch, yaw
mat3 rot3xy(vec2 angle) {
vec2 c = cos(angle);
vec2 s = sin(angle);
return mat3(
c.y, 0.0, -s.y,
s.y * s.x, c.x, c.y * s.x,
s.y * c.x, -s.x, c.y * c.x
);
}
// ray direction
vec3 ray_dir(float fov, vec2 size, vec2 pos) {
vec2 xy = pos - size * 0.5;
float cot_half_fov = tan((90.0 - fov * 0.5) * DEG_TO_RAD);
float z = size.y * 0.5 * cot_half_fov;
return normalize(vec3(xy, -z));
}
// ray intersects sphere
// e = -b +/- sqrt( b^2 - c )
vec2 ray_vs_sphere(vec3 p, vec3 dir, float r) {
float b = dot(p, dir);
float c = dot(p, p) - r * r;
float d = b * b - c;
if (d < 0.0) {
return vec2(MAX, -MAX);
}
d = sqrt(d);
return vec2(-b - d, -b + d);
}
// Mie
// g : ( -0.75, -0.999 )
// 3 * ( 1 - g^2 ) 1 + c^2
// F = ----------------- * -------------------------------
// 2 * ( 2 + g^2 ) ( 1 + g^2 - 2 * g * c )^(3/2)
float phase_mie(float g, float c, float cc) {
float gg = g * g;
float a = (1.0 - gg) * (1.0 + cc);
float b = 1.0 + gg - 2.0 * g * c;
b *= sqrt(b);
b *= 2.0 + gg;
return 1.5 * a / b;
}
// Reyleigh
// g : 0
// F = 3/4 * ( 1 + c^2 )
float phase_reyleigh(float cc) {
return 0.75 * (1.0 + cc);
}
float density(vec3 p) {
return exp(-(length(p) - R_INNER) * SCALE_H);
}
float optic(vec3 p, vec3 q) {
vec3 step = (q - p) / FNUM_OUT_SCATTER;
vec3 v = p + step * 0.5;
float sum = 0.0;
for (int i = 0; i < NUM_OUT_SCATTER; i++) {
sum += density(v);
v += step;
}
sum *= length(step) * SCALE_L;
return sum;
}
vec3 in_scatter(vec3 o, vec3 dir, vec2 e, vec3 l) {
float len = (e.y - e.x) / FNUM_IN_SCATTER;
vec3 step = dir * len;
vec3 p = o + dir * e.x;
vec3 v = p + dir * (len * 0.5);
vec3 sum = vec3(0.0);
for (int i = 0; i < NUM_IN_SCATTER; i++) {
vec2 f = ray_vs_sphere(v, l, R);
vec3 u = v + l * f.y;
float n = (optic(p, v) + optic(v, u)) * (PI * 4.0);
sum += density(v) * exp(-n * (K_R * C_R + K_M));
v += step;
}
sum *= len * SCALE_L;
float c = dot(dir, -l);
float cc = c * c;
return sum * (K_R * C_R * phase_reyleigh(cc) + K_M * phase_mie(G_M, c, cc)) * E;
}
/* end functions */
/* vertex shader begins here*/
#ifdef VERTEX
const float SpecularContribution = 0.3;
const float DiffuseContribution = 1.0 - SpecularContribution;
uniform vec4 _LP;
varying vec2 TextureCoordinate;
varying float LightIntensity;
varying vec4 someOutput;
/* transient stuff */
varying vec3 eyeOutput;
varying vec3 dirOutput;
varying vec3 lOutput;
varying vec2 eOutput;
/* lighting stuff */
// i.e. one could #include "UnityCG.glslinc"
uniform vec3 _WorldSpaceCameraPos;
// camera position in world space
uniform mat4 _Object2World; // model matrix
uniform mat4 _World2Object; // inverse model matrix
uniform vec4 _WorldSpaceLightPos0;
// direction to or position of light source
uniform vec4 _LightColor0;
// color of light source (from "Lighting.cginc")
void main()
{
/* code from that example shader */
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
vec3 ecPosition = vec3(gl_ModelViewMatrix * gl_Vertex);
vec3 tnorm = normalize(gl_NormalMatrix * gl_Normal);
vec3 lightVec = normalize(_LP.xyz - ecPosition);
vec3 reflectVec = reflect(-lightVec, tnorm);
vec3 viewVec = normalize(-ecPosition);
/* copied from https://en.wikibooks.org/wiki/GLSL_Programming/Unity/Specular_Highlights for testing stuff */
//I have no idea what I'm doing, but hopefully this computes some vectors which I need
mat4 modelMatrix = _Object2World;
mat4 modelMatrixInverse = _World2Object; // unity_Scale.w
// is unnecessary because we normalize vectors
vec3 normalDirection = normalize(vec3(
vec4(gl_Normal, 0.0) * modelMatrixInverse));
vec3 viewDirection = normalize(vec3(
vec4(_WorldSpaceCameraPos, 1.0)
- modelMatrix * gl_Vertex));
vec3 lightDirection;
float attenuation;
if (0.0 == _WorldSpaceLightPos0.w) // directional light?
{
attenuation = 1.0; // no attenuation
lightDirection = normalize(vec3(_WorldSpaceLightPos0));
}
else // point or spot light
{
vec3 vertexToLightSource = vec3(_WorldSpaceLightPos0
- modelMatrix * gl_Vertex);
float distance = length(vertexToLightSource);
attenuation = 1.0 / distance; // linear attenuation
lightDirection = normalize(vertexToLightSource);
}
/* test port */
// default ray dir
//That's the direction of the camera here?
vec3 dir = viewDirection; //normalDirection;//viewDirection;// tnorm;//lightVec;//lightDirection;//normalDirection; //lightVec;//tnorm;//ray_dir(45.0, iResolution.xy, fragCoord.xy);
// default ray origin
//I think they mean the position of the camera here?
vec3 eye = vec3(_WorldSpaceCameraPos); //vec3(_WorldSpaceLightPos0); //// vec3(0.0, 0.0, 0.0); //_WorldSpaceCameraPos;//ecPosition; //vec3(0.0, 0.0, 2.4);
// rotate camera not needed, remove it
// sun light dir
//I think they mean the direciton of our directional light?
vec3 l = lightDirection;//_LightColor0.xyz; //lightDirection; //normalDirection;//normalize(vec3(_WorldSpaceLightPos0));//lightVec;// vec3(0, 0, 1);
/* this computes the intersection of the ray and the sphere.. is this really needed?*/
vec2 e = ray_vs_sphere(eye, dir, R);
/* copy stuff sothat we can use it on the fragment shader, "discard" is only allowed in fragment shader,
so the rest has to be computed in fragment shader */
eOutput = e;
eyeOutput = eye;
dirOutput = dir;
lOutput = dir;
}
#endif
#ifdef FRAGMENT
uniform sampler2D _MainTex;
varying vec2 TextureCoordinate;
uniform vec4 _LC;
varying float LightIntensity;
/* transient port */
varying vec3 eyeOutput;
varying vec3 dirOutput;
varying vec3 lOutput;
varying vec2 eOutput;
void main()
{
/* real fragment */
if (eOutput.x > eOutput.y) {
//discard;
}
vec2 f = ray_vs_sphere(eyeOutput, dirOutput, R_INNER);
vec2 e = eOutput;
e.y = min(e.y, f.x);
vec3 I = in_scatter(eyeOutput, dirOutput, eOutput, lOutput);
gl_FragColor = vec4(I, 1.0);
/*vec4 c2;
c2.x = 1.0;
c2.y = 1.0;
c2.z = 0.0;
c2.w = 1.0f;
gl_FragColor = c2;*/
//gl_FragColor = c;
}
#endif
ENDGLSL
}
}
}
Any help is appreciated, sorry for the long post and explanations.
Edit: I just found out that the radius of the spehere does have an influence on the stuff, a sphere with scale 2.0 in every direction gives a much better result. However, the picture is still completly independent of the viewing angle of the camera and any lights, this is nowhere near the shaderlab version.
It's look like you are trying to render a 2D texture over a sphere. It has some different approach. For what you trying to do, I would apply the shader over a plane crossed with the sphere.
For general purpose, look this article showing how to convert shaderToy to Unity3D.
There is some steps that I included here:
Replace iGlobalTime shader input (“shader playback time in seconds”) with _Time.y
Replace iResolution.xy (“viewport resolution in pixels”) with _ScreenParams.xy
Replace vec2 types with float2, mat2 with float2x2 etc.
Replace vec3(1) shortcut constructors in which all elements have same value with explicit float3(1,1,1)
Replace Texture2D with Tex2D
Replace atan(x,y) with atan2(y,x) <- Note parameter ordering!
Replace mix() with lerp()
Replace *= with mul()
Remove third (bias) parameter from Texture2D lookups
mainImage(out vec4 fragColor, in vec2 fragCoord) is the fragment shader function, equivalent to float4 mainImage(float2 fragCoord : SV_POSITION) : SV_Target
UV coordinates in GLSL have 0 at the top and increase downwards, in HLSL 0 is at the bottom and increases upwards, so you may need to use uv.y = 1 – uv.y at some point.
About this question:
Tags{ "Queue" = "Geometry" } //Is this even the right queue?
Queue references the order it will be rendered, Geometry is one of the first of, if you want you shader running over everything you could use Overlay for example. This topic is covered here.
Background - this render queue is rendered before any others. It is used for skyboxes and the like.
Geometry (default) - this is used for most objects. Opaque geometry uses this queue.
AlphaTest - alpha tested geometry uses this queue. It’s a separate queue from - Geometry one since it’s more efficient to render alpha-tested objects after all solid ones are drawn.
Transparent - this render queue is rendered after Geometry and AlphaTest, in back-to-front order. Anything alpha-blended (i.e. shaders that don’t write to depth buffer) should go here (glass, particle effects).
Overlay - this render queue is meant for overlay effects. Anything rendered last should go here (e.g. lens flares).

iPhone GLSL dynamic branching issue

I am trying to pass an array of vec3 as uniform and then iterate through them on each pixel. The size of array varies on situations so I can't make the loop with constant number of iterations.
Here is the code:
precision highp float;
precision highp int;
varying vec4 v_fragmentColor;
varying vec4 v_pos;
uniform int u_numberOfParticles;
const int numberOfAccumsToCapture = 3;
const float threshold = 0.15;
const float gooCoeff = 1.19;
uniform mat4 u_MVPMatrix;
uniform vec3 u_waterVertices[100];
void main()
{
vec4 finalColor = vec4(0.0, 0.0, 0.0, 0.0);
vec2 currPos = v_pos.xy;
float accum = 0.0;
vec3 normal = vec3(0, 0, 0);
for ( int i = 0; i < u_numberOfParticles; ++i )
{
vec2 dir2 = u_waterVertices[i].xy - currPos.xy;
vec3 dir3 = vec3(dir2, 0.1);
float q = dot(dir2, dir2);
accum += u_waterVertices[i].z / q;
}
float normalizeToEdge = 1.0 - (accum - threshold) / 2.0;
if (normalizeToEdge < 0.4)
finalColor = vec4( 0.1, normalizeToEdge + 0.5, 0.9-normalizeToEdge*0.4, 1.0);
if ( normalizeToEdge < 0.2 )
{
finalColor = vec4( 120.0/255.0, 245.0/255.0, 245.0/255.0, 1.0);
float shade = mix( 0.7, 1.0, normal.x);
finalColor *= shade;
}
gl_FragColor = vec4(finalColor);
}
The problem is here:
for ( int i = 0; i < u_numberOfParticles; ++i )
{
vec2 dir2 = u_waterVertices[i].xy - currPos.xy;
vec3 dir3 = vec3(dir2, 0.1);
float q = dot(dir2, dir2);
accum += u_waterVertices[i].z / q;
}
When I make the for-loop like this
for ( int i = 0; i < 2; ++i )
{
//...
}
I get double the framerate even though u_numberOfParticles is also 2
Making it like this
for ( int i = 0; i < 100; ++i )
{
if (i == u_numberOfParticles)
break;
//...
}
gives no improvement.
The only way I know to cope with this situation is to create multiple shaders. But The size of array may vary from 1 to 40 and making 40 different shaders just because of the for-loop is stupid. Any help or ideas how to deal with this situation ?
I agree with #badweasel that you're approach is not really suited for shaders.
From what I understand you are calculating the distance from the current pixel to each particle, sum something up and determine the color using the result.
Maybe you could instead render a point sprite for each particle and determine the color by smart blending.
You can set the size of the point sprite in the vertex shader using gl_PointSize. In the fragment shader you can determine the location of the current pixel within the point sprite by using gl_PointCoord.xy (which is in texture coordinates, i.e. [0..1]). By knowing the size of your point sprite you can then calculate the distance of the current pixel from the particles center and set the color to something. By additionally enabling blending you may be able to achieve the summing you do inside your loop, but with much higher frame rates.
Here are vertex and fragment shader I use for rendering "fake" spheres via point sprites as an example on how to use point sprites.
VS:
#version 150
in vec3 InPosition;
uniform mat4 ModelViewProjectionMatrix;
uniform int Radius = 10;
void main()
{
vec4 Vertex = vec4(InPosition, 1.0);
gl_Position = ModelViewProjectionMatrix * Vertex;
gl_PointSize = Radius;
}
FS:
#version 150
out vec4 FragColor;
void main()
{
// calculate normal, i.e. vector pointing from point sprite center to current fragment
vec3 normal;
normal.xy = gl_PointCoord * 2 - vec2(1);
float r2 = dot(normal.xy, normal.xy);
// skip pixels outside the sphere
if (r2 > 1) discard;
// set "fake" z normal to simulate spheres
normal.z = sqrt(1 - r2);
// visualize per pixel eye-space normal
FragColor = vec4(gl_PointCoord, normal.z, 1.0);
}
Note, that you need to enable: GL_POINT_SPRITE, GL_PROGRAM_POINT_SIZE for using point sprites.

OpenGL ES rotate texture

I have the following fragment shader:
varying highp vec2 coordinate;
precision mediump float;
uniform vec4 maskC;
uniform float threshold;
uniform sampler2D videoframe;
uniform sampler2D videosprite;
uniform vec4 mask;
uniform vec4 maskB;
uniform int recording;
vec3 normalize(vec3 color, float meanr)
{
return color*vec3(0.75 + meanr, 1., 1. - meanr);
}
void main() {
float d;
float dB;
float dC;
float meanr;
float meanrB;
float meanrC;
float minD;
vec4 pixelColor;
vec4 spriteColor;
pixelColor = texture2D(videoframe, coordinate);
spriteColor = texture2D(videosprite, coordinate);
meanr = (pixelColor.r + mask.r)/8.;
meanrB = (pixelColor.r + maskB.r)/8.;
meanrC = (pixelColor.r + maskC.r)/8.;
d = distance(normalize(pixelColor.rgb, meanr), normalize(mask.rgb, meanr));
dB = distance(normalize(pixelColor.rgb, meanrB), normalize(maskB.rgb, meanrB));
dC = distance(normalize(pixelColor.rgb, meanrC), normalize(maskC.rgb, meanrC));
minD = min(d, dB);
minD = min(minD, dC);
gl_FragColor = spriteColor;
if (minD > threshold) {
gl_FragColor = pixelColor;
}
}
Now, depending on whether recording is 0 or 1, I want to rotate uniform sampler2D videosprite 180 degrees (reflection in x-axis, flip vertically). How can I do that?
I found the function glRotatef(), but how do I specify that I want to rotate the videosprite and not the videoframe?
Err, can't you just modify the way videosprite is accessed in the fragment shader?
vec2 c2;
if(recording == 0) {
c2 = coordinate;
} else {
c2 = vec2(coordinate.x, 1.0 - coordinate.y);
}