How can I modify the shader in the iPhone OpenGL ES template to produce this effect? - iphone

I'm trying to modify the fragment shader which is part of the standard iPhone/XCode OpenGL ES template. I want to make it so that every other row of pixels is transparent. I have this code so far:
varying lowp vec4 colorVarying;
void main()
{
gl_FragColor = vec4(colorVarying.x, colorVarying.y, colorVarying.z, floor(mod(gl_FragCoord.y, 2.0)));
}
But when I compile and run I still get the same square moving up and down with no other effects.
Here is my vertex shader (my keyboard just broke so no return key! DOH!)
attribute vec4 position;
attribute vec4 color;
varying vec4 colorVarying;
uniform float translate;
void main()
{
gl_Position = position;
gl_Position.y += sin(translate) / 2.0;
colorVarying = color;
}
Using this vertex shader and fragment shader above, I get no 'scanline effect' which I was hoping for. I'm testing using the iPad simulator and also the 3.1.3 iPhone simulator.
What am I doing wrong here? I'm a complete n00b at Glsl - I'm trying to teach myself the very basics (starting with this tutorial) .

Can you post your vertex shader as well? Assuming that it's passing over the vec4 colorVarying there's no reason it shouldn't work when squashed into a single line as opposed to the two-line code in the sample (posted below)
float odd = floor(mod(gl_FragCoord.y, 2.0));
gl_FragColor = vec4(colorVarying.x, colorVarying.y, colorVarying.z, odd);
The only other difference I see is that you specified lowp - try it without that.

Related

CG: Specify a variable not to be interpolated between vertex and fragment shader

I'm using GC for writing shaders inside Unity3D.
I'm using vertex colors attributes for passing some parameters to the shader. They won't be used so for defining colors, and should be forwarded from vertex shader to pixel shader without modifyng them.
This is the structure I'm taking as input from Unity3D to the vertex shader:
struct appdata_full {
float4 vertex : POSITION;
float4 tangent : TANGENT;
float3 normal : NORMAL;
float4 texcoord : TEXCOORD0;
float4 texcoord1 : TEXCOORD1;
fixed4 color : COLOR;
#if defined(SHADER_API_XBOX360)
half4 texcoord2 : TEXCOORD2;
half4 texcoord3 : TEXCOORD3;
half4 texcoord4 : TEXCOORD4;
half4 texcoord5 : TEXCOORD5;
#endif
};
This is the structure returned by vertex shader as input to the fragment:
struct v2f {
float4 pos : SV_POSITION;
float2 uv : TEXCOORD0;
fixed4 col: COLOR;
};
If I simply forward the parameter to the fragment shader, of course it will be interpolated:
v2f vert (appdata_full v)
{
v2f output;
//....
output.col = v.color;
}
I'd like to pass v.color parameter not interpolated to the fragment shader.
Is this possible?if yes how?
EDIT
like Tim pointed out, this is the expected behavior, because of the shader can't do anything else than interpolating colors if those are passed out from vertex shader to fragment.
I'll try to explain better what I'm trying to achieve. I'm using per vertex colors to store other kind of information than colors. Without telling all details on what I'm doing with that, let's say you can consider each color vertex as an id(each vertex of the same triangle, will have the same color. Actually each vertex of the same mesh).
So I used the color trick to mask some parameters because I have no other way to do this. Now this piece of information must be available at the fragment shader in some way.
If a pass as an out parameter of the vertex shader, this information encoded into a color will arrive interpolated at the fragment, that can't no longer use it.
I'm looking for a way of propagating this information unchanged till the fragment shader (maybe is possible to use a global variable or something like that?if yes how?).
I'm not sure this counts for an answer but it's a little much for a comment. As Bjorke points out, the fragment shader will always receive an interpolated value. If/when Unity supports Opengl 4.0 you might have access to Interpolation qualifiers, namely 'flat' that disables interpolation, deriving all values from a provoking vertex.
That said, the problem with trying to assign the same "color" value to all vertices of a triangle is that the vertex shader iterates over the vertices once, not per triangle. There will always be a "boundary" region where some vertex shares multiple edges with other vertices of a different "color" or "id", see my dumb example below. When applied to a box at (0,0,0), the top will be red, the bottom green, and the middle blue.
Shader "custom/colorbyheight" {
Properties {
_Unique_ID ("Unique Identifier", float) = 1.0
}
SubShader {
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct v2f {
float4 pos : SV_POSITION;
fixed4 color : COLOR;
};
uniform float _Unique_ID;
v2f vert (appdata_base v)
{
v2f o;
o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
float3 worldpos = mul(_Object2World, v.vertex).xyz;
if(worldpos[1] >= 0.0)
o.color.xyz = 0.35; //unique_id = 0.35
else
o.color.xyz = 0.1; //unique_id = 0.1
o.color.w = 1.0;
return o;
}
fixed4 frag (v2f i) : COLOR0 {
// local unique_id's set by the vertex shader and stored in the color
if(i.color.x >= 0.349 && i.color.x <=0.351)
return float4(1.0,0.0,0.0,1.0); //red
else if(i.color.x >= 0.099 && i.color.x <=0.11)
return float4(0.0,1.0,0.0,1.0); //green
// global unique_id set by a Unity script
if(_Unique_ID == 42.0)
return float4(1.0,1.0,1.0,1.0); //white
// Fallback color = blue
return float4(0.0,0.0,1.0,1.0);
}
ENDCG
}
}
}
In your addendum note you say "Actually each vertex of the same mesh." If that's the case, why not use a modifiable property, like I have included above. Each mesh just needs a script then to change the unique_id.
public class ModifyShader : MonoBehaviour {
public float unique_id = 1;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
renderer.material.SetFloat( "_Unique_ID", unique_id );
}
}
I know this is an old thread, but it's worth answering anyway since this is one of the top google results.
You can now use the nointerpolation option for your variables in regular CG shaders. i.e.
nointerpolation fixed3 diff : COLOR0;
This is a pretty old thread, but I recently had a similar issue and I found a super simple answer. OSX Mavericks now supports OpenGL 4.1 so soon it won't be an issue at all, but it still may take a while before Unity3d picks it up.
Anyway, there is a neat way to enable flat shading in Unity even on earlier OSX (e.g. Mountain Lion) !
The shader below will do the job (the crucial part is the line with #extension, otherwise you'd get a compilation error for using a keyword flat"
Shader "GLSL flat shader" {
SubShader {
Pass {
GLSLPROGRAM
#extension GL_EXT_gpu_shader4 : require
flat varying vec4 color;
#ifdef VERTEX
void main()
{
color = gl_Color;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
#endif
#ifdef FRAGMENT
void main()
{
gl_FragColor = color; // set the output fragment color
}
#endif
ENDGLSL
}
}
}
Got to it by combining things from:
http://poniesandlight.co.uk/notes/flat_shading_on_osx/
http://en.wikibooks.org/wiki/GLSL_Programming/Unity/Debugging_of_Shaders
The GPU will always interpolated between values. If you want a constant value for a triangle, you need to set the same value for all vertices of that triangle. This can at times be inefficient, but it's how OpenGL (and DirectX) works. There is no inherent notion of a "face" value.
You might do this: glShadeModel(GL_FLAT). This turns off interpolation for all fragment shader inputs, and is available in older OpenGL also (pre 4.0).
If you have some inputs you want to interpolate and some you don't, render once with GL_FLAT to a texture of the same resolution as your output, and then render again with GL_SMOOTH and sample the texture to read the flat values for each pixel (while also getting interpolated values in the usual way).
If you could use DirectX9 instead, you can use the nointerpolation modifier on individual fragment shader inputs (shader model 4 or later).
The following steps works for me.
Unfortunately, DX uses vertex 0 as the provoking vertex while GL by default uses 2.
You can change this in GL but glProvokingVertex does not seem to be exposed.
We are doing flat shading and this reduces our vertex count significantly.
We have to reorder the triangles and compute normals in a special way (if anyone is interested I can post example source).
The problem is that we have to have different meshes on GL vs DX as the triangle indexes need to be rotated in order for the triangles to use the appropriate provoking vertex.
Maybe there is some way to execute a GL command via a plugin.

Send a BOOL value to a Fragment Shader OpenGL ES 2.0 on iOS / iPhone

I'm new to OpenGL ES 2.0 so please bear with me... I'd like to pass a BOOL flag into my fragment shader so that after a certain touch event has occurred in my app, it renders gl_FragColor differently. I tried using a vec2 attribute for this and just "faking" the .x value as my "BOOL" but it looks like OpenGL is normalizing the value from 0.0 to 1.0 before the shader gets ahold of it. So even though in my app I've set it to 0.0, while the shader is doing its thing, the value will eventually reach 1.0. Any suggestions would be hugely appreciated.
VertexAttrib Code:
// set up context, shaders, use program etc.
[filterProgram addAttribute:#"inputBrushMode"];
inputBrushModeAttribute = [filterProgram attributeIndex:#"inputBrushMode"];
bMode[0] = 0.0;
bMode[1] = 0.0;
glVertexAttribPointer(inputBrushModeAttribute, 2, GL_FLOAT, 0, 0, bMode);
Current Vertex Shader Code:
...
attribute vec2 inputBrushMode;
varying highp float brushMode;
void main()
{
gl_Position = position;
...
brushMode = inputBrushMode.x;
}
Current Fragment Shader Code:
...
varying highp float brushMode;
void main()
{
if(brushMode < 0.5) {
// render the texture
gl_FragColor = texture2D(inputImageTexture, textureCoordinate);
} else {
// cover things in yellow funk
gl_FragColor = vec4(1,1,0,1);
}
}
Thanks in advance.
Create the bool as a glUniform (1.0 or 0.0) instead. Set its value with glUniform1f(GLint location, GLfloat v0). In the shader, check its value like so:
if (my_uniform < 0.5) {
// FALSE
} else {
// TRUE
}

OpenGL ES 2.0 / MonoTouch: Rendering GUI Textures shows nothing

I building a simple Framework for OpenGL UI's for MonoTouch. I set up everything and also succeeded rendering 3D Models, but a simple 2D texture object fails. The texture has a size of 256x256 so it's not to large and its power of two.
Here is some rendering code( Note: I did remove the existing, and working code ):
// Render the gui objects ( flat )
Projection = Matrix4x4.Orthographic(0, WindowProperties.Width, WindowProperties.Height, 0);
View = new Matrix4x4();
GL.Disable(All.CullFace);
GL.Disable(All.DepthTest);
_Stage.RenderGui();
Stage:
public void RenderGui ()
{
Draw(this);
// Renders every child control, all of them call "DrawImage" when rendering something
}
public void DrawImage (Control caller, ITexture2D texture, PointF position, SizeF size)
{
PointF gposition = caller.GlobalPosition; // Resulting position is 0,0 in my tests
gposition.X += position.X;
gposition.Y += position.Y;
// Renders the ui model, this is done by using a existing ( and working vertex buffer )
// The shader gets some parameters ( this works too in 3d space )
_UIModel.Render(new RenderParameters() {
Model = Matrix4x4.Scale(size.Width, size.Height, 1) * Matrix4x4.Translation(gposition.X, gposition.Y, 0),
TextureParameters = new TextureParameter[] {
new TextureParameter("texture", texture)
}
});
}
The model is using a vector2 for positions, no other attributes are given to the shader.
The shader below should render the texture.
Vertex:
attribute vec2 position;
uniform mat4 modelViewMatrix;
varying mediump vec2 textureCoordinates;
void main()
{
gl_Position = modelViewMatrix * vec4(position.xy, -3.0, 1.0);
textureCoordinates = position;
}
Fragment:
varying mediump vec2 textureCoordinates;
uniform sampler2D texture;
void main()
{
gl_FragColor = texture2D(texture, textureCoordinates) + vec4(0.5, 0.5, 0.5, 0.5);
}
I found out that the drawing issue is caused by the shader. This line produces a GL_INVALID_OPERATION( It works with other shaders ):
GL.UniformMatrix4(uni.Location, 1, false, (parameters.Model * _Device.View * _Device.Projection).ToArray());
EDIT:
It turns out that the shader uniform locations changed( Yes i'm wondering about this too, because the initialization happens when the shader is completly initialized. I changed it, and now everything works.
As mentioned in the other thread the texture is wrong, but this is another issue ( OpenGL ES 2.0 / MonoTouch: Texture is colorized red )
The shader initialization with the GL.GetUniformLocation problem mentioned above:
[... Compile shaders ...]
// Attach vertex shader to program.
GL.AttachShader (_Program, vertexShader);
// Attach fragment shader to program.
GL.AttachShader (_Program, pixelShader);
// Bind attribute locations
for (int i = 0; i < _VertexAttributeList.Length; i++) {
ShaderAttribute attribute = _VertexAttributeList [i];
GL.BindAttribLocation (_Program, i, attribute.Name);
}
// Link program
if (!LinkProgram (_Program)) {
GL.DeleteShader (vertexShader);
GL.DeleteShader (pixelShader);
GL.DeleteProgram (_Program);
throw new Exception ("Shader could not be linked");
}
// Get uniform locations
for (int i = 0; i < _UniformList.Length; i++) {
ShaderUniform uniform = _UniformList [i];
uniform.Location = GL.GetUniformLocation (_Program, uniform.Name);
Console.WriteLine ("Uniform: {0} Location: {1}", uniform.Name, uniform.Location);
}
// Detach shaders
GL.DetachShader (_Program, vertexShader);
GL.DetachShader (_Program, pixelShader);
GL.DeleteShader (vertexShader);
GL.DeleteShader (pixelShader);
// Shader is initialized add it to the device
_Device.AddResource (this);
I don't know what Matrix4x4.Orthographic uses as near-far range, but if it's something simple like [-1,1], the object may just be out of the near-far-interval, since you set its z value explicitly to -3.0 in the vertex shader (and neither the scale nor the translation of the model matrix will change that). Try to use a z of 0.0 instead. Why is it -3, anyway?
EDIT: So if GL.UniformMatrix4 function throws a GL_INVALID_OPERATION, it seems you didn't retrieve the corresponding unfiorm location successfully. So the code where you do this might also help to find the issue.
Or it may also be that you call GL.UniformMatrix4 before the corresponding shader program is used. Keep in mind that uniforms can only be set once the program is active (GL.UseProgram or something similar was called with the shader program).
And by the way, you're multiplying the matrices in the wrong order, anyway (given your shader and matrix setting code). If it really works this way for other renderings, then you either were just lucky or you have some severe conceptual and mathemtical inconsistency in your matrix library.
It turns out that the shader uniforms change at a unknown time. Everything is created and initialized when i ask OpenGL ES for the uniform location, so it must be a bug in OpenGL.
Calling GL.GetUniformLocation(..) each time i set the shader uniforms solves the problem.

OpenGL ES 2.0 with iPhone - Vertex Shader uniform cannot be located

I have the following vertex shader:
uniform mediump mat4 projMx;
attribute vec2 a_position;
attribute vec4 a_color;
attribute float a_radius;
varying vec4 v_color;
void main()
{
vec4 position = vec4(100.0,600.0,1.0,1.0);
gl_Position = projMx * position;
gl_PointSize = a_radius*2.0;
v_color = a_color;
}
..and the following fragment shader:
#ifdef GL_FRAGMENT_PRECISION_HIGH
precision highp float;
#else
precision mediump float;
#endif
varying vec4 v_color;
void main()
{
gl_FragColor = v_color;
}
..and the following Obj-C code:
//..shaders have been created..
program = glCreateProgram();
glAttachShader(program, shaders[0]);
glAttachShader(program, shaders[1]);
GLfloat projMx[16] = {2/screenWidth,0,0,-1,0,2/-screenHeight,0,1,0,0,-2,-1,0,0,0,1};
projMxId = glGetUniformLocation(program, "projMx");
NSLog(#"uniform location:%i",projMxId);
The uniform location of 'projMx' is -1 (i.e. 'projMxId == -1' is true). Could someone please explain why this is the case?
You can only retrieve uniform locations after linking the program (what you don't do), as these are per-program state and known in every shader of the program. The GLSL compiler can optimize unused uniforms away during linking, so their locations should only be known after linking, which is also the moment when all attributes not explicitly bound get their locations.

OpenGL ES 2.0 - Can't find Attribute in Vertex Shader

I've looked for a while for an answer for this - but I'm not having much luck.
All I'm trying to do is pass my normal data into my vertex shader. Positions are passing in correctly, however I'm receiving a "normal attribute not found" error when trying to load my shaders.
My ATTRIB values are enums.
I've created a cube in OpenGL ES 2.0 for Iphone development.
My Shader.vsh looks like this:
attribute vec4 normal;
attribute vec4 position;
varying vec4 colorVarying;
uniform mat4 mvp_matrix;
void main()
{
//Trasform the vertex
gl_Position = mvp_matrix * position;
colorVarying = vec4(1.0, 1.0, 0.0, 0.0);
}
The part where I update attribute values in the drawframe looks like this:
// Update attribute values.
glVertexAttribPointer(ATTRIB_VERTEX, 3, GL_FLOAT, 0, 0, cubeVerticesStrip);
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_NORMAL, 3, GL_FLOAT, 0, 0, cubeNormalsStrip);
glEnableVertexAttribArray(ATTRIB_NORMAL);
And the part where I bind these in my LoadShader function is like this:
glBindAttribLocation(program, ATTRIB_VERTEX, "position");
glBindAttribLocation(program, ATTRIB_NORMAL, "normal");
Again, the position works. But "normal" cannot be found. Any ideas?
normal isn't found because your GLSL compiler was smart. It saw that you didn't actually do something with normal, so it pretends it doesn't exist to save resources.
Also, your normal and position should be vec3's since you're only passing 3 values. This isn't strictly required, but it is better form to make your inputs and attributes match.