Multi-textured Point Sprites in OpenGL ES2.0 on iOS? - iphone

I am trying to make a multi-textured point sprite for an iphone application using OpenGL ES 2.0. I can't find any examples of this on web, and it doesn't seem to be working. Is there some built-in limitation where gl_PointCoord can't be used on multiple textures when using GL_POINTS mode for point sprites?
uniform sampler2D tex;
uniform sampler2D blur_tex;
vec4 texPixel = texture2D( tex, gl_PointCoord );
vec4 blurPixel = texture2D( blur_tex, gl_PointCoord );
I'm sure I am passing in the textures properly, as I can do multi-texturing just fine in TRIANGLE_STRIP mode, but I am hoping to speed things up using point sprites.
If it is possible, a link to an example of working code would super helpful. Thanks!
EDIT:
Here's how I'm passing in the textures to my shader. This lets me do multi-texturing when I am in TRIANGLE or TRIANGLE_STRIP mode.
//pass in position and tex_coord attributes...
//normal tex
glActiveTexture(0);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, tex0);
glUniform1i(SAMPLER_0_UNIFORM, 0);
//blur tex
glActiveTexture(1);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, tex1);
glUniform1i(SAMPLER_1_UNIFORM, 1);
//draw arrays...
However if I am using POINTS mode then I never see the second texture. That is, referring to the shader code above, whether I do
gl_FragColor = texPixel;
OR
gl_FragColor = blurPixel;
I see the same texture. Which seems strange. My guess is that you CAN'T do multi-texturing on a point sprite and somehow having two active textures or two calls to gl_PointCoord causes a problem. But I'm hoping I'm wrong. So if someone has a simple example of multi-texturing working with point sprites in OpenGL ES 2.0 I would be happy to look at that code!
EDIT 2:
vertex shader:
attribute vec4 position;
void main() {
gl_PointSize = 15.0;
gl_Position = position;
}
fragment shader:
precision mediump float;
uniform sampler2D tex;
uniform sampler2D blur_tex;
void main() {
vec4 texPixel = texture2D( tex, gl_PointCoord );
vec4 blurPixel = texture2D( blur_tex, gl_PointCoord );
//these both do the same thing even though I am passing in two different textures?!?!?!?
//gl_FragColor = texPixel;
gl_FragColor = blurPixel;
}

There is a typo in your main program.
The right parameter to pass to glActiveTexture is GL_TEXTURE0, GL_TEXTURE1, ...
Note that GL_TEXTURE0, GL_TEXTURE1 does not have a value of 0,1 etc.
Since you are passing an invalid value to glActiveTexture, the function will fail and so the active texture will always be a default (probably 0) all your changes are going to texture at 0 position.

source
In my case there is a blending for points
The possible problem was in nonexistent parameters
glTexParameteri ( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
glTexParameteri ( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );

I think may be too late to post this though.
There are two problems in your code. One is the one that Satyakam has pointed out. The other problem is that you should NOT use glUniform1f. Right one is glUniform1i. The deference is f or i on the tail which means float or integer.

Related

Decal wrap around mesh

I'm working on tattoo simulator program, i need to know if there's a way for the decal (tattoo) to wrap arond the target mesh, like having a tattoo that goes from one side to the other side of lets say leg, or event behind it.
Not at runtime, using a projected decal, no.
What you need here instead is a procedural tattoo map. Think of it as another texture, like a lightmap. You may need a custom shader, but it could possibly be done with the secondary albedo channel of the standard shader.
The tricky part is writing to that texture. I'll outline the basic algorithm, but leave it up to you to implement:
The first thing you need to be able to do is unwrap the mesh's triangles in code. You need to identify which edges are contiguous on the UV map, and which are separate. Next, you need a way to identify the tattoo and the initial transform. First, you'll want to define an origin on the tattoo source texture that it will rotate around. Then you'll want to define a structure that references the source texture, and the UV position (Vector2) / rotation (float) / scale (float) to apply it to in the destination texture.
Once you have the tattoos stored in that format, then you can start building the tattoo mask texture for the skin. If your skin uvs have a consistent pixel density, this is a lot easier because you can work primarily in uv-space, but if not, you'll need to re-project to get the scale for each tri. But, basically, you start with the body triangle that contains the origin, and draw onto that triangle normally. From there, you know where each vertex and edge of that triangle lies on the tattoo source texture. So, loop through each neighboring triangle (I recommend a breadth-first recursive method) and continue it from the edge you already know. If all three verts fall outside the source texture's rect, you can stop there. Otherwise, continue with the next triangle's neighbors. Make sure you're using the 3D mesh when calculating neighbors so you don't get stuck at seams.
That algorithm is going to have an edge case you'll need to deal with for when the tattoo wraps all the way around and overlaps itself, but there are a couple different ways you can deal with that.
Once you've written all tattoos to the tattoo texture, just apply it to the skin material and voila! Not only will this move all the calculations out of real-time rendering, but it will let you fully control how your tattoos can be applied.
You can use a decal projector using Unity's official preview tool Render Pipelines - High Definition.
Here's how I used it to project a "tatoo" onto a bucket. You can apply it to your model of course.
(Child the decal projector so that the tatoo follows the model)
The best way to import Render Pipelines - High Definition package is to use Unity Hub to create a new project, choosing it as a template. If it's an existing project, this official blog might help you.
Once you succefully set up the package, follow this tutorial and you'll be able to project tatoos onto your models anywhere you want.
I've done something similar with a custom shader. I think it would do what you want. Mine is dynamically rendering flags based on rank and type of a unit for an iPad game prototype. Exactly how you'll do it depends a bit on how you have things setup in your project, but here's what mine looks like - first image is the wireframe showing the mesh and second is with the shaders turned on and shows them adding the colors and emblem based on rank and unit. I've just included the shader for the top flag since that has the unit emblem added added similar to how you want your tattoo to be:
Note that you can attach multiple shaders to a particular mesh.
And the emblem is just an image with transparency that is added to the shader and referenced as a texture within the shader:
You can see we also have a picture that has some shadow texture that's used as the background for the banner.
This is my first shader and was written a while ago, so I'm sure it's sub-optimal in all kinds of ways, but it should hopefully be enough to get you started (and it still works in Unity 2018.3.x, though I had to hack in some changes to get it to compile):
Shader "Custom/TroopFlagEmblemShader" {
Properties {
_BackColor ("Background Color", Color) = (0.78, 0.2, 0.2) // scarlet
_MainTex ("Background (RGBA)", 2D) = "" {}
_EmblemTex("Emblem (RGBA)", 2D) = "" {}
_Rank ( "Rank (1-9)", Float ) = 3.0
}
SubShader {
Pass {
CGPROGRAM
#pragma exclude_renderers xbox360 ps3 flash
#pragma target 3.0
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata {
float4 vertex: POSITION;
float4 texcoord: TEXCOORD0;
};
struct v2f {
float4 pos: SV_POSITION;
float2 uv: TEXCOORD0;
};
uniform sampler2D _MainTex;
uniform sampler2D _EmblemTex;
uniform float3 _BackColor;
uniform float _Rank;
v2f vert( appdata v )
{
v2f o;
o.pos = UnityObjectToClipPos( v.vertex );
o.uv = v.texcoord.xy;
return o;
}
float4 frag( v2f IN ) : COLOR
{
float4 outColor;
float4 backTextureColor = tex2D( _MainTex, IN.uv.xy );
float4 emblemTextureColor = tex2D( _EmblemTex, IN.uv.xy );
// not drawing the square at all above rank 5
if ( _Rank >= 6.0 )
discard;
if ( _Rank < 5 ) // 4 and below
{
outColor = float4( (emblemTextureColor.rgb * emblemTextureColor.a) +
(((1.0 - emblemTextureColor.a) * backTextureColor.rgb) * _BackColor.rgb) , 1 );
// float4(_BackColor.rgb, 1 ));
}
else if ( _Rank >= 5.0 ) // but excluded from 6 above
{
// 5 is just solid backcolor combined with background texture
outColor = float4( backTextureColor.rgb * _BackColor.rgb, 1 );
}
return outColor;
}
ENDCG
}}
}
Shaders are a bit maddening to learn how to do, but pretty fun once you get them working - like most programming :)
In my case the overlay texture was the same size/shape as the flag which makes it a bit easier. I'm thinking you'll need to add some parameters to the shader that indicate where you want the overlay to be drawn relative to the mesh and do nothing for vertexes/fragments outside your tattoo bounds, just as a first thought.

Shader programming in Unity

I am trying to write a shader to read the whole frame, pixel by pixel and after some calculations re-write the pixels. I have looked through some codes but most of them were not relevant. Could you give me some hints on how I can read pixels and write pixels in Unity shader programming?
If you have the Pro version of Unity, you can achieve this with image (postprocessing) effects. All you have to do is to implement the OnRenderImage callback on a component of a camera. Then you call Graphics.Blit with a material which has a shader. The shader receives the screen contents as main texture.
You need texture buffers the size of your frame.
Then you want to render your frame into one of the buffers.
Now you need to write a fragment shader that reads one buffers, and writes to the other.
Then finally you draw the fragment shader output as a flat object that covers the screen.
In shader programming, you do not work pixel by pixel, you define a function that will be used on a single pixel at a position which is a float from 0 to 1 in all 3 axis (although you will only be using 2). That fragment shader is then run for lots of pixel all in parallel, that's how it does everything more quickly.
I hope that brief explanation is enough to get you started. Unity fragment shaders are written in Cg. Cg is a language which is half way between OpenGL's language GLSL and DirectX language HLSL, as all the high level languages compile into native instructions on the graphics card they are all fairly similar. So there are plenty of Cg samples about, and once you can write Cg, you will have no problem reading HLSL and GLSL.
Thank you for your advice. they were really helpful. I finally ended up with this code for my shader. And now, a new problem just comes up.
My solution:
To solve my keystone problem, I have adapted the "wearing a glass" idea! it means that I have placed on a plane in front of camera and attached the below shader on it. Then I attached the plane to the camera. The problem right now is that is shader works very well but in my VR setting it does not work because I have several cameras and the scene is distorted in one of them (as I want) but other cameras have a normal scenes. Everything is fine until these two scenes have intersection. In that case I have a disjoint scene (please forgive me if is not a correct word). By the way, I thought that instead of using this shader for a "plane infront of camera" I have to apply it on the camera itself. my shader does not work when I add it to the camera although it works perfectly with the plane object. Could you let me know how can I modify this code to be compatible with camera? I am more than welcome to hear your suggestion and ideas besides of my solution.
Shader "Custom/she1" {
Properties {
top("Top", Range(0,2)) = 1
bottom("Bottom", Range(0,2)) = 1
}
SubShader {
// Draw ourselves after all opaque geometry
Tags { "Queue" = "Transparent" }
// Grab the screen behind the object into _GrabTexture
GrabPass { }
// Render the object with the texture generated above
Pass {
CGPROGRAM
#pragma debug
#pragma vertex vert
#pragma fragment frag
#pragma target 3.0
sampler2D _GrabTexture : register(s0);
float top;
float bottom;
struct data {
float4 vertex : POSITION;
float3 normal : NORMAL;
};
struct v2f {
float4 position : POSITION;
float4 screenPos : TEXCOORD0;
};
v2f vert(data i){
v2f o;
o.position = mul(UNITY_MATRIX_MVP, i.vertex);
o.screenPos = o.position;
return o;
}
half4 frag( v2f i ) : COLOR
{
float2 screenPos = i.screenPos.xy / i.screenPos.w;
float _half = (top + bottom) * 0.5;
float _diff = (bottom - top) * 0.5;
screenPos.x = screenPos.x * (_half + _diff * screenPos.y);
screenPos.x = (screenPos.x + 1) * 0.5;
screenPos.y = 1-(screenPos.y + 1) * 0.5 ;
half4 sum = half4(0.0h,0.0h,0.0h,0.0h);
sum = tex2D( _GrabTexture, screenPos);
return sum;
}
ENDCG
}
}
Fallback Off
}

How to flex a 3D texture with OpenGL ES?

I have tried to draw some 3D squares (with OpenGL on iPhone) and make them rotate around, now they look like a sphere.
http://i618.photobucket.com/albums/tt265/LoyalMoral/Post/ScreenShot2013-05-15at23249PM.png
But the square is flat (the first one on image below), and I want to flex it:
http://i618.photobucket.com/albums/tt265/LoyalMoral/Post/Untitled-1.jpg
someone told me that I have to use glsl, but I don't know shading language.
this is my vertex and fragment (follow Ray Wenderlich's tutorial):
// Vertex.glsl
attribute vec4 Position;
attribute vec4 SourceColor;
varying vec4 DestinationColor;
uniform mat4 Projection;
uniform mat4 Modelview;
attribute vec2 TexCoordIn;
varying vec2 TexCoordOut;
void main(void) {
DestinationColor = SourceColor;
gl_Position = Projection * Modelview * Position;
TexCoordOut = TexCoordIn;
}
// Fragment.glsl
varying lowp vec4 DestinationColor;
varying lowp vec2 TexCoordOut;
uniform sampler2D Texture;
void main(void) {
gl_FragColor = DestinationColor * texture2D(Texture, TexCoordOut);
}
could somebody help me? :)
Instead of using a quad (pair of triangles) for a square use a grid for it. Thus you will be able to place vertices of the grid manually resulting in the shape you want.

Earth Day/Night side shaders in OpneGL ES 2.0

I have an iPhone application what models the Planet Earth! I would like to make it realistic: There is a sphere object, and a Nightside and Dayside texture and shader, but it doesn't work!
My Sphere object's draw method:
-(bool)execute:(GLuint)texture;
{
glBindTexture(GL_TEXTURE_2D, texture);
glBindVertexArrayOES(m_VertexArrayName);
glDrawArrays(GL_TRIANGLE_STRIP, 0, m_NumVertices);
glBindTexture(GL_TEXTURE_2D, 0);
return true;
}
My ViewController call method:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
glClearColor(0.3f, 0.3f, 0.3f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUniformMatrix4fv(uniforms[UNIFORM_MODELVIEWPROJECTION_MATRIX], 1, 0, _modelViewProjectionMatrix.m);
glUniformMatrix3fv(uniforms[UNIFORM_NORMAL_MATRIX], 1, 0, _normalMatrix.m);
glUseProgram(m_NightsideProgram);
[m_Sphere setBlendMode:0];
[m_Sphere execute:m_EarthNightTexture.name];
glUseProgram(m_DaysideProgram);
[m_Sphere setBlendMode:1];
[m_Sphere execute:m_EarthDayTexture.name];
glCullFace(GL_FRONT);
glEnable(GL_CULL_FACE);
glFrontFace(GL_CW);
}
Blendmodes:
0: glBlendFunc(GL_SRC_COLOR, GL_DST_COLOR);
1: glBlendFunc(GL_ONE, GL_CONSTANT_COLOR); //The constant is a mid blue, for make it lighter a bit
Nightside fragmentshader:
precision mediump float;
varying lowp vec4 colorVarying;
varying vec2 v_texCoord;
uniform sampler2D s_texture;
void main() {
vec4 newColor;
newColor=1.0-colorVarying;
gl_FragColor = texture2D(s_texture, v_texCoord)*newColor;
}
DaySide Fragmentshader:
precision mediump float;
varying lowp vec4 colorVarying;
varying lowp vec4 specularColorVarying;
varying vec2 v_texCoord;
uniform sampler2D s_texture;
void main() {
vec4 finalSpecular=vec4(0,0,0,1);
vec4 surfaceColor;
float halfBlue;
surfaceColor=texture2D(s_texture,v_texCoord);
halfBlue=0.5*surfaceColor[2];
if(halfBlue>1.0)
halfBlue=1.0;
if((surfaceColor[0]<halfBlue) && (surfaceColor[1]<halfBlue))
finalSpecular=specularColorVarying;
gl_FragColor = surfaceColor*colorVarying+colorVarying*finalSpecular;
}
If I use the only one of the shaders, it seens to be good, but it won't work together!
For a glUniform... call to take effect, there has to be a valid program bound/used, and even then it only changes the uniform value for that specific program's uniform (indentified by the uniform location). So you have to call your glUniform... functions for each program after the respective glUseProgram.
This is why it works with only one shader program, because you don't ever bind any other program. But it is still conceptually wrong, as in this case you are relying on a specific program being already bound, which is always a source for errors (like when adding a second program), as OpenGL is a state machine.
On the other hand a uniform variable keeps its value even when its corresponding program gets unbound (glUseProgram(0_or_any_other_program)).

Translating square with shaders on iPhone

I'm trying to update my (little) knowledge of OpenGL ES 1.1 to 2.0 on the iPhone. The default OpenGL ES Application template for the iPhone draws a square and makes it translate up and down and works fine. Their implementation does the math for the Y value changes on the shader itself which is pretty much useless. So, I've changed the vertext shader to:
uniform mat4 mvpMatrix;
attribute vec4 position;
attribute vec4 color;
varying vec4 colorVarying;
void main()
{
gl_Position = position * mvpMatrix;
colorVarying = color;
}
Which seems to be correct and common (from I've seen in my research). Obviously, I did the necessary changes to the code, like binding the uniform and, to help with the math, I got the sources for the esUtil.h code. On the drawing method, my code looks like this:
transY += 0.075f;
ESMatrix mvp, model, view;
esMatrixLoadIdentity(&view);
esPerspective(&view, 60.0, 320.0/480.0, 1.0, -1.0);
esMatrixLoadIdentity(&model);
esTranslate(&model, sinf(transY), 0.0f, 0.0f);
esMatrixLoadIdentity(&mvp);
esMatrixMultiply(&mvp, &model, &view);
glUniformMatrix4fv(uniforms[UNIFORM_MVPMATRIX], 1, GL_FALSE, (GLfloat *)&mvp);
And that should be working but, unfortunately, what I get is quite different from a simple translation.
I've restarted the template a few times but I can't figure out what I'm doing wrong here... Rotating seems to be working as expected, I believe...
Any help would be appreciated.
I think you want to reverse the order of your position transform, as your matrix library is probably working in Column-major order.
gl_Position = position * mvpMatrix;
=>
gl_Position = mvpMatrix * position;
unknowingly you have made a camera position change. In opengles camera(global) and object(local) transforms are just inverse.