I'm trying to render a simple textured quad in OpenGL ES 2.0 on an iPhone. The geometry is fine and I get the expected quad if I use a solid color in my shader:
gl_FragColor = vec4 (1.0, 0.0, 0.0, 1.0);
And I get the expected gradients if I render the texture coordinates directly:
gl_FragColor = vec4 (texCoord.x, texCoord.y, 0.0, 1.0);
The image data is loaded from a UIImage, scaled to fit within 1024x1024, and loaded into a texture like so:
glGenTextures (1, &_texture);
glBindTexture (GL_TEXTURE_2D, _texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA,
GL_UNSIGNED_BYTE, data);
width, height, and the contents of data are all correct, as examined in the debugger.
When I change my fragment shader to use the texture:
gl_FragColor = texture2D (tex, texCoord);
... and bind the texture and render like so:
glActiveTexture (GL_TEXTURE0);
glBindTexture (GL_TEXTURE_2D, _texture);
// this is unnecessary, as it defaults to 0, but for completeness...
GLuint texLoc = glGetUniformLocation(_program, "tex");
glUniform1i(texLoc, 0);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
... I get nothing. A black quad. glGetError() doesn't return an error and glIsTexture(_texture) returns true.
What am I doing wrong here? I've been over and over every example I could find online, but everybody is doing it exactly as I am, and the debugger shows my parameters to the various GL functions are what I expect them to be.
After glTexImage2D, set the MIN/MAG filters with glTexParameter, the defaults use mipmaps so the texture is incomplete with that code.
I was experiencing the same issue (black quad) and could not find an answer until a response by jfcalvo from this question led me to the cause. Basically make sure you are not loading the texture in a different thread.
make sure you set the texture wrap parameters to GL_CLAMP_TO_EDGE in both S and T directions. Without this, the texture is incomplete and will appear black.
make sure that you are calling (glTexImage2D) with right formats(constants)
make sure that you are freeing resources of image after glTexImage2D
that's how i'm do it on android:
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
mTextureID = textures[0];
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureID);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_REPEAT);
InputStream is = mContext.getResources()
.openRawResource(R.drawable.ywemmo2);
Bitmap bitmap;
try {
bitmap = BitmapFactory.decodeStream(is);
} finally {
try {
is.close();
} catch(IOException e) {
// Ignore.
}
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
maybe you forgot to
glEnable(GL_TEXTURE_2D);
is such case, texture2D in the shader would return black, as the OP seems to suffer.
Related
I want to manipulate a texture that has been created in Unity directly with OpenGL.
I create the texture in unity with these parameters :
_renderTexture = new RenderTexture(_sizeTexture, _sizeTexture, 0
, RenderTextureFormat.ARGB32, RenderTextureReadWrite.Linear)
{
useMipMap = false,
autoGenerateMips = false,
anisoLevel = 6,
filterMode = FilterMode.Trilinear,
wrapMode = TextureWrapMode.Clamp,
enableRandomWrite = true
};
Then I send the texture pointer to a native rendering plugin with GetNativeTexturePtr() method. In the native rendering plugin I bind the texture with glBindTexture(GL_TEXTURE_2D, gltex); where gltex is the pointer of my texture in Unity.
Finally, I check the internal format of my texture with :
GLint format;
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_INTERNAL_FORMAT, &format);
I have format = GL_RGBA8 even though I defined the texture in Unity with the format RenderTextureFormat.ARGB32. You can reproduce this by using the native rendering plugin example of Unity and just replacing the function RenderAPI_OpenGLCoreES::EndModifyTexture of the file RenderAPI_OpenGLCoreES.cpp by :
void RenderAPI_OpenGLCoreES::EndModifyTexture(void* textureHandle, int textureWidth, int textureHeight, int rowPitch, void* dataPtr)
{
GLuint gltex = (GLuint)(size_t)(textureHandle);
// Update texture data, and free the memory buffer
glBindTexture(GL_TEXTURE_2D, gltex);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB32F, textureWidth, textureHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, NULL);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, textureWidth, textureHeight, GL_RGBA, GL_UNSIGNED_BYTE, dataPtr);
GLint format;
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_INTERNAL_FORMAT,
&format);
delete[](unsigned char*)dataPtr;
}
Why did the internal format change after binding the texture to OpenGL? And is it possible to "impose" GL_RGBA32F format to my OpenGL texture?
I tried to use glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, textureWidth, textureHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); after my binding but I get the following error: Error : OpenGL error 0x0502 (GL_INVALID_OPERATION).
Sorry if the question is simple I am new on OpenGL!
I just misread the documentation, moreover unity and open have not the same naming convention... I thought RenderTextureFormat.ARGB32 was a texture with 32 bit per channel, but it's a total of 32 bit and therefore 8 bit per channel which corresponds to GL_RGBA8.
There is a summary table for unity format here : https://docs.unity3d.com/Manual/class-ComputeShader.html.
So I have a Unity3D plugin written in c++ and compiled for Android.
When I started off I used OpenGLES2 to maximize device reach but recently I decided I wanted to try moving up to OpenGLES3, so I included the gl3 headers instead of the gl2 headers, built, and switched Graphics API on Unity to OpenGLES3.
Unfortunately, it is not working correctly anymore.
The code is the following:
In one plugin I have this:
glBindTexture(GL_TEXTURE_2D, textureID);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, videoWidth, videoHeight, glPixFormat, glPixType, frameBuffer->buff);
where textureID is a pointer to a texture passed by Unity, and framebuffer->buff is the bytearray of the image I want to put in it.
In a second plugin I have this:
static std::vector<unsigned char> emptyPixelsAlpha(height * width, 0);
glBindTexture(GL_TEXTURE_2D, alphaID);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, width, height, GL_ALPHA, GL_UNSIGNED_BYTE, emptyPixelsAlpha.data());
where alphaID is a pointer to a texture passed by Unity. (I simplified this part a bit, what it does in this case is simply fill another texture of the same dimensions of the previous one with black on a single channel)
These two textures are fed to the following shader on Unity side:
Shader "alphaMaskShader"
{
Properties{
_MainTex("Base (RGB)", 2D) = "white" {}
_Alpha("Alpha (A)", 2D) = "white" {}
}
SubShader{
Tags{ "RenderType" = "Transparent" "Queue" = "Overlay" }
ZWrite Off
ZTest Off
Blend SrcAlpha OneMinusSrcAlpha
ColorMask RGB
Pass{
SetTexture[_MainTex]{
Combine texture
}
SetTexture[_Alpha]{
Combine previous, texture
}
}
}
}
Before, this code would simply make the displayed texture completely invisible since the "alphaID" texture becomes the alpha channel.
Instead, it now displays the "textureID" texture as if the alpha channel isn't there or that it's somehow set to complete opaqueness.
I read over the OpenGLES3 specs but it clearly states that it's backward compatible with OpenGLES2 and haven't found much about porting issues.
I found what the problem was. Apparently something has changed with how GL_ALPHA works between ES 2.0 and 3.0 (or 3.2 since that's what my device is on).
I got it working again by swapping GL_ALPHA with GL_RED.
EDIT: This apparently only works on Oreo devices, the problem still persists on Nougat. I've created a dedicated question to this issue: Simple OpenGL ES 3 function works on Oreo but not on Nougat
I am in OpenGL es 2.0 with glKit trying to render to iOS devices.
Basically my goal is to instead of drawing to the main buffer draw to a texture. Then render that texture to the screen. I have been trying to follow another topic on so. Unfortunately they mention something about the power of two (im assuming with regards to resolution) but I don't know how to fix it. Anyway here is my swift interpretation of the code from that topic.
import Foundation
import GLKit
import OpenGLES
class RenderTexture {
var framebuffer:GLuint = 0
var tex:GLuint = 0
var old_fbo:GLint = 0
init(width: GLsizei, height: GLsizei)
{
glGetIntegerv(GLenum(GL_FRAMEBUFFER_BINDING), &old_fbo)
glGenFramebuffers(1, &framebuffer)
glGenTextures(1, &tex)
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), framebuffer)
glBindTexture(GLenum(GL_TEXTURE_2D), tex)
glTexImage2D(GLenum(GL_TEXTURE_2D), 0, GL_RGBA, GLsizei(width), GLsizei(height), 0, GLenum(GL_RGBA), GLenum(GL_UNSIGNED_BYTE), nil)
glFramebufferTexture2D(GLenum(GL_FRAMEBUFFER), GLenum(GL_COLOR_ATTACHMENT0), GLenum(GL_TEXTURE_2D), tex, 0)
glClearColor(0, 0.1, 0, 1)
glClear(GLenum(GL_COLOR_BUFFER_BIT))
let status = glCheckFramebufferStatus(GLenum(GL_FRAMEBUFFER))
if (status != GLenum(GL_FRAMEBUFFER_COMPLETE))
{
print("DIDNT GO WELL WITH", width, " " , height)
print(status)
}
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), GLenum(old_fbo))
}
func begin()
{
glGetIntegerv(GLenum(GL_FRAMEBUFFER_BINDING), &old_fbo)
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), framebuffer)
}
func end()
{
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), GLenum(old_fbo))
}
}
Then as far as rendering I have some things going on.
A code that theoretically renders any texture full screen. This has been tested with two manually loaded pngs (using no buffer changes) and works great.
func drawTriangle(texture: GLuint)
{
loadBuffers()
//glViewport(0, 0, width, height)
//glClearColor(0, 0.0, 0, 1.0)
//glClear(GLbitfield(GL_COLOR_BUFFER_BIT) | GLbitfield(GL_DEPTH_BUFFER_BIT))
glEnable(GLenum(GL_TEXTURE_2D))
glActiveTexture(GLenum(GL_TEXTURE0))
glUseProgram(texShader)
let loc1 = glGetUniformLocation(texShader, "s_texture")
glUniform1i(loc1, 0)
let loc3 = glGetUniformLocation(texShader, "matrix")
if (loc3 != -1)
{
glUniformMatrix4fv(loc3, 1, GLboolean(GL_FALSE), &matrix)
}
glBindTexture(GLenum(GL_TEXTURE_2D), texture)
glDrawArrays(GLenum(GL_TRIANGLE_STRIP), 0, 6)
glDisable(GLenum(GL_TEXTURE_2D))
destroyBuffers()
}
I also have a function that draws a couple dots on the screen. You dont really need to see the methods but it works. This is how I am going to know that OpenGL is drawing from the buffer texture and NOT a preloaded texture.
Finally here is the gist of the code I am trying to do.
func initialize()
{
nfbo = RenderTexture(width: width, height: height)
}
fun draw()
{
glViewport(0, 0, GLsizei(width * 2), GLsizei(height * 2)) //why do I have to multiply for 2 to get it to work?????
nfbo.begin()
drawDots() //Draws the dots
nfbo.end()
reset()
drawTriangle(nfbo.tex)
}
At the end of all this all that is drawn is a blank screen. If there is any more code that would help you figure things out let me know. I tried to trim it to make it less annoying for you.
Note: Considering the whole power of two thing I have tried passing the fbo class 512 x 512 just in case it would make things work being a power of two. Unfortunately it didnt do that.
Another Note: All I am doing is going to be 2D so I dont need depth buffers right?
yesterday I saw exactly the same issue.
after struggling for hours, I found out why.
the trick is configuring your texture map with the following:
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GL_CLAMP_TO_EDGE);
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GL_CLAMP_TO_EDGE);
otherwise, you won't draw anything on the texture map.
the reason seems to be that while ios supports texture maps that are not power of 2. it requires GL_CLAMP_TO_EDGE. otherwise it won't work.
it should really report incomplete framebuffer. it took me quite long time to debug this problem!
here a related discussion:
Rendering to non-power-of-two texture on iPhone
I need to write OpenGL ES 2-dimensional renderer on iOS. It should draw some primitives such as lines and polygons into 2d image (it will be rendering of vector map). Which way is the best for getting image from OpenGL context in that task? I mean, should I render these primitives into texture and then get image from it, or what? Also, it will be great if someone give examples or tutorials which look like the thing I need (2d GL rendering into image). Thanks in advance!
If you need to render an OpenGL ES 2-D scene, then extract an image of that scene to use outside of OpenGL ES, you have two main options.
The first is to simply render your scene and use glReadPixels() to grab RGBA data for the scene and place it in a byte array, like in the following:
GLubyte *rawImagePixels = (GLubyte *)malloc(totalBytesForImage);
glReadPixels(0, 0, (int)currentFBOSize.width, (int)currentFBOSize.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
// Do something with the image
free(rawImagePixels);
The second, and much faster, way of doing this is to render your scene to a texture-backed framebuffer object (FBO), where the texture has been provided by iOS 5.0's texture caches. I describe this approach in this answer, although I don't show the code for raw data access there.
You do the following to set up the texture cache and bind the FBO texture:
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context], NULL, &rawDataTextureCache);
if (err)
{
NSAssert(NO, #"Error at CVOpenGLESTextureCacheCreate %d");
}
// Code originally sourced from http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
//CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);
CVPixelBufferCreate(kCFAllocatorDefault,
(int)imageSize.width,
(int)imageSize.height,
kCVPixelFormatType_32BGRA,
attrs,
&renderTarget);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
rawDataTextureCache, renderTarget,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)imageSize.width,
(int)imageSize.height,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
CFRelease(attrs);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
and then you can just read directly from the bytes that back this texture (in BGRA format, not the RGBA of glReadPixels()) using something like:
CVPixelBufferLockBaseAddress(renderTarget, 0);
_rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);
// Do something with the bytes
CVPixelBufferUnlockBaseAddress(renderTarget, 0);
However, if you just want to reuse your image within OpenGL ES, you just need to render your scene to a texture-backed FBO and then use that texture in your second level of rendering.
I show an example of rendering to a texture, and then performing some processing on it, within the CubeExample sample application within my open source GPUImage framework, if you want to see this in action.
Im trying to load an image that has background transparency that will be layered over another texture. When i try and load it, all i get is a white screen. The texture is 512 by 512, and its saved in photoshop as a 24 bit PNG (standard PNG specs in the Photoshop Save for Web and Devices config window). Any idea why its not showing? The texture without transparency shows without a problem. Here is my loadTextures method:
public void loadGLTexture(GL10 gl, Context context) {
//Get the texture from the Android resource directory
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.m1);
Bitmap normalScheduleLines = BitmapFactory.decodeResource(context.getResources(), R.drawable.m1n);
//Generate texture pointers...
gl.glGenTextures(3, textures, 0);
//...and bind it to our array
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[1]);
//Create Nearest Filtered Texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL11.GL_TEXTURE_2D, GL11.GL_GENERATE_MIPMAP, GL11.GL_TRUE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
//Bind our normal schedule bus map lines
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
//Create Nearest Filtered Texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL11.GL_TEXTURE_2D, GL11.GL_GENERATE_MIPMAP, GL11.GL_TRUE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, normalScheduleLines, 0);
normalScheduleLines.recycle();
}
It was actually the automatically generated mipmaps that was preventing the PNG from being displayed. I changed
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
to
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
and sure enough, it worked. Not sure why it doesn't like the PNG with alpha but when i do find out i will post here.