problem loading texture with transparency with OpenGL ES and Android - iphone

Im trying to load an image that has background transparency that will be layered over another texture. When i try and load it, all i get is a white screen. The texture is 512 by 512, and its saved in photoshop as a 24 bit PNG (standard PNG specs in the Photoshop Save for Web and Devices config window). Any idea why its not showing? The texture without transparency shows without a problem. Here is my loadTextures method:
public void loadGLTexture(GL10 gl, Context context) {
//Get the texture from the Android resource directory
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.m1);
Bitmap normalScheduleLines = BitmapFactory.decodeResource(context.getResources(), R.drawable.m1n);
//Generate texture pointers...
gl.glGenTextures(3, textures, 0);
//...and bind it to our array
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[1]);
//Create Nearest Filtered Texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL11.GL_TEXTURE_2D, GL11.GL_GENERATE_MIPMAP, GL11.GL_TRUE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
//Bind our normal schedule bus map lines
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
//Create Nearest Filtered Texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL11.GL_TEXTURE_2D, GL11.GL_GENERATE_MIPMAP, GL11.GL_TRUE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, normalScheduleLines, 0);
normalScheduleLines.recycle();
}

It was actually the automatically generated mipmaps that was preventing the PNG from being displayed. I changed
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
to
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
and sure enough, it worked. Not sure why it doesn't like the PNG with alpha but when i do find out i will post here.

Related

Loading image from Bytes into Texture2D generates low quality results

I'm using Unity and using an Image object to display an image by loading its bytes (coming from a JSON request) in a Texture2D object, but the resulting image is blurry and pixelated, in a very low quality. This is the code:
Texture2D myTexture = new Texture2D(2, 2, TextureFormat.ARGB32, false);
myTexture.filterMode = FilterMode.Point;
myTexture.LoadImage(Bytes);
myImage.GetComponent<RawImage>().texture = myTexture;
And the output looks like this:
Any idea on how to improve the quality on this? There should be a way to make it look better, if I import an asset into Unity (image) and set it with filter mode Point, it actually looks pretty good, but in this case, it just makes it worse. The original image is pretty detailed:
Try with image exact width and hight in
Texture2D myTexture = new Texture2D(width, height, TextureFormat.ARGB32, false);
or
Texture2D myTexture = new Texture2D(1, 1, TextureFormat.ARGB32, false);

Unity texture from disk has low resolution

I am trying to load a texture(and create a sprite from it eventually) from disk but sprite renders as low resolution image.
What I am doing:
-> Download the image from url. Once the image is downloaded, I save the texture as png to disk so that next time it doesn't requires a download.
WWW www = new WWW(url);
yield return www;
if (www.isDone)
{
if (string.IsNullOrEmpty(www.error))
{
Sprite img = Sprite.Create(www.texture, new Rect(0, 0, www.texture.width, www.texture.height), new Vector2(0, 0));
reward.RewardSprite = img;
byte[] bytes = www.texture.EncodeToPNG();
FileManager.SaveRewardImage(reward.rewardId, bytes);
}
else
{
Debug.Log(www.error);
}
}
-> Load from disk
string path = string.Format("Cache\\Venue\\{0}", nameWithoutExtension);
return Resources.Load<Texture2D>(path);
The first time when the texture loads from url, its resolution seems fine(because its the original one). When it loads from cache, it attenuates to a lower one.
Can someone tell me what am I missing, or even if there is way around it?
Thanks in advance.
You can overload your texture in your sprite creation in the same way that you do with the rectangle and specify in TextureFormat the format you need:
Sprite img = Sprite.Create(
new Texture2D (www.texture.width, www.texture.height, TextureFormat format, bool mipmap),
new Rect(0, 0, www.texture.width, www.texture.height), new Vector2(0, 0));

renderWithShader texture passing

i wish to create a nightvision effect with a shader for my camera. I have written the shader for a normal material, in which i mass a noise mask and a texture (in my camera example, the texture should be the image i get from the camera itself).
I have some questions: first, i see that i can pass a shader to the camera using Camera.renderWithShader. The thing is that i don't know how to link the image from what i see through my camera and my shader. I would also like to put the noise mask to my shader and don't know how to pass it. This is different then having a material to which you could link the textures.
I found some code on the net how to link the shader and the camera.. the thing is that i don't know if it's good due to the fact that i can't see the final nightvision effect because i don't know how to pass textures to the camera. I can see the view altering but don't know if it's right.
void Start () {
nightVisionShader = Shader.Find("Custom/nightvisionShader");
Camera.mainCamera.RenderWithShader(nightVisionShader,"");
}
void OnRenderImage (RenderTexture source, RenderTexture destination)
{
RenderTexture sceneNormals = RenderTexture.GetTemporary (source.width, source.height, 24, RenderTextureFormat.ARGB32);
transform.camera.targetTexture = sceneNormals;
transform.camera.RenderWithShader(nightVisionShader, "");
transform.camera.targetTexture = null;
// display contents in game view
Graphics.Blit (sceneNormals, destination);
RenderTexture.ReleaseTemporary (sceneNormals);
}
found how to do it!
void OnRenderImage (RenderTexture source, RenderTexture destination) {
overlayMaterial.SetTexture ("_MainTex", Resources.Load("nightvision/") as Texture2D);
overlayMaterial.SetTexture ("_noiseTex", Resources.Load("nightvision/noise_tex6") as Texture2D);
overlayMaterial.SetTexture ("_maskTex", Resources.Load("nightvision/binoculars_mask") as Texture2D);
overlayMaterial.SetFloat ("_elapsedTime", Time.time);
Graphics.Blit (source, destination, overlayMaterial, 0);
}

OpenGL ES 2d rendering into image

I need to write OpenGL ES 2-dimensional renderer on iOS. It should draw some primitives such as lines and polygons into 2d image (it will be rendering of vector map). Which way is the best for getting image from OpenGL context in that task? I mean, should I render these primitives into texture and then get image from it, or what? Also, it will be great if someone give examples or tutorials which look like the thing I need (2d GL rendering into image). Thanks in advance!
If you need to render an OpenGL ES 2-D scene, then extract an image of that scene to use outside of OpenGL ES, you have two main options.
The first is to simply render your scene and use glReadPixels() to grab RGBA data for the scene and place it in a byte array, like in the following:
GLubyte *rawImagePixels = (GLubyte *)malloc(totalBytesForImage);
glReadPixels(0, 0, (int)currentFBOSize.width, (int)currentFBOSize.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
// Do something with the image
free(rawImagePixels);
The second, and much faster, way of doing this is to render your scene to a texture-backed framebuffer object (FBO), where the texture has been provided by iOS 5.0's texture caches. I describe this approach in this answer, although I don't show the code for raw data access there.
You do the following to set up the texture cache and bind the FBO texture:
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context], NULL, &rawDataTextureCache);
if (err)
{
NSAssert(NO, #"Error at CVOpenGLESTextureCacheCreate %d");
}
// Code originally sourced from http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
//CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);
CVPixelBufferCreate(kCFAllocatorDefault,
(int)imageSize.width,
(int)imageSize.height,
kCVPixelFormatType_32BGRA,
attrs,
&renderTarget);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
rawDataTextureCache, renderTarget,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)imageSize.width,
(int)imageSize.height,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
CFRelease(attrs);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
and then you can just read directly from the bytes that back this texture (in BGRA format, not the RGBA of glReadPixels()) using something like:
CVPixelBufferLockBaseAddress(renderTarget, 0);
_rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);
// Do something with the bytes
CVPixelBufferUnlockBaseAddress(renderTarget, 0);
However, if you just want to reuse your image within OpenGL ES, you just need to render your scene to a texture-backed FBO and then use that texture in your second level of rendering.
I show an example of rendering to a texture, and then performing some processing on it, within the CubeExample sample application within my open source GPUImage framework, if you want to see this in action.

OpenGL ES 2.0 texturing

I'm trying to render a simple textured quad in OpenGL ES 2.0 on an iPhone. The geometry is fine and I get the expected quad if I use a solid color in my shader:
gl_FragColor = vec4 (1.0, 0.0, 0.0, 1.0);
And I get the expected gradients if I render the texture coordinates directly:
gl_FragColor = vec4 (texCoord.x, texCoord.y, 0.0, 1.0);
The image data is loaded from a UIImage, scaled to fit within 1024x1024, and loaded into a texture like so:
glGenTextures (1, &_texture);
glBindTexture (GL_TEXTURE_2D, _texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA,
GL_UNSIGNED_BYTE, data);
width, height, and the contents of data are all correct, as examined in the debugger.
When I change my fragment shader to use the texture:
gl_FragColor = texture2D (tex, texCoord);
... and bind the texture and render like so:
glActiveTexture (GL_TEXTURE0);
glBindTexture (GL_TEXTURE_2D, _texture);
// this is unnecessary, as it defaults to 0, but for completeness...
GLuint texLoc = glGetUniformLocation(_program, "tex");
glUniform1i(texLoc, 0);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
... I get nothing. A black quad. glGetError() doesn't return an error and glIsTexture(_texture) returns true.
What am I doing wrong here? I've been over and over every example I could find online, but everybody is doing it exactly as I am, and the debugger shows my parameters to the various GL functions are what I expect them to be.
After glTexImage2D, set the MIN/MAG filters with glTexParameter, the defaults use mipmaps so the texture is incomplete with that code.
I was experiencing the same issue (black quad) and could not find an answer until a response by jfcalvo from this question led me to the cause. Basically make sure you are not loading the texture in a different thread.
make sure you set the texture wrap parameters to GL_CLAMP_TO_EDGE in both S and T directions. Without this, the texture is incomplete and will appear black.
make sure that you are calling (glTexImage2D) with right formats(constants)
make sure that you are freeing resources of image after glTexImage2D
that's how i'm do it on android:
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
mTextureID = textures[0];
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureID);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_REPEAT);
InputStream is = mContext.getResources()
.openRawResource(R.drawable.ywemmo2);
Bitmap bitmap;
try {
bitmap = BitmapFactory.decodeStream(is);
} finally {
try {
is.close();
} catch(IOException e) {
// Ignore.
}
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
maybe you forgot to
glEnable(GL_TEXTURE_2D);
is such case, texture2D in the shader would return black, as the OP seems to suffer.