How do I fix the colours for this Directx TK sprite batch render? - png

This is a png from Aseprite, rendered with DirectX TK.
This is the png, as seen in Aseprite.
The code to render the scene:
void Graphics::BeginScene()
{
m_d3dDeviceContext->ClearRenderTargetView(m_renderTargetView.Get(), BackColor);
m_d3dDeviceContext->OMSetRenderTargets(1, m_renderTargetView.GetAddressOf(), nullptr);
CD3D11_VIEWPORT viewport(0.0f, 0.0f, static_cast<float>(m_backBufferWidth), static_cast<float>(m_backbufferHeight));
m_d3dDeviceContext->RSSetViewports(1, &viewport);
m_spriteBatch->Begin(SpriteSortMode_FrontToBack, NULL, m_samplerState.Get());
}

Pro tip: Use Pyxel Edit to export png instead of Aseprite.

Related

Is there a way to import a texture in a specific format?

I'm trying to load a rendered image from Blender into Unity. But I saved the image with a color depth of 16 bits per channel, and now I want to have the same accuracy inside Unity. However, when I put the texture on a material and zoom in a bit, it looks like this:
As far as I can tell, this is only 8 bits per channel. What fixed it for me was overriding the format in the Texture Import Settings (from the default RGBA Compressed DXT5 to RGBA64, because my image also has an alpha channel):
And now the image looks nice again:
However, I would like to be able to import the image at runtime. So far I've been doing it like this:
Texture2D tex = new Texture2D(0, 0);
byte[] bytes = File.ReadAllBytes(path);
tex.LoadImage(bytes);
The problem is that, according to the documentation for Texture2D.LoadImage, "PNG files are loaded into ARGB32 format" by default. And even if I set the format when creating the Texture2D, it seems to get overitten when I call LoadImage ("After LoadImage, texture size and format might change").
Is there a way to import an image in a specific format (at runtime)? Thanks in advance
Subclass AssetPostprocessor:
public class MyPostProcessor : AssetPostprocessor {
public void OnPreprocessTexture() {
if (assetPath.Contains("SomeFileName")) {
TextureImporter textureImporter = (TextureImporter)assetImporter;
TextureImporterPlatformSettings settings = new TextureImporterPlatformSettings();
settings.textureCompression = TextureImporterCompression.Uncompressed;
settings.format = TextureImporterFormat.RGBA64;
textureImporter.SetPlatformTextureSettings(settings);
}
}
}

Unity 5.1 Distorted image after download from web

When I load my png after compressing with tiny png, they get distorted( all purple and transparent)
http://s22.postimg.org/b39g0bhn5/Screen_Shot_2015_06_28_at_10_39_50_AM.png
the background for example should be blue
http://postimg.org/image/fez234o6d/
this only happens when i use pictures that got compressed by tinypng.com
and only after i updated to unity 5.1.
Im downloading the image with WWW class and loading texture using Texture2D.
is this problem known to anyone?
I had exactly the same issue. I was able to solve it using the following code
mat.mainTexture = new Texture2D(32, 32, TextureFormat.DXT5, false);
Texture2D newTexture = new Texture2D(32, 32, TextureFormat.DXT5, false);
WWW stringWWW = new WWW(texture1URL);
yield return stringWWW;
if(stringWWW.error == null)
{
stringWWW.LoadImageIntoTexture(newTexture);
mat.mainTexture = newTexture;
}
The key seemed to be using DXT5 as the texture format, and using the method LoadImageIntoTexture(...);

Gtk3 loading PixbufAnimation inside DrawingArea?

For the purpose of mine gstreamer application I tought about simple loader before I give a handle of DrawingArea widget to sink element.The basic idea was to load an animated .gif inside Gtk.DrawingArea but I run on the problem with documentation.I found out about PixbufAnimation and I used it with Gtk.Image widget but the same logic doesn't work for Gtk.DrawingArea and since it doesn't have add method I don't know what to do so as my last resort I came here to get a help.
This is what I did with Gtk.Image:
from gi.repository import Gdk,Gtk,GdkPixbuf
class animatedWin(Gtk.Window):
def __init__(self):
Gtk.Window.__init__(self,width_request=640,height_request=480)
self.canvas=Gtk.Image()
self.add(self.canvas)
self.load_file()
self.connect("delete-event",self.Gtk.main_quit)
def load_file(self):
self.loader=GdkPixbuf.PixbufAnimation.new_from_file("loader.gif")
self.canvas.set_from_animation(self.loader)
app=animatedWin()
app.show_all()
Gtk.main()
is it possible to achieve the same thing with DrawingArea ?
DrawingArea like most widgets in gtk3 uses cairo for drawing on them. Cairo draws on surfaces using context. You can convert pixbuf into surface by
public Surface Gdk.cairo_surface_create_from_pixbuf (Pixbuf pixbuf, int scale, Window? for_window)
And back by
public Pixbuf? Gdk.pixbuf_get_from_surface (Surface surface, int src_x, int src_y, int width, int height)
(taken from valadoc.org)
Example code snippet from my drawing app (I'm learning Vala while I writing it, so it may not be best implementation):
private void on_scale (Gtk.Button button) { // on button press
var my_pixbuf = Gdk.pixbuf_get_from_surface (this.buf_surface, 0, 0, CANVAS_WIDTH, CANVAS_HEIGHT);
var tmp_surface = Gdk.cairo_surface_create_from_pixbuf (my_pixbuf, 2, null);
var ctx = this.ccc; //this.ccc is context of drawing surface
ctx.set_source_surface (tmp_surface, 0, 0);
ctx.paint();
drawing_area.queue_draw(); // ask drawing_area to redraw, on redraw I have function/method that will paint drawing_area widget surface with drawing surface
}
PS. see http://valadoc.org/#!api=cairo/Cairo for more info on cairo. As I see it, cairo used for vector graphics and pixbuf for raster.

renderWithShader texture passing

i wish to create a nightvision effect with a shader for my camera. I have written the shader for a normal material, in which i mass a noise mask and a texture (in my camera example, the texture should be the image i get from the camera itself).
I have some questions: first, i see that i can pass a shader to the camera using Camera.renderWithShader. The thing is that i don't know how to link the image from what i see through my camera and my shader. I would also like to put the noise mask to my shader and don't know how to pass it. This is different then having a material to which you could link the textures.
I found some code on the net how to link the shader and the camera.. the thing is that i don't know if it's good due to the fact that i can't see the final nightvision effect because i don't know how to pass textures to the camera. I can see the view altering but don't know if it's right.
void Start () {
nightVisionShader = Shader.Find("Custom/nightvisionShader");
Camera.mainCamera.RenderWithShader(nightVisionShader,"");
}
void OnRenderImage (RenderTexture source, RenderTexture destination)
{
RenderTexture sceneNormals = RenderTexture.GetTemporary (source.width, source.height, 24, RenderTextureFormat.ARGB32);
transform.camera.targetTexture = sceneNormals;
transform.camera.RenderWithShader(nightVisionShader, "");
transform.camera.targetTexture = null;
// display contents in game view
Graphics.Blit (sceneNormals, destination);
RenderTexture.ReleaseTemporary (sceneNormals);
}
found how to do it!
void OnRenderImage (RenderTexture source, RenderTexture destination) {
overlayMaterial.SetTexture ("_MainTex", Resources.Load("nightvision/") as Texture2D);
overlayMaterial.SetTexture ("_noiseTex", Resources.Load("nightvision/noise_tex6") as Texture2D);
overlayMaterial.SetTexture ("_maskTex", Resources.Load("nightvision/binoculars_mask") as Texture2D);
overlayMaterial.SetFloat ("_elapsedTime", Time.time);
Graphics.Blit (source, destination, overlayMaterial, 0);
}

problem loading texture with transparency with OpenGL ES and Android

Im trying to load an image that has background transparency that will be layered over another texture. When i try and load it, all i get is a white screen. The texture is 512 by 512, and its saved in photoshop as a 24 bit PNG (standard PNG specs in the Photoshop Save for Web and Devices config window). Any idea why its not showing? The texture without transparency shows without a problem. Here is my loadTextures method:
public void loadGLTexture(GL10 gl, Context context) {
//Get the texture from the Android resource directory
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.m1);
Bitmap normalScheduleLines = BitmapFactory.decodeResource(context.getResources(), R.drawable.m1n);
//Generate texture pointers...
gl.glGenTextures(3, textures, 0);
//...and bind it to our array
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[1]);
//Create Nearest Filtered Texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL11.GL_TEXTURE_2D, GL11.GL_GENERATE_MIPMAP, GL11.GL_TRUE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
//Bind our normal schedule bus map lines
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
//Create Nearest Filtered Texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL11.GL_TEXTURE_2D, GL11.GL_GENERATE_MIPMAP, GL11.GL_TRUE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, normalScheduleLines, 0);
normalScheduleLines.recycle();
}
It was actually the automatically generated mipmaps that was preventing the PNG from being displayed. I changed
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR_MIPMAP_NEAREST);
to
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
and sure enough, it worked. Not sure why it doesn't like the PNG with alpha but when i do find out i will post here.