I imported my artwork in the simulator and this is how it looks like:
The alpha channel is not honored in the simulator.
The pnb is made on a win xp machine in Photoshop and imported
on a mac.
Is this only a simulator issue? I don't have a device to test.
In case its not simulator specific, what should I do to improve the png quality?
This will also happen on devices too. The phenomenon is called 'colour-banding', caused by the limited number of bits used to represent each colour channel, and affects all images but especially images with slow gradients. You can lessen the effects by applying some dithering algorithms (I believe that the TexturePacker tool has some post-processing dithering capabilities). See this article for an example of the effect of some dithering algorithms.
Related
I'm using some animated gifs on our mobile-site. It's a clock-animation and since the iOS 5 update it sometimes happens that the clock gets blue instead of red, as planned. Happens on iPhone4 and iPhone5 with the new os.
Any ideas what could cause the problems? It's hard to reconstruct this failure but it happens from time to time.
Any help would be appreciated.
Sometimes ios devices may not be able to process all the images because of its relatively low graphics ability compared to that of a computer. Instead of using a GIF I would suggest using an animated PNG. This has been more popular among the ios devices as using GIFs has become obsolete when working with ios devices. I am not sure how fast this would be, but I would say it can apply less stress then that of a gif. Another idea, because it is a clock gif is to analyze the gif in a program and determine any problems. Also use imgoptim (for mac) or pngcrush for windows to reduce the size of the gif to reduce the stress on the processor.
use GIF 128 Dither and Please make sure that images size has to be base on the resolution it happen some time if your images size is not depend on retina or normal resolutions. And if I am not wrong you facing problem only in retina device, may be below details will help
iPhone Retina Display
~~~~~~~~~~~~~~~~~~~~~~~
Width - 640px
Height - 960px : including 40px status bar
DPI - 326
You can use Cocos2D framework, Cocos2d and UIKIT both work great with out any graphic related errors... Sprite sheets reduces the memory usage and also support all transparant images , you can run animation ,stop and repeat them. Good luck..
Images that I add to the layer in Cocos2d look pixelated around the edges of the image (i.e. a hillside, the rounded part of the hill, where the sky and the hill meet). I don't know if it's the image quality, or just because the graphics processor on my 'older' MacBook Pro is not as advanced as the iPhone 4 or iPod Touch 4 or iPad 2. Is it because of that?
I have ran into an instance where a large resolution image was being used in the context of a locally packaged HTML file in a UIWebView. The image looked fine in the simulator, but when ran on a hardware device, a bug was exposed in the rendering engine where it would invert the colors. Here's a bug report as an example of this. The solution was to scale the image down a bit in photo editing application.
While an extreme corner case, this is an example of the simulator not quite living up to how things will work on a hardware device.
The simulator usually does a pretty good job representing what the final image will look like. For the image quality on a normal computer to be worse than that of an iPhone for it to make a large enough difference, your MacBook Pro has to be really bad. So I doubt it.
However, if you really want to make sure, the best way to check would be to transfer the image you are using to another machine to see if it still looks pixelated. If it does, its a problem with your image.
Hope this helps and good luck!
I am writing an application which contains some graphs drawn in OpenGL ES. each of these graphs are in a table cell, when I press some of those graphs, it is being opened in full screen mode.
Everything worked perfectly since I upgraded to iOS4.2. Now the problem is in simulator, I can't see the drawn graph in cells, but in full screen mode I do see the chart. There are no changes no the device, it is only on the simulator and only in a case.
The behavior is the same for other Mac's here.
Does anyone have a clue?
As explained in this answer to this similar question, there has been a change in the way that 4.2 handles renderbuffers in Core Animation layers. From the OpenGL ES Programming Guide:
In iOS 4.2 and later, the performance
of Core Animation rotations of
renderbuffers have been significantly
improved, and are now the preferred
way to rotate content between
landscape and portrait mode. For best
performance, ensure the renderbuffer’s
height and width are each a multiple
of 32 pixels.
It appears that if your renderbuffer isn't an even multiple of 32 pixels, it doesn't display in the Simulator. This is a bug in the Simulator, but you should probably make your renderbuffer a multiple of 32 in either dimension in any case to improve performance.
Same things with my App. My textures are broken sporadically on the simulator (4.2). On the hardware everything looks fine.
I don't know if this helps much, but I've seen all manner of strange behaviour on the simulator implementation of OpenGL ES: spurious images appearing; strange lighting on the first render pass; broken rendering to bitmaps, depending on when I call it. I'm no expert in OpenGL programming, so I could just be writing crap code, but there is definitely a noticeable difference in the behaviour of the simulator vs the real hardware.
Your experience suggests that maybe my problems aren't entirely my fault. :-)
A question for the seasoned iPhone developers, what is your preference for graphics in an iPhone app? I have turned to PNGs because I read that is the preferred image format and they are the most efficient format for the OS in terms of performance. However I had read you should try to use svg graphics so they scale up on the iPad. I started reading up on svg for my next app and thought the format was natively supported by UIImageView, but it seems you can only render them in UIWebviews or programatically. My belief was a lot of the latest graphically-rich apps used svg graphics, is that an incorrect assumption?
Thanks for any advice/comments.
I think using SVG graphics would cause a ton of problems and I'm not aware of any apps that use it. It's probably a much better idea to use PNGs and just have different sizes for iPhone, iPhone 4 and iPad.
I did an wave animation to explore features of sgx chip which is tile-based rendering (TBR) architecture by comparing the performance on iphone and laptop.
An advantage of TBR architecture is it allows the GPU to perform hidden surface removal before fragments are processed, so I draw many overlaped layers of animated waves, and only the wave in the top layer is visible.
I did this program on both iphone 3gs (using gles 2.0) and my laptop, a macbook pro(using opengl 2.0). I recorded the fps numbers of different layers, and I assume trends of fps changes on iphone and laptop are different. I guess the performance's decreasing of iphone should be slower than on laptop, when the number of layers is increased. But they have very similar trends.
I have 2 questions.
1. why it doesn't show the advantage of TBR architecture, while there are alot of overlapped triangles
2. why the performance of iphone simulator is much much much much slower than just running on laptop(without simulator)? As documentations say the simulator does not enforce the memory limitations of MBX and SGX and take the advantage of laptop's CPU, i guess its performance should keep up with the laptop.
anyone can help?
thanks alot
The OpenGL ES implementation in the iPhone Simulator is a software rasterizer and does not use the GPU in your MacBook.
What kind of framerate trends are you seeing, and in what way is only the wave on the top layer visible? Your primitives typically need to have framebuffer blending disabled and not issue discards in the fragment shader in order for hidden surface removal to skip fragment processing for what’s underneath.