Unity - native texture formats explained - unity3d

I have read the following here:
When you use a Texture compression format that is not supported on the
target platform, the Textures are decompressed to RGBA 32 and stored
in memory alongside the compressed Textures. When this happens, time
is lost decompressing Textures, and memory is lost because you are
storing them twice. In addition, all platforms have different
hardware, and are optimised to work most efficiently with specific
compression formats; choosing non-compatible formats can impact your
game’s performance. The table below shows supported platforms for each
compression format.
Let's discuss a specific case. Say I stored a .png file on a disc and packaged my game from Android. Now I play that game on an Android device whose GPU requires ETC2 as native texture format. Am I correct that when I enter the game the following should happen:
Read PNG file from disk to RAM (RAM is used for storing PNG file data)
Decompress PNG to RGBA32 (RAM is used for both PNG and for decompressed data)
Compress RGBA32 to ETC2 and upload to GPU (on RAM if I have a texture cash, then I might deallocate memory for PNG file data but I need to store RGBA32 for future reuse or at least I need to store ETC2)
This means I am doing lots of conversions between PNG->RBGA32->ETC2 and during that conversions I not only use CPU resource but also significantly utilize RAM. My question - did I correctly understood what happens when one does not package with native texture formats for targeted platform?

Yes, you kind of correctly understood what's going on here. However you misunderstood something: The way PNG relates to all of this. The compression methods implemented by GPUs to be applied on textures are very different to the filter+deflate method of PNG, so with every kind of GPU you have this kind of behavior.
What the Unity devs are trying to tell you is, that textures can be stored in the very format the GPU works with, and that for optimal performance you should identify which compression formats are supported on your target platform and bundle your asset file for that.
So for a game for platform X identify the compression formats supported by X GPUs, then pack your assets with that and ship the X version with that. Rinse and repeat for other platforms.

Related

How to reduce memory usage of dynamically loaded Textures in Unity?

Overview:
Currently working on a project in Unity for iOS and Android, where at runtime we download and store images in PNG format for later use. When we need to we load these textures as well as some from a web server using Unity's WebRequestTexture and both file:// and https:// schemes respectively.
Problem:
The problem we are facing is that when we use the WebRequestTexture method the memory usage of the textures increases significantly.
Question:
Is there a way to potentially pre-compress the locally downloaded images into device compatible compressed formats, and then load them directly to the GPU without it increasing the memory size required?
If this is possible would using Texture2D.CreateExternalTexture allow us to then use these textures without further increasing the memory usage?

What formats of music/sound are good for Unity3D games?

I am new to Unity 3D. And we are developing a mobile game in Unity 3D. Some of our *.wav sound files are relatively large, say 25MB for a level background music. And we are going to have different music for different levels. And the size could be a problem, consider most of the mobile game sizes are under 200MB.
So what formats are the best for Unity 3D games? which has a nice balance in size and sound quality? Is there any general guidelines of how to compress the music, etc?
Thanks!
I personally use OGG which I feel is a good compromise between small file sizes and good quality.
As far as I know and understand, Unity re-encodes your source files anyway. Therefore your question about your assets' original format may be not as relevant as you might expect, concerning the data format in the published game binaries. See also manual on Audio.
You may influence what is actually stored and distributed by changing the Import Settings for each audio asset file.
This is an outdated question, yes, but Unity supports a wide range of audio files. Including:
.mp3
.ogg
.wav
.aiff
and more. I prefer either .mp3 or .oog because of their small file size.
I think there are some devices that could have issues with mp3 files has they haven't the mp3 chip decoder so the best option it's ogg files, also ogg usually compress better.

GPU usage of webp on iphone, compared to png

I am currently working on an iOS game and the image resources seem to be a little too much. I heard of webP and wanted to know more about it.
I did some researches on webP and know that this new format requires much less space than PNG and its encoding/decoding speed is fast. But I found no article discussing the GPU burden when using WebP pictures, compared to PNG ones.
Is there any article out there on this topic?
Or can I do the experiment myself? I am coding in VS using cocos2d-x. I don't know what to do if I want to simulate an iOS GPU and monitor its memory usage.
Many thanks!
You can assume that the textures generated remain the same, ie render at the same speed, using the same amount of memory.
If you want faster loading and rendering and less memory usage, use the .pvr.ccz format.

External JPG compression library for iPhone?

Does anyone know if any JPEG compression library that produces decent image quality has been ported to the iPhone? The built-in algorithm inside UIImageJPEGRepresentation produces huge files (compared to the quality), which makes uploading images from the phone over the network much slower than necessary. I can compress a JPG compressed inside the iPhone to one tenths of the file size using GD built into PHP, without significant loss of quality...
Well. The GD library uses the iJPEG library for compression. So if you want the same quality you should use the same library:
http://www.ijg.org/
It's the most commonly used jpeg compression/decompression library btw.
I would like to know if there is a better library as well. I'm using .Net to compress jpeg images on the server and it does a much better job than the iPhone's UIImageJPEGRepresentation. I'd like to get them as small as possible on the iPhone before uploading as it a dreadfully slow process.

Lossy compressed format to raw PCM on iPhone

I want to start with an audio file of a modest filesize, and finish with an array of unsigned chars that can be loaded into OpenAL with alBufferData. My trouble is the steps that happen in the middle.
I thought AAC would be the way to go, but according to Apple representative Rincewind (circa 12/08):
Currently hardware assisted compression formats are not supported for decode on iPhone OS. These formats are AAC, MP3 and ALAC.
Using ExtAudioFile with a client format set generates PERM errors, so he's not making things up.
So, brave knowledge-havers, what are my options here? Package the app with .wav's and just suck up having a massive download? Write my own decoder?
Any links to resources or advice you might have would be greatly appreciated.
Offline rendering of compressed audio is now possible, see QA1562.
While Vorbis and the others suggested are good, they can be fairly slow on the iPhone as there is no hardware acceleration.
One codec that is natively supported (but has only a 4:1 compression ratio) is ADPCM, aka ima4. It's handled through the ExtAudioFile interface and is only the tiniest bit slower than loading .wav's directly.
There are some good open source audio decoding libraries that you could use:
mpg123
FAAC
Both are licensed under LGPL, meaning you can use them in closed source applications provided modifications to the library, if any, are open sourced.
You could always make your wave files mono and hence cut your wave file size in half. But that might not be the best alternative for you
Another option for doing your own decoding would be Ogg Vorbis. There's even a low-memory version of their library for integer processors called "Tremor".