Unity3D how does AssetBundle cache work? - unity3d

I am kinda confused about:
After downloading an assetbundle at the first time, how Unity knows I have already downloaded it and directly load from cache(disk) at the second time?
Does it use url to mapping to local storage? If in that case, if I update my assetbundle on the server using the same name, at the second time, it will still be loading from cache since the url doesn't change?
Sample code:
UnityEngine.Networking.UnityWebRequest request = UnityEngine.Networking.UnityWebRequest.GetAssetBundle(uri, 0);
yield return request.Send();
//only download at the first time, at the second time, it can be loaded from cache
AssetBundle bundle = DownloadHandlerAssetBundle.GetContent(request);
GameObject cube = bundle.LoadAsset<GameObject>("Cube");

To have caching;
Instead of calling:
UnityEngine.Networking.UnityWebRequest request = UnityEngine.Networking.UnityWebRequest.GetAssetBundle(uri, 0);
You must call GetAssetBundle with a version number:
UnityEngine.Networking.UnityWebRequest request = UnityEngine.Networking.UnityWebRequest.GetAssetBundle(uri, 1, 0);
See documentation here:
https://docs.unity3d.com/ScriptReference/Networking.UnityWebRequest.GetAssetBundle.html
You can also implement this with a call to new DownloadHandlerAssetBundle(string url, uint version, uint crc);. See sample code here:
https://docs.unity3d.com/ScriptReference/Networking.DownloadHandlerAssetBundle-ctor.html
And documentation here:
https://docs.unity3d.com/ScriptReference/Networking.DownloadHandlerAssetBundle-ctor.html
Note that when you use caching, the request.responseCode will no longer be value 200 (success), but will be 0, when the data is retrieved from cache!

Related

How can I make assets accessible to players for modding?

My game includes image files and json configuration files that I would like to make accessible in the deployed game's folder structure so that players can easily edit or swap them out.
I have considered/tried the following approaches:
My initial approach was to use the Resources folder and code
such as Resources.Load<TextAsset>("Rules.json"). Of course,
this did not work as the resources folder is compiled during builds.
I investigated the Addressables and AssetBundle features, but they do not seem aimed at solving this problem.
After asking around, I went for using .NET's own file methods, going
for code like File.ReadAllText(Application.dataPath + Rules.json). This seems like it will work, but such files are still not deployed automatically and would have to manually be copied over.
It seems that the StreamingAssets folder exists for this, since the manual advertises that its contents are copied verbatim on the target machine. I assume that its contents should be read as in the previous point, with non-Unity IO calls like File.ReadAllText(Application.streamingAssetsPath + Rules.json)?
So yeah, what is the 'canonical' approach for this? And with that approach, is it still possible to get the affected files as assets (e.g. something similar to Resources.Load<Sprite>(path)), or is it necessary to use .NET IO methods to read the files and then manually turn them into Unity objects?
After asking the same question on the Unity forums, I was advised to use the StreamingAssets folder and told that it is necessary to use .NET IO methods with it.
An example for how to load sprites as files using standard IO can be seen here: https://forum.unity.com/threads/generating-sprites-dynamically-from-png-or-jpeg-files-in-c.343735/
static public Sprite LoadSpriteFromFile(
string filename,
float PixelsPerUnit = 100.0f,
SpriteMeshType type = SpriteMeshType.FullRect)
{
// Load a PNG or JPG image from disk to a Texture2D, assign this texture to a new sprite and return its reference
Texture2D SpriteTexture = LoadTexture(filename);
Sprite NewSprite = Sprite.Create(
SpriteTexture,
new Rect(0,
0,
SpriteTexture.width,
SpriteTexture.height),
new Vector2(0, 0),
PixelsPerUnit,
0,
type);
return NewSprite;
}
static private Texture2D LoadTexture(string FilePath)
{
// Load a PNG or JPG file from disk to a Texture2D
// Returns null if load fails
Texture2D Tex2D;
byte[] FileData;
if (File.Exists(FilePath))
{
FileData = File.ReadAllBytes(FilePath);
Tex2D = new Texture2D(2, 2);
// If the image is blurrier than what you get with a manual Unity import, try tweaking these two lines:
Tex2D.wrapMode = TextureWrapMode.Clamp;
Tex2d.filterMode = FilterMode.Bilinear;
// Load the imagedata into the texture (size is set automatically)
if (Tex2D.LoadImage(FileData))
{
return Tex2D; // If data = readable -> return texture
}
}
return null;
}

Azure Data Lake HDFS upload file size limit

Does anyone know what is maximum size to upload file via Azure HDFS Rest API? (https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-data-operations-rest-api).
I found someplace 256MB, some place 32MB, so wondering.
Or similar limits for other SDKs?
I was wrestling with the same problem some months ago and it turned out that the IIS which is in front of ADLS is setting the maxAllowedContentLength with default value of 30000000 bytes (or 28.6Mb). This essentially means that whenever we want to push anything bigger that 30Mb, that request never reaches ADL as IIS throws 404.13 before that. Reference.
As already suggested in the links, ADLS has a driver with a 4-MB buffer, I'm using the .NET SDK myself and following code has served me well
public async Task AddFile(byte[] content, string path)
{
const int fourMb = 4 * 1024 * 1024;
var buffer = new byte[fourMb];
using (var stream = new MemoryStream(content))
{
if (!_adlsFileSystemClient.FileSystem.PathExists(_account, path))
{
_adlsFileSystemClient.FileSystem.Create(_account, path);
}
int bytesToRead;
while ((bytesToRead = stream.Read(buffer, 0, buffer.Length)) > 0)
{
if (bytesToRead < fourMb)
{
Array.Resize(ref buffer, bytesToRead);
}
using (var s = new MemoryStream(buffer))
{
await _adlsFileSystemClient.FileSystem.AppendAsync(_account, path, s);
}
//skipped for brevity
In my tests, I am finding a maximum file size limit somewhere between 28MB and 30MB.
Using the Azure Data Lake Storage REST API, I have had no issues creating files as large as 28MB. However, when I try to create a file that is 30MB, I receive a 404 Not Found error.
The following references align with the file size limit and 404 error I am observing. The references are about the SDK, but it could be that the SDK is also calling the REST API under the covers. My tests are calling the REST API directly.
NotFound error on call to Data Lake Store Create
https://stackoverflow.com/a/41469724/10363

Xuggler write and read video via H.264 to/from Sockets

I want to be able to send BufferedImages generated from my java program over the local network in real time, some my second application can show them.
I have been looking through a lot of websites over the last 2 days but I wasn't able to find anything. Only thing I found was this:
Can I use Xuggler to encode video/audio to a byte array?
I tried implementing the URLHandler but problem is, MediaWriter still wants an URL and as soon as I add a VideoStream, it opens the container a second time with the url and then in crashes.
I hope you can help me and thanks in advance.
Code I have right now:
val clientSocket = serverSocket.accept()
connectedClients.add(clientSocket)
val container = IContainer.make()
val writer = ToolFactory.makeWriter("localhost", container)
container.open(VTURLProtocolHandler(clientSocket.getOutputStream()), IContainer.Type.WRITE, IContainerFormat.make())
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_H264, width, height)

AssetBundles on iOS: Memory always increases, causing crash

We have our asset bundles stored on an Amazon S3 bucket. When the game starts, it determines which bundles it needs to download new versions of, using WWW.LoadFromCacheOrDownload.
The problem we're running into is the memory iOS reports it has allocated for our app keeps increasing, however the memory Unity reports it's using (through the profiler) always stays the same. We have enough bundles that by the time it has finished downloading everything we need, it has invariably received a memory warning from iOS, and we are shutdown due to memory pressure shortly after.
Common solutions we have in place already: Unloading the assetbundle after the WWW is finished, using assetBundle.unload(), calling Resources.UnloadUnusedAssets(), and calling Dispose() on the WWW. None of it is solving the problem.
Code follows:
private IEnumerator DownloadBundle(DownloadQueueEntry entry, DownloadFinishedCallback callback)
{
while (!entry.finished)
{
// grab bundle off S3
string url = string.Format(BUNDLE_URL_FORMAT, entry.directory, entry.assetName);
WWW www = WWW.LoadFromCacheOrDownload(url, entry.version);
yield return www;
if (string.IsNullOrEmpty(www.error))
{
Debug.Log("[BundleDownloader] Download Completed " + entry.assetName);
entry.finished = true;
entry.downloading = false;
www.assetBundle.Unload (true);
Resources.UnloadUnusedAssets ();
}
else
{
// usually timed out resolving host, just try again for now
Debug.LogError("[BundleDownloader] Download failed: " + url + " Error: " + www.error);
}
www.Dispose();
www = null;
}
if(callback != null)
{
callback ();
}
}
--edit--
A screenshot showing the increasing memory usage is at the link below. Memory usage proceeds like that until it has chewed up around 150MB. This is all in an scene that only has a GameObject for init scripts in it (no art or anything).
https://www.dropbox.com/s/3b6skexz6xhug5g/Screenshot%202014-03-28%2014.54.26.png
As the Unity docs suggest, you should really be encapsulating your usage of the WWW object/caching routines in a "using" statement block.
using System;
using UnityEngine;
using System.Collections;
public class CachingLoadExample : MonoBehaviour {
public string BundleURL;
public string AssetName;
public int version;
void Start() {
StartCoroutine (DownloadAndCache());
}
IEnumerator DownloadAndCache (){
// Wait for the Caching system to be ready
while (!Caching.ready)
yield return null;
// Load the AssetBundle file from Cache if it exists with the same version or download and store it in the cache
using(WWW www = WWW.LoadFromCacheOrDownload (BundleURL, version)){
yield return www;
if (www.error != null)
throw new Exception("WWW download had an error:" + www.error);
AssetBundle bundle = www.assetBundle;
if (AssetName == "")
Instantiate(bundle.mainAsset);
else
Instantiate(bundle.Load(AssetName));
// Unload the AssetBundles compressed contents to conserve memory
bundle.Unload(false);
} // memory is freed from the web stream (www.Dispose() gets called implicitly)
}
}
"using" statements are a C# feature to ensure that "Dispose()" methods work correctly (and in many cases, are called automagically for you).
As stated in MS's docs: http://msdn.microsoft.com/en-us/library/yh598w02.aspx
"[using] Provides a convenient syntax that ensures the correct use of IDisposable objects."
I'm guess your memory leak is due to these functions not performing as intended (possibly due to improper configuration, not sure).
I had similar issues on AssetBundle downloading on iPad. However, in my case;
I was downloading assetbundle from server (without using cache) and loading it later. In one asset bundle, it boosts the memory if the asset bundle has images more than 100 or lets say 200.
This was not UnityDestroyWWWConnection or "using" issue. I was downloading 60MB asset bundle (with using best-compression on asset bundle generation) and it uses about 450MB while downloading.
I checked the result from instrument and saw tons of small malloc while downloading asset bundle. You can check the instrument screenshot from here.
My guess - not sure: Unity extract the information from asset bundle while downloading continues, which gives error on memory. It gets the header info first, and understands that it is unity3d object and because it downloads with www class it prepare for WWW asset bundle.
My solution was:
Compressing assetbundle and putting server as zip.
Downloading asset bundle zip and extracting on iPad (using SharpZipLib)
We ran into a similar issue in our app. We were loading lots of textures from WWW and noticing iOS being the only platform to have a memory leak from it. We eventually found our solution here http://forum.unity3d.com/threads/www-memory-leak-ios.227753/. Basically, there is a known issue in unity 4.3 that leaks the data from www calls. This SHOULD be fixed in unity 4.5. In the meantime, you can follow Alexey's suggestion to modify the code in the generate xcode project or update to 4.5:
4.5 will have the fix.
Essentially you need:
search for
extern "C" void UnityDestroyWWWConnection(void* connection)
in WWWConnection.mm
[delegate.connection cancel];
delegate.connection = nil;
[delegate.data release]; // <-- ADD THIS
[delegate release];

JPEG encoder super slow, how to Optimize it?

I'm building an App with actionscript 3.0 in my Flash builder. This is a followup question this question.
I need to upload the bytearray to my server, but the function i use to convert the bitmapdata to a ByteArray is super slow, so slow it freezes up my mobile device. My code is as follows:
var jpgenc:JPEGEncoder = new JPEGEncoder(50);
trace('encode');
//encode the bitmapdata object and keep the encoded ByteArray
var imgByteArray:ByteArray = jpgenc.encode(bitmap);
temp2 = File.applicationStorageDirectory.resolvePath("snapshot.jpg");
var fs:FileStream = new FileStream();
trace('fs');
try{
//open file in write mode
fs.open(temp2,FileMode.WRITE);
//write bytes from the byte array
fs.writeBytes(imgByteArray);
//close the file
fs.close();
}catch(e:Error){
Is there a different way to convert it to a byteArray? Is there a better way?
Try to use blooddy library: http://www.blooddy.by . But i didn't test it on mobile devices. Comment if you will have success.
Use BitmapData.encode(), it's faster by orders of magnitude on mobile http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/display/BitmapData.html#encode%28%29
You should try to find a JPEG encoder that is capable of encoding asynchronously. That way the app can still be used while the image is being compressed. I haven't tried any of the libraries, but this one looks promising:
http://segfaultlabs.com/devlogs/alchemy-asynchronous-jpeg-encoding-2
It uses Alchemy, which should make it faster than the JPEGEncoder from as3corelib (which I guess is the one you're using at the moment.)
A native JPEG encoder is ideal, asynchronous would be good, but possibly still slow (just not blocking). Another option:
var pixels:ByteArray = bitmapData.getPixels(bitmapData.rect);
pixels.compress();
I'm not sure of native performance, and performance definitely depends on what kind of images you have.
The answer from Ilya was what did it for me. I downloaded the library and there is an example of how to use it inside. I have been working on getting the CameraUI in flashbuilder to take a picture, encode / compress it, then send it over via a web service to my server (the data was sent as a compressed byte array). I did this:
by.blooddy.crypto.image.JPEGEncoder.encode( bmp, 30 );
Where bmp is my bitmap data. The encode took under 3 seconds and was easily able to fit into my flow of control synchronously. I tried async methods but they ultimately took a really long time and were difficult to track for things like when a user moved from cell service to wifi or from tower to tower while an upload was going on.
Comment here if you need more details.