Pyglet: how to handle heavy resources loading - pyglet

I'm learning pyglet in order to write a game, and I've trying to create a loading screen/scene while loading all game resources (static images and image sprites), to avoid window freeze while loading. I wonder what are the best practices to handle heavy resources loading with pyglet, on window/game startup or between scenes.

Related

Efficient way to Load the scene and Scene Render optimization

Our unity3d Most of the Scenes are too heavy. What is the right way and efficient way to load the scene.?? Please dont refer me LoadLevelAsync I have already see that! What is the best practice to load heavy scene without any problem and un-smoothness.
For which platform you are building?
The heaviest thing to load in Unity are Textures. If you are using too much of them in the scene, make sure that they are compressed and try to reduce their max size in the inspector. If you can't see any improvement, consider creating a loading screen transition after calling Application.LoadLevel("YourBigLevel").
You can add an Activity indicator if you are running the app on iOS or Android devices.

Is there any other way to load sprites?

Hi there I was wondering if in unity there is any other way of loading sprites during game, that doesn't include using resources.load. That's because i have some large images and load seemse to be quite consuming.
Any ideas?
Stream classes can be used to control loading process instead of Resources.Load(). In this way, you can partially load a resource, stop loading, and resume the loading later. Below link will be helpful.
http://forum.unity3d.com/threads/160787-Texture-Stream-Loading

Preloading assets in cocos2D

I have some assets for my game that are being loaded when the gameplay layer is loaded. I'd prefer to load them when the application launches in a loading screen, which is obviously a pretty standard thing to do.
My problem is the way in which new scenes are launched within cocos2D. Consider the following code which occurs (with some variations) in several places throughout the project:
[[CCDirector sharedDirector] replaceScene:[CCTransitionShrinkGrow transitionWithDuration:0.5f scene:[GameplayLayer scene]]];
This is the standard way of replacing the current scene with a new one. My question is, given that format, how would I pass a preloaded asset to the new GameplayLayer? Is there an accepted way of doing this in cocos2D? I have a feeling that I'm missing something incredibly simple, but as of now it's a mystery to me.
Cocos2d uses a texture cache that persists between scenes. You can preload assets into this cache in a loading scene and they will still be available from your game scene. How you load these are up to you. For images you can opt to do it asynchronously, to allow your loading scene to maintain a decent framerate and render a progress bar.
This post gets the basic idea across.
This thread may also be of use to you in that regard: http://www.cocos2d-iphone.org/forum/topic/2242
If you have your own class of assets (i.e. not a supported cocos2d image, or something), you could create your own cache singleton class (look at how the sharedManager instances of various cocos2d classes are implemented) and load your assets into that. As far as memory management goes you'd have to release those assets yourself whenever you deem necessary, but that's rather beyond the scope of this question.

Load an OpenGl view in the background. iPhone

I have an OpenGL view that renders a 3D model. It is a basic modification on Apples EAGLView. This view is added to a controller's .view and displayed with presentModalViewController: . I would like to do all of the model loading, and OpenGL state configuration in a background thread at app launch before the user chooses to display the view. Is this possible? Can I load textures, setup lighting, and generally just get everything ready to render in a background thread? My fear is that the Cocoa touch portions of the app on the main are going to manipulate the OpenGL state while I am setting up my renderer in the background. The controller will be displayed from the main thread of course. This level of understanding of OpenGl-ES is not something I deal with often, so please be gentile if my question is strange in any way :)
You absolutely can do background loading on a thread. Some of the key points:
- There is probably not much of a win to moving OGL state setup to a background thread - the total amount of change you'd induce in a context before the start of the first draw doesn't add up to a ton of time. Background loading is useful for textures and VBOs, as well as the load time that has to happen first to get the data to feed to the GL.
- You'll need to detach the context from the main thread and move it to the worker thread. We do this using pthreads to "send" the context to the worker.
- In our use, we hide the GL view to ensure that it doesn't need to be drawn while in a load state. (Frankly during load it may not contain anything useful.) So during async load the visible UI is all non-GL Cocoa.
This approach is more difficult than what you would do on the desktop: simply share the objects in two contexts (so that you can load and draw at the same time). When we looked at that approach more than a year ago, it wasn't possible on IOS; it may be possible now, I do not know.

Lots of png files being loaded into textures for use with OpenGLES is causing slow load times

I have thousands of png files that I am loading using libpng and then creating OpenGLES textures out of for use in the application. Doing this is causing a huge load time lag on the iPhone. Is there any way to speed up load time?
What I've typically done is build an object that handles lazy loading of my textures through a manager. At start up I register my known textures and resolve their file system attributes I will need later to save on simple IO at load time, then as they are needed I pull them in.
To speed things up I have a batch load mechanism too, where I say "load this array of images and return them to me." This is simply to remove the overhead of repeated method calls. Indeed, my single load solution is just a simple wrapper around my batch load.
This way at start up I cache my bookkeeping (object creation, file system attribute discovery, etc), but defer heavy work until necessary. As I load textures into my app at run time it triggers faults which fill in the textures from storage to texture memory. If I'm loading a scene with many textures known before hand I load the set of very common textures in a prefetch, but defer relatively uncommonly seen textures to runtime.
In practice this tends to work due to the probabilities involved - forcing the load at start time assures you you'll encounter all textures at once, whereas sparsely loading them in likelihood causes the expected user latency to drop off weighted by load time latency * probability of being loaded within some window of time from start. If you optimize your start time prefetch to not load textures you decrease your expected UI latencies dramatically.
Additionally, you may want to consider using NSURLConnection:connectionWithRequest:delegate: for loading your textures from storage. It is asynchronous so you can ask it to load your largest ones asynchronously and your smaller ones synchronously to take advantage of IO / CPU idle factors during file system fetches / texture decompression (large files load long while small files load fast and can deserialize at the same time). You should test this though since the iPhone may not handle asynchronous file system access well.
You can place several textures in one big texture (1024x1024). This require some re-calculation of the texcoords though. Try also to have the texture parts close to the actual resolution used when drawing, ie if a texture will fill a 1/2 screen height (240px) its good enough to use a texture of 256x256.
Then, a more advanced method (not tested) could be to concatenate all files (+registering their length+pos) and then used mmap(..) to access this png-db file. Then use NSData to feed the texture loading function if they are based on UIImage. The good thing with using mmap(..) like this is that one don't open and close a lot of files and the access to the data is handled with help of the OS VM manager.
[Note: Yeah, iPhone native PNG loader require some png mangling...might need a none native reader then or concatenate mangled files instead]
Do you know what order you're going to need your thousands of images in? Perhaps you can only load a few hundred of them when your app starts, and then load more on a background thread as you go.
Combining small images into fewer larger textures will also be a good idea, again only if there's some pattern about what images are used together.