Downloading resources before they are needed - gwt

I am developing an online multiplayer game using GWT and I want to reduce traffic by downloading images and sound files before they are utilized. I am using the method Image.prefetch for the image files. Is this the right way?
Concerning the sound files, I do not know exactly how to ensure that they are downloaded before the game starts. I am using the library gwt-voices and currently invoking the method play() on all needed sound files with a sound controller with default volume 0.
In both cases, it seems like the files are fetched once and then cached. This is fine but I think it might be better to have them downloaded them by the client and access them locally - if this is possible.

I'm not sure if it will work with sound files but you may wish to use a ClientBundle. At compile time this will create a single large image so when you first ask for one of the images the client will retrieve the entire large image and then you ask for and get the individual images. This cuts down on the multiple requests needed to otherwise retrieve each image individually.
https://developers.google.com/web-toolkit/doc/latest/DevGuideClientBundle
Also, you should look into code splitting which would probably be useful as well.
https://developers.google.com/web-toolkit/doc/latest/DevGuideCodeSplitting

Related

Better way of updating images bundled with the app

In our iOS app we have close to a hundred image files in resources bundle. Now we want to make them network based since the images may change (updated/no longer needed/additions) at any time. We are debating at what approach would be optimal. From what I have read, I understand that the resource bundle will not be editable on the device. So, when I start the app, I will check from the server if there are any image updates. If so, I will download the changed images and then save them to documents directory. Then in the app, for every image, I will basically have to check if it is in the resources bundle, then grab it from there. Else pick it from documents directory and display it.
Another approach is I don't have anything in the resources folder, I download all images on app launch from the server and store them to documents directory and then on, download the changed files at subsequent app launch. Here I am eliminating the check on resources folder if an image is present or not and my app bundle size would be reduced.
The third approach would be to copy the files from my resources directory to the documents directory on first launch and thereafter continue from documents directory.
Any suggestions on what would be a better approach or all of them would be similar from the performance point of view?
IMO, option three offers the best balance between eliminating needless code and preloading as much data as possible. You don't want to make the user wait for 100 image downloads when the app starts the first time, so pre-load as many as possible. The copy code is simple and will only be used once. So that eliminates the runtime checks you'd have to do with the other options.
No worries, performance will not be an issue, unless you use a particularly unwise image lookup algorithm.
Filesystem traversal should be pretty fast for such a small amount of files.
Before implementing something yourself, I would recommend looking at something off the shelf for Image Caching. Namely EGOImageView from EnormEgo.
I have used it in several applications which are dependent on grabbing images from URLs. It handles everything for you, you just set up a 'background' image for it to show while it goes about it's business of grabbing the URL based image in the background. The second time you use it, it's available immediately. Definitely gets my vote for ease of use...
p.s. it's free to use

How to store a very large image on an iPhone/iPad

What's the best way to store a very large image for an iOS app? I want an app to be able to view images that might be hundreds of megabytes, perhaps as much as a gigabyte as jpeg. I need to be able to store the image and retrieve selected areas for display.
Currently the images are cut into 512x512 pixel tiles and stored as jpeg files in a directory tree with tens of thousands of tiles (actually an image pyramid including downsamples).
Ignoring the question of displaying the image, I'm interested in the most efficient, manageable way to store this data on the device: files, like they currently are, in an sqlite database or something else?
Second part to the question. Is there a limit to the amount of data an app can store, or can an app keep importing data up to the storage limit of the device. I'm asking here about data that an app imports after it's installed.
The solution to this is to pre tile the enormous image so the tiles can be quickly retrieved from the file system on an as needed basis. One problem with very large images is that most solutions require the whole image to be rendered into a context, consuming vast amounts of memory. On a system like iOS, where memory is limited, the way to solve this is to use a library like libjeg or libjpegturbo to render an image a line at a time, then save the pixels into a raw file. The downside to doing this directly is that when you need one tile, you need to jump all over the file system finding each row of a tile. Thus a better solution is to not only incrementally scan, but incrementally tile too. You can use mmap to map the file into just the area you need, so you can really minimize memory consumption. That said, you can thrash the Unified Buffer Queue on iOS so badly the app crashes, or even the whole system!
If you are curious about how to implement the above solution, there is a freely available project on github - PhotoScrollerNetwork - that does all the above.
A sample from Apple: PhotoScroller
What about splitting into parts. Then it can be gathered by your application if needed

Downloading large amount of images, making it faster

I need help with downloading from webserver...
What i currently do is get XML file from web servers that contains image locations, parse XML, download each image, store image on iphone, and store image name to sql database.
This takes lots of time because there is large amount of images to be downloaded and i am downloading one by one.
My app update just got rejected because reviewer decieded that downloading is too long... What is funny, last two updates passed without problems..
I was thinking about zipping those images on server and sending zip file to iphone, unzipping it there, or packing images together with binary and sending it to apple.
Any advice on how to make download faster, would be appreciated. Thanks.
BTW, zip won't help with images. They are already compressed, so it will just add overhead. Make sure your images are not any larger than you need for display and I'd do what Mario suggested above and download them in multiple async calls (at least make the one big call asynchronous.)
A key principle of UI design is to display partial results (unless they are invalid or misleading) so that the user understands that progress is being made.
If you really need all the images to make it valid, you can download a few and display them grayed out (alpha = 0.4) or something so that it's clear that this is a partial result, but that progress is being made. The reviewer probably felt that it was taking too long to startup.
Do you change those images often? Or only once per release if at all? If they change with each release only I'd package them. If they're almost never changed, go with the one huge download (so people don't have to redownload when updating) and if they're change often, download them file by file but try to do 2-3 files at once using asynchronous download (if supported).
1) I would use something like an NSOperationQueue to download around three images at a time in the background. Much more than that and the UI starts getting choppy.
2) Also display some kind of loading indicator while this is going on.
3) What format are your images in? If you are transferring over the network you should use JPG, and consider setting the quality level to something smaller (say 6 even 5). To offset the loss of quality you could send down larger images, even with the larger number of pixels you can easily be better off with a lower quality compression.
4) If you have to use PNG to preserve transparency, consider using PNGCrush on the images before sending. As noted, zip will do pretty much nothing.
One way to speed up download of those images is to put them on a CDN. Some CDNs, like Limelight have special network optimizations for sending data to mobile devices. They also just do a better job of routing content, and have higher capacity for transmitting content. What's nice about this approach is that you might not have to change your app. However CDNs can be pricy.
Likely, your images are just way too large. You said you're worried about the 20MB app limit, but I think at that point, your images are just way too large for the phone.
Rather than zipping the files, I'm pretty sure you need to downsample the size of the images. Not only that, but you should only download the ones that you need, when you need them.
If you still want to have bulk downloads, why not have it as a side option rather than the default implementation?

how to decrease the application size

My problem is that my application size is very high,
is there any idea to reduce size of application?
if i make application without content and content is uploaded my server then how i sync the application with content put on my server?
i want to know that once user download application after that when he use application then we stream the content and save his document folder.
once user stream then never required for streaming.
is it possible????
Thanks,
Reducing the size of your application depends on the TYPE of contents of your application. I highly doubt that the application code is the cause, and since you did not mention what they are I am assuming they are some kind of resource.
If your resources are images, try to use image compression programs. Or convert them to smaller sized images or optimize the images.
If your resources are documents / text files / files that have a high compression ratio when zipped. Then you can try to zip your resources and access them inside the compressed file (this will mean additional coding, and probably slower in performance).
These are just examples.
It is not advisable to stream large contents because it uses the network bandwidth which, depending on the user's plan, can cause a big spike in phone bills.
Yes it is possible that you can download your content and can save to application's document folder, when user runs your application for the first time. Thought it may affect the first impression to your user as it will take time to download.

Downloading multiple items in package

I need to allows users to download multiple images in a single download. This download will include an sql file and images. Once the download completes, the sql will execute, inserting text into an sqlite database. This text will include references to the download images. The text and images are rendered in a UIWebView.
What is the best way to download everything in a single download? I was thinking to use a bundle since it can be loaded at runtime but not sure of any limitations/restrictions in this scenario. I have tested putting the bundle into the Documents folder and then accessing resources inside of it. That seems to work fine in a simple test.
You're downloading everything through a socket, which only knows about bytes, so a bundle, or even a file, doesn't "naturally" transfer through, the server side opens files and encodes and sends them into the connection, the client reads from the socket and reconstructs the original file structure.
Assuming the application has a UI for picking which items needs to be transferred, it could then request all items to the server, and the server could then send all the items through the single connection with some delimitation you invent, so that the iPhone app can split the stream back into the individual files.
Or another options is that the client could just perform individual HTTP requests for the different files, through pretty straightforward use of NSURLConnection.
The former sounds like an attempt to optimize the latter. Have you already tested and verified that the latter is too slow/inefficient? It definitely is more complex to implement.
There is a latency issue with multiple HTTP connections that you run in a sequence, however you can perhaps mitigate it by running multiple downloads connections in parallel -- for example through an NSOperationQueue with a limit of 2 to 5 concurrent download operations.