Performance issue in downloading the images on iPhone from the server - iphone

We have an application on iPhone. This application displays 25 products per page/screen. The text items such as product name, price, discount, URL of the product image, etc of all the 25 products is downloaded from the server first.
After that we make 25 synchronous requests to download the 25 product images, one after the another. Each image is about 25KB in size and these are of size 300 by 400 pixels approximately and we only need 72 by 72 pixels size images for display on iPhone. We notice that it takes about 40 seconds to display one screen/page and this sort of performance is not good. So we are investigating how to increase the performance.
Will the performance improve if we scale down the size of the images on the server to 72 by 72 pxiels.
Also is it possible to download all 25 images from server to the iPhone? If so can you please share your approaches as to how to do that? We want to do this only if it can improve the performance.

I suggest you ask this on Stackoverflow as already mentioned.
From a programmer's perspective, if you only need 72x72 images, you should definitely bring that. You'll be saving bandwidth, battery and processing power.
Then 25 sync requests seems like a bad idea, why not bring an entire page (or two pages) at the same time?
A URL request is "slow" by nature, so the less you make, the faster it will work.
I'd modify the server to allow batch fetching as in "give me the first 25" and then you process them locally. Then you can fetch the next 25 asynchronously (and preemptively) for when the user presses next you already have it (and always have 1 or 2 in advance).
Use Cache, save them locally if you can and always check if that page is available locally, so you don't have to re-fetch the records if the user presses back and then next again. What's downloaded stays downloaded :) Product pages should't change that often.
For more specific implementations, I suggest you jump over to StackOverflow.

Related

How to archive live streaming in Azure Media Service

I am trying to use Azure Media service to do the 7x24 live streaming and also try to persist the streamed video. In the document it said the Live Output can set the archiveWindowLength upto 25 hours for VOD. But not really able to persist the whole streamed videos.
Any idea about how to achieve it. I am quite new in this area. Any help is appreciated.
The DVR window length for a single Liveoutput is 25 hours. The reason for the 25 hours is to provide a 1 hour overlap for you to switch to a second Liveoutput with a new Asset underneath.
Typically how i set this up is to have an Azure Function and Logic App running with a timer to ping-poing between two LiveOutputs. You have to create a new LiveOutput with a new Asset.
Think of LiveOutputs as "tape recorders" and the Asset is the "tape". You have to swap between tape recorders and switch tapes ever xx-hours.
You do not necessarily have to wait a full 25 hours though. I actually recommend not doing that because of the size of the manifest gets really huge. Sometimes loading such a large HLS or DASH manifest on a client can really mess with memory and cause some bad things to happen. So, you could consider doing the ping-pong between your "tape recorders" every 1 hour.
If you wish to "publish" the live event to your audience with a smaller DVR window (say 10 minutes or 30 minutes) you could additionally create a 3rd LiveOutput and Asset and leave that one set to a DVR window of 30 minutes and leave it running forever.

Import multiple photos to parse.com database

I'm trying to build an navigation app with place location and its photos.
I have 200 spot location names (String), its location (GeoPoints), and its image (JPG).
is it possible to upload the database including the image instantly?
I only managed to upload the String and GeoPoints database using json, but still can't do it for the image file.
anyway,
clicking one by one is definitely not an option. I got 200 images and still counting. It might reach 500 or more in several weeks.
thank you in advance,
how large are the images?
if you can scale (photos) them down a little bit and if you have multiple threads on the httpclient being used with parse.com then you should be able to saturate the WIFI / ISP bandwidth available to your device.
ie if you've got 10 Mb available upstream to the ISP then you ought to be able to optimize the use of multiple , async connections up so that you are pushing close to 10Mb of photos to parse.com.
It probably wont help much ( parse - android example ) but this was precisely the target of this question.
63 photos ( each 70K in sz ) upload in 3 seconds total .

Facebook Graph Latency

The following code fragment
for($i=0;$i<60;$i++){
$u[$i]=$_REQUEST["u".$i];
$pic[$i] =imagecreatefromjpeg("http://graph.facebook.com/".$u[$i]."/picture");
}
is taking more than 90 seconds to execute on my new server. It was taking less than 15 seconds on my shared hosting server. However, on dedicated server it is taking more than 90 seconds.
The data center of my new server is Asia Pacific.
Please advice on how I can reduce this time of fetching images on the graph.
thanks and regards
Why not just request all the pictures' URLs in a single call?
https://graph.facebook.com/?fields=picture&ids=[CSV LIST OF IDS]&access_token=ACCESS_TOKEN
You'll then have a list of all the images and can fetch them all however you so wish
is taking more than 90 seconds to execute on my new server.
Well, for 60 HTTP requests that’s not too bad, I’d say.
It was taking less than 15 seconds on my shared hosting server. However, on dedicated server it is taking more than 90 seconds.
Maybe the connection of your old server was just faster …?
The data center of my new server is Asia Pacific.
Do you know by any chance, which one it was before?
Please advice on how I can reduce this time of fetching images on the graph.
Do you have to request all these images in one go?
Maybe your app’s workflow (which we don’t know anything about yet) would allow for other approaches, like getting user images at a previous time (f.e. when a user starts using your app) and cache them locally, so that you don’t have to do 60+ HTTP requests in one go.

Performance issue in downloading the images on iPhone from the server

We have an application on iPhone. This application displays 25 products per page/screen. The text items such as product name, price, discount, URL of the product image, etc of all the 25 products is downloaded from the server first.
After that we make 25 synchronous requests to download the 25 product images, one after the another. Each image is about 25KB in size and these are of size 300 by 400 pixels approximately and we only need 72 by 72 pixels size images for display on iPhone. We notice that it takes about 40 seconds to display one screen/page and this sort of performance is not good. So we are investigating how to increase the performance.
Will the performance improve if we scale down the size of the images on the server to 72 by 72 pxiels.
Also is it possible to download all 25 images from server to the iPhone? If so can you please share your approaches as to how to do that? We want to do this only if it can improve the performance.
1.if you resize them to 72x72 then you will a have smaller size to download in total so it's faster.
2.for batching i don't have a solution but you could try to make an asynchronous request for each file. while downloading put a temporary image(a logo or something). when the image is downloaded replace the temp image with the new one.
you can put the images in a cache in order not to download them every time.
for asynchronous download you can use ASIHTTPRequest(it also has a cache class).
if you do synchronous requests then your GUI will freeze until they are finished.
First off, scaling the images on the server is a complete no-brainer - there's no need to download any more data that you absolutely have to.
Once you've done that you'll see a marked performance improvement, which you can further increase by using placeholder images and downloading the real images in the background asynchronously. (The ASIHTTPRequest library is a nice wrapper for such functionality.)
Finally, if appropriate you should use an image cache and store the images locally (perhaps with references in an SQLite database). However, you'll need to perform maintenance on this occasionally to keep it within a sensible filesize limit.
**You can use sdwebimage framework for images download from server in ios **
**You can use this link **
go to this link
download frame and get information , how to use it.

Downloading large amount of images, making it faster

I need help with downloading from webserver...
What i currently do is get XML file from web servers that contains image locations, parse XML, download each image, store image on iphone, and store image name to sql database.
This takes lots of time because there is large amount of images to be downloaded and i am downloading one by one.
My app update just got rejected because reviewer decieded that downloading is too long... What is funny, last two updates passed without problems..
I was thinking about zipping those images on server and sending zip file to iphone, unzipping it there, or packing images together with binary and sending it to apple.
Any advice on how to make download faster, would be appreciated. Thanks.
BTW, zip won't help with images. They are already compressed, so it will just add overhead. Make sure your images are not any larger than you need for display and I'd do what Mario suggested above and download them in multiple async calls (at least make the one big call asynchronous.)
A key principle of UI design is to display partial results (unless they are invalid or misleading) so that the user understands that progress is being made.
If you really need all the images to make it valid, you can download a few and display them grayed out (alpha = 0.4) or something so that it's clear that this is a partial result, but that progress is being made. The reviewer probably felt that it was taking too long to startup.
Do you change those images often? Or only once per release if at all? If they change with each release only I'd package them. If they're almost never changed, go with the one huge download (so people don't have to redownload when updating) and if they're change often, download them file by file but try to do 2-3 files at once using asynchronous download (if supported).
1) I would use something like an NSOperationQueue to download around three images at a time in the background. Much more than that and the UI starts getting choppy.
2) Also display some kind of loading indicator while this is going on.
3) What format are your images in? If you are transferring over the network you should use JPG, and consider setting the quality level to something smaller (say 6 even 5). To offset the loss of quality you could send down larger images, even with the larger number of pixels you can easily be better off with a lower quality compression.
4) If you have to use PNG to preserve transparency, consider using PNGCrush on the images before sending. As noted, zip will do pretty much nothing.
One way to speed up download of those images is to put them on a CDN. Some CDNs, like Limelight have special network optimizations for sending data to mobile devices. They also just do a better job of routing content, and have higher capacity for transmitting content. What's nice about this approach is that you might not have to change your app. However CDNs can be pricy.
Likely, your images are just way too large. You said you're worried about the 20MB app limit, but I think at that point, your images are just way too large for the phone.
Rather than zipping the files, I'm pretty sure you need to downsample the size of the images. Not only that, but you should only download the ones that you need, when you need them.
If you still want to have bulk downloads, why not have it as a side option rather than the default implementation?