iPhone network performance - iphone

I have a question and I am very open to suggestions (even very odd ones!)
I am writing an iPhone app, which does a request (URL with parameters) to a server. As a response, the iPhone receives XML. All is well.
Right now, I am looking to improve my application's speed by measuring the time it takes to perform certain tasks. I've found that, of all the tasks performed (downloading, XML Parsing, sending the request, handeling the request, parsing objects from XML), downloading the actual XML takes the longest.
Now, my XML files are very, very, very easy and very, very, very small. I use them primarily to read RSS-like data and show them in a UITableView.
My app works very well, and there is nothing that feels really slow, but there is one application in the App Store right now which does something very similar to my application, but is way faster and feels more 'snappy', if you know what I mean. It also has the great feature to load the headlines one by one from the RSS-feed.
Currently I'm experimenting with gzip compression of my data, but the compression only makes my data half the size and it doesn't seem to do any real good for performance. The main thing is, that the data has to be downloaded, before it gets parsed. It would be very cool to have a 'stream' of data, which is parsed as it comes in. That way, I can do two jobs almost simultaneous and load headlines one by one (making user interactivity more attractive).
Anyone has an idea of how to improve my performance? Either by great compression tips or entirely different ways to communicate with the server.. All is welcome!
UPDATE: putting the latency and responsiveness of the server aside; how could I get a source of XML to be 'streamed' to my iPhone (downloaded byte for byte) and at the same time get parsed? Right now it is a linear process of downloading -> parsing -> showing, but it could become semi-parallel by downloading & parsing at the same time (and show each item when it is done downloading, instead of loading it all at the same time in a UITableView)

Assuming you're using NSXMLParser's initWithContentsOfURL:, that's probably part of the problem. It seems like that downloads the entire contents of the URL, then hands it off to the parser to parse all at once.
Despite the fact that the NSXMLParser is an event-driven parser, it doesn't seem to support streaming data to the parser in an incremental manner. You could, of course, replace NSXMLParser with some other parsing library that handles incremental data in a more sensible way.
An alternative would be to use NSURLConnection, and create a new NSXMLParser and re-parse the data each time some data comes in, in the connection:didReceiveData: method of your NSURLConnection's delegate. You''d have to write some extra code to ignore the extra events from re-parsing the beginning of the file more than once.
This seems like it'd be more work than just grabbing some other library and adapting it, but maybe not, depending on how you're handling the downstream creation of your table data.

If NSMutableArray is the underlying data structure of your UITableView you should try
using initWithContentsOfURL. The format on the server needs to be in apples "plist" xml which is easy to generate. I'm guessing that if cocoa already has resources acquired for
processing xml it would be quicker to use them instead of creating your own xml parser instance.

How are you parsing the XML data. The need to parse XML as you receive it is the whole reason pull parsing was invented, and they have the NSXMLParser event driven parser that operates on a stream of data...
That also means compression of the data is counterproductive.
For very small amounts of XML, the issue of speed is probably about latency of connection.
Thus, other big factor could be the DNS lookup. You could do that yourself once beforehand and cache the IP, perhaps rechecking only on failure to connect to your server...

Since your xml files are very small, (just 8k or so, I imagine?) I don't think you'd see a big performance boost trying to parse them as they come in. You can use an asynchronous NSURLConnection to do that, but I can't see it helping with a small file very much. Have you considered the time it takes to generate the XML file on the server? Are you using PHP to create the XML, or accessing a static file? For testing purposes, it might be interesting to access static files and see how that compares.
I ran into a problem like this a while ago, and it turned out the mySQL database on my server was being slow, and it really had nothing to do with my app.
Also, use Instruments to look at the amount of memory that is allocated while you're downloading and parsing an XML document. Some XML parsers load the entire XML document into memory and then allow you to index randomly into the document's elements, while others provide step by step parsing only. The step-by-step method is much better for limited devices like the iPhone and it's probably what you're using, but a quick look at memory consumption should tell you.

Related

How do I start and stop the flow of a connection to an internet server?

I'm using an ESP8266 with ESP8266WiFi and ESP8266HTTPClient libraries. My app doesn't have enough memory to download the entire JSON file that I need, but all I really need is a few fields from it, so I can discard most of it as I read it in.
What I don't understand is how to start, stop, or otherwise slow down the incoming data so that I can process it and pick out what I need as it comes in from the server. I have to use a fairly small buffer when I make the connection due to memory limitations caused by the rest of the program.
Is there a way to fill the buffer from the server, pause the transmission, process and clear the data in the buffer, and then resume the transmission until the whole JSON file is processed?
Sounds like you will want to use a streaming JSON parser. There are a couple of forks of such a library on GitHub. https://github.com/mrfaptastic/json-streaming-parser2 seems to be the one still maintained.

What are the best practices to cache the data?

What are the best practices to cache the data in iOS apps connected to data source via web service?
You should lookat NSCache
http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/NSCache_Class/Reference/Reference.html
An NSCache object is a collection-like container, or cache, that
stores key-value pairs, similar to the NSDictionary class. Developers
often incorporate caches to temporarily store objects with transient
data that are expensive to create. Reusing these objects can provide
performance benefits, because their values do not have to be
recalculated. However, the objects are not critical to the application
and can be discarded if memory is tight. If discarded, their values
will have to be recomputed again when needed.
Depends on the type of data
for binary data (files):
- Cache your files in the Cache folder using NSFileManager and NSData writeToFile:
for small ammounts of data (ascii/utf8):
- Use NSUserDefaults
for large ammounts of data (ascii/utf8):
- Use a sqlite3 database
It depends on how much data you want to cache and how you'll be accessing it once you have it cached, and a bunch of other cache management issues.
If you have a small amount of data, you could store that in a dictionary or array, and simply write it out and read it in. But this kind of solution can become slow if you have a lot of data; those reads and writes can take a long time. And flushing a dirty cache to disk means writing the whole object.
You could write individual files, but again, if you have a lot of files that might become a performance issue as well.
Another alternative is to use CoreData. If you have a lot of data (say, many objects) it may make sense to define what those look like as CoreData entities. Then you just store and fetch objects as you need them, falling back to fetching from your web service (and then caching) if the data is not local. You can also optimize other cache management tasks (like expiring unused entries) easily and efficiently using CoreData.
I actually went down this road, with a couple different apps. I started with an NSDictionary, and that became quite slow. I switched to CoreData, which not only simplified a lot of my code for cache initialization and management, but gave the apps quite a performance boost in the process.
If you're using NSURLConnection, or anything that uses NSURLRequest, caching is already taken care of for you:
http://developer.apple.com/library/ios/#documentation/Cocoa/Conceptual/URLLoadingSystem/Tasks/UsingNSURLConnection.html#//apple_ref/doc/uid/20001836-169425
http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/URLLoadingSystem/Concepts/CachePolicies.html#//apple_ref/doc/uid/20001843-BAJEAIEE
By default these use the cache policies of the protocol, which for a web service would be the HTTP headers it returns. This is also true, IIRC, of ASIHttpRequest.
Core Data also implements its own row and object caching, which works pretty well. So the reality here is that you really don't need to worry about caching when it comes to these things - it's optimizing your use of things like NSDateFormatter that starts to become important (they're expensive to create, not thread safe, etc...)
And when in doubt, use Instruments to find bottlenecks and latency

Opening large file (200mb) on iPhone?

want to open a large text file and then search the content of it.
I loaded the file with stringWithContentsOfFile into a NSString.
Every thing works on a 30mb file. But I am concerned what happens if I load a 200mb file, which I want to do.
Is the complete NSString in the memory? if so it wouldn't work on iPhone. is there a solution for such large files on the iPhone?
A good way to read a large file would be to buffer small chunks of it at a time.
Not sure of the exact API methods you could use to do this, but it is fairly standard practice for audio, video, etc to read a small amount of the file into memory, process this, and remove it from memory as you continue through the file.
Since the limit isn't documented the way to check would be actually profiling it on target. People may be able to give their limits here but that is totally relative. Also a good memory management scheme and design would help you avoid problems of running out of memory.

iPhone: Load Queue on Startup

I've not found a answer to this question anywhere, but this seems like a typical problem:
I would like to send some POST-Requests (with ASIHTTPRequest, what I already do), but if something goes wrong, ther user can decide to "Try Later", that means, the task should be put on a queue and this queue should be read next time the application starts. So, that's my question: how to "save" the queue, so that the app can read it next time it starts? Is it possible to "read" the queue and try sending this POST-Request again, let's say, 10 min later, even if the application is not running?
What kind of documentation should I read in order to be able to do this?
I would be very glad to hear any answers. Thanks in advance.
P.S.: Another Idea I have: as I just have to Upload Photos, I could have a folder with all the Photos that still need to be uploaded, and when the App starts, the app looks at this folder and try to send all the photos in this folder. Does it make sense?
My approach for this issue would be like this:
Whenever you fail to send details - write content of the array to a file using '[NSArray writeToFile:]' you can use serialization if array contain any data which is custom defined (if your array contain standard cocoa objects(NSString,NSData etc) they already implemented with serialization )
When app launches; load the content from file directly to an array object ('[NSArray arrayWithContentsOfFile:]')
then construct http request and try sending. In application the data(in your case array) is stored/serialized not the request, you need to reconstruct the http request when you want to try one more time.(don't try serializing ASIHTTPRequest, you have reconstruct it)
I'm going to assume you've already looked at NSOperationQueue and NSOperation. AFAIK there is no built-in support for serializing NSOperation, but you could very easily write your own serialization mechanism for an NSOperation subclass that you use for posting data and write the an NSOperationQueue's operations to disk if something goes wrong.
Without knowing too many details it's hard to give a precise answer. There are many ways to write data to disk and load it again later, the direction you take will be largely dependent on your situation.

Accessing XML data online?

I am just testing an app to get data off our web server, previously I had been using:
NSURL, NSURLRequest, NSURLConnection etc. to get the data that I wanted.
But I have just noticed that if I swap to using XML I can simply do the following and pass the results to NSXMLParser:
NSURL *url = [NSURL URLWithString:#"https://www.fuzzygoat.com/turbine?nbytes=1&fmt=xml"];
Am I right in thinking that if your just after XML this is an acceptable method? It just seems strongly short compared to what I was doing before?
gary
That code only creates a URL object that represents a URL. It doesn't make any request or download any data. You still need to use NSURLRequest and NSURLConnection in order to actually download any data from the server.
Also, stay away from methods like 'initWithContentsOfURL:` and friends, unless you understand that they will block the thread that they are called on until complete. For network services, this method shouldn't be used because it'll block your UI for an indeterminate time, because you can't predict how fast the internet connection will be wherever the app is used.
NSURLConnection's asynchronous request system is exactly what you need. It won't block the UI, and provides a nice encapsulated interface to downloading data from a remote location.
This is definitely the right way to go. There do exist many different connection methods (including my favorite, ASIHTTPRequest) and many, many different xml parsers (including my favorite, KissXML) that are faster or more memory efficient than the Apple built in methods.
But to answer your question, yes, your logic and design pattern is correct.
UPDATE: Because Jasarien seems to think the question talks about asynchronous actions, I will discuss that here. ASIHTTPRequest handles async very very easily. Just check out the quick samples.
Depending on how much XML you get back from the web service, NSXMLParser may not be ideal because the entire XML document has to be read into memory.
Memory is pretty scarce on an iPhone, so using a SAX parser like that in libxml2 is probably better for larger XML files. Instead of reading the entire document into memory, the XML is streamed through and parsed for specific nodes of interest. The memory overhead is lower because less data is stored at once.
Once a node of interest is parsed, an event handler is called to do something useful, like store the node data somewhere.
In this case, take a look at Apple's XMLPerformance sample project for example code.