XML and SQLite memory utilization and performance on the iPhone - iphone

How do the memory utilization and performance for XML or SQLite compare on the iPhone?
The initial data set for our application is 500 records with no more than 750 characters each.
How well would XML compare with SQLite for accessing say record 397 without going through the first 396? I know SQLite3 would have a better methods for that, but how is the memory utilization?

When dealing with XML, you'll probably need to read the entire file into memory to parse it, as well as write out the entire file when you want to save. With SQLite and Core Data, you can query the database to extract only certain records, and can write only the records that have been changed or added. Additionally, Core Data makes it easy to do batched fetching.
These limited reads and writes can make your application much faster if it is using SQLite or Core Data for its data store, particularly if you take advantage of Core Data's batched fetching. As Graham says, specific numbers on performance can only be obtained by testing under your specific circumstances, but in general XML is significantly slower for all but the smallest data sets. Memory usage can also be much greater, due to the need to load and parse records you do not need at that instant.

To find out how the memory usage for your application fares, you need to measure your application :). The Instruments tool will help you.

Related

Efficient storage of large amounts of data in iOS

I'm building an application which has a "record" feature which records user interaction over time. As time progresses, I fill an array in memory with "state" objects representing the current state of the user input. A typical recording will result in about 5k of these objects.
I then archive this data using NSKeyedArchiver archiveRootObject: toFile:. This works fine, however the file size is very large (3.5 megs or so). My question is this:
Is there any inherent file-size overhead involved in archiving files? Would I be able to save this data using much less disk space if I were to use SQLite, or even roll my own file format? Or is the only way to reduce the disk size of the data going to be to reduce the bit depth of the numbers I'm storing?
If your concern is performance, Core Data gives you more granularity. You can lazy load and save by parts during app execution vs loading/saving the whole 3.5Mb object graph.
If your concern is file size, this is the binary plist format, and this is the SQLite file format. But more important than the overhead, is how complex is the translation between your object graph and the Core Data model.
You may also be interested in this comparison of speed and performance for several file formats: https://github.com/eishay/jvm-serializers/wiki/ Not sure if everything there has an C, C++ or objective-C implementation.
3.5 MB isn't a very large file. However, if your app has to load or save a 3.5 MB file all the time, then using Core Data is a lot smarter as this allows you to save only the data that has changed and retrieve only the parts that you're interested in -- not the whole thing every time.
If storage is the main concern, there would be little difference b/w sqlite and core data.
I had to store UIViewControllers with state in an app, where I ended up not saving the serialized objects but saving only the most specific properties and creating a class which read that data and re-allocated those objects.
The property map was then stored in a csv [admittedly very difficult to manage, but small like anything] and then compressed.

Speed Issue with sqlite and core data on the iPhone

I have 40,000+ records in an sqlite db table and am using core data to model the data.
when deployed to a device (iPhone 3G) the data is very slow to load (it takes 5 seconds for the data to load into the tableview). I was wondering if anyone had any tips on how to improve this. I've heard about indexing the data, but am not sure how this is done.
thanks for any help.
...the 40K records are broken up into 70+
categories, the most any tableview
would show is 2000 records. the
categories are in a plist which then
points to the sqlite db using
NSFetchedResultsController.
That sounds like a bottleneck. Firstly, the categories have to all be loaded into memory at once as the plist is read in. Depending on how big the category objects/data are, that could eat quite a bit of memory.
More importantly though, it suggest your data model is not well configured. There should be no need for any significant data external to Core Data model. The category data should be part of the data model. If you are using a lot of external data to configure the fetched results controller, then you probably end up with complex, slow predicates for the fetch request. That will bog everything down.
Well configured, Core Data can handle very large and complex data sets without any apparent effort because the data is read only in smallish chunks.

sqlite versus csv file for iPhone

We have about 10 sqlite files getting downloaded in our app and each of which contains about 4000 rows. We process that data and display it in a tableview. We are running into speed and memory issues when scrolling through the tableview.
We were thinking whether instead of sqlite files, if we have csv files or some other format, can we get better performance than sqlite? I have read that xml or json won't help since the number of records is too huge and parsing time would go up.
Please suggest.
First, don't assume that SQLite is your bottleneck. I made that same assumption in my own application and spent days trying to optimize the database access, only to run Instruments against it and find that I had a slow string-processing routine in my interface that was bogging things down.
Use Time Profiler and Object Allocations first to verify where your hotspots are in code. SQLite is ridiculously fast.
That said, with 4000 rows, you will probably run into memory issues at the least if you try to load all of them into an array for display to the screen. My recommendation would be to import that data into a Core Data SQLite database and use an NSFetchedResultsController with a batch size set for its fetch request to be slightly larger than the number of rows displayed onscreen.
Core Data will handle the loading / unloading of batched data this way, meaning that only a small part of the database is loaded into memory at once. This can lead to a tremendous speedup (particularly on the initial load) and will significantly reduce memory usage. It also does it using a trivial amount of code.
A properly indexed SQLite database will run circles around any flat file, especially if you have a lot of records. Also try consolidating those 10 files into 1 database, so you can perform joins on indexed columns and use clever tricks such as views. Right now it seems like you're pulling data from 10 different databases and manually comparing/processing them, which would of course take a lot of time and memory.
It is going to depend on the application, how you are using and querying the data. Profile it, confirm that sqlite is or isn't the problem. Then attack whatever the profiling turns up.
Profilers: Shark
Or some other profiling solution

Core Data NSFetchedResultsController Performance Advantages Over NSArray?

Does using an NSFetchedResultsController provide any performance advantages on the iPhone over an NSArray?
I have between 4,000 and 8,000 records stored in core data and wanted to know if I should select one over the other. Is the NSFetchedResultsController just used to make code 'prettier'?
My concern is searching, and lag on keyboard button presses (as well as the issue of loading that many records into memory). Thanks!
Given your parameters, Core Data will be faster than an array especially if you make any changes to the data.
The disadvantage of an array in this case is that you have to load the entire array into memory in one go.
It might seem obvious that Core Data will be slower than more primitive methods but owing to fine tuned optimization plus the ease of integration with the rest of the API it is actually fairly hard to beat Core Data in real world apps with significant amounts of data.

Decide which caching startegy to use?

I want to cache my loaded data so that I can reduce my application start time .
I know several strategies to store application data
i.e. core data, nsuserdefaults , archiving .
Now my scenario is that suppose that I have array of maximum 10 objects each object having 5 fields .
So I can not decide which strategy to store this array an later retrieving the same .
Thanks .
Never store cache data in NSUserDefaults; that is not what it is for.
Archiving is expensive and should not be used. It is also far more difficult to manage.
Core Data is almost always the right answer unless the data storage is trivial.
Update
Archiving, also known as serializing is one of the most expensive ways to write data to disk compared to other formats. The exact details are difficult to explain in an answer here but it boils down to an old design that does not perform nearly as well as the more modern persistence systems such as Core Data. Putting the two side by side you will see significant performance increases (due to internal threading, caching, database support on the backend, etc.) when using Core Data.
The fact that Core Data also handles your data model life cycle and structure is just gravy on top of that performance increase.