I have a data set that is around 700 rows with eight columns of string data. Is it worth setting up core data? Will I see a huge performance difference if between Core Data and an array?
The app is a simple UITableView and and a couple of drill down style detail views. I will be searching the data set on one of detail views.
It is worth using Core Data if your data set state can change and if the data set is relational: In other words, if columns of data relate to other columns, and if filtering on those columns gives you subsets of data that you want to show in your UI, and if you are updating or changing data in your array, then Core Data has some nice "free" functionality for managing these transactions.
Managing access to a static array will undoubtedly be faster than Core Data, but if it is easier to think about your data in terms of dynamic sets and subsets, then CD will be easier to code and update for.
Related
I am working on an application for the iPhone (iOS 5). What I have to do is create a map by using binary data that I reveive from a server. Some issues actually work quite well:
I can connect to a server, send requests and receive binary data from it
I can interprete this data, create objects (polygons and paths) from it and draw them within a view
But now it comes to the hard part. The map that I create should be zoomable and moovable. So I have to send new requests to the server and redraw the map. This also works nicely, but the data I already received now needs to be stored, because I should not request the same data from the server twice (e.g. if I zoom out and then back in).
Finally here is my question: What would be the best way to store my data? Until now I thought about using CoreData or SQLite. Are there even better solutions? And what data should I save - the binary data or my created objects?
I hope this was understandable and you can help me with at least one of my issues...
Core data is the only way to go.
Core data is not a storage system, is an object graph and persistence framework, witch can use SQlite to store data.
If you use core data you can refactor your project and use managedObjects subclass as models.
Take a look at Core Data Programming Guide, The differences between Core Data and a Database
Edit:
From Core Data Performance
Core Data is a rich and sophisticated object graph management
framework capable of dealing with large volumes of data. The SQLite
store can scale to terabyte sized databases with billions of
rows/tables/columns. Unless your entities themselves have very large
attributes (although see “Large Data Objects (BLOBs)”) or large
numbers of properties, 10,000 objects is considered to be a fairly
small size for a data set.
It really depends on the size of your data objects and how you access them. If your objects are small, you could store them in Core Data. But, if your map data is coming as images from a bunch of URLs, I would use Core Data to store the mappings to the map image URLs and use NSURLConnection to manage the caching of your objects.
I recommend reading the Apple Core Data Programming Guide Large Data Objects (BLOBs), it discusses the size and number of objects. Some excerpts are below:
The exact definition of "small", "modest", and "large" is fluid and depends on an application's usage. A loose rule of thumb is that objects in the order of kilobytes in size are of a "modest" sized and those in the order of megabytes in size are "large" sized.
For small to modest sized BLOBs (and CLOBs), you should create a separate entity for the data and create a to-one relationship in place of the attribute.
It is better, however, if you are able to store BLOBs as resources on the filesystem, and to maintain links (such as URLs or paths) to those resources. You can then load a BLOB as and when necessary.
I have 40,000+ records in an sqlite db table and am using core data to model the data.
when deployed to a device (iPhone 3G) the data is very slow to load (it takes 5 seconds for the data to load into the tableview). I was wondering if anyone had any tips on how to improve this. I've heard about indexing the data, but am not sure how this is done.
thanks for any help.
...the 40K records are broken up into 70+
categories, the most any tableview
would show is 2000 records. the
categories are in a plist which then
points to the sqlite db using
NSFetchedResultsController.
That sounds like a bottleneck. Firstly, the categories have to all be loaded into memory at once as the plist is read in. Depending on how big the category objects/data are, that could eat quite a bit of memory.
More importantly though, it suggest your data model is not well configured. There should be no need for any significant data external to Core Data model. The category data should be part of the data model. If you are using a lot of external data to configure the fetched results controller, then you probably end up with complex, slow predicates for the fetch request. That will bog everything down.
Well configured, Core Data can handle very large and complex data sets without any apparent effort because the data is read only in smallish chunks.
I have populated a Core Data database and need to query it based upon my users location. We use similar code at the backend of a webservice as a UDF and return the distance as a column, but we now have a requirement to cache some of this data for offline use.
I know CLLocation has a distanceTo method but is this going to be efficient when parsing a few thousand rows of data?
A few thousand rows isn't a huge amount of data. Make sure you design your CoreData schema such that it can load the objects with the location information without also loading tons of other data (put any large blocks of data into their own objects and let it load them lazily).
When you start getting beyond a few thousand, though, you might need to go to a different local data structure, such as some library that implements an R-Tree or range searches.
My plan is to display a list of items alphabetically in a table view that has about 100 items. Each item has an image, a list of times and a description that the tableview will drill down to. What I am struggling with is the correct way to store and load this data. Some have told me that a plist will be too data heavy and that core data is too new. Should I just create arrays?
You're not clear about what you intend to do with this data. Plists and Core Data are both persistence formats (on disk). Arrays are an in-memory format (and can also be slapped onto disk, I suppose, if that's what you want to do, but inventing your own binary disk format is only something you should consider very rarely, and certainly not in the case you probably have).
In memory, you can probably just use an array (NSArray) and have each element perhaps be an NSDictionary of the other properties relative to that entry. That sounds like the model of your MVC design, which you can then hook up to the table view.
As far as persisting this to disk, it depends on whether 100 items is a fixed amount, a ballpark, or a minimum, etc. Plists (see NSKeyedArchiver) are great for all the data except possibly the raw image data-- you might want to keep those "to the side" as separate image files with filenames in the plist.
I don't know much about Core Data, but it's not that new, and it's not untested, so if it does what you want without much hassle, go for it.
Serialize it into an Archive using NSCoding Protocol. See Guide.
I'd use an NSArray of business objects implementing NSCoding and then just archive them.
I usually default to Core Data unless I have a compelling reason not to. (Of course, I have learned Core Data so that makes it easy for me to this.)
Core Data has the following advantages:
It has an editor in which you can create complex object graphs easily
It can generate custom classes for you data model objects.
The generated classes are easily modified.
Core Data manages the complexity of adding, deleting and saving objects.
Core Data makes persisting an object graph almost invisible.
NSFetchedResultsController is custom designed to provide data for tables.
100 objects is a small graph that Core Data can handle easily. It's a lot easier to use Core Data than it is to write custom coders to archive custom objects. For exmaple, at present, I have an app with over a dozen major entities each with two or three relationships to other entities. Hand coding all that would be a nightmare.
Core Data has something of a steep learning curve especially if you've never worked with object graphs before but if you're planning on writing a lot of Apple platform software, learning it is well worth the time.
I've got about 5000-7000 objects in my core data store that I want to display in a table view. I'm using a fetched results controller and I haven't got any predicates on the fetch. Just a sort on an integer field. The object consists of a few ints and a few strings that hold about 10 to 50 chars in. My problem is that it's taking a good 10 seconds to load the view. Is this normal?
I believe that the FRC handles large datasets and handle's batches and such things to allow large datasets. Are there any common pitfalls or something like that because I'm really stumped. I've stripped my app down to a single table view, yet it still takes around 10 seconds to load. And I'm leaving the table view as the default style and just displaying a string in the cell.
Any advice would be greatly appreciated!
Did you check the index checkbox for the integer you are sorting on in your Core Data model?
On your fetch request, have you used -setFetchBatchSize: to minimize the number of items fetched at once (generally, the number of items onscreen, plus a few for a buffer)? Without that, you won't see as much of a performance benefit from using an NSFetchedResultsController for your table view.
You could also limit the properties being fetched by using -setPropertiesToFetch: on your fetch request. It might be best to limit your fetch to only those properties of your objects that will influence their display in the table view. The remainder can be lazy-loaded later when you need them.