SPA apps, and server synchronisation, best practices - rest

I am developing a SPA application in Angularjs to load data into my database.
I have a Django back end with tastypie providing a REST interface.
I am trying to populate a flowcell object, which is made up of 8 lane elements. Each lane may contains multiple libraries (say 5 or 6 on average).
Flowcell
Lane 1
library 1
library 2
library 3
Lane 2
library 5
Lane 3
library 6
library 7
etc.....
So at the moment when I add a new library to a lane object, I POST the details to the server then refresh the lane's library list with a GET request, and the display refreshes.
This ensures that the server data and the display data are synchronised. However, it does add a delay while each Lane contacts the server and refreshes itself.
So is it considered better to add a number of libraries to each lane in the client side, then update them together - this will give a smoother user experience, but the display may not reflect exactly what is in the database? (I can imagine this may cause errors if the two get too far out of synch).
Or is it considered better to do what I am currently doing - update multiple small changes, sending more requests to the server, but ensuring the data between the client and the server are consistent?

I think you're on the right track. Taking the approach of being more granular in your updates will be better.
Primarily, it will create the feeling of less delay for the client. If the changes are small and are fired off when an element is changed, unless the network is really slow, each change should be fast and almost undetectable by the client.
The only thing I would say is that you might not need to do another GET after your POST of changed data. Unless this is some type of shared data model across many users and sessions, if the client makes an update and the POST goes through, the client holds the correct set of data so the additional GET isn't necessary.

Related

Best way to keep in sync data in two different applications

I have 2 closed-source application that must share the same data at some point. Both uses REST APIs.
An actual example are helpdesk tickets, they can be created on both applications and i need to update the data on one application when the user adds a new ticket/closes a ticket on the other application and vice versa.
Since is closed-source I can't really modify che code.
I was thinking I can create a third application that every 5 minutes or so, list both applications' tickets for differences on the precedent call, and if the data is different from the precedent call it updates the other application too.
Is there a better way of doing this?
With closed-source applications it's nearly impossible to get something out of them, unless they have some plugin-based setup that you can hook into.
The most efficient way in terms of costs would be to have the first application publish a message on a queue, or call a web-hook that you set, whenever the event is triggered. But as I mentioned, the application needs to support that.
So yeah, your solution is pretty much everything you can do for now, but keep in mind the challenges that you may encounter over time:
What if the results of both APIs are too large to be compared directly? Maybe you need to think about paging the results.
What if your app crashes and you loose the previous state? You need to somehow back it up in an external source
How often you should poll the API to make sure you're getting the updates you need, while keeping a good performance for the existing traffic?

What is better: one large REST API call or many small for a Cordova/Backbone app?

This is my first Cordova/Backbone application.
I have grasped the whole deal with Models, Views, etc. somewhat, and now I have gotten to actually making proper view structure for my app.
It is a user centered app, which means that views are dynamic depending on who the user is and their status in the app.
Could you please help me to understand what is a better choice: making one (large-ish) api call to the server to get the data for all user-related app views (that would get all user info, various menus for the current user etc) and put them in one User model or make several smaller api calls that each get a fragment of the information (let's say, profile information, newsfeed information and options for two menus, so 4 ajax calls total) and keep the models separate? All the relevant views (UserProfile, SideMenu, UserProfileMenu and ActivityFeed) are rendered on user login. Some of them are available for user at all times (SideBar menu for example), some get switched out as user navigates elsewhere.
I design the server-side API myself, so I can freely choose what data is returned and when.
"it depends". If you need all the info (from 4 ajax calls) from start, it would be better to create one big api call, because callig server 4 times will last longer than one big call - 4x server ping time. you could use the big call on app start and still create the 4smaller ones to refresh data when needed.

Is there any value in using core data for iPhone apps?

Can people give me examples of why they would use coreData in an application?
I ask this because most apps are just clients to a central server where an API of some sort gives you the information you need.
In my case I'm writing a timesheet application for a web app which has an API and I'm debating if there is any value in replicating the data structure on my server in core data(Sqlite)
e.g
Project has many timesheets
employee has many timesheets
It seems to me that I can just connect to the API on every call for lists of projects or existing timesheets for example.
I realize for some kind of offline mode you could store locally in core data but this creates way more problems because you now have a big problem with syncing that data back to the web server when you get connection again.. e.g. the project selected for a timesheet no longer exists.
Can any experienced developer shed some light on there experiences on when core data is best practice approach?
EDIT
I realise of course there is value in storing local persistance but the key value of user defaults seems to cover most applications I can think of.
You shouldn't think of CoreData simply as an SQLite database. It's not JUST an SQLite database. Sure, SQLite is an option, but there are other options as well, such as in-memory and, as of iOS5, a whole slew of custom data stores. The biggest benefit with CoreData is persistence, obviously. But even if you are using an in-memory data store, you get the benefits of a very well structured object graph, and all of the heavy lifting with regards to pulling information out of or putting information into the data store is handled by CoreData for you, without you necessarily needing to concern yourself with what is backing that data store. Sure, today you don't care too much about persistence, so you could use an in-memory data store. What happens if tomorrow, or in a month, or a year, you decide to add a feature that would really benefit from persistence? With CoreData, you simply change or add a persistent data store, and all of your methods to get information out or in remain unchanged. The overhead for that sort of addition is minimal in comparison to if you were trying to access SQLite or some other data store directly. IMHO, that's the biggest benefit: abstraction. And, in essence, abstraction is one of the most powerful things behind OOP. Granted, building the Data Model just for in-memory storage could be overkill for your app, depending on how involved the app is. But, just as a side note, you may want to consider what is faster: Requesting information from your web service every time you want to perform some action, or requesting the information once, storing it in memory, and acting on that stored value for the remainder of the session. An in-memory data store wouldn't persistent beyond that particular session.
Additionally, with CoreData you get a lot of other great features like saving, fetching, and undo-redo.
There are basically two kinds of apps. Those that provide you with local functionality (games, professional applications, navigation systems...) and those that grant access to a remote service.
Your app seems to be in the second category. If you access remote services, your users will want to access new or real-time data (you don't want to read 2 week old Facebook posts) but in some cases, local caching makes sense (e.g. reading your mails when you're on the train with unstable network).
I assume that the value of accessing cached entries when not connected to a network is pretty low for your customers (internal or external) compared to the importance of accessing real-time-data. So local storage might be not necessary at all.
If you don't have hundreds of entries in your timetable, "normal" serialization (NSCoding-protocol) might be enough. If you only access some "dashboard-data", you will be able to get along with simple request/response-caching (NSURLCache can do a lot of things...).
Core Data does make more sense if you have complex data structures which should be synchronized with a server. This adds a lot of synchronization logic to your project as well as complexity from Core Data integration (concurrency, thread-safety, in-app-conflicts...).
If you want to create a "client"-app with a server driven user experience, local storage is not necessary at all so my suggestion is: Keep it as simple as possible unless there is a real need for offline storage.
It's ideal for if you want to store data locally on the phone.
Seriously though, if you can't see a need for it for your timesheet app, then don't worry about it and don't use it.
Solving the sync problems that you would have with an "offline" mode would be detailed in your design of your app. For example - don't allow projects to be deleted. Why would you? Wouldn't you want to go back in time and look at previous data for particular projects? Instead just have a marker on the project to show it as inactive and a date/time that it was made inactive. If the data that is being synced from the device is for that project and is before the date/time that it was marked as inactive, then it's fine to sync. Otherwise display a message and the user will have to sort it.
It depends purely on your application's design whether you need to store some data locally or not, if it is a real problem or a thin GUI client around your web service. Apart from "offline" mode the other reason to cache server data on client side might be to take traffic load from your server. Just think what does it mean for your server to send every time the whole timesheet data to the client, or just the changes. Yes, it means more implementation on both side, but in some cases it has serious advantages.
EDIT: example added
You have 1000 records per user in your timesheet application and one record is cca 1 kbyte. In this case every time a user starts your application, it has to fetch ~1Mbyte data from your server. If you cache the data locally, the server can tell you that let's say two records were updated since your last update, so you'll have to download only 2 kbyte. Now you should scale up this for several tens of thousands of user and you will immediately notice the difference of the server bandwidth and CPU usage.

Core Data with Web Services recommended pattern?

I am writing an app for iOS that uses data provided by a web service. I am using core data for local storage and persistence of the data, so that some core set of the data is available to the user if the web is not reachable.
In building this app, I've been reading lots of posts about core data. While there seems to be lots out there on the mechanics of doing this, I've seen less on the general principles/patterns for this.
I am wondering if there are some good references out there for a recommended interaction model.
For example, the user will be able to create new objects on the app. Lets say the user creates a new employee object, the user will typically create it, update it and then save it. I've seen recommendations that updates each of these steps to the server --> when the user creates it, when the user makes changes to the fields. And if the user cancels at the end, a delete is sent to the server. Another different recommendation for the same operation is to keep everything locally, and only send the complete update to the server when the user saves.
This example aside, I am curious if there are some general recommendations/patterns on how to handle CRUD operations and ensure they are sync'd between the webserver and coredata.
Thanks much.
I think the best approach in the case you mention is to store data only locally until the point the user commits the adding of the new record. Sending every field edit to the server is somewhat excessive.
A general idiom of iPhone apps is that there isn't such a thing as "Save". The user generally will expect things to be committed at some sensible point, but it isn't presented to the user as saving per se.
So, for example, imagine you have a UI that lets the user edit some sort of record that will be saved to local core data and also be sent to the server. At the point the user exits the UI for creating a new record, they will perhaps hit a button called "Done" (N.B. not usually called "Save"). At the point they hit "Done", you'll want to kick off a core data write and also start a push to the remote server. The server pus h won't necessarily hog the UI or make them wait till it completes -- it's nicer to allow them to continue using the app -- but it is happening. If the update push to server failed, you might want to signal it to the user or do something appropriate.
A good question to ask yourself when planning the granularity of writes to core data and/or a remote server is: what would happen if the app crashed out, or the phone ran out of power, at any particular spots in the app? How much loss of data could possibly occur? Good apps lower the risk of data loss and can re-launch in a very similar state to what they were previously in after being exited for whatever reason.
Be prepared to tear your hair out quite a bit. I've been working on this, and the problem is that the Core Data samples are quite simple. The minute you move to a complex model and you try to use the NSFetchedResultsController and its delegate, you bump into all sorts of problems with using multiple contexts.
I use one to populate data from your webservice in a background "block", and a second for the tableview to use - you'll most likely end up using a tableview for a master list and a detail view.
Brush up on using blocks in Cocoa if you want to keep your app responsive whilst receiving or sending data to/from a server.
You might want to read about 'transactions' - which is basically the grouping of multiple actions/changes as a single atomic action/change. This helps avoid partial saves that might result in inconsistent data on server.
Ultimately, this is a very big topic - especially if server data is shared across multiple clients. At the simplest, you would want to decide on basic policies. Does last save win? Is there some notion of remotely held locks on objects in server data store? How is conflict resolved, when two clients are, say, editing the same property of the same object?
With respect to how things are done on the iPhone, I would agree with occulus that "Done" provides a natural point for persisting changes to server (in a separate thread).

CoreData and ASP.NET webservices

I'm developing an iPhone app that relies heavily on calling ASP.NET Webservices to transfer data to and fro between a database in one of our servers and the phone.
There are multiple items that the user works on. And to work on each item, a typical usecase requires 5-6 webservice calls. Depending on 3G signal strengths, the amount of data that's being downloaded and the number of items each day, it takes any where between 5-6 minutes of time just to keep the user looking at the spinning wheel.
This is simply not acceptable, so, basically I want to be able to asynchronously keep both the app and the database on the server side in sync.
How would I go about doing this? (I'm currently not using CoreData at all, but I'm guessing I probably need to use it now).
Thanks,
Teja
You will need to code the glue beteeen core data and your restful web service. Try and design your data model to match the data coming back from the web service and that will make the glue easier to manage. If you can get the data in JSON then the glue will be even less.
Core Data is an object hierarchy and you will need to approach it like that when you design your caching and syncing code.
Update
The tableview has nothing to do with the data. When you are working with Objective-C and iOS you need to think in terms of MVC. So you need to be thinking about watching delta changes in your Model (the UI or view portion of MVC is irrelevant). Core Data easily lets you do this during a save operation and you can take those deltas and push them back up to the server. The tricky issue is how to get notifications from the server of changes server side. That is something that is dependent on the server design.
Processing changes from the server should be done on a background thread (with a separate NSManagedObjectContext connected to the same NSPersistentStoreCoordinator) and the main thread should be watching for save notifications from that background thread so that it can update the UI as needed.
This is a non-trivial design and the complexity means that you can and will run into issues but those issues are dependent on your application and server design. There is no silver bullet other than the fact that using Core Data makes all of this a lot easier.