Pulling game mechanic updates in an iOS strategy game? - iphone

I'm planning on developing a multiplayer strategy game for the iOS platform. However, being a strategy game with multiple "units", there will likely be imbalances in gameplay, that will need to be addressed with constant mechanic-tweaking (upgrading / nerfing certain units).
The easiest way to accomplish this would simply be to change the mechanics within the app itself, and constantly submit updates to Apple. However, updates take time to propagate through Apple's review process (so the changes wouldn't be instantaneous), and I would need to do checks to see if all the players in the game are running the same version of the game, force users to update to the latest version of the game, etc.
What I'm thinking of doing instead is something similar to what the game Uniwar does. Every time the game is launched, it appears to check if there are any gameplay tweaks available, and if there are, it downloads the updates (and shows an update message to the user detailing what has changed).
However, as a relatively new programmer, I don't really know what would be the easiest way of accomplishing this. Would I host a text file online containing the unit statistics, and get the game to check that file for changes? Or is there some better, more efficient way? And if I were to do it this way, how would I do it?

First, ensure that your rules are some sort of resource you can easily change (be it binary or text-based). The most convenient way of updating these would be to periodically poll a server, most conveniently using the HTTP protocol, fetching updates as needed. The way I see it, there are two ways of doing this.
The first method uses the excellent caching abilities of the HTTP protocol, and as such requires a server (and client library) that understands these. The basic idea would be to have a copy of the latest version of the game mechanics published on a server (say, to http://example.org/mechanics.gz, and then have the client issue conditional GET requests with the If-modified-since header set to the time the last update check was performed. The HTTP protcol will den effectively do the rest for you, issuing a 304 Not modified if there is no update, and sending the new mechanics if there is one. This method has the disadvantage that the whole mechanics file has to be downloaded on every update (no diffs can be used), and that old versions won't be available, but it has an appealing simplicity.
The second method would consist of having a list of updates (well, their URI, ID and release date), say http://example.org/updates.xml, which the client pulls on every poll. The client then checks if there are any updates it doesn't have, and downloads and applies these in chronological order. Using this method, old updates can be made available (and will have permanent links), and diffs may be used. This is useful if history is important or if game mechanic files are large.

The format of the file doesn't really matter -- use whatever works for you. The key to being able to tune the gameplay, I think, is to turn the rules of the game into objects that you can configure. If the rules are all objects, you can do things like changing the order in which they're applied, the weight given to each one, the conditions under which a rule becomes effective, etc. You might have an object that's responsible for managing and properly applying the rules, basically a "rule model." Once you have that, all you need to do is to implement NSCoding in your rule and rule model classes and you can easily write and read rule configurations.

Related

Best way to keep in sync data in two different applications

I have 2 closed-source application that must share the same data at some point. Both uses REST APIs.
An actual example are helpdesk tickets, they can be created on both applications and i need to update the data on one application when the user adds a new ticket/closes a ticket on the other application and vice versa.
Since is closed-source I can't really modify che code.
I was thinking I can create a third application that every 5 minutes or so, list both applications' tickets for differences on the precedent call, and if the data is different from the precedent call it updates the other application too.
Is there a better way of doing this?
With closed-source applications it's nearly impossible to get something out of them, unless they have some plugin-based setup that you can hook into.
The most efficient way in terms of costs would be to have the first application publish a message on a queue, or call a web-hook that you set, whenever the event is triggered. But as I mentioned, the application needs to support that.
So yeah, your solution is pretty much everything you can do for now, but keep in mind the challenges that you may encounter over time:
What if the results of both APIs are too large to be compared directly? Maybe you need to think about paging the results.
What if your app crashes and you loose the previous state? You need to somehow back it up in an external source
How often you should poll the API to make sure you're getting the updates you need, while keeping a good performance for the existing traffic?

Synchronize Directory of Files Between Server and iOS Application

I am building an internal iOS application (so - it won't ever be in the app store), and I need to keep a directory of content synchronized between a server and each of the instances of the iOS application. This would be easy enough if I just wanted to delete and re-download this content each time, but I would rather use something similar to rsync to only download the elements that have changed.
I haven't found any good way to utilize rsync. I considered looking at Objective-Git as a possibility here, but at a quick glance it looked like there is still a lot of the support for remote repositories that isn't supported yet.
As a final note, while this won't be in the app store, I will not be jailbreaking these devices and I would prefer to not rely on any private API's (although if there was an elegant solution that utilized private API's I might consider it).
Thoughts?
ADDITIONAL NOTE: This needs to be an isolated solution. I won't be relying on outside services (like Dropbox, Box.net, etc...). This needs to work solely between the device and the server (which is on a local network with the device).
Use HTTP to list the contents of each folder on the server.
Compare last modification time of each file with those on the device, and identify added/removed files.
Get added and modified files, remove deleted files.
It sounds like you're maybe asking for a library that already does this, but if you don't find one it's obviously moderately easy to write this from the ground up using stat(2) on the server and the same or a higher-level equivalent on the iOS devices. Have the iPhone send a tree of files with their modification date to the server and get back a list of insert/delete/update operations to do with the url (or whatever) for each one so you can do them incrementally on a background thread. Have the information from the server for new/updated files include the mod date that the server has so you can set it to be the same on the iOS device and send that when asking the server for the status of each file (kind of hack using the file system to store that, but it works).
Why not just set up a RESTful interface and do it across HTTP; that way you could query the modification times easily enough to determine whether client or server files need to be updated. You might also want to keep track of what files on the client have been synced, so you can easily know which files to add or delete. This can be done with a simple .sync file or using a plist / sqlite / etc.
If you'll consider FTP, there are some pretty advanced client libraries available.
For example, the iOS Chilkat bundle includes an FTP client library that supports synchronization in both directions. It's not free, but it's pretty cheap -- and you get a ton of other stuff that will likely prove useful someday. Here's an example of iOS pulling down all additions and changes (mode 2):
http://www.example-code.com/ios/ftp_syncLocalTree.asp
One caveat -- judging solely from the example, it doesn't appear to synchronize deletions. If this is a requirement, you could do it yourself without too much effort immediately following a sync.
acrosync (see https://acrosync.com/library.html) seems like a good fit given the initial question, however I haven't used it myself yet.

Core Data with Web Services recommended pattern?

I am writing an app for iOS that uses data provided by a web service. I am using core data for local storage and persistence of the data, so that some core set of the data is available to the user if the web is not reachable.
In building this app, I've been reading lots of posts about core data. While there seems to be lots out there on the mechanics of doing this, I've seen less on the general principles/patterns for this.
I am wondering if there are some good references out there for a recommended interaction model.
For example, the user will be able to create new objects on the app. Lets say the user creates a new employee object, the user will typically create it, update it and then save it. I've seen recommendations that updates each of these steps to the server --> when the user creates it, when the user makes changes to the fields. And if the user cancels at the end, a delete is sent to the server. Another different recommendation for the same operation is to keep everything locally, and only send the complete update to the server when the user saves.
This example aside, I am curious if there are some general recommendations/patterns on how to handle CRUD operations and ensure they are sync'd between the webserver and coredata.
Thanks much.
I think the best approach in the case you mention is to store data only locally until the point the user commits the adding of the new record. Sending every field edit to the server is somewhat excessive.
A general idiom of iPhone apps is that there isn't such a thing as "Save". The user generally will expect things to be committed at some sensible point, but it isn't presented to the user as saving per se.
So, for example, imagine you have a UI that lets the user edit some sort of record that will be saved to local core data and also be sent to the server. At the point the user exits the UI for creating a new record, they will perhaps hit a button called "Done" (N.B. not usually called "Save"). At the point they hit "Done", you'll want to kick off a core data write and also start a push to the remote server. The server pus h won't necessarily hog the UI or make them wait till it completes -- it's nicer to allow them to continue using the app -- but it is happening. If the update push to server failed, you might want to signal it to the user or do something appropriate.
A good question to ask yourself when planning the granularity of writes to core data and/or a remote server is: what would happen if the app crashed out, or the phone ran out of power, at any particular spots in the app? How much loss of data could possibly occur? Good apps lower the risk of data loss and can re-launch in a very similar state to what they were previously in after being exited for whatever reason.
Be prepared to tear your hair out quite a bit. I've been working on this, and the problem is that the Core Data samples are quite simple. The minute you move to a complex model and you try to use the NSFetchedResultsController and its delegate, you bump into all sorts of problems with using multiple contexts.
I use one to populate data from your webservice in a background "block", and a second for the tableview to use - you'll most likely end up using a tableview for a master list and a detail view.
Brush up on using blocks in Cocoa if you want to keep your app responsive whilst receiving or sending data to/from a server.
You might want to read about 'transactions' - which is basically the grouping of multiple actions/changes as a single atomic action/change. This helps avoid partial saves that might result in inconsistent data on server.
Ultimately, this is a very big topic - especially if server data is shared across multiple clients. At the simplest, you would want to decide on basic policies. Does last save win? Is there some notion of remotely held locks on objects in server data store? How is conflict resolved, when two clients are, say, editing the same property of the same object?
With respect to how things are done on the iPhone, I would agree with occulus that "Done" provides a natural point for persisting changes to server (in a separate thread).

Best strategy for synching data in iPhone app

I am working on a regular iPhone app which pulls data from a server (XML, JSON, etc...), and I'm wondering what is the best way to implement synching data. Criteria are speed (less network data exchange), robustness (data recovery in case update fails), offline access and flexibility (adaptable when the structure of the database changes slightly, like a new column). I know it varies from app to app, but can you guys share some of your strategy/experience?
For me, I'm thinking of something like this:
1) Store Last Modified Date in iPhone
2) Upon launching, send a message like getNewData.php?lastModifiedDate=...
3) Server will process and send back only modified data from last time.
4) This data is formatted as so:
<+><data id="..."></data></+> // add this to SQLite/CoreData
<-><data id="..."></data></-> // remove this
<%><data id="..."><attribute>newValue</attribute></data></%> // new modified value
I don't want to make <+>, <->, <%>... for each attribute as well, because it would be too complicated, so probably when receive a <%> field, I would just remove the data with the specified id and then add it again (assuming id here is not some automatically auto-incremented field).
5) Once everything is downloaded and updated, I will update the Last Modified Date field.
The main problem with this strategy is: If the network goes down when I am updating something => the Last Modified Date is not yet updated => next time I relaunch the app, I will have to go through the same thing again. Not to mention potential inconsistent data. If I use a temporary table for update and make the whole thing atomic, it would work, but then again, if the update is too long (lots of data change), the user has to wait a long time until new data is available. Should I use Last-Modified-Date for each of the data field and update data gradually?
I would start by making the update routine atomic, since you'll have enough on your hands figuring out how to get the client-server communication working properly.
After that is a good time to consider tweaking it to be incremental, but only after you do some testing to figure out if it's really necessary. If you're tuning your update protocol to be as low bandwidth as possible, you might discover that even a "big" update is downloaded fast enough.
Another way to look at it is to ask yourself, how often is there going to be network trouble when an average user is doing a sync? You probably don't want to tune for unlikely scenarios.
If you are trying to optimize (minimize) the data transfer you may want to consider a different format than XML, since XML is fairly verbose. Or at least you may want to trade in XML readability for space by making each element name and attribute as small as possible, and eliminate all unnecessary whitespace.
Your basic scheme is good. The thing you need to do is to somehow make your updates idempotent so that you can restart a partially-completed transfer without risk. This is a better way to go than to try to implement some sort of true atomic commit (though you could do that too, using, eg, the SQLite database).
In our experience fairly large updates (10s of KB) can be downloaded quite rapidly, if the server is fast enough. No great need to break updates up into tiny bits. But certainly it won't hurt to try to minimize the amount of data transferred by keeping more granular info on "last update".
(And definitely you should use JSON rather than XML as your transmitted data representation.)
Wonder if you have considered using a Sync Framework to manage the synchronization. If that interests you can take a look at the open source project, OpenMobster's Sync service. You can do the following sync operations
two-way
one-way client
one-way device
bootup
Besides that, all modifications are automatically tracked and synced with the Cloud. You can have your app offline when network connection is down. It will track any changes and automatically in the background synchronize it with the cloud when the connection returns. It also provides synchronization like iCloud across multiple devices
Also, modifications in the Cloud are synched using Push notifications, so the data is always current even if it is stored locally.
In your case,
Criteria are speed (less network data exchange), robustness (data recovery in case update fails), offline access
Speed: Only the changes are sent across the network in both directions
Robustness: It stores data in a transactional store like sqlite and any failed updates are communicated in the SyncML payload. Only the successful operations are processed while the failed operations are re-tried during the next sync
Here is a link to the open source project: http://openmobster.googlecode.com
Here is a link to iPhone App Sync: http://code.google.com/p/openmobster/wiki/iPhoneSyncApp

Keeping iPhone application in sync with GWT application

I'm working on an iPhone application that should work in offline and online modes.
In it's online mode it's supposed to feed all the information the user enters to a webservice backed by GWT/GAE.
In it's offline mode it's supposed to store the information locally, and when connection is available sync it up to the web service.
Currently my plan is as follows:
Provide a connection between an app and a webservice using Protobuffers for efficient over-the-wire communication
Work with local DB using Core Data
Poll the network status, and when available sync the database and keep some sort of local-db-to-remote-db key synchronization.
The question is - am I in the right direction? Are the standard patterns for implementing this? Maybe someone can point me to an open-source application that works in a similar fashion?
I am really new to iPhone coding, and would be very glad to hear any suggestions.
Thanks
I think you've blurring the questions together.
If you've got a question about making a GWT web interface, that's one question.
Questions about how to sync an iPhone to a web service are a different question. For that, you don't want to use GWT's RPCs for syncing, as you'd have to fake out the 'browser-side' of the serialization system in your iPhone code, which GWT normally provides for you.
about system design direction:
First if there is no REAL need do not create 2 different apps one GWT and other iPhone
create one but well written GWT app. It will work off line no problem and will manage your data using HTML feature -- offline application cache
If it a must to create 2 separate apps
than at least save yourself effort and do not write server twice as if you go with standard GWT aproach you will almost sertanly fail to talk to server from stand alone app (it is zipped JSON over HTTP with some tricky headers...) or will write things twise so look in to the RestLet library it well supported by the GAE.
About the way to keep sync with offline / online switching:
There are several aproaches to consider and all of them are not perfect. So when you conseder yours think of what youser expects... Do not be Microsoft Word do not try to outsmart the user.
If there at least one scenario in the use cases that demand user intervention to merge changes (And there will be - take it to the bank) - than you will have implement UI for this - than there is a good reason to use it often - user will get used to it. it better than it will see it in a while since he started to use the app because a need fro it is rare because you implemented a super duper merging logic that asks user only in very special cases... Don't do it.
balance the effort. Because the mess that a bug in such code will introduce to user is much more painful than the benefit all together.
so the HOW:
The one way is the Do-UnDo way.
While off line - keep the log of actions user did on data in timed order user did them
as soon as you connected - send to server and execute them. Same from server to client.
Will work fine in most cases as long as you are not writing a Photoshop kind of software with huge amounts of data per operation. Also referred as Action Pattern by the GangOfFour.
Another way is a source control way. - Versions and may be even locks. very application dependent. DBMS internally some times use it for transactions implementations.
And there is always an option to be Read Only when Ofline :-)
Wonder if you have considered using a Sync Framework to manage the synchronization. If that interests you can take a look at the open source project, OpenMobster's Sync service. You can do the following sync operations
two-way
one-way client
one-way device
bootup
Besides that, all modifications are automatically tracked and synced with the Cloud. You can have your app offline when network connection is down. It will track any changes and automatically in the background synchronize it with the cloud when the connection returns. It also provides synchronization like iCloud across multiple devices
Also, modifications in the Cloud are synched using Push notifications, so the data is always current even if it is stored locally.
Here is a link to the open source project: http://openmobster.googlecode.com
Here is a link to iPhone App Sync: http://code.google.com/p/openmobster/wiki/iPhoneSyncApp