Framework Recommendation for developing distributed iPhone / iPad applications - iphone

Is there a distributed application framework (commercial is okay as well) that supports iPhone / iPad ?
What I'm looking for in the framework:
Allows me to focus on the application logic
I don't have to code "low-level" network programming (I've done it too many times that I dont wanna do it again =p)
Should be actively maintained (popular would be nice)
Basically, I can then develop faster.
We plan to develop a soft real-time TCP/IP client/server application where there are many iPhone/iPad clients (30+) connected to single server over LAN. The server most likely will run Windows (unless the framework does not support it).
I've been looking around and I see:
MonoTouch WCF (still looks quite raw ?)
RemObjects (Mono + Objective-C)
Cocoa Distributed Objects
ZeroC Ice Touch (Objective-C)
RakNet ( ? included because it mentions iPhone, but will need to use C++)
Of course, there's also the option of using the plain old MonoTouch System.Net.Sockets
Or, CFNetwork (I dont plan to use this one)
I'm still deciding whether to use Objective-C or MonoTouch, but leaning towards MonoTouch since we will get the .NET framework, and not be tied into just the Mac world.
Please feel free to comment if I added anything that's not related to my question---I'm new to iPhone/iPad world.

We've used WCF/Monotouch with great success - there are some areas of the f/work that arent 100% but for most cases you should find working with WCF on monotouch a breeze.
The ability to share all of our data sync, model, tests etc between monodroid and monotouch and wm7 is seriously cool (with some working - this is easilly possible - you'll need to manage multiple prj files).
Be careful to manage calls to wcf services correctly, keep them to a minimum, keep the archetecture simple. We ended up with a fairly complex dto to minimise the amount of calls to the wcf services to sync the data - this was well worth it as the time needed to sync a device from scratch is now a fraction of what it was.
Using SSL to communicate with the server is a PITA but I think that's more a case of the way apple have managed it.

You need to be a bit more explicit on your requirements. If you need only object serialization (dehydration/hydration) over REST API, then anything that supports POX or JSON will work just fine for you. However, if you need RPC-style method invocation, with authentication, encryption/digital signature, transactions, etc, then you need one of those frameworks you listed above.
If you need a framework, I personally would lean towards the MonoTouch WCF, as it gives you the ability to move your client to other platforms later as well (Windows Phone 7 for example). Then again, as you said, it's a bit rough right now, and if Mono team decides in the future that they don't have the resources to invest in maintaining it, you might end up with having to move to another framework. Of course, there's also the drawback that you need to use MonoTouch for your application, and can't use Objective-C. Granted, with the recent changes in the iOS Developer Agreement, that's not that much of an issue, but it is still something to keep in mind.
(Disclaimer: I used to work on Microsoft's WCF team, so I am biased towards the product itself)
The other option I would go for, would be Cocoa Distributed Objects. However, that would be my choice if the server is also running on OS X. I know there's Bonjour for Windows, but I doubt it's optimized for server scenarios, and I also don't know how rich is Apple's RPC implementation on top of it for the Windows platform. So I would stay with Apple's technology only if I am building exclusively for Apple's platform.
Note that WCF and Distributed Objects would give you RPC-style functionality, but they won't help you with any particular scenarios. If you need/want even higher level of abstraction, for example you need presence information or multi-user chat, you will still need to implement those yourself. It might be worth at this point to look at frameworks that provide those features for you. An example would be RakNet (which you listed above), which abstracts the remoting level and builds additional features on top of it.

You can use JSON Touch + Vitche PHP Emission Framework which provides all server-side you need. Also you can use that Framework to access existing SOAP (WCF, Axis, etc) services.

You can use Google protocol buffers to implement RPC though you will need to do some network programming for transporting your messages anyway. It supports interface generation for C++, Java, Python and Objective-C and .NET so you can create a single set of RPC messages and get code for working with them for almost any mobile platform. Transport layer on your mobile platforms you will have to implement yourself.
http:// code.google.com/apis/protocolbuffers/ - main Protobuf page (C++, Java, Python)
http:// code.google.com/p/protobuf-net/ - Protobuf .NET mentioned in one of the comments
http:// code.google.com/p/metasyntactic/wiki/ProtocolBuffers - Protobuf for Obj-C

Related

Enterprise Web Application Architecture Question

I am wondering if it would be possible to develop an enterprise-level web application without the use of a standard MVC structure and application server by carrying the business/flow logic and session data to the client-size Javascript and make it talk to REST data services directly...maybe we could make use of an authorization/authentication layer and a second validation layer sitting on top of the data services. All these services operate on standard HTTP methods, support configurable logging&monitoring, and content or query parameters are all contained in the HTTP request/response body. Static HTML and Javascript are served to the browser and the rest is carried out by Javascript functions talking to the HTTP-based authorization/authentication, validation and then data services. Do you think this kind of an architecture could satisfy enterprise-level web application requirements?
It's possible but unlikely; what are the drivers that suggest this architecture to you? Is it just to be different or are there some specific aspects that this best addresses?
by carrying the business/flow logic
and session data to the client-side
Javascript and make it talk to REST
data services directly
In theory you'd still be able to have an appropriately layered solution (Business Logic (BL) script vs UI focused script) but practically speaking it'd be messy, and you'd lose the ability to physically separate it into different tiers. This could "bite" you at any number of places in the life of the system.
"Enterprise" grade systems are seldom small, I hate to think how much logic you'd be having to send over the wire to support a given action/process.
Putting all the BL into a scripting language ties you to that platform, and platforms change over time. The bad thing about scripts is that whilst they are stable to a degree I'd suggest they are more exposed to change than server-based platforms like Java or .Net. In an enterprise scenario, the servers will have very tight change control and upgrade paths mapped out for them - whereas browsers are much more open to regular change.
There's the issue of compatibility - unless you're tied to a specific browser (to the version level) guaranteeing consistent behavior is going to be harder, and will likely require more development effort. Let's say you deliver the solution successfully; what do you do when the business wants to take advantage of mobile computing - say iPads? Your only option is going to be a browser - you won't be able to take advantage of any of the native advantages of the platform. "The web and browsers" might seem like they'll be around forever - but then I'm guessing that what MainFrame folks said at the time. A server-centered solution is going to give you more life for less expense.
Staffing will be an issue - you'll need very strong JavaScript and server-side developers.
Security: having your core BL out on the client where it's much more exposed sounds very dangerous.
EDIT:
Web Apps can be sow for many reasons - not many of which are reason enough to put all your BL in JavaScript on the client. Building apps for performance is a whole field of endeavor on its own - I suggest you get more familiar with Architecting and implementing for performance before you write off n-tier web apps altogether :)
Regarding keeping your layers separated: there are different ways of doing this but it boils down to abstraction - and more correctly to keeping good design principles in mind; if you haven't heard of SOLID that would be a good place to start. In terms of implementation start reading up on Dependency Inversion (FYI - self-promotion, the articles mine and is .Net focused, but you should have no problem tracking down Java-based ones too).

Should I learn how web frameworks work before I use them?

I'm interested in creating a basic web application (for learning, but I want to finish within a few months), and I've read that using a web framework can make that task much easier.
After reading about different frameworks online, it seems to me that using frameworks would hide a lot of detail on they work. I fear that if I use a framework, I won't really know how my website is running.
Is it important to understand how frameworks do what they do, or am I worrying too much? (eg. I don't know how the Linux kernel works, or the C compiler, etc.)
Even if you don't have a particular interest in web frameworks, I would say it's good to play with a few and then crack them open if only for the exposure to new design patterns and solutions that can be applied anywhere in development. (MVC in particular when talking about most web frameworks)
It is (to some extent) important to understand how frameworks work, but you'll never learn that without using them.
So, start using some framework and you'll get basic understanding of it. And then, if you have interest, you can always dig deeper into it (maybe even submit patches and participate in its development). But not in the opposite order.
Using your analogy, you don't become Linux kernel developer without being Linux user for some time.

Any good client-server data sync frameworks available for iPhone?

I'm just getting into the client-server data sync stage of my iPhone app project, and have managed to get my CoreData data model loading on both the iPhone client and my TurboGears server (which is good). I'm now beginning to tackle the problem of sync'ing data between the server and multiple clients, and while I could roll my own, this seems like one of those problems that is quite general and therefore there should be frameworks or libraries out there that provide a good deal of the functionality.
Does anyone know of one that might be applicable to this environment (e.g. Objective-C on iPhone, pyobjc / Python on the server)? If not, does anyone know of a design pattern or generally-agreed upon approach for this stuff that would be a good road to take for a self-implementation? I couldn't find a generally accepted term for this problem beyond "data synchronization" or "remote object persistence", neither of which hit much useful on Google.
I did come across the Funambol framework which looks like it provides this exact type of functionality, however, it is C++ / Java based and therefore seems like it might not be a good fit for the specific languages in my project.
Any help much appreciated.
Since you are using TurboGears already, take a look at the RestController documentation. Using RESTful services has become a widely adopted architecture with many implementations for both clients and servers. Matt Gemmell's MGTwitterEngine is a good example of the client implementation of a specific API, Twitter.

Shared Library for iPhone and BlackBerry

I have a set of functionality (classes) that I would like to share with an application I'm building for the iPhone and for the Blackberry (Java). Does anyone have any best practices on doing this?
This is not going to be possible as far as I understand your question - the binary format for the iPhone and Java are not compatible - and even for a native library on a blackberry device.
This is not like building for OS X where you can use Java unfornately the iPhone doesn't support Java.
The best idea is probably to build you library in Objective-C and then port it to Java which is an easier transition than going the other way. If you programme for Objective-C and make sure you code has no memory leaks - then the changes are not so complex.
If you keep the structure of your classes the same then you should find maintenance much simpler - fix a bug in the Java and you should find it easy to check for the same bug in the ObjC methods etc.
Hope this helps - sorry that it is not all good news.
As Grouchal mentioned - you are not going to be able to share any physical components of your application between the two platforms. However you should be able to share the logical design of your application if you carefully separate it into highly decoupled layers. This is still a big win because the logical application design probably accounts for a large part of your development effort.
You could aim to wrap the sections of the platform specific APIs (iPhone SDK etc.) that you use with your own interfaces. In doing so you are effectively hiding the platform specific libraries and making your design and code easier to manage when dealing with differences in the platforms.
With this in place you can write your core application code so that it appears very similar on either platform - even though they are written in different languages. I find Java and Objective-C to be very similar conceptually (at least at the level at which I use it) and would expect to be able to achieve parity with at least the following:
An almost identical set of Java and Objective-C classes with the same names and responsibilities
Java/Objective-C classes with similarly named methods
Java/Objective-C methods with the same responsibilities and logical implementations
This alone will make the application easier to understand across platforms. Of course the code will always look very different at the edges - i.e when you start dealing with the view, threading, networking etc. However, these concerns will be handled by your API wrappers which once developed should have fairly static interfaces.
You might also stand to benefit if you later developer further applications that need to be delivered to both platforms as you might find that you can reuse or extend your API wrappers.
If you are writing a client-server type application you should also try and keep as much logic on your server as possible. Keep the amount of extra business logic on the device to a minimum. The more you can just treat the device as a view layer the less porting you'll have to do over all.
Aside from that, following the same naming conventions and package structure across all the projects helps greatly, especially for your framework code.
The UI API's and usability paradigms for BlackBerry and iPhone are so different that it won't be possible in most cases to directly port this kind of logic between apps. The biggest mistake one could make (in my opinion) is to try and transplant a user experience designed for one mobile platform on to another. The way people interact with BlackBerrys vs iPhones is very different so be prepared to revamp your user experience for each mobile platform you want to deploy on.
Hope this is helpful.
It is possible to write C++ code that works in both a BB10 Native app and an iOS app.
XCode would need to see the C++ files as ObjectiveCPP code.
I am currently working on such a task in my spare time. I have not yet completed it enough to either show or know if it is truly possible, but I haven't run in to any road-blocks yet.
You will need to be disciplined to write good cross-platform code designed w/ abstractions for platform-specific features.
My general pattern is that I have "class Foo" to do cross platform stuff, and a "class FooPlatform" to do platform specific stuff.
Class "Foo" can call class "FooPlatform" which abstracts out anything platform specific.
The raw cross-platform code is itself not compile-able on its own.
Separate BB10 and XCode projects are created in their respective IDEs.
Each project implements a thin (few [dozen] line) "class FooPlatform" and references the raw cross-platform code.
When I get something working that I can show I will post again here...

Any success using Apache Thrift on iPhone?

Has anybody done or seen a deployment of Apache Thrift in an iPhone app?
I am wondering if is a reasonable solution for a high-volume, low(er)-latency network service for iPhones compared to HTTP.
One noteworthy thing I found is a bug report about running Thrift on the iPhone, which seems to have been fixed. But that doesn't necessarily indicate that it's a done deal.
Thrift and HTTP aren't mutually exclusive. In fact thrift now ships with an HTTP transport implementation to use. It's also a really nice way to auto-generate server/client code that avoids a lot of marshalling/unmarshalling boilerplate while still being really fast. Its internal representation is basically binary JSON, so it's very similar to a RESTful web service (except being easier to code and much, much faster).
So... anyone able to answer the original question? If not, I'll dive in myself with thrift's included Cocoa support and see how it works on the iphone.
Just my two cents..
The accepted answer to this question, is an opinion to not use a technology, not an answer of whether it is possible.
Thrift, is an interface definition language, IDL, like Protobuf and Capt'n'Proto. They permit the definition of a client/server/server protocol which is platform agnostic. JSON and Plist don't provide the same level of type conformance.
Having previously lead an iOS team with 10Ms MAU using Google Protobuf v2.5 on iOS, Android, Windows, and server teams, I can attest that IDLs are great on mobile. Apple uses them for syncing iWork content.
My current team uses Thrift for iOS and Android clients, with a mostly Scala backend. I much prefer it to Protobuf.
We send Thrift payloads over HTTPS and WebSockets. Once you have defined (in Thrift) your our wire communication protocol (i.e. frame structure), it's very easy to evolve your APIs.
However, on iOS in particular there are some implementation issues. The current version of the library is quite poorly packaged, and if you hope to make an Objective-C framework (e.g. for iOS 8+), then you will not be able to out of the box with v0.9.2. This is because the library headers include local imports, (#import "TProtocol.h" instead of #import <Thrift/TProtocol.h>) with no umbrella headers. Worst of all, the Objective-C compiler generates very messy Objective-C classes, also including local imports from the Thrift library.
Some of these issues are pretty damning. It indicates to me that while use of an IDL is very much a good engineering decision, not many iOS teams are using Thrift, unless they're huge with the resources to write their own library.
I've always disliked frameworks that use a common interface definition that builds out both server and client code. It keeps both sides too much in lockstep where in reality server API changes must be very flexible in the versions of clients that are communicating with it.
There are helpful libraries that make JSON or PLIST communication over HTTP pretty easy, and decades of debugging and understanding the HTTP protocol and how to use it well. I would ignore that at your peril.
I have used thrift's objective c bindings for a large iPhone app with a few million users. As one of the posters mentioned we can use Http which gets the best of both worlds. However there is no asynchronous HTTP client for thrift. We had to build an event based wrapper to allow non-blocking I/O calls. The underlying layer still issues one call at a time which hit us in a big way because we have one server call that takes a long time but it does not block UI flow and another really fast one that does block UI flow. If the underlying layer is busy with the slow command our fast command just has to wait. I am trying to build asyc http in c++ which can then be used on the iPhone but that is someways off from being ready.
Thrift as an external API doesn't make sense. Use it internally rock and roll.