I have a DB and web server in the same host. I have an iPhone app that sends XML to the web server. The web server will query the database and return data back to the app.
I'm not sure whether this is the fastest way
Are there other ways faster than this?
In server-client programs, the biggest bottleneck is usually the network latency unless you are doing very complicated time-consuming processing at server side or you have behemoth data in your RDBMS and need to search from it.
There are a couple of things (scopes more than XML) that you may try for fast loading:
If you are feeling that database is the bottleneck, you may try caching (look for MemcacheD and the likes) objects in front of database. This will reduce DB hits and retrieval will be faster.
Use compression in XML. Or use shorter notation like JSON or YAML. In general, for webapps:
Reuse and use optimized (compressed) images. Use more CSS less images wherever you can.
Minify the components like CSS and JS.
Don't load everything at once, if not needed.
You may look into XML compressors or if you can, use some shorter notation like JSON or YAML. But I guess it will be hard to change data format if you have already developed the app.
XML isn't the world's most compact format, so you could obviously speed things up by reducing the amount of data that you send and receive. If you can talk to the database directly, you can cut out the web server, which would surely speed things up too.
The thing is, though, that XML and HTTP are standards, and there's a lot of value in that. Is the small increase in speed that you get from a purpose-built custom protocol really worth the loss of flexibility and extra development time?
Related
I am starting to work on an application that consists of multiple components. Some of them reside at the server-side while some will be on the client side. I have most of the stuff figured out but I can't decide if I should use a human readable protocol for communication between the server and the client or if it is best if the format will be non-human readable.
So far all the scenarios could be made so that only text will be transferred between the server and the client. But obviously, if I take into consideration a future expanding I may be in the need to transfer non-text data (Ipotetical example: images)
For the actual communication channel I will use standard sockets (no REST, WebServices or anything like this), so I will have a lot of flexibility.
Both human-readable and non-human-readable have advantages and disadvantages, so I am unsure which path to take.
I'm a web developer and I have a strong preference for using human-readable protocols at the outset. For example, json. They have the advantage of being easy to debug and easy to prototype. Furthermore, they allow you to reach for all the low hanging fruit, such as rough system architecture and major bottlenecks without having to do mental translation at each step.
If later on you discover that the human readable version is a major bottle neck, you can address it then. I suspect you'll very often have other things you can optimize first.
i'd like to make some application that store data in device.
so i want to use some storage. but there are two ways i can store my data.
first one is saving them to file system by XML structure, and the other one is by using sqlLite library. but i have no idea. because i don't know the strength and weakness each of them. for example location of stored data, and which one is faster to get the data i want. i just store text data. so please recommand which one is better.
sorry for my english skill. thank you!
SQLite !!! No doubt about that.
For easy maintenance, querying capability and lot more.
SQLite has the basic features of a database engine, it's lightweight. Specially designed for mobile platform. So some features like regex in database etc is not included.
File and Database. Always preference goes to database.
DATABASE ALWAYS...
As you said SQLite is one option, and you can use SQLite directly with iOS. But I hope you have heard about Core data, which is a higher level wrapper to make Database easier.
But there is no comparison between Database and files
This would be Perfect answer for your question and for iOS I would suggest SQLite will be the best one as it has everything in one file,
performance loss is lower than XML as cache gets bigger.
It depends on the amount of data (and its structure and complexity) that you are going to store. For rather small amounts, not more than a handfull of records or two, I would use an NSDictionary and the load and save methods that come with it. Those persistant storage will be in XML, btw.
For more data or rather complex data, especially when you do not need to load all the data at the same time, then you should go for a database solution. That chould be sqlite. Or you may opt for core data which (per default) is based on sqlite.
I would suggest core data (unless you aim for portability such as android). You will have to invest in learning core data. Which may appear to be confusing from start. But once you have started with that you will certainly find core data quite convenient. It takes a lot of work from you that you have to care for manually when using native sqlite.
Hi thank you for the help in advance,
I have looked at some of the posts and I am a bit confused about the multi threading. It seems that it may be pretty easy, however I am very new to programming so I am still trying to grasp this.
These are two calls to pull data from a database, and they take forever as it is... So I'm thinking about multithreading these until I can learn how to build a core data for this. Right now i am using sqllite and the database involves 10,000 + recipes... So not lightning fast like I would like...
Please let me know what you think, and how I can make these happen maybe simultaneously? (If thats even possible)
Thank you in advance.
requestCount++;
[[DataPuller sharedDataPuller] getAllDeletedRecipeList];
[DataPuller sharedDataPuller].target = self;
requestCount++;
[[DataPuller sharedDataPuller] getAllRecipesList];
[DataPuller sharedDataPuller].target = self;
If you're doing SQLite, you might want to contemplate using FMDB which (a) gets you out of the weeds of sqlite3 calls; and (b) offers a FMDatabaseQueue which allows you to coordinate queries from multiple queues, so that the data operations don't stumble across each other.
Having said that, you suggest that you're having significant performance issues which you're hoping to solve with a shift to Core Data or going multi-threaded with SQLite. That's unlikely. Poor performance of local database operations is generally more of a matter of your application design (e.g. it's unlikely to be wise to try to retrieve the entire details for all 10,000 recipes ... you probably want to retrieve just the unique identifiers, perhaps only those required for the given screen, and then only retrieve the particulars for a given recipe at a later point as you need that). For local database interaction, you rarely have to contemplate a multithreaded implementation, but rather just design the system to retrieve the least possible information at any given point that you need for the presentation. I personally find that my database-driven apps generally only need to go multithreaded when I'm doing extensive interaction with some remote web service (in which case, it's the retrieval of server data and the parsing of that which goes on the separate thread, not necessarily the database operations themselves).
If you're having performance issues with your recipe app, I'd suggest you submit a far more detailed question, with code samples, that articulates your particular performance problem. And I wouldn't be surprised if multi-threading was not part of the solution. Likely, appropriate use of indexes and a more judicious retrieval of information at any given point might be more critical.
Get records from database in the form of pages; i.e. 20 or 50 recipes per page. Have a look on YouTube app. on iPhone or have a look on my app. HCCHelper
How can I check if a proxy is online and properly working with perl? I was considering running a get operation and comparing output but i'll be running this check so frequently this overhead would be huge, any other more lightweight alternative?
No, this is exactly how you do it. If you use a light-weight method such as HEAD, TRACE or OPTIONS instead, you cannot know whether the proxy is actually useful or censoring or even subtly subverting the unencrypted data.
You can keep the overhead small by testing against a minimal useful HTML document.
Like daxim said, I think that testing against a very small HTML document is going to be lightweight enough for most scenarios.
The insuperable lightweight solution will be to use a web service that responds you with minimal data about your proxy IP address, if it is online and working fast enough, etc. This of course is going to include a third party(who's going to be doing the not so light work, doing the requests to all the proxies) and this, like everything, has their pros and cons.
I use this proxy checker from Google code to do exactly what you need and I also obtain some more info of each IP address, like country and a couple of the proxy speed measurements. It's a very simple code that consumes a web service from http://proxyipchecker.com/ .
PS: The example is in PHP but is trivial to do the same in Perl.
I'm developing an iPhone application that will access XML files (or something similar) from a server. I want to convert that data into a slick, native UI on the iPhone. With my current knowledge I could already do this by loading the files, parsing them, writing custom code to fill in data structures and convert the data into user interface elements. However, since I know this is a common problem in iPhone development, I'm inclined to think that there is a simpler method that could abstract some of the process.
What's the best and most appropriate way to write a hybrid app without reinventing the wheel?
There are a few ready-made abstractions, but most of them focus on XMLRPC, which tends to get a bit clunky.
Your best bet is probably to write an NSXMLParserDelegate, which is straightforward enough; and then simply create your parser using -initWithContentsOfURL:. With this method, loading the XML files and parsing them becomes one step; and you create your data structures as you go. Conversion to UI elements happen with the usual abstraction mechanisms (dataSource and delegate).
This frees you from the constraint of externally imposed XML Schema, but it predicates that your XML files are relatively lightweight, or there might be (significant) interface lag. It can be sensible to load in the XML on a separate thread and reload your interface as more data becomes available (tho' not too often), especially if the files are more than a couple of KiB each.
Edit: A few notes: On the whole you want to avoid UIWebView if you are doing anything even remotely complex. It's not as fast as native controls, and the look and feel of result applications is usually just a little off.
Also, it sounds to me like what you want is more or less an XML-file => UITableView type application, or at least something conceptually similar. This is really easy to build; the trickiest part is figuring out an XML format that contains the information you want without getting bloated. In fact, I'd recommend you start there; just an XML consumer and a Navigation Controller; using that as a starting point should let you check that your structure is sane and that the files aren't too big; which brings us to another problem with using UIWebView:
You have no control over the caching, especially if you process or get the files using JS. This is fine for most web browsing, where WebKit usually does the right thing, especially when faced with sane web server configurations (well, not the actual configs, but the practical results of the configuration: sane headers).
When you use a custom-built parser and caching system, you have a greater degree of control, and a lot of tricks you can use to ensure that you never download more than you strictly speaking need.