I'm looking for an efficient means of copying an online file to the local iPhone. Something like RSync would be an ideal technology to use, since it only transfers the bytes that are different; does anyone have experience with anything that fits my description?
What I'm specifically trying to accomplish is the transfer of an SQLite database hosted via http to the iPhone. I expect the database to grow in time; hence my search for an efficient means of transferring the data to the iPhone.
I would have the app query the database for the changes and then download and store the changes in the local database. Treating the database as an opaque file results in a partially downloaded database that is useless and partial downloads could happen quite frequently with mobile devices.
Related
Ive made an enterprise Universal (iphone/ipad) app that uses the odata sdk to talk to an sql server database. All great, works fine. The issue (well not really an issue, but somewhat of limitation) is that it is a connected solution, if you want to retrieve or update information you have to be connected to the internet.
Im now trying to add disconnected functionality. My question is, is there a way of saving large (relatively large) amounts of serialized data to the actual device. I dont want to store it in the application because
it will build memory up in the app really quick
if the app crashes they will lose the data
Any ideas on how I can go about this?
Use Core Data... Apple has very good documentation, check it out!
It sounds like you want to synchronize data between an online system and the device. Synchronization is a very difficult problem to get working perfectly. If your web service is REST compatible I would look into RestKit as a solution for communicating with your online system and storing that data in Core Data locally on the iPad.
Ultimately you want to replicate the data online with a database locally, I would recommend Core Data for this if it is a large amount of data. Core Data alone is a complex framework and will require a good amount of understanding.
There are a lot of things to consider, how much data is needed offline? Can you perform delta syncs? How is data moved around and what steps are necessary for moving the data in the local database when moves occur remotely? How do you detect that a record was deleted online and therefore needs to be deleted locally? Can you users edit existing data while offline? What do you do about merge conflicts (same record is edited online and offline)? You will want to think about all of these scenarios.
There are some systems out there that can handle some of this for you, if your online system is also in development you may consider CoucheBase which has a mobile counterpart that handles this tricky synchronization problem for you.
Good luck!
Core Data is the way to go.
http://developer.apple.com/library/mac/#referencelibrary/GettingStarted/GettingStartedWithCoreData/_index.html#//apple_ref/doc/uid/TP40005316
I keep a content revision history for a certain content type. It's stored in MongoDB. But since the data is not frequently accessed I don't really need it there, taking up memory. I'd put it in a slower hard disk database.
Which database should I put it in? I'm looking for something that's really cheap and with cloud hosting available. And I don't need speed. I'm looking at SimpleDB, but it doesn't seem very popular. rdbms doesn't seem easy enough to handle since my data is structured into documents. What are my options?
Thanks
Depends on how often you want to look at this old data:
Why don't you mongodump it to your local disk and mongorestore when you want it back.
Documentation here
OR
Setup a local mongo instance and clone the database using the information here
Based on your questions and comments, you might not find the perfect solution. You want free or dirt cheap storage, and you want to have your data available online.
There is only one solution I can see feasible:
Stick with MongoDB. SimpleDB does not allow you to store documents, only key-value pairs.
You could create a separate collection for your history. Use a cloud service that gives you a free tier. For example, http://MongoLab.com gives you 240Mb free tier.
If you exceed the free tier, you can look at discarding the oldest data, moving it to offline storage, or start paying for what you are using.
If you data grows a lot you will have to make the decision whether to pay for it, keep it available online or offline, or discard it.
If you are dealing with a lot of large objects (BLOBS or CLOBS), you can also store the 'non-indexed' data separate from the database. This keeps the database both cheap and fast. The large objects can be retrieved from any cheap storage when needed.
Cloudant.com is pretty cool for hosting your DB in the cloud and it uses Big Couch which is a nosql thing. I'm using it for my social site in the works as Couch DB (Big Couch) similar has an open ended structure and you talk to it via JSON. Its pretty awesome stuff but so weird to move from SQL to using Map-Reduce but once you do its worth it. I did some research because I'm a .NET guy for a long time but moving to Linux and Node.js partly out of bordom and the love of JavaScript. These things just fit together because Node.js is all JavaScript on the backend and talks seemlessly to Couch DB and the whole thing scales like crazy.
I've tried to google this for a couple of days and I am still pretty confused, so I thought I would try here.
I have an iPhone app that uses Core Data with an sqlite database. I am trying to implement a simple backup/restore of the database with Dropbox.
I downloaded the Dropbox SDK, and I have everything running fine as far as linking, uploading and downloading my .sqlite file.
However, I don't want users to have access to the actual .sqlite file for security purposes. I have been seeing JSON on these boards for some time now so I decided to look into it. I think it is exactly what I need.
Easier said than done. I have never worked with Java and have never implemented anything like JSON before, so I have had to try to figure out where to start.
I understand basically what is going on, but I'm having a heck of a time figuring out how to do it. I think I found a way to get the Core Data model into JSON format (and I do use the term 'think' loosely here). But then what - what exactly do I upload to dropbox? Do I somehow combine the model (in JSON format) and the database? What gets uploaded to Dropbox? I'm sorry if this seems obvious to most, it really is not obvious to me, and I have looked.
I am willing to do the work, but it just seems like I could go in 90 directions without some basic guidance and a start. I am not trying to do anything fancy as far as determining data that has been changed, etc. - just want to backup/restore the whole database. I just need some basic explanation and to be pointed in the right direction. A simple core-data sample project would be tremendous.
I'm not an experienced programmer, but I am a fast learner. Just break it down easy...
Thanks in advance.
JPK
Thanks Andrew. I didn't want to 'give away' the database structure of my app, but i can now see that the json string wouldn't be much better than the sqlite file in that area. I am a teach-myself programmer (stay at home mom) so this is all pretty new to me. Maybe I want encryption? But is that allowed for iPhone apps anyway? I recall them asking about encryption when I have uploaded binaries.... I know that iCloud is coming out soon, and I do plan to implement that as well, but with the limited amount of data that can be synced for free, I want to be able to do a simple backup as well. Many of my users have asked for it - a backup in addition to that of iTunes, which really is not a great backup since you can't restore data for just one app (you would have to restore for all apps on the iDevice). Hmmm... Any suggestions as to how to upload the file in such a way that it is not easily readable? Is encryption the way to go in this situation?
JPK,
I think you're crossing multiple streams here.
JSON is a data transfer format. IOW, it has almost nothing to do with the architecture of what you are attempting. You will almost certainly use JSON to communicate with Dropbox.
I have a question: why do you think that the user won't have access to any data you send to Dropbox? I suspect that you are probably wrong. The user will have access to everything.
You are doing to an awful lot of work. You already have the .sqlite file being persisted on Dropbox. If you are doing this to make a backup, that data is being backed up in many other venues. In other words, your task is likely moot and unworthy of your time.
I use JSON, REST networks and Core Data daily. If you have a specific question, I am happy to answer it.
Andrew
Instead of saving the whole database, save out a plist file to Dropbox that you can rebuild a database from. That's assuming you have a lot of extra stuff in your database you do not want the user to see, otherwise just back up the DB as-is if it's all user generated data.
You could also encrypt it but why? Adding encryption only means you have to answer "yes" that you use encryption and may not be able to sell to specific companies, should you choose to encrypt - it's not forbidden.
I would like to get feedback from all you seasoned developers as to what methodology would be the more "correct" or "efficient" way of implementing my solution.
I have a 4.5 MB flat file that is some 16,000 rows with 13 columns. I know I can import this into SQLite and create my data model but would it be more iPhone efficient to use this file locally on the iPhone or have the application read the data from a web service?
Thanks.
If you are not going to update the data (or only update it when you are updating the app) the local sqlitedb is going to simpler and more responsive. You would probably be even better off importing the data into CoreData, that way you won't need to directly manipulate sqlite or deal with things like synchronous read APIs.
If you want to be able to have the app download updated data the choice because a lot more difficult, depending on the quantity of data, the frequency of updates, how large the changes tend to be, etc.
a local database should always be more efficient in terms of user experience than a web service
I'd use both.
A remote source allowing for a dynamic datastore, and a local datastore with local cacheing seems like a pretty safe bet.
As for the web service. Unless there is any server-side only business logic, maybe give a cloud solution a try. Something like Amazon's SimpleDB comes to mind.
It of course really depends on how static your data is. As everyone has mentioned already if you don't need many updates the most effective solution is a sole local datastore.
Cheers
I guess it depends a bit on how much of the data you need at any one time. If your users need to download a lot of data just to use your application, that would make your app potentially very slow and also unusable without a network connection.
How often do you need to update the data? Frequent updates would favour a web service solution. Otherwise you'd need to update your app and resubmit every time a bit of your data changes.
Another thing to think about: how much do you pay for web traffic for your website? It could become quite expensive if a lot of users constantly need to download data. Unless you use some kind of subscription you only get money once, when you sell the app.
Personally, I'd probably lean towards putting the data on the phone and not using a web service.
I am working on the developement of a application that will perform online backup of the files and folder in the PC, automatically or manually. Currently, I was keeping only the latest version of the file at the server.Now, I have to implement the versioning so that only the changes can be transfered to the online server and user must be able to download any of the available version of the file at Backup Server.
I need to perform Deduplication for this. Guys, though I am able to perform it using the fixed block size but facing an overhead of transferring the file having CRC information with each version backup.
I have never worked on such technology , so lacks in experience. I am eager to know is there any feasible method to embedd this functionality in the application without much pain. Is any third party tool would help to perform same thing? Please let me know?
Note: I am using FTP protocol to transfer the data.
There's a program called dump that does something similar, but it operates on filesystem blocks rather than files. rsync also may be of interest.
You will need to keep track of a large number of blocks with multiple versions and how they fit into the various versions of the original files, so you will need some kind of database to track this information, and an efficient way to query it to determine which blocks in a given file need to be transferred. Also note that adding something to the beginning of a file will cause all your blocks to be "new" if you use a naive blocking and diff scheme.
To do this well will be very complex. I highly recommend you thoroughly research already-available solutions, and if you decide you need to write your own, consider the benefits of their designs carefully.