What's the proper file format for local data to implement a sync feature with a SQL database server? - postgresql

This question might be quite dumb, but I couldn't find anything related.
Let's say a user can create and modify a shopping list in an application, that data would be synchronized to the SQL server whenever network is available and vice-versa.
What file format should the local data stored in? Using a local SQL database server won't make much sense for a mobile platform.
Should I store everything into a JSON file locally, or there is a better approach?

Related

What is the most lightweight way to get last modified datetime from an Aqueduct application?

I'm writing a very small REST API using Dart/Aqueduct hosted on Heroku utilizing PostgreSQL.
When communicating with this API, I need to fetch all data and store it in an application locally. The application will on reboot ask the API if any data has changed in any databases, which will only get modified through this API.
My question is: How do I check, whether data has been modified? Storing it in the Aqueduct channel is not viable, as Heroku will boot servers up as needed (and multiple at the same time) changing this modified date time each time.
PostgreSQL cannot supply this modified date time (https://dba.stackexchange.com/questions/58214/getting-last-modification-date-of-a-postgresql-database-table) - so what do I do? Is the only way to do this to have a seperate table storing this information? Can this be done more lightweight, so I wouldn't have to make a query to the databases when e.g. calling https://my-api.com/lastModified? Should I serve a static file, which should be written to on each data modification?
Maybe there exist a smart, lightweight solution!
Cheers :)
Yes, use a separate table to store the date - the ORM is already doing this to track migrations. I’m not sure what light weight means in this context, but you won’t run into any problems with this approach, whereas a file or in-memory storage won’t work at all.

Which kind of Google Cloud Platform mobile backend client is appropriate?

THE PROBLEM
I'm writing a mobile app which will allow a user to log in, save some preferences that must be stored in a database, and display congressional bills to the user.
I've only written simple RESTful services with PHP and MySQL in the past. I'd like to take advantage of newer technologies, and am a little lost on general direction.
The bill data (formatted as JSON) can be gathered by running the scrapers found here. Using docker, I managed to set a working directory and download the files on my local machine.
I've designed a MySQL database for holding the relevant bill and user data.
I started to mess around in Google Cloud Platform, and read the doc that describes different models. I'm thinking of a few different ideas, but aren't familiar with GCP or what I can actually accomplish.
QUESTIONS
1) What are App Engine, Compute Engine, and Container Engine each for? I get the gist that Container Engine holds different instances of stuff you load up with docker, and that Compute Engine sets up a VM, but I don't really understand the relationships. How should I think of them?
2) When I run those scrapers from the shell, where are the files being stored, and how can I check on them? On my computer, I set a working directory, but how do directories work in GCP? Is it just a directory in the currently selected VM, or is this what Buckets are for?
IDEAS
1) Since my bill data already comes as JSON, should I skip the entire process of building a database for the bills and insert them into Firebase somehow? Is this even possible? If so, am I stuck using Firebase's NoSQL, or can I still set up a relational database?
2) I could schedule the scrapers to run periodically, detect new files, and run a script to parse the JSON and insert new bill data into my a database (PostgrSQL?/MySQL?). Then I would write an API.
3) Download the JSON files to a bucket, and write an API that reads from them. Not sure how the performance would compare to using a DB.
I'm open to other suggestions as well.
For your use case (stateless web application), App Engine is probably your best choice. The Google documentation has severalcomparisons of your computing options
You can use App Engine with PHP and cloud-hosted MySQL if you want, which could be a good way to get your toes wet without going in over your head.

Local, file-based database for an Electron application

We are working on an application that will be offered both as a web-based and as a cross-platform desktop solution by means of Electron.
Due to customer requirements, the desktop client cannot make use of "the cloud" to store data; all data should be stored in the local machine or, even better, the user should have the option to keep the database/data file on an external HDD so that another user on the same local network can use the same data file.
We've been looking at NeDB, PouchDB, etc, but all these use either Web SQL or IndexedDB on the browser itself to store the data.
NeDB can theoretically use the file system but that seems only possible for Node Webkit apps.
Another option is of course MongoDB, but it requires setting up a site on a web server. Seeing as how our users will set that up in on their own machines, that will work for one user only but would make it very hard for them to share the data (note: assume users with little technical know-how).
Is there a way to force NeDB to persist data in a file instead of the in-browser database?
Alternatively, does any one know of a file-based, compact database that plays well with electron/node?
We'd preferably like to use a NoSQL database, but options of file-based SQL databases will be considered as well.
I have some experience with NeDB in an Electron app and I can say it will definitely work on the filesystem.
How are you initializing NeDB (or whatever your database choice is)? Also, are you initializing it in the main or renderer process? If you can share that, I think we could trace the issue to a configuration issue.
This is how you start NeDB with a persistent data-store that saves to disk.
var Datastore = require('nedb')
, db = new Datastore({ filename: 'path/to/datafile', autoload: true });
I think MongoDB is going to be overkill for an Electron app (it's meant to be really a high performance, distributed database running in the cloud).
Another option you could consider is LevelDB (a key/value store that can persist to the filesystem) which is popular in the node community. (EDIT 4/17/17 IndexedDB uses LevelDB underneath the hood, so if you go that route, may as well just use that)
One aspect I would definitely evaluate carefully is: How difficult is this database going to be to package and distribute? How do I integrate it into my build system? Level and NeDB can be included simply via npm install and any native code compiling is handled seamlessly with node-gyp, which is as simple as it gets. However, bundling Mongo, for example, will require some work to get a working build for each different platform.

Stream coredata to remote REST end-point

There are several apps that I use on my Mac that store their data in core data. I can see the data I want in CoreDataPro. I want that data - specifically I want to send changes in that to some remote end points (such as Zapier, or some other REST service).
I was thinking of piggybacking something like RestKit - such that I provide a configuration file saying where the app is and what end points the data needs sending to. I need only to scrape the data and send to REST, not a two-way sync.
I'd prefer a utility that I could configure rather than having to code a Mac application.
I noted http://nshipster.com/core-data-libraries-and-utilities/ - RestKit still seemed the most capable, but in https://github.com/RestKit/RestKit/issues/1748 I was advised that coredata projects should only be opened by a single application at a time, and really RestKit is designed for baking into the source app (rather than for database scraping and sending).
What approach would you take?
I also noted:
http://www.raywenderlich.com/15916/how-to-synchronize-core-data-with-a-web-service-part-1
Thanks, Martin.
First, Core Data is an object store in memory. What is written to disk from Core Data can be in one of several formats. One of those formats happens to be SQLite. If the application is writing to SQLite then it is possible to sample that same file and push it somewhere else.
However, each application will have its own data structure so you would need to be flexible in the structure you are handling.
RestKit has no value in this situation as you are just translating objects into JSON and pushing them to a server. The built in frameworks do that just fine.
There is no utility to do this at this time. You would need to write it yourself or hire someone to write it.
If I were going to do something like this, I would write it using Core Data itself interrogate the model from the application that wrote the data in the first place and then translate the database into JSON and push it. You won't be able to tell what is new vs. old so the server will need to sort that out.
Another option, since you can't diff anyway, is to just push the sqlite file to the server and let the server parse through it.
Other answers might include:
use a middleware platform e.g. using rssbus.com (only) sqlite connections are free to send the events
as my target system (http://easy-insight.com) actually has a transmitter that sends new records it sees from MySQL abd PostgreSQL, I could https://dba.stackexchange.com/questions/2510/tools-to-migrate-from-sqlite-to-postgresql or use an ETL such as http://www.easyfrom.net (I did ask the vendor for SQLite support a long time ago, but SQLite is just not a priority for them).
I'm wondering whether a good answer (where good excludes Objective-C and includes languages that I do know, such as - to a limited extent - Ruby) is to use MacRuby and its Core Data libraries.
Core Data seemingly can be exposed as an Active Record. https://www.google.com/search?q=macruby+coredata , notably http://www.spacevatican.org/2012/1/26/seeding-coredata-databases-with-ruby/
However, MacRuby seems to have faded - https://github.com/MacRuby/MacRuby/issues/231 - it won't even compile on Mavericks.

Can Core Data use a Web Service as a persistence store?

I'm working on an app that uses Core Data, and I'd like to be able to code it in a way that it can use a local SQLite store or a web-based store (with an XML or JSON response schema).
Is it possible to use the exact same code for the Core Data stuff and just select the appropriate persistence store based on a user's preference?
Look at the WWDC video "Building a server-driven user experience".
You can connect to a remote store via a URL but that doesn't sound like what you want as that would support only one store for every remote user.
Really, all you need to do is setup a regular SQLite store and then add a little code to send changes to the server via the chosen method. Then you could turn the server connection on and off as needed.
That would be simplest as long as you don't have a requirement that no data be persisted on the device itself.
In theory, yes. However you would probably wanna cache the data locally in case of network issues etc.
Take a look at this project https://github.com/AFNetworking/AFIncrementalStore which doesn't really implement a Web Service backed NSPersistenceStore, but it does try to achieve what you have in mind.