Database and application design to guarantee application availability when internet connection temporary lost - postgresql

I want to create a web application for restaurants and because of business model reasons it should be a online web application (on the cloud). So a restaurant can have an account on the app and it creates its own menu and adds its waiters and the cook.
The waiters should be able to access the menu and place orders all the time. My main issue which i should decide how to go about is:
"How can i grant fulltime availability to the waiters or the cook even when the internet connection is lost for several seconds to several minutes or even hours during the day"
I was thinking of installing the app in a sever in the local network of the restaurant which takes over the responsibility of the could server when there is no internet connection which means all orders are saved in the DB of the local server. And as soon as the connection is back the local DB is synced to the cloud DB (i was told Postgresql might have plugins supporting this, via on-premise or sth similar). Which means local DB records should be pushed to the cloud DB.
Can someone give me a hint on what tech (open source and no enterprise solutions pls) to use, to accomplish the DB syncing when internet goes on and off.
Am i on the right track or completely off with what i suggested previously?

Related

How do you sync many writable MongoDB databases?

What I mean by witable is that you can CRUD on each database, and it automatically syncs with the other so that all of them are synced all the time (as much as possible).
I want to start a project for a company with some tricks.
The company is present in many locations (at least 5) and wants the app to run locally (with local database), but when there's a change(Create Update or Delete), the change is propagated to the other databases.
The goal is to have them all synced at every moment, but with the possibility that if internet connection is lost on one site, they continue to use the app properly since they are actually connected to the local database. That's why they don't one a totally online database.
They use MongoDB.
I saw the replica sets technology, but since it's with a unique master, it seems complicated.
Please can you share solutions to such a situation?

Dropbox app with tiered users

Preface:
I'm hoping to upgrade an existing application by adding cloud backup and syncing of the customers data. We want this to be as seamless as possible, but also for the customers only interface to the data to be via the applications front-end interface.
Our application can be connected to the oil pipe of a machine, collects data on the oil condition. When a test has completed we want to push this to the cloud. Because of the distinct test nature of the data (as opposed to one big trend) most IoT platforms don't suit very well, so we're aiming to release a slightly modified version of the application which doesn't have the connection to the sensors and this will be our remote front-end.
Since the existing application uses a relatively simple file structure to store it's data, if we simply replicate these files in the cloud, the remote front-end version can just download these to the same location and it'll work fine. Thus this has lead us to Dropbox (or any recommended more appropriate cloud storage system).
We hope to use the Dropbox API directly in our application to push and pull the files as necessary. All of this so far we believe is perfectly achievable.
Question: Is it possible - and if so how would we go about - to setup a user system with the below requirements
The users personal dropbox is not used
Dropbox is completely hidden from the user
The application vendor has a top level user who has access to all data (for analytic, we do not want to store confidential or sensitive data).
When the user logs in they only have access to their folder and any attackers could not disrupt the overall structure. (We understand that if an attacker got the master account then all is lost, but that is an internal issue to keep it secure. As long as the user accounts are isolated this is okay.)
Alternative Question Is anyone aware of a storage system or IoT system which would better suite this use case? We will still require backups/loss prevention as part of the service.

PouchDB / Ionic 1 / CouchDB - architecture recommendation

I have a multi-user single-page mobile app developed with Ionic 1, PouchDB and CouchDB. User's management is achieved with SuperLogin.
I would like to add a feature computing a score (something similar to the score in the Waze app) for each user based on his current data, and keeping a track of the former values of that score every past day.
I am wondering about the best way to implement this.
About my app:
it should be able to work offline, and then sync with the server when online (this is why I am using PouchDB and CouchDB, working great so far). So on the server, I have one CouchDB database per user, storing his own data
on the server. The PouchDB database in the app is syncing with the one of the user on the server.
I am considering various options for the score, but none of them really satisfy me, so your advice would be welcome (possibly for yet another option)
Option 1: The score is computed in the app by the Ionic code. The result is stored as a local database object, with a date and a score value. This happens whenever the user changes its data. As the DB is synced with the server, these scores are updated in the server too. However, if some days the user does not use the application, the score won't be computed for these days. More over, if the user runs the app on 2 different devices, and update some data on one of them, this will make the score recomputed locally, then propagated on the server. When the changed data propagate to the server and to the other device, this will trigger a new score computation on this other device, and might lead to conflicts on the score object in the server. Finally, if at some point in time, I want to change the way of computing the score, the value given to each user will depend on whether he has upgraded to the latest app version.
Option 2: have a server-side process that triggers every day, and compute each user's score by connecting to each user's DB on the server, reading its data, computing the corresponding score, and storing it (date+value) back in the server DB. This option looks cleaner to me, but it would require further developments, and an additional process to maintain and keep alive on the server. And if the user inputs data to the application while not connected to the internet, the score will not be updated in the app until he gets connected again (which would cause the server process to recompute the score, and propagate it back to the app through CouchDB sync)
Option 3: have some kind of "stored procedure" in the CouchDB server, triggering every time related data change, in charge of computing the score of each user. But I don't think this is doable with CouchDB.
So how would you do this score computation please??
Many thanks!

Recommendations for multi-user Ionic/CouchDB app

I need add multi-user capability to my single-page mobile app developed with Ionic 1, PouchDB and CouchDB. After reading many docs I am getting confused on what would be the best choice.
About my app:
it should be able to work offline, and then sync with the server when online (this why I am using PouchDB and CouchDB, working great so far)
it should let the user create an account with a username and password, which would then be stored within the app so that he does not have to log in again whenever he launches the app. This account will make sure his data are then synced on the server in a secure place so that other users cannot access it.
currently there is no need to have shared information between users
Based on what I have read I am considering the following:
on the server, have one database per user, storing his own data
on the server, have a master database, storing all the data of all users, plus the design docs. This makes it easy to change the design docs in a single place, and have them replicated on each user database (and then within the PouchDB database in the app). The synchronization of data, between the master and the user DBs, is done through a filter, so that only the docs belonging to one user (through some userId field) are replicated to this user's database only
use another module/plugin (SuperLogin? nolanlawson/pouchdb-authentication?) to manage the users from the app (user creation, login, logout, password reset, email notification for password lost, ...)
My questions:
do you think this architecture is appropriate, or do you have something better to recommend?
which software would you recommend for the users management? SuperLogin looks great but needs to run on a separate HTTP server, making the architecture more complex. Does it automatically create a new database for each new user (I don't think so)? Nolanlawson/pouchdb-authentication is client-only, but does it fit well with Ionic 1? Isn't there a LOT of things to develop around it, that come out of the box with SuperLogin? Do you have any other module in mind?
Many thanks in advance for your help!
This is an appropriate approach. The local PouchDBs will provide the data on the client side even if a client went offline. And the combination with a central CouchDB server is a great to keep data synchronized between server and clients.
You want to store the users credentials, so you will have to save this data somehow on your client side, which could be done in a separate PouchDB.
If you keep all your user data in a local PouchDB database and have one CouchDB database per user on the server, you can even omit the filter you mentioned, because the synchronization will only happen between this two user databases.
I recommend SuperLogin. Yes, you have to install NodeJS and some extra libraries (namely morgan, express, http, body-parser and cors), and you will have to open your server to at least one new port to provide this service. But SuperLogin is really powerful to manage user accounts and user databases on a CouchDB server.
For example, if a user registers, you just make a call to SuperLogin via http://server_address:port/auth/register, query the user name, password etc. and SuperLogin not only adds this new user to the user database, it also creates automatically a new database only for this user. Each user can have multiple databases (private or shared) and SuperLogin manages the access rights to all these databases. Moreover, SuperLogin can also send confirmation emails or resend forgotten passwords (an access token, respectively).
Sure, you will have to configure a lot (but, hey, at least you have all these options), and maybe you even have to write some additional API for functionality not covered by SuperLogin. But in general, SuperLogin saves a lot of pain regarding the development of a custom user management.
But if you are unsure about the server configuration, maybe a service such as Couchbase, Firebase etc. is a better solution. These services have also some user management capabilities, and you have to bother less with server security.

Syncing iOS Core Data with remote server which sends XML

My app parses XML from remote server and store the objects in Core Data(SQLite storage). So that user can browse the material when OFFLINE by reading from local storage.
User may make changes to objects when browsing offline which gets stored locally in Core Data SQLite store. Another User makes some changes to object on Remote server and it is stored there. Now when I detect internet connection, my app should sync my local storage with remote server. Which means remote server is updated with changes I made to my Core Data(SQLite storage) when I was offline and my local storage - Core Data(SQLite storage) needs to be updated with what ever changes other user made to remote server.
For example there is a forum and it is stored in my local storage so that I can read and reply when I am traveling. When later on internet is accessible. My app should automatically put all my replies stored in core data to remote server and also bring other posts on remote server into my local storage.
Remote server is sending XML which I'm parsing and storing in Coredata. My problem is how to sync it ?
How both ways communication happens when there is a change?
How to sync only data which has changed and not to IMPORT whole remote server DB and vice-versa ?
I know one of the way to do it..
add one more field to your local and server database. i.e. Timestamp.
when user change data on the local database change the Timestamp to current time.
do same on the server..i.e. When someone edit data on the server change Timestamp to current time.
When user connects to internet... check local Timestamp to Server timestamp..
case 1 Both are same - Nothing to do
case 2 local time > server time - use sql to get all the data having timestamp greater than server timestamp.. and upload it on the server...
case 3 local < server .... get all the records greater than the local timestamp and add it to the local database..
I am not sure if there is any better way... but this surely works...
One solution could be
iphone syncs the changes to the server
server merges the new and old stuff
iphone gets the new changes (from the merge) from the server
So let the server be the master which should know how to merge stuff and the clients should only download the data incrementally after some changes.
Transactions. You want to follow the ACID rules for transactions. Essentially you have to make sure that data has not be updated that you have not refreshed locally before altering your local write to update.
So the easiest way is have a user get the most recent update from the server, then overwrite that and make sure that with timestamps no othe update happens during that process. Even better yet, use blocking with threads to insure that nothing else happens.
If you Google transactions or ACID there will be a lot of info out there. It's a big area in RDBMS environments where many users can corrupt the data and locks must be held between writes and updates.