I need add multi-user capability to my single-page mobile app developed with Ionic 1, PouchDB and CouchDB. After reading many docs I am getting confused on what would be the best choice.
About my app:
it should be able to work offline, and then sync with the server when online (this why I am using PouchDB and CouchDB, working great so far)
it should let the user create an account with a username and password, which would then be stored within the app so that he does not have to log in again whenever he launches the app. This account will make sure his data are then synced on the server in a secure place so that other users cannot access it.
currently there is no need to have shared information between users
Based on what I have read I am considering the following:
on the server, have one database per user, storing his own data
on the server, have a master database, storing all the data of all users, plus the design docs. This makes it easy to change the design docs in a single place, and have them replicated on each user database (and then within the PouchDB database in the app). The synchronization of data, between the master and the user DBs, is done through a filter, so that only the docs belonging to one user (through some userId field) are replicated to this user's database only
use another module/plugin (SuperLogin? nolanlawson/pouchdb-authentication?) to manage the users from the app (user creation, login, logout, password reset, email notification for password lost, ...)
My questions:
do you think this architecture is appropriate, or do you have something better to recommend?
which software would you recommend for the users management? SuperLogin looks great but needs to run on a separate HTTP server, making the architecture more complex. Does it automatically create a new database for each new user (I don't think so)? Nolanlawson/pouchdb-authentication is client-only, but does it fit well with Ionic 1? Isn't there a LOT of things to develop around it, that come out of the box with SuperLogin? Do you have any other module in mind?
Many thanks in advance for your help!
This is an appropriate approach. The local PouchDBs will provide the data on the client side even if a client went offline. And the combination with a central CouchDB server is a great to keep data synchronized between server and clients.
You want to store the users credentials, so you will have to save this data somehow on your client side, which could be done in a separate PouchDB.
If you keep all your user data in a local PouchDB database and have one CouchDB database per user on the server, you can even omit the filter you mentioned, because the synchronization will only happen between this two user databases.
I recommend SuperLogin. Yes, you have to install NodeJS and some extra libraries (namely morgan, express, http, body-parser and cors), and you will have to open your server to at least one new port to provide this service. But SuperLogin is really powerful to manage user accounts and user databases on a CouchDB server.
For example, if a user registers, you just make a call to SuperLogin via http://server_address:port/auth/register, query the user name, password etc. and SuperLogin not only adds this new user to the user database, it also creates automatically a new database only for this user. Each user can have multiple databases (private or shared) and SuperLogin manages the access rights to all these databases. Moreover, SuperLogin can also send confirmation emails or resend forgotten passwords (an access token, respectively).
Sure, you will have to configure a lot (but, hey, at least you have all these options), and maybe you even have to write some additional API for functionality not covered by SuperLogin. But in general, SuperLogin saves a lot of pain regarding the development of a custom user management.
But if you are unsure about the server configuration, maybe a service such as Couchbase, Firebase etc. is a better solution. These services have also some user management capabilities, and you have to bother less with server security.
Related
I would like to develop a specific app that can be used to access a database developed in PostgreSQL. The app performs calculations and asks for the required data from the database server.
The user can download the app from a website if he has registered. After starting the app, the user has to log in to be able to use it.
Now the question:
What would be the most sensible solution in this example?
To be honest, I don't want to create a separate role for each user.
My idea is that the app only accesses the database via a general role, for example with the name "usership". With this role, a user only has well-defined read access. It is possible that users should also be able to save their own settings or measured values under their user name in certain tables. Access would then only be possible with the correct user name and password, which are specified with each operation on the relevant tables (however, this effort would not be necessary for read-only access to other tables with general data).
The question is whether there are any limits to how many apps can communicate with the database at the same time via the same database credentials / username "usership".
I don't want to have to create a separate DB role for each customer. Somehow that doesn't seem right to me, if only because adding new employees or deleting them means major interventions in the database schema (create / drop role). Basically, the app should do nothing else than a website where several users are logged in at the same time, the only difference being that the app does not run in the browser and everything works either on the client side at the application level or on the database server.
I'm not aware of any limits on sharing of usernames + passwords in postgres. You can have hundreds or thousands of concurrent connections using the same username + password.
There can be issues with many hundreds or thousands of concurrent connections, depending on your database hardware, especially ram.
While Postgres supports thousands of concurrent connections in theory, in practice I've run into memory issues as the # of open connections increases. If this is a problem and a large % of your connections are idle at any one moment, you can add a layer of connection pooling with something like pgbouncer, but keep in mind that adds another process to monitor.
In general, however, I wouldn't recommend this approach. You'd be providing direct, essentially anonymous access to your shared database. I expect it would be difficult to secure your database credentials in the client, and with direct access it should be fairly easy to construct SQL queries that would take down your database server. This would be difficult to monitor or prevent against since all users would be the same and you'd have no way to revoke access in case of abuse (without changing the password for everyone that has access).
From a security standpoint I'd definitely recommend being able to identify your users, monitor their usage separately and revoke access individually. I don't know of any performance issues with having many thousands of separate postgres users/credentials.
-- Scalability --
Using a postgres cluster with read replicas and load balancing (e.g. https://aws.amazon.com/premiumsupport/knowledge-center/requests-rds-read-replicas/) you should be able to scale this horizontally fairly easily if the need arises.
As with many other founders and their start-ups, I'm low on cash and aiming to launch without funding. The app will be dealing with users health data so setting up a server with the correct encryption may be costly. I am also only familiar with JSON, SwiftUI, Swift5 and API programming so setting up a server is outside my scope of expertise.
Therefore, I aim to launch the app with all user data stored locally in SwiftUI CoreData, as to avoid these issues. With enough users i.e. traction, I will then begin to seek funding, at which point I would hope to set up an encrypted server and transfer user data there.
I am worried that if I launch with local storage, I will not be able to transfer each individual users data to an external server, without them having to reenter all of their information.
I was just wondering whether this was possible or not? And if you could provide details that would be very helpful.
I’m building a new web application which needs to work seamlessly even when there is no internet connection. I’ve selected Angular and am building a PWA as it comes with built-in functionality to make the application work offline. So far, I have the service worker working perfectly and driven by the manifest file, this very nicely caches the static content and I’ve set it to cache a bunch of API requests which I want to use whilst the application is offline.
In addition to this, I’ve used localStorage to store attempts to invoke put, post and delete API requests when the user is offline. Once the internet connection is re-established, the requests stored in localStorage are sent to the server.
This far in my proof of concept, the user can access content whilst offline, edit data and the data gets synced with the server once the user’s internet connection is re-established. This is where my quandary begins though. There is API request data cached automatically by the service worker as defined in the manifest file, and there is a separate store of data for data edits whilst offline. This leads to a situation where the user edits some data, saves the data, refreshes the page and the data is served by the service worker cached API.
Is there a built in mechanism to update API data cached automatically by the service worker? I don’t fancy trying to unpick this manually as it seems hacky and I can’t imagine it’ll be future proof as service workers evolve.
If there isn’t a standard way to achieve what I need to do, is it common for developers to take full control of offline data by storing it all in IndexedDB/localStorage manually? i.e. I could invoke API requests and write some code which caches the results in a structured format in IndexedDB to form an offline database, then writes back to the offline database whenever the user edits some data, and uploads any data edits when the user is back online. I don’t envisage any technical problems with doing this, it just seems like a lot of effort to achieve something which I was hoping to be standard functionality.
I’m fairly new to developing with Angular, but have many years of other development experience. So please forgive me if I’m asking obvious questions, I just can’t seem to find a good article on best practices for working with data storage with service workers.
Thanks
I have a project where my users can edit local data when they are offline and I use Cloud Firestore to have a local database cached available. If I understood you correctly, this would be exactly your requirement.
The benefit of this solution is that with just one line of code, you get not only a local db, but also all the changes made offline are automatically synchronised with the server once the client gets online again.
firebase.firestore().enablePersistence()
.catch(function(err) {
// log the error
});
// Subsequent queries will use persistence, if it was enabled successfully
If using this NoSQL database is an option for you I would go with it, otherwise you need to implement the local updates by yourself as there is not a built in solution for that.
My whole interest in PostgreSQL is driven by its ACL system which is really powerful.
To access the data in a Scala application I have two options in mind EBeans/JDBC or Slick FRM.
Our application is an enterprise one and has more than 1000 users, who will be accessing it simultaneously, having different roles and access permissions. The current connectors, I am aware of, ask for database username/password at the time of connection building, and I haven't found these providing any facility to change the username/password on the fly as we will be getting the user reference from session object of the user accessing our server.
I am not sure how much the title of the question makes sense, but I don't see recreating(or separately creating) a database connection for every user as an efficient solution. What I am looking for is a library or toolkit which lets us supply the interacting sub-user/ROLE in options parameter using which PostgreSQL can do its ACL enforcing/check on data/manipulation requested.
You can "impersonate" a user in Postgres during a transaction and reset just before the transaction is done using the SET ROLE or SET SESSION AUTHORIZATION commands issued after establishing a database connection.
Preface:
I'm hoping to upgrade an existing application by adding cloud backup and syncing of the customers data. We want this to be as seamless as possible, but also for the customers only interface to the data to be via the applications front-end interface.
Our application can be connected to the oil pipe of a machine, collects data on the oil condition. When a test has completed we want to push this to the cloud. Because of the distinct test nature of the data (as opposed to one big trend) most IoT platforms don't suit very well, so we're aiming to release a slightly modified version of the application which doesn't have the connection to the sensors and this will be our remote front-end.
Since the existing application uses a relatively simple file structure to store it's data, if we simply replicate these files in the cloud, the remote front-end version can just download these to the same location and it'll work fine. Thus this has lead us to Dropbox (or any recommended more appropriate cloud storage system).
We hope to use the Dropbox API directly in our application to push and pull the files as necessary. All of this so far we believe is perfectly achievable.
Question: Is it possible - and if so how would we go about - to setup a user system with the below requirements
The users personal dropbox is not used
Dropbox is completely hidden from the user
The application vendor has a top level user who has access to all data (for analytic, we do not want to store confidential or sensitive data).
When the user logs in they only have access to their folder and any attackers could not disrupt the overall structure. (We understand that if an attacker got the master account then all is lost, but that is an internal issue to keep it secure. As long as the user accounts are isolated this is okay.)
Alternative Question Is anyone aware of a storage system or IoT system which would better suite this use case? We will still require backups/loss prevention as part of the service.