MAMP, One computer, two users, shared database - mamp

Two developers often share the same system, and both have local copies of the project and try to connect to a local database. Both users can see the database, but tables and their data are only visible to the database's original author.
We've tried giving all permissions to both users, but it seems the only thing that works is to duplicate the database.
Is there a way around this?
Thanks in advance!

You would probably be better off hosting a separate MySQL instance on it's own machine, and then configure your code to connect to that database instead of the MAMP-hosted one. That being said, you will need to open the port on the firewall of the MAMP(0) for the MAMP-MySQL (usually port 8889). Then, the script on the MAMP(1) needs to be configured to connect to MAMP(0) database on the newly opened port.
You will also need to GRANT privileges for user(1) on the MAMP-host(0) database.
A connect string from MAMP(1) would look like:
$db_url = 'mysqli://user:password#mamp0.local:8889/es_forms_drupal';
Hopefully that makes some sense.

Related

SQL Live Backup Over Intermittent Connection

I have a few PCs that have local PostgreSQL databases running, just logging data. Data is only ever inserted, never removed or updated. The remote PCs are connected to the internet by cellular modem and depending on their location, often do not have internet access. When they do have an internet connection I would like them to push a copy of their databases to a central location and keep the remote database up to date with any new data. Essentially, I need an 'rsync' for databases.
At first it seemed like what I need is to set up PostgreSQL Hot-Standby but I'm unsure if this is actually what I need because my situation seems to differ from the examples I've seen.
Each remote PC has a Postgres server with a single database that has a unique name, the tables within the DBs have generic names. I would like to synchronize these databases to a single remote Postgres server. I think this should be okay due to the unique DB names.
My connectivity is very intermittent, days to weeks without a connection. I've seen PgAdmin be very reliable despite a terrible (cellular) internet connection, if Postges Hot-Standby is the same I may be alright.
As far as I can see my options are either to set up PostgreSQL Hot-Standby, or roll my own solution. I don't want to roll my own solution. However it is simple enough if I can't find anything better; a Python daemon run by systemd to find the diff between the local and remote DB, then push the new rows from the local to the remote DB. But I'm sure someone has solved this problem, I just haven't found the solution yet.
You don't need hot standby (which is the PostgreSQL term for being able to query the replicated database), but streaming replication. You need a central standby server for each intermittently connected remote database server. If you use replication slots, you can be sure that replication will never fall behind.

In PostgreSQL, how to accept *any* password for the user "postgres"?

I'm working on many projects simultaneously, and some have some passwords defined as default, which can vary along projects. I've got postgresql installed on my (Ubuntu) laptop and of course I'm only using it locally for devving.
I know it's horribly insecure, but I don't expose postgres remotely. So to make things easier I would like the postgresql server to accept ANY password it is given for the postgres user. Is there any way that I could do this?
Set trust for all you local connections in pg_hba.conf like e.g
local all all trust
After editing, restart the postgresql service.

How to get rid of extra Postgres databases shown in pgAmin?

I am having a Postgres database deployed. When I connect to it using pgAdmin I see so many databases that I don't have access to and I haven't created at all.
The picture shows some of them. My actual database is one of them.
What are these database and why are they here? How can get rid of them? Can I just delete them without any problem?
If this is your database, then you better know what databases you have and why you have them.
One possibility is that you have lost control of your database, probably to cryptomining hackers (they do create databases with gibberish names).
You can delete the extra's, but the hackers will just keep on getting back in if you don't fix the underlying problem. You need to give good passwords to all your superuser accounts (and all non-superuser accounts too), block access to your database to all but white-listed hosts in pg_hba.conf, maybe block super-user access from all but localhost, as well as blocking access to 5432 on your firewall to all but trusted hosts. Any one of these might be sufficient, but you will be better off to do all 4 of these things.
I faced the same issue using Heroku Postgres addon...
The solution was setting DB Restriction in Advanced options.
Setting your database there you will only see your DB and not the other.

Informix dbserver connections in sqlhosts via perl

I want to add a new Informix sever entry into sqlhosts, but I'm not quite sure how it will impact the existing connection.
Currently sqlhosts contains only one server entry...
dbserver onsoctcp 111.111.111.20 7101
The database handle is created within an existing perl module (db is a database on the server)...
my $dsn = "DBI:Informix:db";
my $dbh = DBI->connect($dsn,"user","password");
Notice that "dbserver" is never referenced.
I want to add a test server to sqlhosts. Something like this...
dbserver onsoctcp 111.111.111.20 7101
dbserver_test onsoctcp 111.111.111.21 7101
With only one entry in sqlhosts, everything has been working fine. But my connection never references the server name in sqlhosts.
So, my question(s)...
Does Informix just try to use the only one available?
Will adding a second server entry in sqlhosts force me to include the server name in the connection string?
Thanks!
Informix client uses environment variables to resolve hosts and other configuration; check that INFORMIXDIR is set to the path where Informix CSDK is installed (I assume it is), and set INFORMIXSERVER to point to the new entry in sqlhosts. See this article in IBM knowledge base.
Alternatively, use db#server data source format:
my $dbh = DBI->connect("DBI:Informix:db#server", "user", "password");
Maybe it is a permissions issue? From the documentation:
Note that you might also be able to connect to other databases not
listed by DBI->data_sources using other notations to identify the
database. For example, you can connect to "dbase#server" if "server"
appears in the sqlhosts file and the database "dbase" exists on the
server and the server is up and you have permission to use both the
server and the database on the server and so on. Also, you might not
be able to connect to every one of the databases listed if you have
not been given at least connect permission on the database. However,
the databases listed by the DBI->data_sources method certainly exist,
and it is legitimate to try connecting to those sources.
http://search.cpan.org/~johnl/DBD-Informix-2013.0521/Informix.pm

How can I disable a superuser in postgres

On a server I have two databases ( say db1 and db2 ). I have a superuser called user1.
My requirement is to disable user1(super user) for database db1.
So that using user1 I can only connect to db2 and not to db1.
How can this be done.
Note : postgres version is 8.0 and both the databases are on same database cluster.
Remove their superuser rights entirely. Make them the owner of db2 (ALTER DATABASE db2 OWNER TO whatever_user), so they can do anything to db2 except limited superuser-only operations like loading C extensions.
You cannot restrict superusers. That's the point. Superuser-only operations are ones that break through the usual access control rules. For example, loading a user-defined C function allows you to write and load a function that opens pg_hba.conf and rewrites it, or just manipulates the system catalogs directly. Similarly, the adminpack functions let you do direct file system access, so they're superuser-only.
If they're a superuser, they can just read pg_hba.conf, see that your user ID has the right to log in to db1, then change your password then log in as you.
Asking to limit a superuser to one DB is like asking if you can make a user root, but only for one subdirectory. (OK, so with SELinux you can kind-of do that, but it's complicated).
If you truly must do this, the only way to do it is to split db1 and db2 into different PostgreSQL servers running under different unpriveleged system user IDs. Each has its own separate shared_buffers, data directory, listening (ip-address, port) combo, WAL, user IDs, database lists, etc. Since they're running under different system users they don't have the right to read or write each others' data directories, so they are isolated. They must listen on different ports and/or different IP addresses, though you can use PgBouncer to make them appear to be the same server to external clients.