Postgres run statement against multiple schemas - postgresql

I have a mult-tenant application where tenants are set up on different schemas within the same database. The reason being that they have some shared data they use on one of the schemas.
Up until now I've been using a bash script with a list of the schemas in it that needs to be updated whenever a new schema is added and I need to do things like table schema changes across the accounts. For instance adding a new column to a table.
Is there a way in Postgres, psql, etc... to run for instance
ALTER TABLE some_table ADD COLUMN some_column TEXT NOT NULL DEFAULT '';
without having to do string replacement in another script like bash for instance.
So in other words is there an easy enough way to get the schemas, and write in psql a for loop that will iterate through the schemas and run the statement each by setting search_path for instance.
The reason being that the number of tenants is growing, and new tenants can be added by admin users that aren't devs, so I'm constantly updating my shell scripts. This will only grow exponentially. Is there a standard way of handling this kind of problem?

You can do that with a little PL/pgSQL block:
do
$$
declare
s_rec record;
begin
for s_rec in select schema_name
from information_schema.schemata
where schema_name not in ('pg_catalog', 'information_schema')
loop
execute format ('ALTER TABLE if exists %I.some_table ADD COLUMN some_column TEXT NOT NULL DEFAULT ''''), s_rec.schema_name);
end loop;
end;
$$
The if exists will make sure the statement doesn't fail if that table doesn't exist in the schema.
If you over-simplified your question and want in fact run complete scripts once for each schema, generating a script for each schema that includes the actual script is probably easier:
select concat(format('set search_path = %I;', schema_name),
chr(10),
'\i complete_migration_script.sql')
from information_schema.schemata
where schema_name not in ('pg_catalog', 'information_schema')
You can spool the output of that statement into a file and then run that file using psql (of course you need to replace complete_migration_script.sql with the actual name of your script)

Related

How to add ONE column to ALL tables in postgresql schema

question is pretty simple, but can't seem to find a concrete answer anywhere.
I need to update all tables inside my postgresql schema to include a timestamp column with default NOW(). I'm wondering how I can do this via a query instead of having to go to each individual table. There are several hundred tables in the schema and they all just need to have the one column added with the default value.
Any help would be greatly appreciated!
The easy way with psql, run a query to generate the commands, save and run the results
-- Turn off headers:
\t
-- Use SQL to build SQL:
SELECT 'ALTER TABLE public.' || table_name || ' add fecha timestamp not null default now();'
FROM information_schema.tables
WHERE table_type = 'BASE TABLE' AND table_schema='public';
-- If the output looks good, write it to a file and run it:
\g out.tmp
\i out.tmp
-- or if you don't want the temporal file, use gexec to run it:
\gexec

How to join tables in two Firebird databases?

Currently I'm working on a simple library project using Embarcadero C++Builder 10.3 Community Edition, and Firebird and FlameRobin to create databases.
So far, I need only use simple queries, that were connected to a single database. Therefore, I used TFDConnection and TFDPhysFbDriverLink to connect to a .fdb file. Then, TFDQuery to create SQL commands and TDataSource. It works great.
Unfortunately, now I must join two tables. How do I write this command? I tried this:
SELECT * FROM users_books
join books on
users_books.id_book = books.id
where users_books and books are databases.
I got an error:
SQL error code = -204
Table unknown
BOOKS.
So I think I must connect somehow to these two databases simultaneously. How to do that?
Firebird databases are isolated and don't know about other databases. As a result, it is not possible to join tables across databases with a normal select statement.
What you can do, is use PSQL (Procedural SQL), for example in an EXECUTE BLOCK. You can then use FOR EXECUTE STATEMENT ... ON EXTERNAL to loop over the table in the other database, and then 'manually' join the local table using FOR SELECT (or vice versa).
For example (assuming a table user_books in the remote database, and a table books in the current database):
execute block
returns (book_id integer, book_title varchar(100), username varchar(50))
as
begin
for execute statement 'select book_id, username from user_books'
on external 'users_books' /* may need AS USER and PASSWORD clause as well */
into book_id, username do
begin
for select book_title from books where id = :book_id
into book_title do
begin
suspend;
end
end
end

How can I delete all tables from a Firebird 3.0 database using single query?

I'm working on JSF application that uses a Firebird 3.0 database containing hundreds of tables. I need to delete all tables time to time.
I have checked this query:
DROP TABLE TABLE_NAME
but only one table can be deleted at a time by using this query and its very time consuming for program, can I have another approach to hammer it away?
You can create procedure in which drop tables
create or alter procedure PRC_DROP_TABLES
as
declare variable TBL varchar(50);
begin
for select r.rdb$relation_name
from rdb$relation_fields r
where
r.rdb$system_flag=0 and r.rdb$view_context is null
-- and r.rdb$relation_name not containing '$' --uncomment and modify this if you what filter tables by condition
group by r.rdb$relation_name
into :tbl do
execute statement 'drop table '||:tbl;
end

How do I drop all tables in psql (PostgreSQL interactive terminal) that starts with a common word?

How do I drop all tables whose name start with, say, doors_? Can I do some sort of regex using the drop table command?
I prefer not writing a custom script but all solutions are welcomed. Thanks!
This script will generate the DDL commands to drop them all:
SELECT 'DROP TABLE ' || t.oid::regclass || ';'
FROM pg_class t
-- JOIN pg_namespace n ON n.oid = t.relnamespace -- to select by schema
WHERE t.relkind = 'r'
AND t.relname ~~ E'doors\_%' -- enter search term for table here
-- AND n.nspname ~~ '%myschema%' -- optionally select by schema(s), too
ORDER BY 1;
The cast t.oid::regclass makes the syntax work for mixed case identifiers, reserved words or special characters in table names, too. It also prevents SQL injection and prepends the schema name where necessary. More about object identifier types in the manual.
About the schema search path.
You could automate the dropping, too, but it's unwise not to check what you actually delete before you do.
You could append CASCADE to every statement to DROP depending objects (views and referencing foreign keys). But, again, that's unwise unless you know very well what you are doing. Foreign key constraints are no big loss, but this will also drop all dependent views entirely. Without CASCADE you get error messages informing you which objects prevent you from dropping the table. And you can then deal with it.
I normally use one query to generate the DDL commands for me based on some of the metadata tables and then run those commands manually. For example:
SELECT 'DROP TABLE ' || tablename || ';' FROM pg_tables
WHERE tablename LIKE 'prefix%' AND schemaname = 'public';
This will return a bunch of DROP TABLE xxx; queries, which I simply copy&paste to the console. While you could add some code to execute them automatically, I prefer to run them on my own.

How do I find all code, triggers from an oracle database that relate to specific tables?

I have a problem where I need to remove all code and triggers from a database that relate to certain tables in order for a Solaris package to install. Long complicated story but I need
to start with a clean slate.
I've managed to remove all the existing tables/synonyms, but how to locate the code/triggers from sqlplus that is related?
Unfortunately, it's not feasible to drop the database and recreate it.
Well, it turns out all the table names are prefixed with my module name DAP.
So, to find all the table names and public synonyms with sqlplus:
select table_name from all_tables where table_name like 'DAP%';
select synonym_name from all_synonyms where table_name like 'DAP%';
To get a list of triggers and sequences
select trigger_name from all_triggers where table_name like 'DAP%';
select sequence_name from all_sequences where sequence_name like 'DAP%';
To get a list of all the constraints
select table_name, constraint_name from all_constraints where table_name like 'DAP%';
To get the DAP related code:
select text from dba_source where name like 'DAP%';
I can now write a script that drops everything.
You should be able to query the system table ALL_TRIGGERS to find the triggers. It has a table_name column. You can probably find the other related objects with different system tables (been awhile since I've messed with Oracle).
http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/statviews_2107.htm