I want to extract the names of the tables I have.
The code below returns me tables AND views.
SELECT quote_ident(table_name) as tab_name
FROM information_schema.tables
WHERE table_schema='public'
Question
How can I obtain just the table names and exclude the views?
From the documentation (emphasis mine):
The view tables contains all tables and views defined in the current database.
You can use the table_type column to exclude views:
SELECT quote_ident(table_name) as tab_name
FROM information_schema.tables
WHERE table_schema = 'public'
AND table_type != 'VIEW'
Related
I need to compare two PostgreSQL databases with exactly one SQL-query. I tried with the following query:
SELECT *
FROM information_schema.columns
WHERE table_name IN (SELECT table_name
FROM information_schema.tables
WHERE table_schema NOT IN ('information_schema', 'pg_catalog')
AND table_type = 'BASE TABLE'
ORDER BY table_schema, TABLE_NAME);
It works for my problem but the foreign and primary keys are missing in this table. Is there are way to include them into my sql-query?
It is ok if the result table is not normalized and displays data multiple times. It is purely a runtime comparison, which is why the result tables are deleted again when the program has run through.
I have two tables, one a test table and one a production table, both with +200 columns and a couple thousand lines of code to create the table. I periodically make changes and am trying to automate QA. I would like to
Compare all rows between the two tables to detect differences.
Exclude certain columns, either because columns are new (added to test, does not exist in prod) or because they will be different on purpose (table_creation, created_by_used_id, etc).
Use a variable to generate the SELECT list_of_column_names so I do not have to continually manually update the column names I need to compare between the two tables.
#3 is the issue. I know how to do this in python, but am currently limited to doing this only in PostgreSQL and have never done anything with variables in SQL.
Code So Far
So far, I know I can get all columns names from
SELECT *
FROM information_schema.columns
WHERE table_schema = 'my_test_schema'
AND table_name = 'my_test_table'
From there, I can do a FULL JOIN and WHERE clause to join with the prod columns and get a table with 1 column of only the subset column names that I want.
After that, I'm using an EXCEPT/UNION ALL script to compare the tables. The issue below is with the * - I instead need to have some sort of variable or list and use that to select the column names.
SELECT * FROM my_test_table
EXCEPT
SELECT * FROM my_prod_table
UNION ALL
SELECT * from my_prod_table
EXCEPT
SELECT * from my_test_table
I am open to alternate suggestions.
This will give you the columns which are present in prod table and not in
test table and or the other way around:
SELECT column_name
FROM information_schema.columns
WHERE table_schema = 'my_test_schema'
AND table_name = 'my_test_table'
except
SELECT column_name
FROM information_schema.columns
WHERE table_schema = 'my_prod_table'
AND table_name = 'my_prod_table'
UNION
SELECT column_name
FROM information_schema.columns
WHERE table_schema = 'my_prod_table'
AND table_name = 'my_prod_table'
except
SELECT column_name
FROM information_schema.columns
WHERE table_schema = 'my_test_schema'
AND table_name = 'my_test_table'
I have a table with 100 columns and i need to get distinct records from all the columns from the table.
I used below query to get distinct records from table
select distinct col1, col2, col3,........ from test_table
but is there any good query to fetch distinct records from all the columns
from table without mentioning column names in the query.
Since you want DISTINCT on all columns, and you want to select all columns, it couldn't be simpler:
SELECT DISTINCT * FROM table
I am not sure if there is a simpler way,
You can use information_schema to get your columns and then use it.
SELECT string_agg(column_name::character varying, ',') as columns
FROM information_schema.columns
WHERE table_schema = 'schema_name'
AND table_name = 'table_name'
This will return you the list of columns in your table.
SELECT string_agg(column_name::character varying, ',') as columns
FROM information_schema.columns
WHERE table_schema = 'schema_name'
AND table_name = 'table_name' \gset
You can refer to gset here,
For example, if your table has two columns 'a' and 'b', gset will store, 'a,b'.
echo might be used to check what gset has stored,
\echo :columns
The following query might help you,
select distinct :columns from table_name;
There is one use case in my project where i want to show the user who has got the access to use that database/schema/table in postgresql. Suppose I have created a database employee. So I want list the users who are accessing this database. Same for schema and tables. I tried this:
SELECT
*
FROM
information_schema.tables
WHERE
table_schema not in ('pg_catalog', 'information_schema') AND
table_schema not like 'pg_toast%'
But it gives information about current user has access to. I want the list of accessing users that are using that database/table/schema/column.
You can use the function has_table_privilege() to achieve what you want.
select has_table_privilege('user_name', 'schema.table', 'select');
More info here.
I suppose you need to be a superuser to get all results. Below shows all the users who have privileges, not necessarily whether they are accessing the tables.
Now you can tweak the query to join all users with list of tables -
SET search_path TO public,
schema1;
SELECT *,
usename,
has_table_privilege(usename, table_schema||'.'||table_name, 'select') as has_privilege
FROM SVV_TABLES
JOIN PG_USER
ON 1=1
WHERE table_schema = 'schema1';
I have to update all columns of type "uuid" to "varchar(38)". I created all the necessary queries with:
SELECT format(
'ALTER TABLE %I.%I.%I ALTER COLUMN %I SET DATA TYPE varchar(38);',
table_catalog,
table_schema,
table_name,
column_name
)
FROM information_schema.columns
WHERE data_type = 'uuid'
AND table_schema NOT LIKE 'pg_%'
AND lower(table_schema) <> 'information_schema'
AND is_updatable = 'YES';
Obviously, I can't execute the resulting queries because of all the existing PK and FK constraints involving the uuid columns.
Is there a way to temporarily disable the constraints, then executing all the queries and reactivating the constraints afterwards without dropping the constraints?
Or if I have to drop all the constraints first, is there a way to set them all up again after the updates? I am not the creator of the database so I don't have all necessary queries to create the constraints again.
I found a way to create all queries for dropping and creating all constraints of the database.
So first I have to save the output of the first query
SELECT 'ALTER TABLE "'||nspname||'"."'||relname||'" DROP CONSTRAINT "'||conname||'";'
FROM pg_constraint
INNER JOIN pg_class ON conrelid=pg_class.oid
INNER JOIN pg_namespace ON pg_namespace.oid=pg_class.relnamespace
ORDER BY CASE WHEN contype='f' THEN 0 ELSE 1 END,contype,nspname,relname,conname;
and of the second query
SELECT 'ALTER TABLE "'||nspname||'"."'||relname||'" ADD CONSTRAINT "'||conname||'" "'||
pg_get_constraintdef(pg_constraint.oid)||'";'
FROM pg_constraint
INNER JOIN pg_class ON conrelid=pg_class.oid
INNER JOIN pg_namespace ON pg_namespace.oid=pg_class.relnamespace
ORDER BY CASE WHEN contype='f' THEN 0 ELSE 1 END DESC,contype DESC,nspname DESC,relname DESC,conname DESC;
When I have all the queries, I first dropped every constrained, updated the tables and then executed the queries for adding the constraints again. Worked perfectly!