PostgreSQL information schema query for tables without a specific data type - postgresql

I'm trying to write a PostgreSQL query to get all the tables (from a specified schema) which don't have any columns of a specific data type, for example, show all tables without any integer type columns. so far I can manage to get only the table names, the data types of the columns they have and their count but I feel like that's the wrong direction in order to get what I want. any help appreciated, thanks
SELECT Table_Name, Data_Type, COUNT(Data_Type)
FROM Information_schema.Columns
WHERE Table_Schema = 'project'
GROUP BY Table_Name, Data_Type

You'll want to start with the tables table and then use an EXISTS subquery:
SELECT table_name
FROM information_schema.tables
WHERE table_schema = 'project'
AND NOT EXISTS (
SELECT *
FROM information_schema.columns
WHERE column.table_schema = tables.table_schema
AND column.table_name = tables.table_name
AND data_type = 'integer'
)

Related

PostgreSQL : How to select rows from multiple tables inside different schemas?

I am trying to select all rows from multiple tables inside various schemas.
These are the tables inside different schemas
schema_1-->ABC_table_1,XYZ_table_2
schema_2-->ABC_table_1,JLK_table_2
.
.
schema_N-->ABC_table_1,LMN_table_2
I am trying to select all rows from table_2 from all schemas:
This query is giving me all the tables:
SELECT
table_schema || '.' || table_name
FROM
information_schema.tables
WHERE
table_type = 'BASE TABLE'
AND
table_schema NOT IN ('pg_catalog', 'information_schema');
What i need to do is something like
select * from schema_1.XYZ_table_2
Union all
select * from schema_2.JLK_table_2
.
.
schema_2.LMN_table_2
Verify if this option can work for you :
select row_to_json(row) from (select * from table1 ) row
UNION ALL
select row_to_json(row) from (select * from table2 ) row
;
If you dont want to work with JSON/HStore datamanipulation you can also try to make dynamic sql select creation for all possible columns in the schema then UNION ALL can still work .
Json looks pretty easier to do this task and also very supported to do any further actions.

How can I find point geometry typed tables from postgis?

I want to get all point geometry type tables from postgis. Can I use sql select for this operations.
I can select all tables from select * from information_schema.tables table.
And I can get all geometry columns like this:
SELECT type FROM geometry_columns;
This query returns "GEOMETRY"
But I want to select all tables hthat has POINT geometry type.
If I get your question right, you can just query it from information_schema with:
select distinct table_schema, table_name
from information_schema.columns
where data_type = 'point';
eg:
t=# create table p(i point);
CREATE TABLE
t=# select distinct table_schema,table_name from information_schema.columns where data_type = 'point';
table_schema | table_name
--------------+------------
postgres | p
(1 row)

Select columns of a particular data type in PostgreSQL

I am looking for a piece of information, but the table I'm looking at has dozens of columns, and I can't remember the exact name of the column. I only know that it is of a date type. Is there a way to select only date columns so that it will be easier to find the name of the column?
e.g.
SELECT * FROM "MySchema"."MyTable"
WHERE {column.data_type} = 'date'
You can use the information_schema.columns view:
select column_name
from information_schema.columns
where table_schema = 'MySchema' and table_name = 'MyTable' and data_type = 'date';
Now that you have the names of the columns of type date, you can use that information create a view that selects only the values of such columns.

PostgreSQL - Query tables' columns' size

I want to know tables' columns' sizes, to limit inputs' length in the form (I'm also checking it in the backend).
How can I know this?
Try this query, it returns all the column name, data type, maximum lengths for a given table 'table_name'.
SELECT column_name, data_type, character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'table_name';
You could modify it to return one column replacing 'column_name' with yours like so:
SELECT column_name, data_type, character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'table_name'
AND column_name = 'column_name';

SELECT ALL column_names in postgresql

I'm using PostgreSQL and I want to create a query that will display all column_names in a specific table.
Schema: codes
Table Name: watch_list
Here are the column_names in my table:
watch_list_id, watch_name, watch_description
I tried what I found in the web:
SELECT *
FROM information_schema.columns
WHERE table_schema = 'codes'
AND table_name = 'watch_list'
It output is not what I wanted. It should be:
watch_list_id, watch_name, watch_description
How to do this?
If you want all column names in a single row, you need to aggregate those names:
SELECT table_name, string_agg(column_name, ', ' order by ordinal_position) as columns
FROM information_schema.columns
WHERE table_schema = 'codes'
AND table_name = 'watch_list'
GROUP BY table_name;
If you remove the condition on the table name, you get this for all tables in that schema.
SELECT table_name FROM information_schema.tables WHERE table_schema='public'