SELECT ALL column_names in postgresql - postgresql

I'm using PostgreSQL and I want to create a query that will display all column_names in a specific table.
Schema: codes
Table Name: watch_list
Here are the column_names in my table:
watch_list_id, watch_name, watch_description
I tried what I found in the web:
SELECT *
FROM information_schema.columns
WHERE table_schema = 'codes'
AND table_name = 'watch_list'
It output is not what I wanted. It should be:
watch_list_id, watch_name, watch_description
How to do this?

If you want all column names in a single row, you need to aggregate those names:
SELECT table_name, string_agg(column_name, ', ' order by ordinal_position) as columns
FROM information_schema.columns
WHERE table_schema = 'codes'
AND table_name = 'watch_list'
GROUP BY table_name;
If you remove the condition on the table name, you get this for all tables in that schema.

SELECT table_name FROM information_schema.tables WHERE table_schema='public'

Related

Show all not empty tables, with a especific column, postgresql

The problem is the following, how through a query in postgresl/pgadmin I can get all the tables with a specific column and that this column is not empty.
The solution I came to is the following.
SELECT relname, n_tup_ins
FROM pg_stat_all_tables
WHERE relname IN (
SELECT table_name
FROM information_schema.columns
WHERE column_name = 'company_id' AND table_schema = 'public' AND table_schema = 'company_id'.
AND table_schema = 'public'
)
ORDER BY n_tup_ins DESC
Where company_id is the column I was looking for.
I hope this helps someone.

PostgreSQL information schema query for tables without a specific data type

I'm trying to write a PostgreSQL query to get all the tables (from a specified schema) which don't have any columns of a specific data type, for example, show all tables without any integer type columns. so far I can manage to get only the table names, the data types of the columns they have and their count but I feel like that's the wrong direction in order to get what I want. any help appreciated, thanks
SELECT Table_Name, Data_Type, COUNT(Data_Type)
FROM Information_schema.Columns
WHERE Table_Schema = 'project'
GROUP BY Table_Name, Data_Type
You'll want to start with the tables table and then use an EXISTS subquery:
SELECT table_name
FROM information_schema.tables
WHERE table_schema = 'project'
AND NOT EXISTS (
SELECT *
FROM information_schema.columns
WHERE column.table_schema = tables.table_schema
AND column.table_name = tables.table_name
AND data_type = 'integer'
)

how do I check for a column being non-nullable?

As in topic, I would like to know how can I check for a column being non-nullable?
For oracle I have:
SELECT Nullable
FROM user_tab_columns
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
but how to transform it for postgresql?
I tried something like this, but getting ERROR: column "is_nullable" does not exist:
SELECT is_nullable
FROM information_schema.tables
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
///EDIT
After modification:
SELECT is_nullable
FROM information_schema.columns
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
I get:
Use pg_attribute to check the column is NULLABLE OR NOT
SELECT attnotnull FROM pg_attribute WHERE attname = 'ERROR_LEVEL'
Note: The above query will return more than one row if the ERROR_LEVEL column appears in more than one table in the database.

PostgreSQL - Query tables' columns' size

I want to know tables' columns' sizes, to limit inputs' length in the form (I'm also checking it in the backend).
How can I know this?
Try this query, it returns all the column name, data type, maximum lengths for a given table 'table_name'.
SELECT column_name, data_type, character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'table_name';
You could modify it to return one column replacing 'column_name' with yours like so:
SELECT column_name, data_type, character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'table_name'
AND column_name = 'column_name';

PostgreSQL: Search all tables of a database for a field called LIKE *active*

In my public schema I have 1200 tables.
Somewhere in one or more of this tables there are some fields called LIKE "active"
like:
- status_active
- hr_active
- who_knows_what_active_could_be
I want to find them all using PGAdmin in the console or via the normal client on console
how could I do this with quick with less resources?
Try:
SELECT *
FROM information_schema.columns
WHERE TRUE
AND table_schema = 'public'
AND column_name ~* 'active'
You can try:
SELECT table_name,tables.table_catalog,tables.table_schema,column_name
FROM information_schema.columns
inner join information_schema.tables
using(table_name,table_schema)
WHERE table_type = 'BASE TABLE'
and column_name ilike '%active%'
select * from INFORMATION_SCHEMA.COLUMNS
where TABLE_SCHEMA = 'your schema name'
--and TABLE_NAME ilike '%your keyword%' --when you need to search for a TABLE
and COLUMN_NAME ilike '%your keyword%' --when you need to search for a COLUMN