PostgreSQL - Query tables' columns' size - postgresql

I want to know tables' columns' sizes, to limit inputs' length in the form (I'm also checking it in the backend).
How can I know this?

Try this query, it returns all the column name, data type, maximum lengths for a given table 'table_name'.
SELECT column_name, data_type, character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'table_name';
You could modify it to return one column replacing 'column_name' with yours like so:
SELECT column_name, data_type, character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'table_name'
AND column_name = 'column_name';

Related

Show all not empty tables, with a especific column, postgresql

The problem is the following, how through a query in postgresl/pgadmin I can get all the tables with a specific column and that this column is not empty.
The solution I came to is the following.
SELECT relname, n_tup_ins
FROM pg_stat_all_tables
WHERE relname IN (
SELECT table_name
FROM information_schema.columns
WHERE column_name = 'company_id' AND table_schema = 'public' AND table_schema = 'company_id'.
AND table_schema = 'public'
)
ORDER BY n_tup_ins DESC
Where company_id is the column I was looking for.
I hope this helps someone.

how do I check for a column being non-nullable?

As in topic, I would like to know how can I check for a column being non-nullable?
For oracle I have:
SELECT Nullable
FROM user_tab_columns
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
but how to transform it for postgresql?
I tried something like this, but getting ERROR: column "is_nullable" does not exist:
SELECT is_nullable
FROM information_schema.tables
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
///EDIT
After modification:
SELECT is_nullable
FROM information_schema.columns
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
I get:
Use pg_attribute to check the column is NULLABLE OR NOT
SELECT attnotnull FROM pg_attribute WHERE attname = 'ERROR_LEVEL'
Note: The above query will return more than one row if the ERROR_LEVEL column appears in more than one table in the database.

PostgreSQL how to check table existence?

This is how we can check table existence in MSSQL:
IF OBJECT_ID(N'public."TABLE_NAME"', N'U') IS NOT NULL
select 1 as 'column'
else
select 0 as 'column';
which stores outcome in variable 'column'
How can I do same thing in PostgreSQL ? I want to return 1 or 0 for respective outcome.
Use a SELECT with an EXISTS operator checking e.g. information_schema.tables:
select exists (select *
from information_schema.tables
where table_name = 'table_name'
and table_schema = 'public') as table_exists;
If you can't (or won't) deal with proper boolean values, the simply cast the result to a number (but I have no idea why that should be better):
select exists (select *
from information_schema.tables
where table_name = 'table_name'
and table_schema = 'public')::int as "column";
Note that column is a reserved keyword and thus you need to quote it using double quotes.
Check for column in a table existence use view pg_tables
IF EXISTS ( SELECT attname
FROM pg_attribute
WHERE attrelid = (SELECT oid FROM pg_class WHERE relname = 'YOURTABLENAME')
AND attname = 'YOURCOLUMNNAME')
THEN
-- do something
END IF;
For my sql use INFORMATION_SCHEMA.COLUMNS
SELECT 1
FROM INFORMATION_SCHEMA.COLUMNS
WHERE table_name = 'tbl_name'
[AND table_schema = 'db_name']
[AND column_name LIKE 'wild']

Postgresql get different schema table columns

I have a query. I'm using a few schemas. I want to get different schema table column names.
select column_name
from information_schema.columns
where table_name='public.combine'
or table_name='kds.2014_new'
or table_name='public.point'
or table_name='spt.point'
When I run this query I get 0 result. How can I solve this problem?
You have to separate table_name and table_schema
SELECT column_name
FROM information_schema.columns
WHERE (table_name = 'combine' AND table_schema = 'public')
OR (table_name = '2016_new' AND table_schema = 'kds')
OR (table_name = 'point' AND table_schema = 'public')
OR (table_name = 'point' AND table_schema = 'spt')

SELECT ALL column_names in postgresql

I'm using PostgreSQL and I want to create a query that will display all column_names in a specific table.
Schema: codes
Table Name: watch_list
Here are the column_names in my table:
watch_list_id, watch_name, watch_description
I tried what I found in the web:
SELECT *
FROM information_schema.columns
WHERE table_schema = 'codes'
AND table_name = 'watch_list'
It output is not what I wanted. It should be:
watch_list_id, watch_name, watch_description
How to do this?
If you want all column names in a single row, you need to aggregate those names:
SELECT table_name, string_agg(column_name, ', ' order by ordinal_position) as columns
FROM information_schema.columns
WHERE table_schema = 'codes'
AND table_name = 'watch_list'
GROUP BY table_name;
If you remove the condition on the table name, you get this for all tables in that schema.
SELECT table_name FROM information_schema.tables WHERE table_schema='public'