I have query in DB2 to get table definition / structure
SELECT c.column_name,
c.column_default,
c.data_type,
t.table_name,
c.character_maximum_length AS LENGTH
FROM sysibm.tables t
JOIN sysibm.columns c
ON t.table_schema = c.table_schema
AND t.table_name = c.table_name
WHERE t.table_schema = SCHEMA
AND t.table_name = TABLE NAME
IN this I want to get column type is Isnullable true or false.
Can any one please help?
There should be a column, IS_NULLABLE , in the sysibm.columns table.
Related
I'm working on PostreSQL and I have this code to select all the tables and their columns from my database
select t.table_name
, array_agg(c.column_name::text) as columns
from information_schema.tables t
join information_schema.columns c
on t.table_name = c.table_name
where t.table_schema = 'public'
and t.table_type = 'BASE TABLE'
and c.table_schema = 'public'
group
by t.table_name;
I'm trying to modify it to give me only the tables that contains specific column name e.g 'email'
The problem is that when I'm adding another "and" it only returns one column instead of all
where t.table_schema = 'public' and t.table_type= 'BASE TABLE' and c.table_schema = 'public' and c.column_name = 'email'
Use a having clause:
select t.table_name
, array_agg(c.column_name::text) as columns
from information_schema.tables t
join information_schema.columns c
on t.table_name = c.table_name
and t.table_schema = c.table_schema
where t.table_schema = 'public'
and t.table_type = 'BASE TABLE'
group by t.table_name
having 'email' = any(array_agg(c.column_name::text))
If you want to check for multiple columns you could do that like this:
select t.table_name
, array_agg(c.column_name::text) as columns
from information_schema.tables t
join information_schema.columns c
on t.table_name = c.table_name
and t.table_schema = c.table_schema
where t.table_schema = 'public'
and t.table_type = 'BASE TABLE'
group by t.table_name
having array_agg(c.column_name::text) #> array['email', 'phone']
As in topic, I would like to know how can I check for a column being non-nullable?
For oracle I have:
SELECT Nullable
FROM user_tab_columns
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
but how to transform it for postgresql?
I tried something like this, but getting ERROR: column "is_nullable" does not exist:
SELECT is_nullable
FROM information_schema.tables
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
///EDIT
After modification:
SELECT is_nullable
FROM information_schema.columns
WHERE table_name = 'TOP_VALIDATION_RULE'
AND column_name = 'ERROR_LEVEL'
I get:
Use pg_attribute to check the column is NULLABLE OR NOT
SELECT attnotnull FROM pg_attribute WHERE attname = 'ERROR_LEVEL'
Note: The above query will return more than one row if the ERROR_LEVEL column appears in more than one table in the database.
This is how we can check table existence in MSSQL:
IF OBJECT_ID(N'public."TABLE_NAME"', N'U') IS NOT NULL
select 1 as 'column'
else
select 0 as 'column';
which stores outcome in variable 'column'
How can I do same thing in PostgreSQL ? I want to return 1 or 0 for respective outcome.
Use a SELECT with an EXISTS operator checking e.g. information_schema.tables:
select exists (select *
from information_schema.tables
where table_name = 'table_name'
and table_schema = 'public') as table_exists;
If you can't (or won't) deal with proper boolean values, the simply cast the result to a number (but I have no idea why that should be better):
select exists (select *
from information_schema.tables
where table_name = 'table_name'
and table_schema = 'public')::int as "column";
Note that column is a reserved keyword and thus you need to quote it using double quotes.
Check for column in a table existence use view pg_tables
IF EXISTS ( SELECT attname
FROM pg_attribute
WHERE attrelid = (SELECT oid FROM pg_class WHERE relname = 'YOURTABLENAME')
AND attname = 'YOURCOLUMNNAME')
THEN
-- do something
END IF;
For my sql use INFORMATION_SCHEMA.COLUMNS
SELECT 1
FROM INFORMATION_SCHEMA.COLUMNS
WHERE table_name = 'tbl_name'
[AND table_schema = 'db_name']
[AND column_name LIKE 'wild']
I have a database which stores multiple schemas with tables in it
I want to get every schema name and in the same time check if the schema has a table named 'status'
I got two queries for that:
This query returns all schemas of the database:
select schema_name from information_schema.schemata
With the returned query I then check every schema if the table 'status' exists:
select exists(select * from information_schema.tables where table_schema = 'the_schema_name' and table_name = 'status')
My question is now if I can combine these two queries into one?
Thanks in advance
Doobie
Use a co-related sub-query:
select s.schema_name,
exists (select *
from information_schema.tables t
where t.table_schema = s.schema_name
and t.table_name = 'status') as status_exists
from information_schema.schemata s;
If you just want to find the schemas where the table does not exist, you can do that with the following query:
select s.schema_name
from information_schema.schemata s
where not exists (select *
from information_schema.table t
where t.schema_name = s.schema_name
and t.table_name = 'status');
I am using PGAdmin for executing a query , If there is 30 tables in database and one column XYZ is used in min 10 to 15 table So, how can i get the tables in which that particular column is used.
Kindly help me out
You can use the information_schema views:
select table_schema,
table_name
from information_schema.columns
where column_name = 'xyz';
Just right click on your schema then you will find a option search objects.
Click on the option and select the option you want to search from the new opened window.
Now fill the pattern as name or as per the option you have selected with prefix and suffix as "%".
select t.table_schema,
t.table_name
from information_schema.tables t
inner join information_schema.columns c on c.table_name = t.table_name
and c.table_schema = t.table_schema
where c.column_name = 'last_name'
and t.table_schema not in ('information_schema', 'pg_catalog')
and t.table_type = 'BASE TABLE'
order by t.table_schema;
Replace that 'last_name' with your column XYZ
Thanks
Just override the "column_x" for your desired column:
> select t.table_schema,
> t.table_name from information_schema.tables t inner join information_schema.columns c on c.table_name = t.table_name
> and c.table_schema = t.table_schema where c.column_name = 'column_x'
> and t.table_schema not in ('information_schema', 'pg_catalog')
> and t.table_type = 'BASE TABLE' order by t.table_schema;
Original from https://dataedo.com/kb/query/postgresql/find-tables-with-specific-column-name
just do the right-click on the databases list, refresh, schemas, public, tables
When I was learning pgadmin and postgresql, I found this tool helpful, as it allows to play with the ui
http://choose.tools/tool