oracle_fdw - βIs it possible in PostgreSQL to import foreign schema with a limit to synonym - as in the attached image?
γ
€
For example, this not working (limit to synonym):
IMPORT FOREIGN SCHEMA "UserA" LIMIT TO (A_SYNONYM) FROM SERVER oracledb INTO public;
but works when I try to import foreign schemat limit to view or table:
IMPORT FOREIGN SCHEMA "UserB" LIMIT TO (B_VIEW) FROM SERVER oracledb INTO public;
β
So, it is possible to use synonyms in import foreign schema "LIMIT TO ()"?
β
When processing IMPORT FOREIGN SCHEMA, oracle_fdw searches through the Oracle view ALL_TAB_COLUMNS in Oracle. According to the Oracle documentation
ALL_TAB_COLUMNS describes the columns of the tables, views, and clusters accessible to the current user.
That sounds like synonyms won't be listed. You can easily test that by omitting the LIMIT TO clause. If foreign tables for synonyms are not created, you have a proof.
As a workaround, you could import those tables from the schemas where they really reside.
Related
When we use import foreign schema in oracle_fdw, there is no option for IF NOT EXISTS.
Due to which if we re-execute the import foreign schema command to import the newly added tables/view we get the error that relation already exists.
As we are not aware of the table/view which were not added in the previous execution it becomes difficult to specify them in LIMIT/EXCEPT clause
Is there any work around available to achieve the IF NOT EXISTS functionality
There is no direct way to do that, but you can first find the foreign tables that already exist in the target schema and list them in the EXCEPT clause of IMPORT FOREIGN SCHEMA.
To find all foreign tables in a schema:
SELECT relname
FROM pg_class
WHERE relkind = 'f'
AND relnamespace = 'schemaname'::regnamespace;
Then
IMPORT FOREIGN SCHEMA "XYZ"
EXCEPT (/* list from above */)
FROM SERVER oraserver
INTO schemaname;
IMPORT FOREIGN SCHEMA myforeignschema LIMIT TO (tables_that_match_a_specific_condition)
FROM SERVER myserver INTO myschema;
I want to import a limited amount of tables from a foreign schema, but I don't want to list them seperately, rather I want a list of all tables in the foreign schema that match specific conditions (In my case, I want to import only those tables which are foreign tables themselves).
I have a PostgreSQL database, which has had all objects in public schema.
Used SQLAlchemy to succesfully connect to it and reflect objects from it.
Now I needed to create a separate schema schema2 in the same database. I assigned to new user all rights in that schema, checked that I can connect from command line with psql and do things in it.
But SQLAlchemy doesn't see any tables in the schema, when I try to use the same method to reflect its tables -- despite trying various ways to specify schema!
This is what worked for initial connection to public schema:
from sqlalchemy.ext.automap import automap_base
from sqlalchemy import create_engine
from sqlalchemy import Table, Integer, String, Text, Column, ForeignKey
Base=automap_base()
sql_conn='postgres://my_user#/foo'
engine=create_engine(sql_conn)
Base.prepare(engine, reflect=True)
Then I could use Base.classes.* for tables and I didn't need to create Table classes on my own.
Now, this same works for this new user for public schema as well.
But whenever I somehow try to pass the schema2 to reflect tables from schema2, I always get empty Base.classes.*
I tried all the solutions in SQLAlchemy support of Postgres Schemas but I don't get anything at all reflected!
I tried:
making user's default schema schema2 via SQL means:
ALTER new_user SET search_path=schema2;
tried to pass schema in engine.connect via engine options
tried to set schema in MetaData and use that, as per SQLAlchemy docs:
Doing:
meta=MetaData(bind=engine,schema='schema2')
meta.reflect()
does work, as I can see all the tables correctly in meta.tables afterwards.
However, when I try to get Base.classes working as per SQLAlchemy automapper docs, the Base.classes don't get populated:
from sqlalchemy.ext.automap import automap_base
from sqlalchemy import create_engine
from sqlalchemy import Table, Integer, String, Text, Column, ForeignKey, MetaData
sql_conn='postgres://new_user#/foo'
engine=create_engine(sql_conn)
meta=MetaData(bind=engine,schema='schema2')
Base=automap_base(metadata=meta)
Base.prepare(engine, reflect=True)
Base.classes is empty...
I am now stumped. Any ideas?
PS. I am working on newest SQLAlchemy available (1.3) under pip for Python2.7, Ubuntu 18.04LTS.
If I follow: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.PostgreSQL.CommonDBATasks.html#postgresql-commondbatasks-fdw, how can I pre-fix the tables with the schema I am retrieving tables from, e.g.
IMPORT FOREIGN SCHEMA lands
LIMIT TO (land, land2)
FROM SERVER foreign_server INTO public;
The created tables are named land and land2. Is it possible to prefix land and land2 with 'lands', e.g. 'lands_land' and 'lands_land2'?
With psql and recent PostgreSQL versions, you could simply run (after the IMPORT FOREIGN SCHEMA):
SELECT format(
'ALTER FOREIGN TABLE public.%I RENAME TO %I;',
relname,
'lands_' || relname
)
FROM pg_class
WHERE relkind = 'f' -- foreign table
AND relnamespace = 'public'::regnamespace \gexec
The \gexec will interpret each result row as an SQL stateent and execute it.
Another option that I'd like better is to keep the original names, but use a different schema for the foreign tables:
IMPORT FOREIGN SCHEMA lands
LIMIT TO (land, land2)
FROM SERVER foreign_server INTO lands;
Then all foreign tables will be in a schema lands, and you have the same effect in a more natural fashion. You can adjust search_path to include the lands schema.
Is it possible to query the pg_catalog schema of a remote Postgres server? I'm trying to access some simple statistics of a remote server. I've tried importing the foreign schema, but it fails on an anyarray column.
psql> IMPORT FOREIGN SCHEMA pg_catalog LIMIT TO (pg_stats) FROM SERVER myserver into myschema;
ERROR: column "most_common_vals" has pseudo-type anyarray
CONTEXT: importing foreign table "pg_stats"
I'm able to individually import tables that don't have anyarray columns.
You could define a view (on the foreign side) which casts those columns to text, then create a foreign table for that view instead of the original. Not very elegant, but it works. But you do have to have create privs on the foreign side, or have the cooperation of someone who does.
I don't know what will happen for stats over bytea columns.