I am trying to convert SQL inner join query into PostgreSQL inner join query. In this inner join query which tables are using that all tables are not present in one database. we separated tables into two databases i.e. application db and security db
users and permission table are present in security db
userrolemapping and department are present in application db
I tried like below but I am getting following error
Error
ERROR: cross-database references are not implemented: "Rockefeller_ApplicationDb.public.userrolemapping"
LINE 4: INNER JOIN "Rockefeller_ApplicationDb".public.userro..
SQL Stored Function
SELECT Department.nDeptID
FROM Users INNER JOIN Permission
ON Users.nUserID = Permission.nUserID INNER JOIN UserRoleMapping
ON Users.nUserID = UserRoleMapping.nUserID INNER JOIN Department
ON Permission.nDeptInst = Department.nInstID
AND Department.nInstID = 60
WHERE
Users.nUserID = 3;
PostgreSQL Stored Function
SELECT dep.ndept_id
FROM "Rockefeller_SecurityDb".public.users as u
INNER JOIN "Rockefeller_SecurityDb".public.permissions p ON u.nuser_id = p.nuser_id
INNER JOIN "Rockefeller_ApplicationDb".public.userrolemapping as urm ON u.nuser_id = urm.nuser_id
INNER JOIN "Rockefeller_ApplicationDb".public.department dep ON p.ndept_inst = dep.ninst_id
AND dep.ninst_id = 60
WHERE
u.nuser_id = 3;
You cannot join tables from different databases.
Databases are logically separated in PostgreSQL by design.
If you want to join the tables, you should put them into different schemas in one database rather than into different databases.
Note that what is called “database” in MySQL is called a “schema” in standard SQL.
If you really need to join tables from different databases, you need to use a foreign data wrapper.
For future searchs, you can to use dblink to connect to other database.
Follow commands:
create extension dblink;
SELECT dblink_connect('otherdb','host=localhost port=5432 dbname=otherdb user=postgres password=???? options=-csearch_path=');
SELECT * FROM dblink('otherdb', 'select field1, field2 from public.tablex')
AS t(field1 text, field2 text);
New to postrgreSQL and I had the same requirement. FOREIGN DATA WRAPPER did the job.
IMPORT FOREIGN SCHEMA — import table definitions from a foreign server
But first I had to:
enable the fdw extension
define the foreign server (which was the locahost in this case!)
create a mapping between the local user and the foreign user.
CREATE EXTENSION postgres_fdw;
CREATE SERVER localsrv
FOREIGN DATA WRAPPER postgres_fdw
OPTIONS (host 'localhost', dbname 'otherdb', port '5432');
CREATE USER MAPPING FOR <local_user>
SERVER localsrv
OPTIONS (user 'ohterdb_user', password 'ohterdb_user_password');
IMPORT FOREIGN SCHEMA public
FROM SERVER localsrv
INTO public;
After that I could use the foreign tables as if they were local. I did not notice any performance cost.
In my case, I changed my query from:
SELECT * FROM myDB.public.person
to this:
SELECT * FROM "myDB".public.cats
and it worked.
You can read more at mathworks.com.
Related
If I follow: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.PostgreSQL.CommonDBATasks.html#postgresql-commondbatasks-fdw, how can I pre-fix the tables with the schema I am retrieving tables from, e.g.
IMPORT FOREIGN SCHEMA lands
LIMIT TO (land, land2)
FROM SERVER foreign_server INTO public;
The created tables are named land and land2. Is it possible to prefix land and land2 with 'lands', e.g. 'lands_land' and 'lands_land2'?
With psql and recent PostgreSQL versions, you could simply run (after the IMPORT FOREIGN SCHEMA):
SELECT format(
'ALTER FOREIGN TABLE public.%I RENAME TO %I;',
relname,
'lands_' || relname
)
FROM pg_class
WHERE relkind = 'f' -- foreign table
AND relnamespace = 'public'::regnamespace \gexec
The \gexec will interpret each result row as an SQL stateent and execute it.
Another option that I'd like better is to keep the original names, but use a different schema for the foreign tables:
IMPORT FOREIGN SCHEMA lands
LIMIT TO (land, land2)
FROM SERVER foreign_server INTO lands;
Then all foreign tables will be in a schema lands, and you have the same effect in a more natural fashion. You can adjust search_path to include the lands schema.
I'm working on a Postgres database project that was not documented at all and one of the major issue is accounting for dependency on foreign datawrappers. I am able to query for all foreign data wrappers in postgres SQL but I don't know how to associate them with views using them.
I did a schema dump using pg_dump and tried to do control find where the wrappers were used but there are too many of them. Is there a tool that can use the schema dump and make sense of it or is there another way to get this dependency information?
I think this should do it:
SELECT DISTINCT
pg_rewrite.ev_class::regclass AS view,
pg_class.oid::regclass AS fdw
FROM pg_depend
JOIN pg_rewrite ON pg_rewrite.oid = objid
JOIN pg_class ON pg_class.oid = refobjid
WHERE pg_class.relkind = 'f'
I have many tables in different databases and want to bring them to a database.
It seems like I have to create foreign table in the database (where I want to merge them all) with schemas of all the tables.
I am sure, there is a way to automate this (by the way, I am going to use psql command) but I do not know where to start.
what I have found so far is I can use
select * from information_schema.columns
where table_schema = 'public' and table_name = 'mytable'
I added more detail explanation.
I wanted to copy tables from another database
the tables have same column names and data type
using postgres_fdw, I needed to set up a field name and data type for each tables (the table names are also same)
then, I want to union the tables have same name all to have one single table.
for that, I am going to add prefix on table
for instance, mytable in db1, mytable in db2, mytable in db3 as in
db1_mytable, db2_mytable, db3_mytable in my local database.
Thanks to Albe's comment, I managed it and now I need to figure out doing 4th step using psql command.
I have one database with table t1 at local server and one database with table t2 and t3 at remote server. I would like to create a function on local database that can insert data on remote server using local data.
Example:
local table - t1 (xid, newxid)
remote table - t2 (id, xid, iname)
remote table to be populate:
t3 (t2.id, t1.newxid, t2.iname)
criteria: t1.xid = t2.xid
I know about dblink, but not sure how to use it specifically for above requirements.
Note: I know how to do local insert with remote select.
Any help would be appreciated.
You have two options:
dblink
writable Foreign Data Wrappers
I am new to PostgreSQL. I have 2 databases in PostgreSQL 9.0, db1 and db2, and with db2 I have read only access. I want to create a stored function that would be otherwise easily accomplished with a JOIN or a nested query, something PostgreSQL can't do across databases.
In db1, I have table1 where I can query for a set of foreign keys keys that I can use to search for records in a table2 in db2, something like:
SELECT * from db2.table2 WHERE db2.table2.primary_key IN (
SELECT db1.table1.foreign_key FROM db1.table1 WHERE
db1.table1.primary_key="whatever");
What is the best practice for doing this in Postgres? I can't use a temporary tables in db2, and passing in the foreign keys as a parameter in a stored function running in db2 doesn't seem like a good solution.
Note: the keys are all VARCHAR(11)
You'll want to look into the db_link contrib.
As an aside if you're familiar with C, there also is a cute functionality called foreign data wrappers. It allows to manipulate pretty much any source using plain SQL. Example with Twitter:
SELECT from_user, created_at, text FROM twitter WHERE q = '#postgresql';