Copy data between two tables in PostgreSQL using dblink.sql - postgresql

I am using PostgreSQL 9.1. I need to transfer required columns from one table of one database into another table of another database, but not schema.
I found that dblink.sql file has to be there in share/contrib. But my contrib folder is empty. Where can I download the dblink.sql file and can execute my query?
When I execute the query now it shows an error message:
cross database reference is not possible ...
Can anyone help me how to transfer the data between two databases?

After you have installed the package into your system as detailed in the related question install the extension dblink into your database (the one you are running this code in, the foreign db does not need it):
CREATE EXTENSION dblink;
You can find code examples in the manual.
Here is a simple version of what I use to copy data between dbs:
First, create a FOREIGN SERVER
CREATE SERVER mydb
FOREIGN DATA WRAPPER postgresql
OPTIONS (hostaddr '111.111.111.111',port '5432',dbname 'mydb');
FOREIGN DATA WRAPPER postgresql was pre-installed in my case.
Then create function that opens a connection, removes old data (opotional), fetches new data, runs ANALYZE and closes the connection:
CREATE OR REPLACE FUNCTION f_tbl_sync()
RETURNS text AS
$BODY$
SELECT dblink_connect('mydb'); -- USER MAPPING for postgres, PW in .pgpass
TRUNCATE tbl; -- optional
INSERT INTO tbl
SELECT * FROM dblink(
'SELECT tbl_id, x, y
FROM tbl
ORDER BY tbl_id')
AS b(
tbl_id int
,x int
,y int)
ANALYZE tbl;
SELECT dblink_disconnect();
$BODY$
LANGUAGE sql VOLATILE;

Related

Trigger execution for foreign table in postgresql

I have two databases db1 and db2.db2 has foreign table, say tableA whose schema has been imported from db1. A trigger runs on tableA in db1 for every row update.
Now using postgres_fdw I can fetch the records from db2 but unable to update any record on tableA due to that trigger function.Update works fine in case I disable the trigger.I need that trigger for audit log.
Please suggest me a suitable suggestion to resolve the issue.I am using postgres 9.6.
Make sure the user establishing the link has access to the audit tables.
You could also add the required schema to the trigger function search path:
CREATE OR REPLACE FUNCTION abc.mytrigger() RETURNS trigger AS
$BODY$BEGIN
[...] -- do something in the xyz schema
RETURN NEW;
END;$BODY$
LANGUAGE plpgsql
SET search_path = xyz;

Foreign table inserts don't use the remote sequence

I have a set of applications accessing two different PostgreSQL 9.6 DBs on the same server. Due to some application limitations, one application accesses a handful of tables via FDW in one DB to the other.
Something like this:
DB1.fdw_table_a -> DB2.table_a
fdw_table_a is only used for inserts of log data. This table has an id column, which is a bigint sequence. The sequence exists in DB1 (on the foreign table) and in DB2 (the "real" table). This works as it should and all is well.
Now there's a need to have another application (again with limited access capabilities) perform inserts into the "real" table, DB2.table_a. In testing, I can see some inconsistencies in the id column, but no obvious issues have appeared.
I can see in the customer-facing environments that the DB1 FDW sequence is used as expected, but when inserts start directly on the DB2 'real' table, that sequence will start at 1 (as it has never been used).
Are there other things we should be considering in this environment?
Are there some issues that could arise from overlap in these two sequences inserting into the table?
The sequence only gets used if you omit the id column in the INSERT statement. But postgres_fdw will never omit a column, as you can see from the execution plan.
One way to solve the problem is to use a foreign table that does not contain the id column. Then any insert into that foreign table will use the sequence to populate that column.
The following plan from 2014 is still valid today.
=# CREATE SEQUENCE seq;
CREATE SEQUENCE
=# CREATE VIEW seq_view AS SELECT nextval('seq') as a;
CREATE VIEW
=# CREATE EXTENSION postgres_fdw;
CREATE EXTENSION
=# CREATE SERVER postgres_server
-# FOREIGN DATA WRAPPER postgres_fdw
-# OPTIONS (host 'localhost', port '5433', dbname 'postgres');
CREATE SERVER
=# CREATE USER MAPPING FOR PUBLIC SERVER postgres_server OPTIONS (password '');
CREATE USER MAPPING
=# CREATE FOREIGN TABLE foreign_seq_table (a bigint)
-# SERVER postgres_server OPTIONS (table_name 'seq_view');
CREATE FOREIGN TABLE
=# CREATE FUNCTION foreign_seq_nextval() RETURNS bigint AS
-# 'SELECT a FROM foreign_seq_table;' LANGUAGE SQL;
CREATE FUNCTION
=# CREATE TABLE tab (a int DEFAULT foreign_seq_nextval());
CREATE TABLE
=# INSERT INTO tab VALUES (DEFAULT), (DEFAULT), (DEFAULT);
INSERT 0 3
=# SELECT * FROM tab;
a
----
9
10
11
(3 rows)
https://paquier.xyz/postgresql-2/global-sequences-with-postgres_fdw-and-postgres-core/

temp tables in postgresql

I'm coming from a background in SQL Server where I would create temp tables using the:
select id
into #test
from table A
I've just moved into a PostGresql environment and I was hoping I could do the same, but I'm getting a syntax error. I did a search and it seems like you have to do a Create Table statement.
Is it not possible to easily create temp tables in Postgres?
Postgres supports SELECT INTO, so this should work fine:
SELECT id
INTO TEMP TABLE test
FROM a
You can also use CREATE TABLE AS:
CREATE TEMP TABLE test AS
SELECT id FROM a
This version is generally preferred, as the CREATE statement provides additional options, and can also be used in PL/pgSQL functions (where the SELECT INTO syntax has been hijacked for variable assignment).

Selecting column name from other database table through function in PostgreSQL

Here i need to select a column name by using function(stored procedure) which is present in other database table using PostgreSQL.
I have sql server query as shown below.
Example:
create procedure sp_testing
as
if not exists ( select ssn from testdb..testtable) /*ssn is the column-name of testtable which exists in testdb database */
...
Q: Can i do the same in PostgreSQL?
Your question is not very clear, but if you want to know if a column by a certain name exists in a table by a certain name in a remote PostgreSQL database, then you should first set up a foreign data wrapper, which is a multi-stage process. Then to test the existence of a certain column in a table you need to formulate a query that conforms to the standards of the particular DBMS that you are connecting to. Use the remote information_schema.tables table for optimal compatibility (which is here specified as remote_tables which you must have defined with a prior CREATE FOREIGN TABLE command):
CREATE FUNCTION sp_testing () AS $$
BEGIN
PERFORM *
FROM remote_tables
WHERE table_name = 'testtable'
AND column_name = 'ssn';
IF NOT FOUND THEN
...
END IF;
END;
$$ LANGUAGE plpgsql;
If you want to connect to another type of DBMS, you need to write some custom function in f.i. C or perl and then call that from within a PostgreSQL function on your local machine. The test on the column is then best done inside the function which should therefore take connection parameters, table name and column name as parameters, and return a boolean to inform the result.
Before you start testing this, make sure that you read all the documentation on connecting to remote servers and learning PL/pgSQL first would also be a nice gesture to demonstrate your own efforts before you ask for help.

Is there any shortcut for using dblink in Postgres?

In Postgres, you can link to your other databases using dblink, but the syntax is very verbose. For example you can do:
SELECT *
FROM dblink (
'dbname=name port=1234 host=host user=user password=password',
'select * from table'
) AS users([insert each column name and its type here]);
Is there any way to do this faster? Maybe pre-define the connections?
I noticed that Postgres has a new create foreign table function for connecting to a MySQL database. It has a simpler syntax than dblink. Could I use that?
In PostgreSQL 8.4 and later, you can use the foreign data wrapper functionality to define the connection parameters. Then you'd simply refer to the foreign server name in your dblink commands. See the example in the documentation.
In PostgreSQL 9.1 and later, you can use the foreign data wrapper functionality to define foreign tables and thus access remote databases transparently, without using dblink at all. For that, you'd need to get the postgresql_fdw wrapper, which isn't included in any production release yet, but you can find experimental code in the internet. In practice, this route is more of a future option.
You can wrap connection parameters in a FOREIGN SERVER object like #Peter explains, but you'd still have to spell out the rest.
You can encapsulate everything in a view or function, so you type it only once. Example with a function - Run as superuser:
CREATE OR REPLACE FUNCTION f_lnk_tbl()
RETURNS TABLE(tbl_id int, col1 text, log_ts timestamp) AS
$BODY$
SELECT *
FROM dblink(
'SELECT tbl_id, col1, log_ts
FROM tbl
ORDER BY tbl_id'::text) AS b(
tbl_id int
,col1 text
,log_ts timestamp);
$BODY$ LANGUAGE sql STABLE SECURITY DEFINER;
REVOKE ALL ON FUNCTION f_lnk_tbl() FROM public;
CREATE OR REPLACE FUNCTION f_sync()
RETURNS text AS
$BODY$
SELECT dblink_connect('hostaddr=123.123.123.123 port=5432 dbname=mydb
user=postgres password=*secret*');
INSERT INTO my_local_tbl SELECT * FROM f_lnk_tbl();
-- more tables?
SELECT dblink_disconnect();
$BODY$
LANGUAGE sql VOLATILE SECURITY DEFINER;
REVOKE ALL ON FUNCTION blob.f_dbsync() FROM public;
-- GRANT ....;