PostgreSQL PL/PerlU trigger issue - postgresql

I'm trying to create a PostgreSQL trigger on Linux written in Perl which should execute code based on external libraries. The SQL script containing the trigger looks like this:
CREATE OR REPLACE FUNCTION notify_mytable_update() RETURNS trigger AS $$
use lib "full_path_to_lib_dir";
use MyModule;
return;
$$ LANGUAGE plperlu
SECURITY DEFINER
SET search_path = myschema, public, pg_temp;
DROP TRIGGER IF EXISTS notify_mytable_update ON mytable;
CREATE TRIGGER notify_mytable_update AFTER UPDATE ON mytable
FOR EACH ROW
EXECUTE PROCEDURE notify_mytable_update();
The issue with this is that whenever I try to this script with psql I get a permission denied error in the Perl code for accessing MyModule. Giving full access to my home directory to postgres didn't help.
Thank you in advance!

Don't forget that to have access to a file, you not only need permissions to the file and the directory where it resides, but also to all directories in the path.
So if your module is /home/george/MyModule.pm, you need access to / and /home in addition to /home/george and the file itself.
You'll have to give these permissions to the operating system user running the PostgreSQL server process, commonly postgres.

Related

postgresql copy from csv file into table in windows through procedure

below is my procedure executed to upload file into table and do joins etc.
CREATE OR REPLACE FUNCTION sp_product_price()
RETURNS void
LANGUAGE 'plpgsql'
COST 100
AS $BODY$
BEGIN
truncate table prd_product_data;
truncate table price_import;
COPY price_import FROM 'C:\Users\Ram\Documents\prices.csv' CSV HEADER;
truncate table active_product_price;
insert into active_product_price(key,name,price)
SELECT prd.key,prd.name,prd.price FROM prd_product_data prd JOIN price_import import ON prd.name = import.name;
raise notice 'success';
END
$BODY$;
Above procedure giving error could not open file "C:\Users\Ram\Documents\prices.csv" for reading: No such file or directory HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
I have given access to file for everyone in file properties.
I tried \copy in procedure but this gives error syntax error at or near "\" .
\copy is working when I ran cmd in psql but not in above procedure.
Is there a way to import file into table in above procedure/functions ?
The procedure and the COPY statement are running on the database server, so the file C:\Users\Ram\Documents\prices.csv must be on the database server as well (and your database user must either be a superuser or a member of pg_read_server_files).
The only way you can use COPY to load data from the client is COPY ... FROM STDIN, and you won't be able to use that in a procedure.
\copy is a psql command, not an SQL statement, so it cannot be used in a procedure either.
My suggestion is to use \copy, but to forget about the procedure.

Creating Batch Files with PostgreSQL \copy Command in Jetbrains Datagrip

I'm familiarizing myself with the standalone version of Datagrip and having a bit of trouble understanding the different approaches to composing SQL via console, external files, scratch files, etc.
I'm managing, referencing the documentation, and am happy to figure things out as such.
However, I'm trying to ingest CSV data into tables via batch files using the Postgres \copy command. Datagrip will execute this command without error but no data is being populated.
This is my syntax, composed and ran in the console view:
\copy tablename from 'C:\Users\username\data_file.txt' WITH DELIMITER E'\t' csv;
Note that the data is tab-separated and stored in a .txt file.
I'm able to use the import functions of Datagrip (via context menu) just fine but I'd like to understand how to issue commands to do similarly.
\copy is a command of the command-line PostgreSQL client psql.
I doubt that Datagrip invokes psql, so it won't be able to use \copy or any other “backslash command”.
You probably have to use Datagrip's import facilities. Or you start using psql.
Ok, but what about the SQL COPY command https://www.postgresql.org/docs/12/sql-copy.html ?
How can I run something like that with datagrip ?
BEGIN;
CREATE TEMPORARY TABLE temp_json(values text) ON COMMIT DROP;
COPY temp_json FROM 'MY_FILE.JSON';
SELECT values->>'aJsonField' as f
FROM (select values::json AS values FROM temp_json) AS a;
COMMIT;
I try to replace 'MY_FILE.JSON' with full path, parameter (?), I put it in sql directory etc.
The data grip answer is :
[2021-05-05 10:30:45] [58P01] ERROR: could not open file '...' for reading : No such file or directory
EDIT :
I know why. RTFM! -_-
COPY with a file name instructs the PostgreSQL server to directly read from or write to a file. The file must be accessible by the PostgreSQL user (the user ID the server runs as) and the name must be specified from the viewpoint of the server.
Sorry.....

How to import a CSV file into PostgreSQL using Mac

I am trying to import data from a CSV file into PostgreSQL using pgAdmin. However, I am getting an error message when attempting to perform a COPY command.
ERROR: could not open file "/Users/brandoncruz/Desktop/Test File.csv" for reading: Permission denied
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
SQL state: 42501
Below is the code I have attempted to use for the import.
CREATE TABLE roles(
role_id serial PRIMARY KEY,
role_name VARCHAR (255)
);
COPY roles FROM '/Users/brandoncruz/Desktop/Test File.csv' DELIMITER ',' CSV HEADER;
It looks about your permissions. You can change permissions of 'Test File.csv'. I mean Postgres user cannot read your file. After change permissions it must be copied successfully.
You might need to check following path if you are using pgAdmin:
pgAdmin > File > Preferences
Paths > Binary Paths
For postgreSQL binary path, you should find the location of PostgreSQL from your mac and save that:
i.e. C:\Program Files\PostgreSQL\12\bin (Windows)
For mac, check under Applications, Program Files, that depends and changes from Mac to Mac.

Using triggers with different users (MySQL error 1449)

Question: How can I export the database contents (including triggers) from my dev system and import the data to the live server without running into error 1449 when a trigger gets triggered?
In a recent php project I am making extensive use of mysql triggers but ran into a problem when deploying my database from dev to live system.
E.g. one of my triggers is defined as follows (output generated by using mysqldump)
DELIMITER ;;
/*!50003 CREATE*/ /*!50017 DEFINER=`root`#`localhost`*/ /*!50003 TRIGGER update_template BEFORE UPDATE ON template
FOR EACH ROW BEGIN
SET new.mod_date := now();
END */;;
DELIMITER ;
That trigger was defined on my dev system using the user root#localhost which creates the DEFINER=root#localhost clause in above statement.
root#localhost does not exist as a user on the live server which causes the following error when ever the trigger gets triggered (e.g. by using update templates set...) by the live systems user
1449: The user specified as a definer('root'#'localhost') does not exist
Currently I use mysqldump --add-drop-table --user=root -p my_project > export.sql for export and mysql -u devuser -p my_project < export.sql for importing data.
Export/import works flawless. The error occurs only in cases when I manipulate the data via sql and a trigger gets involved.
Edit:
MySQL version is 5.5.47 (live and dev)
Once you've exported to export.sql, next you'll need to sanitize your trigger lines using regex via sed, like this:
sed -i -- 's/^..!50003\sCREATE.....!50017\sDEFINER=.root...[^]*.....!50003\s\([^;]*\)/CREATE DEFINER=CURRENT_USER \1/g;s/^\s*\([^\*]*;\{0,1\}\)\s\{0,1\}\*\/;;$/\1;;/g' export.sql
This works on Linux, but if you're on Mac OS X, the built-in sed command won't work. You'll need to first brew install gnu-sed, then use gsed instead of sed in the above command.
In my case, the trigger that caused the problem didn't have the BEGIN and END statements. So I applied the corresponding DROP TRIGGER and CREATE TRIGGER, after that I made again a backup that latter restored without problems. i.e:
DROP TRIGGER `incorrect_trg1`;
DELIMITER ;;
CREATE DEFINER = `root`#`localhost` TRIGGER `incorrect_trg1` BEFORE INSERT ON `table1` FOR EACH ROW
BEGIN
SET NEW.col = DATE_FORMAT(NEW.col,'%Y%m');
END;;
DELIMITER ;
Use the following sed command to remove the DEFINER part from the dump file and then import it to your live server.
sed 's/\sDEFINER=`[^`]*`#`[^`]*`//' -i dumpfile.sql
The triggers will then be created by the user importing the dump file by default.

Mysql to Posgresql query conversion

Please help to create postgresql query equal to mysql query
LOAD DATA LOCAL INFILE 'file.txt' REPLACE INTO TABLE newtable TERMINATED BY ',' IGNORE 1 LINES;
There is no equivalent feature in PostgreSQL - at least in the current 9.3 or any prior version.
You must do this in a few steps:
CREATE TEMPORARY TABLE ...
COPY into the temp table
Do an UPDATE ... FROM followed by an INSERT INTO ... WHERE NOT EXISTS (...) to merge data
DROP the temp table
Search for "postgresql bulk upsert" or "postgresql copy upsert".
you might be looking for COPY
COPY will be run by the PostgreSQL backend (user "postgres"). The backend user requires permissions to read & write to the data file in order to copy from/to it. You need to use an absolute pathname with COPY. \COPY on the other hand, runs under the current $USER, and with that users environment. And \COPY can handle relative pathnames. The psql \COPY is accordingly much easier to use if it handles what you need.