using psql, can I override variables set in .psqlrc from command line? - postgresql

I would like to have a .psqlrc file with default values, and be able to override these values from psql's command line.
For example :
have some values set in .psqlrc :
-- .psqlrc :
-- "user#database # " in bold green
\set PROMPT1 '%[%033[1;32;40m%]%n#%/%[%033[0m%]% > '
-- store command history in home directory, with a per database file :
\set HISTFILE ~/.psql-history- :DBNAME
in some wrapper script master-psql.sh, which connects as user postgres, be able to override these values :
# master-psql.sh :
# when using this script, change color to red, change history file location :
psql -U postgres \
-v PROMPT1='%[%033[1;31;40m%]%n#%/%[%033[0m%]% # ' \
-v HISTFILE='/some/other/place/.psql_history_postgres'
The above does not work, because the the -v ... argument is executed before the .psqlrc file is loaded, and the instruction in .psqlrc overwrites the existing value.
Question
Is there a way to instruct psql to run a set of commands after loading its .psqlrc file(s),
or to have .psqlrc execute some \set or \pset command only if value is not set ?

You could write the instructions not to overwrite those variables if already set into the .psqlrc file itself:
\if :{?HISTFILE}
\else
\set HISTFILE ~/.psql-history- :DBNAME
\endif
If you can't get your system psqlrc to cooperate with you, then might need to copy and modify it and then bypass the original. You need at least v11 for the :{? construct to work.
The problem is that PROMPT1 has a compiled-in default even in the absence of RC file processing, so you might need to test that against the compiled-in string, rather than test for being defined. So I think that would end up with something like this:
select :'PROMPT1'='%/%R%x%# ' as default_prompt \gset
\if :default_prompt
\set PROMPT1 '%[%033[1;32;40m%]%n#%/%[%033[0m%]% > '
\endif
Note that the compiled in default changed in v13, so if you want to work with older versions as well, you would need to do something more complicated.

From here:
https://www.postgresql.org/docs/current/app-psql.html
Environment
[...]
PSQLRC
Alternative location of the user's .psqlrc file. Tilde (~) expansion is performed.
So create an alternate .psqlrc file and set thePSQLRC environment variable for the script to override your default.

Related

how to pass variable to copy command in Postgresql

I tried to make a variable in SQL statement in Postgresql, but it did not work.
There are many csv files stored under the path. I want to set path in Postgresql that can tell copy command where can find csv files.
SQL statement sample:
\set outpath '/home/clients/ats-dev/'
\COPY licenses (_id, name,number_seats ) FROM :outpath + 'licenses.csv' CSV HEADER DELIMITER ',';
\COPY uploaded_files (_id, added_date ) FROM :outpath + 'files.csv' CSV HEADER DELIMITER ',';
It did not work. I got error: no such files. The two files licneses.csv and files.csv are stored under /home/cilents/ats-dev on Ubuntu. I found some sultion that use "\set file 'license.csv'". It did not work for me becacuse I have many csv files. also I tried to use "from : outpath || 'licenses.csv'". it did not work ether. Appreciate for any helps.
Using 9.3.
It looks like psql does not support :variable substitution withinpsql backslash commands.
test=> \set somevar fred
test=> \copy z from :somevar
:somevar: No such file or directory
so you will need to do this via an external tool like the unix shell. e.g.
for f in *.sql; do
psql -c "\\copy $(basename $f) FROM '$f'"
done
You can try COPY command
\set outpath '\'/home/clients/ats-dev/'
COPY licenses (_id, name,number_seats ) FROM :outpath/licenses.csv' WITH CSV HEADER DELIMITER ',';
COPY uploaded_files (_id, added_date ) FROM :outpath/files.csv' WITH CSV HEADER DELIMITER ',';
Note: Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. Similarly, the command specified with PROGRAM is executed directly by the server, not by the client application, must be executable by the PostgreSQL user. COPY naming a file or command is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
Documentation: Postgresql 9.3 COPY
It may have been true when this was originally asked, that psql backslash commands didn't support variable interpolation, but in my PostgreSQL 14 instance that's no longer the case. However, the psql manpage is clear that \copy specifically does not support variable interpolation.

How to disallow psql command history duplicates

How to disallow psql command history duplicates?
Is this possible to make psql command history delete all duplicate history commands upon enter of new commands?
You can try setting in the psql prompt
\set HISTCONTROL ignoredups
you can also set it in a file called .psqlrc in the user home directory
example from my .psqlrc file
\set HISTCONTROL ignoredups
\set COMP_KEYWORD_CASE upper
Set internal variable HISTCONTROL
This is from PostreSQL 9.4 manual:
If this variable is set to ignorespace, lines which begin with a space
are not entered into the history list. If set to a value of
ignoredups, lines matching the previous history line are not entered.
A value of ignoreboth combines the two options. If unset, or if set to
any other value than those above, all lines read in interactive mode
are saved on the history list.

psql: Init files?

In the psql documentation, I read information about variables (section advanced features), e.g. one of these variables is:
HISTSIZE
The number of commands to store in the command history. The default value is 500.
Is there a file in the home directory or somewhere else where I can configure these variables?
What syntax would I use in that file?
If you look at the Files section, you'll see this:
Files
Unless it is passed an -X or -c option, psql attempts to read and execute commands from the system-wide psqlrc file and the user's ~/.psqlrc file before starting up. (On Windows, the user's startup file is named %APPDATA%\postgresql\psqlrc.conf.) See PREFIX/share/psqlrc.sample for information on setting up the system-wide file. It could be used to set up the client or the server to taste (using the \set and SET commands).
The location of the user's ~/.psqlrc file can also be set explicitly via the PSQLRC environment setting.
So like most Unix commands, there is an RC ("Run Commands") file that you can use for configuration, the name also matches the Unix conventions of ~/.cmdrc so you want ~/.psqlrc.
The format matches the \set commands you'd use within psql itself:
\set HISTSIZE 11
for example.

Passing null string value via environment variable to TSQL script

I have a DOS batch file I want to use to invoke a TSQL program.
I want to pass the names of the databases to use. This seems to work.
I want to pass the PREFIXES for the names of the tables I want to work with.
So for test tables I want to pass the name of a prefix to use the test table.
set svr=myserver
rem set db=myTESTdatabasename
set db=mydatabasename
rem set tp=TEST
set tp=
sqlcmd -S %svr% -d somename -i test01.sql
test01.sql looks like this:
use $(db)
go
select top 10 * into $(db).dbo.$(tp)dsttbl from $(db).dbo.$(tp)srctbl
It works fine for the test stuff, but for the real stuff, I just want to set the value of tp to null so that it will use the real table name and not the bogus table name.
The reason I'm doing this is because I don't know the names of everything that will be used on the actual databases. I'm trying to make it generic so I don't have to do a bunch of search replaces on what will be a very large sql program (the real sql program is already hundreds of lines).
In the test case, this would resolve to
select top 10 * into myTESTdatabasename.dbo.TESTdsttbl from myTESTdatabasename.dbo.TESTsrctbl
For the production runs, it should resolve to
select top 10 * into mydatabasename.dbo.dsttbl from mydatabasename.dbo.srctbl
The problem seems that it doesn't like null values for $(tp), or perhaps that it's getting an undefined variable.
I experimented some with the syntax and as Preet Sangha pointed out you should use the /V command line option.
The reason is that setting a variable to the empty string in a batch script undefines it.
If you want to set the database name in the top of the batch file you can still use set, like this:
set db_to_use=
Then you can use this (undefined) variable in the sqlcmd using the /V option:
sqlcmd -S %svr% -d somename -v db="%db_to_use%" -i test01.sql
...or you can just set the value directly in the sqlcmd line:
sqlcmd -S %svr% -d somename -v db="" -i test01.sql

PostgreSQL - batch + script + variable

I am not a programmer, I am struggling a bit with this.
I have a batch file connecting to my PostgreSQL server, and then open a sql script. Everything works as expected. My question is how to pass a variable (if possible) from one to the other.
Here is my batch file:
set PGPASSWORD=xxxx
cls
#echo off
C:\Progra~1\PostgreSQL\8.3\bin\psql -d Total -h localhost -p 5432 -U postgres -f C:\TotalProteinImport.sql
And here's the script:
copy totalprotein from 'c:/TP.csv' DELIMITERS ',' CSV HEADER;
update anagrafica
set pt=(select totalprotein.resultvalue from totalprotein where totalprotein.accessionnbr=anagrafica.id)
where data_analisi = '12/23/2011';
delete from totalprotein;
This is working great, now the question is how could I pass a variable that would carry the date for data_analisi?
Like in the batch file, "Please enter date", and then the value is passed to the sql script.
You could create a function out of your your SQL script like this:
CREATE OR REPLACE FUNCTION f_myfunc(date)
RETURNS void AS
$BODY$
CREATE TEMP TABLE t_tmp ON COMMIT DROP AS
SELECT * FROM totalprotein LIMIT 0; -- copy table-structure from table
COPY t_tmp FROM 'c:/TP.csv' DELIMITERS ',' CSV HEADER;
UPDATE anagrafica a
SET pt = t.resultvalue
FROM t_tmp t
WHERE a.data_analisi = $1
AND t.accessionnbr = a.id;
-- Temp table is dropped automatically at end of session
-- In this case (ON COMMIT DROP) after the transaction
$BODY$
LANGUAGE sql;
You can use language SQL for this kind of simple SQL batch.
As you can see I have made a couple of modifications to your script that should make it faster, cleaner and safer.
Major points
For reading data into an empty table temporarily, use a temporary table. Saves a lot of disc writes and is much faster.
To simplify the process I use your existing table totalprotein as template for the creation of the (empty) temp table.
If you want to delete all rows of a table use TRUNCATE instead of DELETE FROM. Much faster. In this particular case, you need neither. The temporary table is dropped automatically. See comments in function.
The way you updated anagrafica.pt you would set the column to NULL, if anything goes wrong in the process (date not found, wrong date, id not found ...). The way I rewrote the UPDATE, it only happens if matching data are found. I assume that is what you actually want.
Then ask for user input in your shell script and call the function with the date as parameter. That's how it could work in a Linux shell (as user postgres, with password-less access (using IDENT method in pg_haba.conf):
#! /bin/sh
# Ask for date. 'YYYY-MM-DD' = ISO date-format, valid with any postgres locale.
echo -n "Enter date in the form YYYY-MM-DD and press [ENTER]: "
read date
# check validity of $date ...
psql db -p5432 -c "SELECT f_myfunc('$date')"
-c makes psql execute a singe SQL command and then exits. I wrote a lot more on psql and its command line options yesterday in a somewhat related answer.
The creation of the according Windows batch file remains as exercise for you.
Call under Windows
The error message tells you:
Function tpimport(unknown) does not exist
Note the lower case letters: tpimport. I suspect you used mixe case letters to create the function. So now you have to enclose the function name in double quotes every time you use it.
Try this one (edited quotes!):
C:\Progra~1\PostgreSQL\8.3\bin\psql -d Total -h localhost -p 5432 -U postgres
-c "SELECT ""TPImport""('%dateimport%')"
Note how I use singe and double quotes here. I guess this could work under windows. See here.
You made it hard for yourself when you chose to use mixed case identifiers in PostgreSQL - a folly which I never tire of warning against. Now you have to double quote the function name "TPImport" every time you use it. While perfectly legit, I would never do that. I use lower case letters for identifiers. Always. This way I never mix up lower / upper case and I never have to use double quotes.
The ultimate fix would be to recreate the function with a lower case name (just leave away the double quotes and it will be folded to lower case automatically). Then the function name will just work without any quoting.
Read the basics about identifiers here.
Also, consider upgrading to a more recent version of PostgreSQL 8.3 is a bit rusty by now.
psql supports textual replacement variables. Within psql they can be set using \set and used using :varname.
\set xyz 'abcdef'
select :'xyz';
?column?
----------
abcdef
These variables can be set using command line arguments also:
psql -v xyz=value
The only problem is that these textual replacements always need some fiddling with quoting as shown by the first \set and select.
After creating the function in Postgres, you must create a .bat file in the bin directory of your Postgres version, for example C:\Program Files\PostgreSQL\9.3\bin. Here you write:
#echo off
cd C:\Program Files\PostgreSQL\9.3\bin
psql -p 5432 -h localhost -d myDataBase -U postgres -c "select * from myFunction()"