I have a somewhat detailed query in a script that uses ? placeholders. I wanted to test this same query directly from the psql command line (outside the script). I want to avoid going in and replacing all the ? with actual values, instead I'd like to pass the arguments after the query.
Example:
SELECT *
FROM foobar
WHERE foo = ?
AND bar = ?
OR baz = ? ;
Looking for something like:
%> {select * from foobar where foo=? and bar=? or baz=? , 'foo','bar','baz' };
You can use the -v option e.g:
$ psql -v v1=12 -v v2="'Hello World'" -v v3="'2010-11-12'"
and then refer to the variables in SQL as :v1, :v2 etc:
select * from table_1 where id = :v1;
Please pay attention to how we pass string/date values using two quotes " '...' " But this way of interpolation is prone to SQL injections, because it's you who's responsible for quoting. E.g. need to include a single quote? -v v2="'don''t do this'".
A better/safer way is to let PostgreSQL handle it:
$ psql -c 'create table t (a int, b varchar, c date)'
$ echo "insert into t (a, b, c) values (:'v1', :'v2', :'v3')" \
| psql -v v1=1 -v v2="don't do this" -v v3=2022-01-01
Found out in PostgreSQL, you can PREPARE statements just like you can in a scripting language. Unfortunately, you still can't use ?, but you can use $n notation.
Using the above example:
PREPARE foo(text,text,text) AS
SELECT *
FROM foobar
WHERE foo = $1
AND bar = $2
OR baz = $3 ;
EXECUTE foo('foo','bar','baz');
DEALLOCATE foo;
In psql there is a mechanism via the
\set name val
command, which is supposed to be tied to the -v name=val command-line option. Quoting is painful, In most cases it is easier to put the whole query meat inside a shell here-document.
Edit
oops, I should have said -v instead of -P (which is for formatting options) previous reply got it right.
You can also pass-in the parameters at the psql command-line, or from a batch file. The first statements gather necessary details for connecting to your database.
The final prompt asks for the constraint values, which will be used in the WHERE column IN() clause. Remember to single-quote if strings, and separate by comma:
#echo off
echo "Test for Passing Params to PGSQL"
SET server=localhost
SET /P server="Server [%server%]: "
SET database=amedatamodel
SET /P database="Database [%database%]: "
SET port=5432
SET /P port="Port [%port%]: "
SET username=postgres
SET /P username="Username [%username%]: "
SET /P bunos="Enter multiple constraint values for IN clause [%constraints%]: "
ECHO you typed %constraints%
PAUSE
REM pause
"C:\Program Files\PostgreSQL\9.0\bin\psql.exe" -h %server% -U %username% -d %database% -p %port% -e -v v1=%constraints% -f test.sql
Now in your SQL code file, add the v1 token within your WHERE clause, or anywhere else in the SQL. Note that the tokens can also be used in an open SQL statement, not just in a file. Save this as test.sql:
SELECT * FROM myTable
WHERE NOT someColumn IN (:v1);
In Windows, save the whole file as a DOS BATch file (.bat), save the test.sql in the same directory, and launch the batch file.
Thanks for Dave Page, of EnterpriseDB, for the original prompted script.
I would like to offer another answer inspired by #malcook's comment (using bash).
This option may work for you if you need to use shell variables within your query when using the -c flag. Specifically, I wanted to get the count of a table, whose name was a shell variable (which you can't pass directly when using -c).
Assume you have your shell variable
$TABLE_NAME='users'
Then you can get the results of that by using
psql -q -A -t -d databasename -c <<< echo "select count(*) from $TABLE_NAME;"
(the -q -A -t is just to print out the resulting number without additional formatting)
I will note that the echo in the here-string (the <<< operator) may not be necessary, I originally thought the quotes by themselves would be fine, maybe someone can clarify the reason for this.
It would appear that what you ask can't be done directly from the command line. You'll either have to use a user-defined function in plpgsql or call the query from a scripting language (and the latter approach makes it a bit easier to avoid SQL injection).
I've ended up using a better version of #vol7ron answer:
DO $$
BEGIN
IF NOT EXISTS(SELECT 1 FROM pg_prepared_statements WHERE name = 'foo') THEN
PREPARE foo(text,text,text) AS
SELECT *
FROM foobar
WHERE foo = $1
AND bar = $2
OR baz = $3;
END IF;
END$$;
EXECUTE foo('foo','bar','baz');
This way you can always execute it in this order (the query prepared only if it does not prepared yet), repeat the execution and get the result from the last query.
Related
I'm trying to export some data from a Firebird database, with FBExport to a CSV file.
The problem is I have two different errors. I spent a few hours to try different combinations:
Unknown switch -
Switches must begin with -
The command I tried:
fbexport -Sc -Q -F h:\AABBCC\export.csv -B -D h:\AABBCC\XXYYZZ.FDB -U "MMNNOO" -P "PPQQRR" -X "select PATIENTS.IPP, PATIENTS.NOM, PATIENTS.NOM_MARITAL, PATIENTS.NOM_USUEL, PATIENTS.PRENOMS, dmc.NUM_DOSSIER_PAPIER, dmc.NUM_DOSSIER_PAPIER_2, dmc.NUM_DOSSIER_PAPIER_3 from PATIENTS join dmc on PATIENTS.ipp = dmc.CODE where dmc.NUM_DOSSIER_PAPIER is not null or dmc.NUM_DOSSIER_PAPIER_2 is not null or dmc.NUM_DOSSIER_PAPIER_3 is not null;"
I absolutely don't understand what FBExport needs.
How can I export the data?
For me, the following command line works:
fbexport -Sc -D employee -U sysdba -P masterkey -F C:\Temp\export.csv -Q "select * from employee"
The problem in your original command was that you had a bare -Q, which caused the following -F to be interpreted as the argument of -Q, which then lead to h:\AABBCC\export.csv to be interpreted as an option, which then produced an error because it doesn't start with a -.
In addition, your command also had the following problems:
-B defines an alternative separator character for the produced CSV. It expects a separator character or TAB or \t for a tab. So, in similar vein as the previous problem, this would cause -D to be interpreted as an argument of -B, which then leads to h:\AABBCC\XXYYZZ.FDB to be interpreted as an option (without -).
-X is a primary option (like -S), to execute the query specified by -Q, instead of exporting (saving, -S). It doesn't accept a query text as argument, so the query text is also interpreted as an option (without -). This occurrence of -X should have been -Q.
I am attempting to automate the following series of commands which work correctly into a BASH script:
kubectl exec -it mongo-pod -- bash
mongo DBNAME
db.auth("theUser", "thePw")
db.theCollection.find()
The script I am using is as follows:
#!/bin/bash
kubectl exec -it mongo-pod -- bash -c "mongo DBNAME && /
db.auth("theUser", "thePw") && /
db.theCollection.find()"
I have tried the following:
Executing multiple commands( or from a shell script) in a kubernetes pod
but any commands that are added after the first using & or && are not executed. For example just using "mongo DBNAME" correctly opens the prompt and sets it to the correct db, but adding any other command with && causes all commands to fail with the following:
bash: -c line 0: syntax error near unexpected token 'theUser'
All of the comments are spot on, but at least two things I have the highest confidence I have the answer to:
First, you have the line continuation character wrong; it should be \ and not /. It actually wouldn't even be required if you switched bash into "exit on error" mode, with
kubectl exec -it mongo-pod -- bash -ec "mongo DBNAME
echo 'this command only runs if mongo exits a-ok'
exit 1
and this never will run
"
However, the other mistake is around the quoting characters used: if you have bash -c " then you must either use the single-quote for the interior string literals, or escape them with \". You can actually see what I'm talking about by looking at the syntax highlighting of the shell snippet in your question. Observe that the string literal is red, but then the text theUser as well as thePw are both black -- that's because they are outside the string literal since the string stopped at the first " it encountered -- the one present in db.auth("
It is almost always the case that you'll want to use single quotes when invoking bash remotely like that, for several reasons but the most relevant is that you can then use db.auth("something") without having to unnecessarily escape the double quotes.
Since mongo (like many interpreters such a node and python) wants you to either type in it interactively, provide the input on its "standard input", or give it a local file containing commands, you will want to change the invocation to one of those strategies depending on your needs.
A very convenient way of redirecting standard input without having to use echo or printf and its associated quoting hell is to use what are called "here documents" (abbreviated "heredocs") in bash:
kubectl exec -it mongo-pod -- bash -ec 'mongo DBNAME<<"FOO"
db.auth("theUser", "thePw")
printjson(db.theCollection.find())
FOO
'
That causes bash to transmit almost all characters between the two "heredoc delimiters" to the standard input of the command. If you quote the delimiter, as I have with the [arbitrary] word FOO, then the contents are not subject to variable expansion, command interpolation, etc, which can be one more mechanism to avoid backtick and dollarsign weirdness.
I'm using PostgreSQL on Windows 7 through the command line. I want to import the content of different CSV files into a newly created table.
After executing the command the database name appeared like:
database=#
Now appears like
database*# after executing:
type directory/*.csv | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER';
What does *# mean?
Thanks
This answer is for Linux and as such doesn't answer OP's question for Windows. I'll leave it up anyway for anyone that comes across this in the future.
You accidentally started a block comment with your type directory/*.csv. type doesn't do what you think it does. From the bash built-ins:
With no options, indicate how each name would be interpreted if used as a command name.
Try doing cat instead:
cat directory/*.csv | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER';
If this gives you issues because each CSV has its own header, you can also do:
for file in directory/*.csv; do cat "$file" | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER'; done
Type Command
The type built-in command in Bash is a way of viewing command interpreter results. For example, using it with ssh:
$ type ssh
ssh is /usr/bin/ssh
This indicates how ssh would be interpreted when you run ssh as a command in the current Bash environment. This is useful for things like aliases. As an example for this, ll is usually an alias to ls -l. Here's what my Bash environment had for ll:
$ type ll
ll is aliased to `ls -l --color=auto'
For you, when you pipe the result of this command to psql, it encounters the /* in the input and assumes it's a block comment, which is what the database*# prompt means (the * indicates it's waiting for the comment close pattern, */).
Cat Command
cat is for concatenating multiple files together. By default, it writes to standard out, so cat directory/*.csv will write each CSV file to standard out one after another. However, piping this means that each CSV's header will also be piped mid-stream of the copy. This may not be desirable, so:
For Loop
We can use for to loop over each file and individually import it. The version I have above, for file in directory/*.csv, will properly handle files with spaces. Properly formatted:
for file in directory/*; do
cat "$file" | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER'
done
References
PostgreSQL 10 Comments Documentation (postgresql.org)
type built-in Manual page (mankier.com)
cat Manual page (mankier.com)
Bash looping tutorial (tldp.org)
I'm trying to use the copy command to copy the content of a file into a database.
One of the lines have this:
CCc1ccc(cc1)C(=O)/N=c\1/n(ccs1)C
and when i insert this normally into database there is no errors.
But when i'm trying to use the following command, this line is not insert correctly.
cat smile_test.txt | psql -c "copy testzincsmile(smile) from stdout" teste
This i what i get (it is wrong):
CCc1ccc(cc1)C(=O)/N=c/n(ccs1)C
What's wrong here?
Thank you :)
copy expects a specific input format and cannot just be used to read random text from a file into a field.
See the manual.
The specific issue you're hitting is probably a backslash being interpreted as an escape by the default copy in/out format.
I figure out how to do this:
This is my answer:
cat smile_test.txt | sed '1d; s/\\/\\\\/g' | psql -c "copy testzincsmile(smile) from stdout" teste
I want to dump a select query to a tab-delimited text file using psql -F. However, this doesn't work:
psql -Umyuser mydb -F '\t' --no-align -c "select * from mytable" -o /tmp/dumpfile.txt
That makes the delimiter a literal \t. How do I get it to use real tabs instead?
I think you just need to use a literal tab. How this works depends on your shell. Have you seen this post?
In the bash shell you can do this with $'\t'.
Using the example in your question:
psql -Umyuser mydb -AF $'\t' --no-align -c "select * from mytable" -o /tmp/dumpfile.txt
From man bash:
Words of the form $'string' are treated specially. The word expands to string, with backslash-escaped characters replaced as specified by the ANSI C standard. [...] The expanded result is single-quoted, as if the dollar sign had not been present.
In Unix, you can also type
ctrl-V tab
ctrl-V tells the terminal not to interpret the next key.
This also works with carriage returns (^M) and many other special keys like arrow keys
In case if somebody looked for how to do it in the interactive shell:
\f '\t'