I am trying to build a simple script that outputs a query to a csv in powershell. However, it keeps returning the following error:
psql: fe_sendauth: no password supplied
I tried exposing the password as this will be a safe environment, but have seen suggestions to use pgpass but no real explanation on how to implement.
Set-Location 'E:\Program Files\PostgreSQL\9.1\bin\';
SET 'PGPASSWORD = myPwd';
.\psql -U postgres -w MyDatabase
copy {'SELECT * FROM table';} TO 'C:\Users\e\Desktop\test1.csv' CSV DELIMITER ',';
SET is an alias for Set-Variable, but Powershell variables are not environment variables. To set an environment variable, you need to use the $env: scope. Try:
$env:PGPASSWORD = 'myPwd';
See also here for more on environment variables.
Also, I don't think you can get away with putting raw input on the command line like that in PowerShell. I think it will treat things as separate commands, but I could be wrong.
You may also want to use the command switch (-c) and the PowerShell stop parsing symbol (--%) when you call psql to prevent PowerShell from parsing your command string:
.\psql --% -U postgres -w MyDatabase -c "copy {'SELECT * FROM table';} TO 'C:\Users\e\Desktop\test1.csv' CSV DELIMITER ',';"
Or set the commands to a variable with here-strings and pipe that to psql:
$query = #'
copy {'SELECT * FROM table';} TO 'C:\Users\e\Desktop\test1.csv' CSV DELIMITER ',';
'#
$query | .\psql -U postgres -w MyDatabase
Or about a dozen other ways to call an executable.
Related
I am trying to find a way to pass a file to psql while using '\copy'. There is a question posted here use sql file in "\copy" psql command line asking a similar thing, but the accepted solution doesn't actually pass a file to psql, it passes the contents of the file, so this cannot be done when the contents of the sql file exceeds the maximum allowed length of the psql command.
e.g. something like this psql -c data_base "\copy \file_path <file.sql> To './test.csv' With CSV"
Using copy rather than \copy, you can do it like this:
(echo "copy ("; cat file.sql ; echo ") to STDOUT with CSV")| psql -X > ./test.csv
I am a newbie to writing shell scripts. Please help me in parameterizing a variable value in my shell script.
I am taking command-line arguments for database name, server, user, and password in the following way:
database_name=$1
server=$2
user=$3
password=$4
I want to understand how I can pass these values to a variable called sqlcmd. I pass these values in the following way and then echo to see the value of sqlcmd variable:
sqlcmd=sqlcmd -S $server -U $user -P $password
echo $sqlcmd
after making the shell script executable using chmod a+x on ubuntu. I run the script and get the following error
line 37: -S: command not found. Line 37 in my shell script is a line on which sqlcmd variable is initialized
P.S I am using WSL on a remote windows machine. I am not sure if that should cause an error.
You need quotes:
sqlcmd="sqlcmd -S $server -U $user -P $password"
Note that you may run into difficulties later trying to execute the contents of sqlcmd.
I am writing a batch job for Postgres for first time. I have return ".sh" file, which has a command with out any out put in the log or console.
Code
export PGPASSWORD=<password>
psql -h <host> -p <port> -U <user> -d <database> --file cleardata.sql > log\cleardata.log 2>&1
What I did at cammond line
su postgres
and run ./cleardatasetup.sh
Nothing is happening.
Please note : When I try psql command in Unix command line, I am getting message as some SQL exception which is valid.
Can any one please help me in this regard.
You probably wanted to create log/cleardata.log but you have a backslash where you need a slash. You will find that the result is a file named log\cleardata.log instead.
The backslash is just a regular character in the file's name, but it's special to the shell, so you'll need to quote or escape it to (unambiguously) manipulate it from the shell;
ls -l log\\cleardata.log # escaped
mv 'log\cleardata.log' log/cleardata.log # quoted
In postgresql 9.3.1, when interactively developing a query using the psql command, the end result is sometimes to write the query results to a file:
boron.production=> \o /tmp/output
boron.production=> select 1;
boron.production=> \o
boron.production=> \q
$ cat /tmp/output
?column?
----------
1
(1 row)
This works fine. But how can I get the query itself to be written to the file along with the query results?
I've tried giving psql the --echo-queries switch:
-e, --echo-queries
Copy all SQL commands sent to the server to standard output as well.
This is equivalent to setting the variable ECHO to queries.
But this always echoes to stdout, not to the file I gave with the \o command.
I've tried the --echo-all switch as well, but it does not appear to echo interactive input.
Using command editing, I can repeat the query with \qecho in front of it. That works, but is tedious.
Is there any way to direct an interactive psql session to write both the query and the query output to a file?
You can try redirecting the stdout to a file directly from your shell (Win or Linux should work)
psql -U postgres -c "select 1 as result" -e nomedb >> hello.txt
This has the drawback of not letting you see the output interactively. If that's a problem, you can either tail the output file in a separate terminal, or, if in *nix, use the tee utility:
psql -U postgres -c "select 1 as result" -e nomedb | tee hello.txt
Hope this helps!
Luca
I know this is an old question, but at least in 9.3 and current versions this is possible using Query Buffer meta-commands shown in the documentation or \? from the psql console: https://www.postgresql.org/docs/9.3/static/app-psql.html
\w or \write filename
\w or \write |command
Outputs the current query buffer to the file filename or pipes it to the shell command command.
Please try this format as I got the output from the same:
psql -h $host -p $port -q -U $user -d $Dbname -c "SELECT \"Employee-id\",\"Employee-name\" FROM Employee_table" >> Employee_Date.csv
I need the output in a CSV file.
I run scripts against my database like this...
psql -d myDataBase -a -f myInsertFile.sql
The only problem is I want to be able to specify in this command what schema to run the script against. I could call set search_path='my_schema_01' but the files are supposed to be portable. How can I do this?
You can create one file that contains the set schema ... statement and then include the actual file you want to run:
Create a file run_insert.sql:
set schema 'my_schema_01';
\i myInsertFile.sql
Then call this using:
psql -d myDataBase -a -f run_insert.sql
More universal way is to set search_path (should work in PostgreSQL 7.x and above):
SET search_path TO myschema;
Note that set schema myschema is an alias to above command that is not available in 8.x.
See also: http://www.postgresql.org/docs/9.3/static/ddl-schemas.html
Main Example
The example below will run myfile.sql on database mydatabase using schema myschema.
psql "dbname=mydatabase options=--search_path=myschema" -a -f myfile.sql
The way this works is the first argument to the psql command is the dbname argument. The docs mention a connection string can be provided.
If this parameter contains an = sign or starts with a valid URI prefix
(postgresql:// or postgres://), it is treated as a conninfo string
The dbname keyword specifies the database to connect to and the options keyword lets you specify command-line options to send to the server at connection startup. Those options are detailed in the server configuration chapter. The option we are using to select the schema is search_path.
Another Example
The example below will connect to host myhost on database mydatabase using schema myschema. The = special character must be url escaped with the escape sequence %3D.
psql postgres://myuser#myhost?options=--search_path%3Dmyschema
The PGOPTIONS environment variable may be used to achieve this in a flexible way.
In an Unix shell:
PGOPTIONS="--search_path=my_schema_01" psql -d myDataBase -a -f myInsertFile.sql
If there are several invocations in the script or sub-shells that need the same options, it's simpler to set PGOPTIONS only once and export it.
PGOPTIONS="--search_path=my_schema_01"
export PGOPTIONS
psql -d somebase
psql -d someotherbase
...
or invoke the top-level shell script with PGOPTIONS set from the outside
PGOPTIONS="--search_path=my_schema_01" ./my-upgrade-script.sh
In Windows CMD environment, set PGOPTIONS=value should work the same.
I'm using something like this and works very well:* :-)
(echo "set schema 'acme';" ; \
cat ~/git/soluvas-framework/schedule/src/main/resources/org/soluvas/schedule/tables_postgres.sql) \
| psql -Upostgres -hlocalhost quikdo_app_dev
Note: Linux/Mac/Bash only, though probably there's a way to do that in Windows/PowerShell too.
This works for me:
psql postgresql://myuser:password#myhost/my_db -f myInsertFile.sql
In my case, I wanted to add schema to a file dynamically so that whatever schema name user will provide from the cli, I will run sql file with that provided schema name.
For this, I replaced some text in the sql file. First I added {{schema}} in the file like this
CREATE OR REPLACE FUNCTION {{schema}}.usp_dailygaintablereportdata(
then replace {{schema}} dynamically with user provided schema name with the help of sed command
sed -i "s/{{schema}}/$pgSchemaName/" $filename
result=$(psql -U $user -h $host -p $port -d $dbName -f "$filename" 2>&1)
sed -i "s/$pgSchemaName/{{schema}}/" $filename
First replace is done, then target file is run and then again our replace is reverted back
I was facing similar problems trying to do some dat import on an intermediate schema (that later we move on to the final one). As we rely on things like extensions (for example PostGIS), the "run_insert" sql file did not fully solved the problem.
After a while, we've found that at least with Postgres 9.3 the solution is far easier... just create your SQL script always specifying the schema when refering to the table:
CREATE TABLE "my_schema"."my_table" (...);
COPY "my_schema"."my_table" (...) FROM stdin;
This way using psql -f xxxxx works perfectly, and you don't need to change search_paths nor use intermediate files (and won't hit extension schema problems).