psql command problem with // (double-slash) - postgresql

/COPY MondayLotto FROM 'https://thelottoproject.blob.core.windows.net/data/MondayLotto.csv' DELIMITER ',' CSV HEADER
The command returns these error messages
(In Azure Cloud Shell terminal)
https:/thelottoproject.blob.core.windows.net/data/MondayLotto.csv: No such file or directory
(In SQL Shell (psql) on Windows10)
https:/thelottoproject.blob.core.windows.net/data/MondayLotto.csv: Invalid argument
I guess the // caused the error because the error message shows only / after https:
PostgreSQL server in Azure
The CSV file in Azure blob storage is accessible.
Is there any solution for this problem?

psql special commands starting by \ so the syntax should be
\copy MondayLotto FROM ...
Probably there will be second issue too. I don't know how the table MondayLotto was created. This looks like case sensitive identifier, and if it is really case sensitive identifier, then it should be used inside parentheses like "MondayLotto".

To load data from a web site, you could run something like
wget -O - https://thelottoproject.blob.core.windows.net/data/MondayLotto.csv | psql -c "COPY mondaylotto FROM STDIN (FORMAT 'csv', HEADER)"

Related

Can I pass a .sql file to psql?

I'm trying to pass a sql file to psql. After reading the docs, tried:
psql_args=(
"password='$INPUT_PASSWORD'"
dbname=analytics
"host='$INPUT_HOST'"
user=analytics
port=32648
file=query.sql
)
psql "${psql_args[*]}"
psql: error: invalid connection option "file"
root#380773cb4e26:/#
If I remove the file=query.sql arg this results in a connection to psql. I just don't know how to pass it a query file?
On the docs, two arguments look like ones of interest here:
-f filename
--file=filename
Read commands from the file filename, rather than standard input
and also:
-c command
--command=command
Specifies that psql is to execute the given command string, command
I tried the file=query.sql one but that failed with the error message above.The command one wants a string whereas I want to pass a .sql file. I tried anyway:
psql_args=(
"password='$INPUT_PASSWORD'"
dbname=analytics
"host='$INPUT_HOST'"
user=analytics
port=32648
command=query.sql
)
psql "${psql_args[*]}"
psql: error: invalid connection option "command"
Is there a way that I can pass query.sql to psql in order to run a query?
You seem to be packaging options up into a connection string. But --file must be given directly as an option to psql, not as part of a connection string.
psql "${psql_args[*]}" --file=query.sql
Since other answer seem to overlook this.
Here is how to store dynamic options into an array, and pass it as arguments to the command:
#!/usr/bin/env bash
psql_args=(
"--dbname=analytics"
"--host=$INPUT_HOST"
"--user=analytics"
"--port=32648"
"--file=query.sql"
)
psql "${psql_args[#]}"

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

how to pass variable to copy command in Postgresql

I tried to make a variable in SQL statement in Postgresql, but it did not work.
There are many csv files stored under the path. I want to set path in Postgresql that can tell copy command where can find csv files.
SQL statement sample:
\set outpath '/home/clients/ats-dev/'
\COPY licenses (_id, name,number_seats ) FROM :outpath + 'licenses.csv' CSV HEADER DELIMITER ',';
\COPY uploaded_files (_id, added_date ) FROM :outpath + 'files.csv' CSV HEADER DELIMITER ',';
It did not work. I got error: no such files. The two files licneses.csv and files.csv are stored under /home/cilents/ats-dev on Ubuntu. I found some sultion that use "\set file 'license.csv'". It did not work for me becacuse I have many csv files. also I tried to use "from : outpath || 'licenses.csv'". it did not work ether. Appreciate for any helps.
Using 9.3.
It looks like psql does not support :variable substitution withinpsql backslash commands.
test=> \set somevar fred
test=> \copy z from :somevar
:somevar: No such file or directory
so you will need to do this via an external tool like the unix shell. e.g.
for f in *.sql; do
psql -c "\\copy $(basename $f) FROM '$f'"
done
You can try COPY command
\set outpath '\'/home/clients/ats-dev/'
COPY licenses (_id, name,number_seats ) FROM :outpath/licenses.csv' WITH CSV HEADER DELIMITER ',';
COPY uploaded_files (_id, added_date ) FROM :outpath/files.csv' WITH CSV HEADER DELIMITER ',';
Note: Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. Similarly, the command specified with PROGRAM is executed directly by the server, not by the client application, must be executable by the PostgreSQL user. COPY naming a file or command is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
Documentation: Postgresql 9.3 COPY
It may have been true when this was originally asked, that psql backslash commands didn't support variable interpolation, but in my PostgreSQL 14 instance that's no longer the case. However, the psql manpage is clear that \copy specifically does not support variable interpolation.

PSQL COPY from ShellScript

I am writing a shell script that fetches data (.csv file) form AWS S3, downloads it locally onto an EC2 Linux AMI Instance, and then copies the data to an RDS PostGresql database.
My Shell code is the following:
FILE="$(ls DB)"
PARAMETERFORDB= "'\\COPY table(x,y) FROM ''$FILE'' CSV HEADER'"
$(psql --host=XXXXX --port=XXXXX --username=XXXXX --password --dbname=XXXXX -c ${PARAMETERFORDB})
So when the data from S3 is downloaded, I store the files' name inside the FILE variable (it is the only file in the folder, the folder will be deleted after the Database query).
I get following error message:
./shellTest.sh: line 21: '\COPY table(x,y) FROM ''14.9.2016.csv'' CSV HEADER': command not found
psql: option requires an argument -- 'c'
Try "psql --help" for more information.
What am I doing wrong?
In the line
PARAMETERFORDB= "'\\COPY table(x,y) FROM ''$FILE'' CSV HEADER'"
remove the space after the = and remove one level of single quotes:
PARAMETERFORDB="\\COPY table(x,y) FROM '$FILE' CSV HEADER"
In the line where psql is invoked, enclose ${PARAMETERFORDB} in double quotes since it contains spaces.

PostgreSQL Copy to FTP Server

we can use
copy (select * from mytbl) to 'D:/products.csv' with csv header
to import data in mytbl to local disk D
so is it possible to use the same method to upload the file directly into a FTP-Server ?
i tried like this
copy (select * from mytbl) to 'ftp://usrname:mypasswrd#ftp.drivehq.com/masters/3/product/products.csv' with csv header
but got this error
ERROR: relative path not allowed for COPY to file
SQL state: 42602
using PostgreSQL 9.2
PostgreSQL does not support any source/destination for COPY other than a file or stdin/stdout.
What you can do is COPY to stdout and pipe that to a program that writes the data to the ftp dir. psql's \copy is useful for this:
psql -c "\copy mytable to stdout with (format csv, header)" | ncftpput -c my.ftp.host /path/on/host
You can use any tool that accepts the input data on a pipe to write to the remote ftp file; ncftpput is just one option.
A future PostgreSQL version may add support for invoking COPY with a pipe, e.g. COPY ... TO '|/some/command', but there are serious security concerns with running programs under the PostgreSQL user that would make this a superuser-only operation and of questionable safety even then. It's much safer to run the program client-side, and psql is ideal for that.