Containerized database file path and root directory unknown - postgresql

I recently tried copying my database contents to a csv file with the following command inside my containerized Postgres database:
\copy ${TABLE} TO ${FILE} DELIMITER ',' CSV HEADER;
I got a response indicating the file was successfully copied however I can't find where it was copied to. When I try specifying a different path to output the file, I get the response directory/file.csv: No such file or directory
Does anyone know where containerized databases output files and how I can direct them to accessible locations?
I am on a Mac OS and this is some of the relevant info from my docker-compose file with which the database was initiated.
db:
image: kartoza/postgis:12.0
volumes:
- postgis:/var/lib/postgresql

Docker containers store their information internally in what is called Docker volumes. You can read more literature on that in Use volumes.
Regarding your particular issue, you've got some options:
Copy to a temporary file and pull it from the container:
\copy ${TABLE} TO /tmp/file.csv DELIMITER ',' CSV HEADER;
Then run docker ps, find your container ID and run:
docker cp container_id:/tmp/file.csv file.csv
And you will have file.csv with the data in your current folder.
Another, simpler way is to export to stdout, if the output is gonna be short:
\copy ${TABLE} TO STDOUT DELIMITER ',' CSV HEADER;
This will dump all the data through the terminal. Only use it if there are few enough registers that it doesn't get past the scrollback.
Third option, because two are never enough... you could publish temporarily the 5432 port and connect from your local machine using psql... then running the copy command will dump to your local machine. (Or use third-party tools like pgAdmin or DataGrip to dump the information).

Related

Postgresql copy command not finding file

when running:
~/fidelity/releases/20220907033831$ ls -a
.
..
.browserslistrc
221005_users_all.csv
_private
the presence of a file is confirmed.
However, when launching a postgresql command
psql fidelity_development
COPY users (id,migrated_id,[...]) FROM '~/fidelity/releases/20220907033831/221005_users_all.csv' DELIMITER ';' CSV HEADER;
The response is unexpected:
ERROR: could not open file "~/fidelity/releases/20220907033831/221005_users_all.csv" for reading: No such file or directory
What am I missing to determine why postgresql cannot see this file?
note this directory was also simlinked as fidelity/current and the same result was obtained when referring to that directory for the file, whereas bash sees it.
Use \COPY command as this one is client based and handles the local path correctly.
While COPY is server based and this could cause issues finding your file.

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

PostgreSQL Copy to FTP Server

we can use
copy (select * from mytbl) to 'D:/products.csv' with csv header
to import data in mytbl to local disk D
so is it possible to use the same method to upload the file directly into a FTP-Server ?
i tried like this
copy (select * from mytbl) to 'ftp://usrname:mypasswrd#ftp.drivehq.com/masters/3/product/products.csv' with csv header
but got this error
ERROR: relative path not allowed for COPY to file
SQL state: 42602
using PostgreSQL 9.2
PostgreSQL does not support any source/destination for COPY other than a file or stdin/stdout.
What you can do is COPY to stdout and pipe that to a program that writes the data to the ftp dir. psql's \copy is useful for this:
psql -c "\copy mytable to stdout with (format csv, header)" | ncftpput -c my.ftp.host /path/on/host
You can use any tool that accepts the input data on a pipe to write to the remote ftp file; ncftpput is just one option.
A future PostgreSQL version may add support for invoking COPY with a pipe, e.g. COPY ... TO '|/some/command', but there are serious security concerns with running programs under the PostgreSQL user that would make this a superuser-only operation and of questionable safety even then. It's much safer to run the program client-side, and psql is ideal for that.

How to import Zipped file into Postgres Table

I would like to important a file into my Postgresql system(specificly RedShift). I have found a arguement for copy that allows importing a gzip file. But the provider for the data I am trying to include in my system only produces the data in a .zip. Any built in postgres commands for opening a .zip?
From within Postgres:
COPY table_name FROM PROGRAM 'unzip -p input.csv.zip' DELIMITER ',';
From the man page for unzip -p:
-p extract files to pipe (stdout). Nothing but the file data is sent to stdout, and the files are always extracted in binary
format, just as they are stored (no conversions).
Can you just do something like
unzip -c myfile.zip | gzip myfile.gz
Easy enough to automate if you have enough files.
This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here:
This is the format that works for me if my s3 bucket contains a gzipped .csv.
copy <table> from 's3://mybucket/<foldername> '<aws-auth-args>' delimiter ',' gzip;
unzip -c /path/to/.zip | psql -U user
The 'user' must be have super user right else you will get a
ERROR: must be superuser to COPY to or from a file
To learn more about this see
https://www.postgresql.org/docs/8.0/static/backup.html
Basically this command is used in handling large databases

Using COPY FROM in postgres - absolute filename of local file

I'm trying to import a csv file using the COPY FROM command with postgres.
The db is stored on a linux server, and my data is stored locally, i.e. C:\test.csv
I keep getting the error:
ERROR: could not open file "C:\test.csv" for reading: No such file or directory
SQL state: 58P01
I know that I need to use the absolute path for the filename that the server can see, but everything I try brings up the same error
Can anyone help please?
Thanks
Quote from the PostgreSQL manual:
The file must be accessible to the server and the name must be specified from the viewpoint of the server
So you need to copy the file to the server before you can use COPY FROM.
If you don't have access to the server, you can use psql's \copy command which is very similar to COPY FROM but works with local files. See the manual for details.