psql error: could not stat file- unknown error - copy

I’m using psql copy command to copy csv file from a directory to my targeted table in a postgres database.It successfully works if my file is less that 1 GB but if file is bigger than 1 GB I get the error: could not stat file “mydata.csv” : unknown error
I trier a small file, less that 1 GB , copy psql command works executed using a bat file

Related

Postgresql copy command not finding file

when running:
~/fidelity/releases/20220907033831$ ls -a
.
..
.browserslistrc
221005_users_all.csv
_private
the presence of a file is confirmed.
However, when launching a postgresql command
psql fidelity_development
COPY users (id,migrated_id,[...]) FROM '~/fidelity/releases/20220907033831/221005_users_all.csv' DELIMITER ';' CSV HEADER;
The response is unexpected:
ERROR: could not open file "~/fidelity/releases/20220907033831/221005_users_all.csv" for reading: No such file or directory
What am I missing to determine why postgresql cannot see this file?
note this directory was also simlinked as fidelity/current and the same result was obtained when referring to that directory for the file, whereas bash sees it.
Use \COPY command as this one is client based and handles the local path correctly.
While COPY is server based and this could cause issues finding your file.

Import csv file into table in pgAdmin using COPY command

I'm trying to import csv file's table into pgAdmin using COPY command and getting the error:
ERROR: could not open file "R:\myfile.csv" for reading: No such file or directory
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
The statement is:
copy tst_copy from 'R:\myfile.csv'

Containerized database file path and root directory unknown

I recently tried copying my database contents to a csv file with the following command inside my containerized Postgres database:
\copy ${TABLE} TO ${FILE} DELIMITER ',' CSV HEADER;
I got a response indicating the file was successfully copied however I can't find where it was copied to. When I try specifying a different path to output the file, I get the response directory/file.csv: No such file or directory
Does anyone know where containerized databases output files and how I can direct them to accessible locations?
I am on a Mac OS and this is some of the relevant info from my docker-compose file with which the database was initiated.
db:
image: kartoza/postgis:12.0
volumes:
- postgis:/var/lib/postgresql
Docker containers store their information internally in what is called Docker volumes. You can read more literature on that in Use volumes.
Regarding your particular issue, you've got some options:
Copy to a temporary file and pull it from the container:
\copy ${TABLE} TO /tmp/file.csv DELIMITER ',' CSV HEADER;
Then run docker ps, find your container ID and run:
docker cp container_id:/tmp/file.csv file.csv
And you will have file.csv with the data in your current folder.
Another, simpler way is to export to stdout, if the output is gonna be short:
\copy ${TABLE} TO STDOUT DELIMITER ',' CSV HEADER;
This will dump all the data through the terminal. Only use it if there are few enough registers that it doesn't get past the scrollback.
Third option, because two are never enough... you could publish temporarily the 5432 port and connect from your local machine using psql... then running the copy command will dump to your local machine. (Or use third-party tools like pgAdmin or DataGrip to dump the information).

PSQL COPY from ShellScript

I am writing a shell script that fetches data (.csv file) form AWS S3, downloads it locally onto an EC2 Linux AMI Instance, and then copies the data to an RDS PostGresql database.
My Shell code is the following:
FILE="$(ls DB)"
PARAMETERFORDB= "'\\COPY table(x,y) FROM ''$FILE'' CSV HEADER'"
$(psql --host=XXXXX --port=XXXXX --username=XXXXX --password --dbname=XXXXX -c ${PARAMETERFORDB})
So when the data from S3 is downloaded, I store the files' name inside the FILE variable (it is the only file in the folder, the folder will be deleted after the Database query).
I get following error message:
./shellTest.sh: line 21: '\COPY table(x,y) FROM ''14.9.2016.csv'' CSV HEADER': command not found
psql: option requires an argument -- 'c'
Try "psql --help" for more information.
What am I doing wrong?
In the line
PARAMETERFORDB= "'\\COPY table(x,y) FROM ''$FILE'' CSV HEADER'"
remove the space after the = and remove one level of single quotes:
PARAMETERFORDB="\\COPY table(x,y) FROM '$FILE' CSV HEADER"
In the line where psql is invoked, enclose ${PARAMETERFORDB} in double quotes since it contains spaces.

postgres - save output to server harddrive

When I execute the following script:
copy (
select agk_p_id Promoter_agk, multiplication_lr_agk_p_k4, agk_lr_rvd, status_agk_p_k4
from patient_agk_p_expr
where status_agk_p_k4='Preferentially')
to 'g:\boom.csv'
With CSV HEADER;
It works just beautifully, and creates the boom.csv file on my g drive.
I get:
Query returned successfully: 8486 rows affected, 631 ms execution time.
I should note that my 'g' drive is an external harddrive that is connected to my computer.
And my cygwin refers to my g harddrive like this:
blumr04#SRB524YBZ1 /cygdrive/g/
$ pwd
/cygdrive/g
Now, my computer has also access to a server harddrive of my organization.
On my windows explorer it refers to as (Z:)
My cygwin refers to the 'Z' drive accordingly (just the same it does to my C: drive):
blumr04#SRB524YBZ1 /cygdrive/z/
$ pwd
/cygdrive/z
But I have troubles when it comes to having postgres recognizing this harddrive - when I attempt to run the following script in order to save my table to the Z harddrive :
copy (
select agk_p_id Promoter_agk, multiplication_lr_agk_p_k4, agk_lr_rvd, status_agk_p_k4
from patient_agk_p_expr
where status_agk_p_k4='Preferentially')
to 'z:\boom.csv'
With CSV HEADER;
I get the following error message:
ERROR: could not open file "z:\boom.csv" for writing: No such file or directory
********** Error **********
ERROR: could not open file "z:\boom.csv" for writing: No such file or directory
SQL state: 58P01
Does anyone know how can I save (copy to) my files when it comes to a harddrive that is not physically connected to my computer, but rather is a server harddrive?
- Is there a command / script in postgres that would be able to show me which harddrives are accessible to postgres? It looks like for some reason the Z harddrive is not accessible for postgres read/write, at least not in the way that I attempt it, while G,J,K, and other harddrive which are external HD - are accessible. I would be glad to know if I could expand postgres accessibility somehow..
Thanks!
#Mike Sherrill 'Cat Recall'
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
My Z drive is referred to in Windows Explorer also as:
(\shares.nyumc.org\research)(Z:),
therefore I tried also the following:
copy (
select agk_p_id Promoter_agk, multiplication_lr_agk_p_k4, agk_lr_rvd, status_agk_p_k4
from patient_agk_p_expr
where status_agk_p_k4='Preferentially')
to '\\shares.nyumc.org\research\boom.csv'
With CSV HEADER;
This scripts gives me the following error, indeed all about permissions:
ERROR: could not open file "\\shares.nyumc.org\research\boom.csv" for writing: Permission denied
********** Error **********
ERROR: could not open file "\\shares.nyumc.org\research\boom.csv" for writing: Permission denied
SQL state: 42501
So it looks like the right path is:
\\shares.nyumc.org\research\
And that in this case (Z:) is merely an alias name(?!) as the error message is this time NOT about "No such file or directory", but rather about permissions.
Is there a way I could facilitate the necessary permission to Postgres so it could write to the server drive?
The most common problem with running COPY tablename to filename is dealing with path and permissions from the point of view of the PostgreSQL server.
Files named in a COPY command are read or written directly by the
server, not by the client application. Therefore, they must reside on
or be accessible to the database server machine, not the client. They
must be accessible to and readable or writable by the PostgreSQL user
(the user ID the server runs as), not the client. Source
If you try to write to a file that the PostgreSQL server can't "see", you'll get "No such file or directory". If you try to write to a file in a directory for which the PostgreSQL server lacks permissions, you'll get "Permission denied".
So odds are good that the PostgreSQL user (the user ID the server runs as) lacks permissions on "z".