Computer: Mac OS X, version 10.8
Database: Postgres
Trying to import csv file into postgres.
pg> copy items_ordered from '/users/darchcruise/desktop/items_ordered.csv' with CSV;
ERROR: could not open file "/users/darchcruise/desktop/items_ordered.csv" for reading: Permission denied
Then I tried
$> chown postgres /users/darchcruise/desktop/items_ordered.csv
chown: /users/darchcruise/desktop/items_ordered.csv: Operation not permitted
Lastly, I tried
$> ls -l
-rw-r--r-- 1 darchcruise staff 1016 Oct 18 21:04 items_ordered.csv
Any help is much appreciated!
Assuming the psql command-line tool, you may use \copy instead of copy.
\copy opens the file and feeds the contents to the server, whereas copy tells the server the open the file itself and read it, which may be problematic permission-wise, or even impossible if client and server run on different machines with no file sharing in-between.
Under the hood, \copy is implemented as COPY FROM stdin and accepts the same options than the server-side COPY.
Copy the CSV file to /tmp
For me this solved the issue.
chmod a+rX /users/darchcruise/ /users/darchcruise/desktop /users/darchcruise/desktop/items_ordered.csv
This will change access rights for your folder. Note that everyone will be able to read your file.
You can't use chown being a user without administrative rights.
Also consider learning umask to ease creation of shared files.
Copy your CSV file into the /tmp folder
Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. COPY naming a file is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
I had the issue when I was trying to export data from a remote server into the local disk. I hadn't realised that SQL copy actually is executed on the server and that it tries to write to a server folder. Instead the correct thing to do was to use \copy which is the psql command and it writes to the local file system as I expected. http://www.postgresql.org/message-id/CAFjNrYsE4Za_KWzmfgN1_-MG7GTw_vpMRxPk=OEjAiLqLskxdA#mail.gmail.com
Perhaps that might be useful to someone else too.
Another way to do this, if you have pgAdmin and are comfortable using the GUI is to go the table in the schema and right click on the table you wish to import the file to and select "Import" browse your computer for the file, select the type your file is, the columns you want the data to be imputed into, and then select import.
That was done using pgAdmin III and the 9.4 version of PostgreSQL
I resolved the same issue with a recursive chown on the parent folder:
sudo chown -R postgres:postgres /home/my_user/export_folder
(my export being in /home/my_user/export_folder/export_1.csv)
for macbook first i opened terminal then type
open /tmp
or in finder directory you directly enter command+shift+g then type /tmp in go to the folder.
it opens temp folder in finder. then i paste copied csv file into this folder.then again i go to postgres terminal and typed below command and then it is copied my csv data into db table
\copy recharge_operator FROM '/private/tmp/operator.csv' DELIMITER ',' CSV;
COPY your table (Name, Latitude, Longitude) FROM 'C:\Temp\your file.csv' DELIMITERS ',' CSV HEADER;
Use c:\Temp\"Your File"\.
For me it worked to simply to add sudo (or run as root) for the chown command:
sudo chown postgres /users/darchcruise/desktop/items_ordered.csv
You must grant the pg_read_server_files permission to the user if you are not using postgres superuser.
Example:
GRANT pg_read_server_files TO my_user WITH ADMIN OPTION;
just in case you're facing this problem under windows 10 , add the group of users "youcomputer\Users" on the security Tab and grant it full control , that solved my issue
I had the same error message but was using psycopg2 to communicate with PostgreSQL. I fixed the permission issues by using the functions copy_from and copy_expert that will open the file on the client side as the user running the python script and feed the data to the database over STDIN.
Refer to this link for further information.
This answer is only for Linux Beginners.
Assuming initially the DB user didn't have file/folder(directory) permission on the client side.
Let's constrain ourselves to the following:
User: postgres
Purpose: You wanted to (write to / read from) a specific folder
Tool: psql
Connected to a specific database: YES
FILE_PATH: /home/user/training/sql/csv_example.csv
Query: \copy (SELECT * FROM table_name TO FILE_PATH, DELIMITER ',' CSV HEADER;
Actual Results: After running the query you got an error : Permission Denied
Expected Results: COPY COUNT_OF_ROWS_COPIED
Here are the steps I'd follow to try and resolve it.
Confirm the FILE_PATH permissions on your File system.
Inside a terminal to view the permissions for a file/folder you need to long list them by entering the command ls -l.
The output has a section that shows sth like this -> drwxrwxr-x
Which is interpreted in the following way:
TYPE | OWNER RIGHTS | GROUP RIGHTS | USER RIGHTS
rwx (r: Read, W: Write, X: Execute)
TYPE (1 Char) = d: directory, -: file
OWNER RIGHTS (3 Chars after TYPE)
GROUP RIGHTS (3 Chars after OWNER)
USER RIGHTS (3 Chars after GROUP)
If permissions are not enough (Ensure that a user can at least enter all folders in the path you wanted path) - x.
This means for FILE_PATH, All the directories (home , user, training, sql) should have at least an x in the USER RIGHTS.
Change permissions for all parent folders that you need to enter to have a x. You can use chmod rights_you_want parent_folder
Assuming /training/ didn't have an execute permission.
I'd go the user folder and enter chmod a+x training
Change the destination folder/directory to have a w if you want to write to it. or at least a r if you want to read from it
Assuming /sql didn't have a write permission.
I would now chmod a+w sql
Restart the postgresql server sudo systemctl restart postgresql
Try again.
This would most probably help you now get a successful expected result.
On Linux you can fix this by giving the postgres user read/write/execute permissions on the target directory. Eg:
setfacl -m u:postgres:rwx /home/hi
I just copied the source csv file to another folder in which you have more permissions (C:/temp), and it worked fine.
May be You are using pgadmin by connecting remote host then U are trying to update there from your system but it searches for that file in remote system's file system... its the error wat I faced May be its also for u check it
Related
If I set PGPASSFILE to an explicit path like /home/user/.pgpass then it works fine and when logged in as the user that owns that file I can use psql for the entries in .pgpass.conf.
The problem I have is that I need to have multiple accounts use psql. If I change PGPASSFILE to user directory like ~/.pgpass.conf then it doesn't work and doesn't read the file so it gives a password error.
Because I can only specify one file it means only the owner of that file can run the commands I need to run.
I am running on Ubuntu 18.04 and I need root & www-data to have a .pgpass.conf file.
How do I do this?
If you have system users corresponding to your db users (root and www-data in your case), each has its own, separate .pgpass file in its respective home directory. Set each accordingly.
And simply do not set PGPASSFILE at all. The manual:
PGPASSFILE behaves the same as the passfile connection parameter.
And:
passfile
Specifies the name of the file used to store passwords (see Section 33.15). Defaults to ~/.pgpass, or
%APPDATA%\postgresql\pgpass.conf on Microsoft Windows. (No error is
reported if this file does not exist.)
Related:
Run batch file with psql command without password
I use PostgreSQL 9.4.1
My query:
copy(select * from city) to 'C:\\temp\\city.csv'
copy(select * from city) to E'C:\\temp\\city.csv'
ERROR: relative path not allowed for COPY to file
********** Error **********
ERROR: relative path not allowed for COPY to file SQL state: 42602
As with this case, it seems likely that you are attempting to use copy from a computer other than the one which hosts your database. copy does I/O from the database host machine's local file system only. If you have access to that filesystem, you can adjust your attempt accordingly. Otherwise, you can use the \copy command in psql.
I am using pgAdmin v1.5 . The first query is
select table_name from information_schema.tables where table_catalog = 'ofbiz' order by table_name
Then I press button download, pgAdmin will return a csv file, is result set of first query.
It could be late but i think it can be helpful.
On Windows, make sure the output directory has gain the read/write right for Everyone (or you can specific user name).
Using slash(/) instead of backslash(), example
COPY DT1111 TO 'D:/TEST/DT1111_POST.CSV' DELIMITER ',' CSV HEADER;
TLDR: Make sure you also have write permissions in your copy-to location!
I had the exact same first error, ERROR: relative path not allowed for COPY to file, even though I used '/tmp/db.csv' (which is not a relative path).
In my case, the error message was quite misleading, since I was on the host machine, had an absolute filepath and the location existed. My problem was that I used the bitnami postgres:12 docker image, and the tmp folder in the container belongs to root there, while postgres and psql use the postgres user. My solution was to create an export folder there and transform the ownership to the postgres user:
mkdir /tmp/export
chown postgres:postgres /tmp/export
Then I was able to use COPY tablename TO '/tmp/export/db.csv'; successfully.
I am trying to read a CSV file located on a postgres 8.4 server filesystem:
COPY ip2location_db1 FROM '/pgsrc/IP2LOCATION-LITE-DB9.CSV' WITH CSV QUOTE AS '"';
I am getting the error:
Cannot open file for read access: Permission denied
The file has owner postgres and I tried putting it on /var/lib/pgsql and also on /pgsources folder, to which I gave ownership to postgres user.
What am I doing wrong?
I have run into this issue before, and rather than jockey around with permissions all the time, I just import from STDIN.
This would accomplish what you want (albeit not precisely the way you want to do it), but I think it's a lot less cumbersome and error-prone. Try:
cat /pgsrc/IP2LOCATION-LITE-DB9.CSV | psql -c "COPY ip2location_db1 FROM STDIN (FORMAT CSV);"
This does imply that you're running the query from a shell script or something, but to implement it the other way, you'd have to incorporate the change of permissions with a shell script or something.
(Also, according to the docs, the default quote is the double quote, so you don't need to specify the quote.)
I use PostgreSQL 9.4.1
My query:
copy(select * from city) to 'C:\\temp\\city.csv'
copy(select * from city) to E'C:\\temp\\city.csv'
ERROR: relative path not allowed for COPY to file
********** Error **********
ERROR: relative path not allowed for COPY to file SQL state: 42602
As with this case, it seems likely that you are attempting to use copy from a computer other than the one which hosts your database. copy does I/O from the database host machine's local file system only. If you have access to that filesystem, you can adjust your attempt accordingly. Otherwise, you can use the \copy command in psql.
I am using pgAdmin v1.5 . The first query is
select table_name from information_schema.tables where table_catalog = 'ofbiz' order by table_name
Then I press button download, pgAdmin will return a csv file, is result set of first query.
It could be late but i think it can be helpful.
On Windows, make sure the output directory has gain the read/write right for Everyone (or you can specific user name).
Using slash(/) instead of backslash(), example
COPY DT1111 TO 'D:/TEST/DT1111_POST.CSV' DELIMITER ',' CSV HEADER;
TLDR: Make sure you also have write permissions in your copy-to location!
I had the exact same first error, ERROR: relative path not allowed for COPY to file, even though I used '/tmp/db.csv' (which is not a relative path).
In my case, the error message was quite misleading, since I was on the host machine, had an absolute filepath and the location existed. My problem was that I used the bitnami postgres:12 docker image, and the tmp folder in the container belongs to root there, while postgres and psql use the postgres user. My solution was to create an export folder there and transform the ownership to the postgres user:
mkdir /tmp/export
chown postgres:postgres /tmp/export
Then I was able to use COPY tablename TO '/tmp/export/db.csv'; successfully.
When I execute the following script:
copy (
select agk_p_id Promoter_agk, multiplication_lr_agk_p_k4, agk_lr_rvd, status_agk_p_k4
from patient_agk_p_expr
where status_agk_p_k4='Preferentially')
to 'g:\boom.csv'
With CSV HEADER;
It works just beautifully, and creates the boom.csv file on my g drive.
I get:
Query returned successfully: 8486 rows affected, 631 ms execution time.
I should note that my 'g' drive is an external harddrive that is connected to my computer.
And my cygwin refers to my g harddrive like this:
blumr04#SRB524YBZ1 /cygdrive/g/
$ pwd
/cygdrive/g
Now, my computer has also access to a server harddrive of my organization.
On my windows explorer it refers to as (Z:)
My cygwin refers to the 'Z' drive accordingly (just the same it does to my C: drive):
blumr04#SRB524YBZ1 /cygdrive/z/
$ pwd
/cygdrive/z
But I have troubles when it comes to having postgres recognizing this harddrive - when I attempt to run the following script in order to save my table to the Z harddrive :
copy (
select agk_p_id Promoter_agk, multiplication_lr_agk_p_k4, agk_lr_rvd, status_agk_p_k4
from patient_agk_p_expr
where status_agk_p_k4='Preferentially')
to 'z:\boom.csv'
With CSV HEADER;
I get the following error message:
ERROR: could not open file "z:\boom.csv" for writing: No such file or directory
********** Error **********
ERROR: could not open file "z:\boom.csv" for writing: No such file or directory
SQL state: 58P01
Does anyone know how can I save (copy to) my files when it comes to a harddrive that is not physically connected to my computer, but rather is a server harddrive?
- Is there a command / script in postgres that would be able to show me which harddrives are accessible to postgres? It looks like for some reason the Z harddrive is not accessible for postgres read/write, at least not in the way that I attempt it, while G,J,K, and other harddrive which are external HD - are accessible. I would be glad to know if I could expand postgres accessibility somehow..
Thanks!
#Mike Sherrill 'Cat Recall'
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
My Z drive is referred to in Windows Explorer also as:
(\shares.nyumc.org\research)(Z:),
therefore I tried also the following:
copy (
select agk_p_id Promoter_agk, multiplication_lr_agk_p_k4, agk_lr_rvd, status_agk_p_k4
from patient_agk_p_expr
where status_agk_p_k4='Preferentially')
to '\\shares.nyumc.org\research\boom.csv'
With CSV HEADER;
This scripts gives me the following error, indeed all about permissions:
ERROR: could not open file "\\shares.nyumc.org\research\boom.csv" for writing: Permission denied
********** Error **********
ERROR: could not open file "\\shares.nyumc.org\research\boom.csv" for writing: Permission denied
SQL state: 42501
So it looks like the right path is:
\\shares.nyumc.org\research\
And that in this case (Z:) is merely an alias name(?!) as the error message is this time NOT about "No such file or directory", but rather about permissions.
Is there a way I could facilitate the necessary permission to Postgres so it could write to the server drive?
The most common problem with running COPY tablename to filename is dealing with path and permissions from the point of view of the PostgreSQL server.
Files named in a COPY command are read or written directly by the
server, not by the client application. Therefore, they must reside on
or be accessible to the database server machine, not the client. They
must be accessible to and readable or writable by the PostgreSQL user
(the user ID the server runs as), not the client. Source
If you try to write to a file that the PostgreSQL server can't "see", you'll get "No such file or directory". If you try to write to a file in a directory for which the PostgreSQL server lacks permissions, you'll get "Permission denied".
So odds are good that the PostgreSQL user (the user ID the server runs as) lacks permissions on "z".