I am trying to populate a postgres table with data from a csv file when running docker-compose up -d, however, all the methods I have tried end up saying the file could not be found.
One of the ways I was trying was using golang-migrate and the migrations to create the table work, but when attempting to run COPY customers FROM 'customers.csv' CSV HEADER; it gives the following error:
error: migration failed: could not open file "customers.csv" for reading: No such file or directory
My migrations step looks like this:
migrations:
image: migrate/migrate
command: -database postgres://postgres:password#database:5432/database?sslmode=disable -path /migrations up
volumes:
- ./migrations:/migrations
The customers.csv file is located in my migrations directory along with my migration sql files to create and drop the table (both of which work fine) along with a third migration sql file with the COPY query. I was under the impression that by setting the volume to ./migrations:/migrations it would map all files from my ./migrations directory in my project to /migrations in the container, so I really don't understand how it can't find the file.
Is there something else I need to do to get my csv file to my docker container or is there a better way to do this?
It seems the migration files themselves are found correctly, via the -path /migrations flag, but this doesn't change the working directory. The COPY customers FROM 'customers.csv' CSV HEADER; command will try to read customers.csv from the working directory, since it is not an absolute path.
You could either change the command to COPY customers FROM '/migrations/customers.csv' CSV HEADER; or change the working directory of the container via the working_dir docker-compose option.
migrations:
image: migrate/migrate
command: -database postgres://postgres:password#database:5432/database?sslmode=disable -path /migrations up
volumes:
- ./migrations:/migrations
working_dir: /migrations
The second option may be slightly nicer, as it doesn't require a SQL code change.
Related
when running:
~/fidelity/releases/20220907033831$ ls -a
.
..
.browserslistrc
221005_users_all.csv
_private
the presence of a file is confirmed.
However, when launching a postgresql command
psql fidelity_development
COPY users (id,migrated_id,[...]) FROM '~/fidelity/releases/20220907033831/221005_users_all.csv' DELIMITER ';' CSV HEADER;
The response is unexpected:
ERROR: could not open file "~/fidelity/releases/20220907033831/221005_users_all.csv" for reading: No such file or directory
What am I missing to determine why postgresql cannot see this file?
note this directory was also simlinked as fidelity/current and the same result was obtained when referring to that directory for the file, whereas bash sees it.
Use \COPY command as this one is client based and handles the local path correctly.
While COPY is server based and this could cause issues finding your file.
I recently tried copying my database contents to a csv file with the following command inside my containerized Postgres database:
\copy ${TABLE} TO ${FILE} DELIMITER ',' CSV HEADER;
I got a response indicating the file was successfully copied however I can't find where it was copied to. When I try specifying a different path to output the file, I get the response directory/file.csv: No such file or directory
Does anyone know where containerized databases output files and how I can direct them to accessible locations?
I am on a Mac OS and this is some of the relevant info from my docker-compose file with which the database was initiated.
db:
image: kartoza/postgis:12.0
volumes:
- postgis:/var/lib/postgresql
Docker containers store their information internally in what is called Docker volumes. You can read more literature on that in Use volumes.
Regarding your particular issue, you've got some options:
Copy to a temporary file and pull it from the container:
\copy ${TABLE} TO /tmp/file.csv DELIMITER ',' CSV HEADER;
Then run docker ps, find your container ID and run:
docker cp container_id:/tmp/file.csv file.csv
And you will have file.csv with the data in your current folder.
Another, simpler way is to export to stdout, if the output is gonna be short:
\copy ${TABLE} TO STDOUT DELIMITER ',' CSV HEADER;
This will dump all the data through the terminal. Only use it if there are few enough registers that it doesn't get past the scrollback.
Third option, because two are never enough... you could publish temporarily the 5432 port and connect from your local machine using psql... then running the copy command will dump to your local machine. (Or use third-party tools like pgAdmin or DataGrip to dump the information).
I use PostgreSQL 9.4.1
My query:
copy(select * from city) to 'C:\\temp\\city.csv'
copy(select * from city) to E'C:\\temp\\city.csv'
ERROR: relative path not allowed for COPY to file
********** Error **********
ERROR: relative path not allowed for COPY to file SQL state: 42602
As with this case, it seems likely that you are attempting to use copy from a computer other than the one which hosts your database. copy does I/O from the database host machine's local file system only. If you have access to that filesystem, you can adjust your attempt accordingly. Otherwise, you can use the \copy command in psql.
I am using pgAdmin v1.5 . The first query is
select table_name from information_schema.tables where table_catalog = 'ofbiz' order by table_name
Then I press button download, pgAdmin will return a csv file, is result set of first query.
It could be late but i think it can be helpful.
On Windows, make sure the output directory has gain the read/write right for Everyone (or you can specific user name).
Using slash(/) instead of backslash(), example
COPY DT1111 TO 'D:/TEST/DT1111_POST.CSV' DELIMITER ',' CSV HEADER;
TLDR: Make sure you also have write permissions in your copy-to location!
I had the exact same first error, ERROR: relative path not allowed for COPY to file, even though I used '/tmp/db.csv' (which is not a relative path).
In my case, the error message was quite misleading, since I was on the host machine, had an absolute filepath and the location existed. My problem was that I used the bitnami postgres:12 docker image, and the tmp folder in the container belongs to root there, while postgres and psql use the postgres user. My solution was to create an export folder there and transform the ownership to the postgres user:
mkdir /tmp/export
chown postgres:postgres /tmp/export
Then I was able to use COPY tablename TO '/tmp/export/db.csv'; successfully.
I use PostgreSQL 9.4.1
My query:
copy(select * from city) to 'C:\\temp\\city.csv'
copy(select * from city) to E'C:\\temp\\city.csv'
ERROR: relative path not allowed for COPY to file
********** Error **********
ERROR: relative path not allowed for COPY to file SQL state: 42602
As with this case, it seems likely that you are attempting to use copy from a computer other than the one which hosts your database. copy does I/O from the database host machine's local file system only. If you have access to that filesystem, you can adjust your attempt accordingly. Otherwise, you can use the \copy command in psql.
I am using pgAdmin v1.5 . The first query is
select table_name from information_schema.tables where table_catalog = 'ofbiz' order by table_name
Then I press button download, pgAdmin will return a csv file, is result set of first query.
It could be late but i think it can be helpful.
On Windows, make sure the output directory has gain the read/write right for Everyone (or you can specific user name).
Using slash(/) instead of backslash(), example
COPY DT1111 TO 'D:/TEST/DT1111_POST.CSV' DELIMITER ',' CSV HEADER;
TLDR: Make sure you also have write permissions in your copy-to location!
I had the exact same first error, ERROR: relative path not allowed for COPY to file, even though I used '/tmp/db.csv' (which is not a relative path).
In my case, the error message was quite misleading, since I was on the host machine, had an absolute filepath and the location existed. My problem was that I used the bitnami postgres:12 docker image, and the tmp folder in the container belongs to root there, while postgres and psql use the postgres user. My solution was to create an export folder there and transform the ownership to the postgres user:
mkdir /tmp/export
chown postgres:postgres /tmp/export
Then I was able to use COPY tablename TO '/tmp/export/db.csv'; successfully.
I'm trying to import a csv file using the COPY FROM command with postgres.
The db is stored on a linux server, and my data is stored locally, i.e. C:\test.csv
I keep getting the error:
ERROR: could not open file "C:\test.csv" for reading: No such file or directory
SQL state: 58P01
I know that I need to use the absolute path for the filename that the server can see, but everything I try brings up the same error
Can anyone help please?
Thanks
Quote from the PostgreSQL manual:
The file must be accessible to the server and the name must be specified from the viewpoint of the server
So you need to copy the file to the server before you can use COPY FROM.
If you don't have access to the server, you can use psql's \copy command which is very similar to COPY FROM but works with local files. See the manual for details.