Export to CSV from postgresql - postgresql

I want to export a postgresql table to a csv file.
I have tried two ways, however both are unsuccessful for different reasons.
In the first case, you can see what I run and what I get bellow:
COPY demand.das_april18_pathprocess TO '/home/katerina/das_april18_pathprocess.csv' DELIMITER ',' CSV HEADER;
No such file or directory
SQL state: 58P01
I need to mention that in the location /home/katerina/ I have created an empty file named das_april18_pathprocess.csv, for which I modified the Permission settings to allow Read and Write.
In my second try, the query is executed without any errors but I cannot see the csv file. The command that I run is the following:
COPY demand.das_april18_pathprocess TO '/tmp/das_april18_pathprocess.csv' DELIMITER ',' CSV HEADER;
In the /tmp directory there is no cvs file.
Any advice on how to export the table to csv file with any way is really appreciated!

Ah, you run into a common problem -- you're creating a file on the server's filesystem, not your local filesystem. That can be a pain.
You can, however, COPY TO STDOUT, then redirect the result.
If you're using linux or another unix, the easiest way to do this is from the command line:
$ psql <connection options> -c "COPY demand.das_april18_pathprocess TO STDOUT (FORMAT CSV)" > das_april18_pathprocess.csv

copy ( select * from demand.das_april18_pathprocess) to '/home/katerina/das_april18_pathprocess.csv' with CSV header ;

Related

Postgresql copy command not finding file

when running:
~/fidelity/releases/20220907033831$ ls -a
.
..
.browserslistrc
221005_users_all.csv
_private
the presence of a file is confirmed.
However, when launching a postgresql command
psql fidelity_development
COPY users (id,migrated_id,[...]) FROM '~/fidelity/releases/20220907033831/221005_users_all.csv' DELIMITER ';' CSV HEADER;
The response is unexpected:
ERROR: could not open file "~/fidelity/releases/20220907033831/221005_users_all.csv" for reading: No such file or directory
What am I missing to determine why postgresql cannot see this file?
note this directory was also simlinked as fidelity/current and the same result was obtained when referring to that directory for the file, whereas bash sees it.
Use \COPY command as this one is client based and handles the local path correctly.
While COPY is server based and this could cause issues finding your file.

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

How to export table data from PostgreSQL (pgAdmin) to CSV file?

I am using pgAdmin version 4.3 and i want to export one table data to CSV file. I used this query
COPY (select * from product_template) TO 'D:\Product_template_Output.csv' DELIMITER ',' CSV HEADER;
but it shows error
a relative path is not allowed to use COPY to a file
How can I resolve this problem any help please ?
From the query editor, once you have executed your query, you just have to click on the "Download as CSV (F8)" button or use F8 key.
Source pgAdmin 4 Query Toolbar
Use absolute paths or cd to a known location so that you can ignore the path.
For example cd into documents directory then run the commands there.
If you are able to cd into your documents directory, then the command would be like this:
Assuming you are want to use PSQL from the command line.
cd ~/Documents && psql -h host -d dbname -U user
\COPY (select * from product_template) TO 'Product_template_Output.csv' DELIMITER ',' CSV HEADER;
The result would be Product_template_Output.csv in your current working directory(Documents folder).
Again using psql.
You have to remove the double quotes:
COPY (select * from product_template) TO 'D:\Product_template_Output.csv'
DELIMITER ',' CSV HEADER;
If your PgAdmin instance resides in a remote server, the aforementioned solutions might not be handy for you if you do not have remote access to the server. In this case, simply select all the query data and copy it. Open an excel file and you could paste it. Simple !! Tweaked.
You might have tough time if your query result is too much though.
Try this command:
COPY (select * from product_template) TO 'D:\Product_template_Output.csv' WITH CSV;
In PgAdmin export option is available in file menu.Execute the query, and then we can view the data in the Output pane. Click on the menu FILE -> EXPORT from query window.
PSQL to export data
COPY noviceusers(code, name) FROM 'C:\noviceusers.csv' DELIMITER ',' CSV HEADER;
https://www.novicetechie.com/2019/12/export-postgresql-data-in-to-excel-file.html for reference.
Write your query to select data on the query tool and execute
Click on the download button on the pgAdmin top bar (selected in red)
Rename the file to your liking
Select which folder to save the file
Congrats!!!

How to import Zipped file into Postgres Table

I would like to important a file into my Postgresql system(specificly RedShift). I have found a arguement for copy that allows importing a gzip file. But the provider for the data I am trying to include in my system only produces the data in a .zip. Any built in postgres commands for opening a .zip?
From within Postgres:
COPY table_name FROM PROGRAM 'unzip -p input.csv.zip' DELIMITER ',';
From the man page for unzip -p:
-p extract files to pipe (stdout). Nothing but the file data is sent to stdout, and the files are always extracted in binary
format, just as they are stored (no conversions).
Can you just do something like
unzip -c myfile.zip | gzip myfile.gz
Easy enough to automate if you have enough files.
This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here:
This is the format that works for me if my s3 bucket contains a gzipped .csv.
copy <table> from 's3://mybucket/<foldername> '<aws-auth-args>' delimiter ',' gzip;
unzip -c /path/to/.zip | psql -U user
The 'user' must be have super user right else you will get a
ERROR: must be superuser to COPY to or from a file
To learn more about this see
https://www.postgresql.org/docs/8.0/static/backup.html
Basically this command is used in handling large databases

Postgres COPY FROM csv file- No such file or directory

I'm trying to import a (rather large) .txt file into a table geonames in PostgreSQL 9.1. I'm in the /~ directory of my server, with a file named US.txt placed in that directory. I set the search_path variable to geochat, the name of the database I'm working in. I then enter this query:
COPY geonames
FROM 'US.txt',
DELIMITER E'\t',
NULL 'NULL');
I then receive this error:
ERROR: could not open file "US.txt" for reading: No such file or directory.
Do I have to type in \i US.txt or something similar first, or should it just get it from the present working directory?
Maybe a bit late, but hopefully useful:
Use \copy instead
https://wiki.postgresql.org/wiki/COPY
jvdw
A couple of misconceptions:
1.
I'm in the /~ directory of my server
There is no directory /~. It's either / (root directory) or ~ (home directory of current user). It's also irrelevant to the problem.
2.
I set the search_path variable to geochat, the name of the database I'm working in
The search_path has nothing to do with the name of the database. It's for schemas inside the current database. You probably need to reset this.
3.
You are required to use the absolute path for your file. As documented in the manual here:
filename
The absolute path name of the input or output file.
4.
DELIMITER: just noise.
The default is a tab character in text format
5.
NULL: It's rather uncommon to use the actual string 'NULL' for a NULL value. Are you sure?
The default is \N (backslash-N) in text format, and an unquoted empty string in CSV format.
My guess (after resetting search_path - or you schema-qualify the table name):
COPY geonames FROM '/path/to/file/US.txt';
The paths are relative to the PostgreSQL server, not the psql client.
Assuming you are running PostgreSQL 9.4, you can put US.txt in the directory /var/lib/postgresql/9.4/main/.
Another option is to pipe it in from stdin:
cat US.txt | psql -c "copy geonames from STDIN WITH (FORMAT csv);"
if you're running your COPY command from a script, you can have a step in the script that creates the COPY command with the correct absolute path.
MYPWD=$(pwd)
echo "COPY geonames FROM '$MYPWD/US.txt', DELIMITER E'\t';"
MYPWD=
you can then run this portion into a file and execute it
./step_to_create_COPY_with_abs_path.sh >COPY_abs_path.sql
psql -f COPY_abs_path.sql -d your_db_name