Skipping first column data in CSV while using copy command in PostgreSQL - postgresql

I have a pipe delimited data file with no headers. I need to import data into a PostgreSQL table starting with the data from second column in the file i.e skip the data before the first '|' for each line. How do I achieve this using the COPY command?

Use the cut command to remove the first column and then import.
cut -d "|" -f 2- file1.csv > file2.csv
psql -d test -h localhost -c "\copy table(f1,f2,f3) from 'file2.csv' delimiter '|' csv header;"
Not an answer as such related to postgresql but more about command line tools.

Related

Invalid Input Syntax for Type Date

I'm triying to load data to PosgreSQL Data Base remotly by command Line in CMD, copying data from CSV file to a specific Table. It was working well but, somehow when I try to load data now, CMD throws me this error
Curiously CSV file has the "fecha_carga" field in date type as well is in the table storage in Data Base in this way, here is my try:
psql -h suggestedorder.postgres.database.azure.com -d DataAnalytics -U dev_ext#suggestedorder -c "TRUNCATE planning.env_cat_bloqueados" -c "\copy planning.env_cat_bloqueados (key, bloqueado, fecha_carga)from 'C:\Users\geradiaz.MODELO\Desktop\Envase\Catalogos\Outputs_Catalogos\Catalogo_Bloqueados\Catalogo_Bloqueados_Output.csv'with delimiter as ','
Can someone explain me what is happening here? and how could I fixed it up?
Best regards and thanks!
First please do not use images for textual data, instead copy and paste text. Second I'm guessing the CSV file has a header that has the column heading fecha_carga for the field. Since you did not specify HEADER along with DELIMITER in the WITH the COPY is taking the header line as data and fecha_carga is not a valid date.
The #Adrian Klaver comment gave me the idea what was wrong with my command line, in effect the fields on my CSV file was taking as data, so I changed the command line specifying that my CSV file has headers by using the next instruction at the end instead with delimiter as ',' :
with (format csv, header)
So, this is the whole command line:
psql -h suggestedorder.postgres.database.azure.com -d DataAnalytics -U dev_ext#suggestedorder -c "TRUNCATE planning.env_cat_bloqueados" -c "\copy planning.env_cat_bloqueados (key, bloqueado, fecha_carga)from 'C:\Users\geradiaz.MODELO\Desktop\Envase\Catalogos\Outputs_Catalogos\Catalogo_Bloqueados\Catalogo_Bloqueados_Output.csv'with (format csv, header)
Thanks!

How to copy a csv file from a url to Postgresql

Is there any way to use copy command for batch data import and read data from a url. For example, copy command has a syntax like :
COPY sample_table
FROM 'C:\tmp\sample_data.csv' DELIMITER ',' CSV HEADER;
What I want is not to give a local path but a url. Is there any way?
It's pretty straightforward, provided you have an appropriate command-line tool available:
COPY sample_table FROM PROGRAM 'curl "http://www.example.com/file.csv"'
Since you appear to be on Windows, I think you'll need to install curl or wget yourself. There is an example using wget on Windows here which may be useful.
My solution is
cat $file |
tail -$numberLine |
sed 's/ / ,/g' |
psql -q -d $dataBaseName -c "COPY tableName FROM STDIN DELIMITER ','"
You can insert a awk between sed and psql to add missing column.
Interesting if already you know what to put in the missing column.
awk '{print $0" , "'info_about_missing_column'"\n"}'
I have done that and it works and faster than INSERT.

Convert pipe delimited csv to tab delimited using batch script

I am trying to write a batch script that will query a Postgres database and output the results to a csv. Currently, it queries the database and saves the output as a pipe delimited csv.
I want the output to be tab delimited rather than pipe delimited, since I will eventually be importing the csv into Access. Does anyone know how this can be achieved?
Current code:
cd C:\Program Files\PostgreSQL\9.1\bin
psql -c "SELECT * from jivedw_day;" -U postgres -A -o sample.csv cscanalytics
postgres = username
cscanalytics = database
You should be using COPY to dump CSV:
psql -c "copy jivedw_day to stdout csv delimiter E'\t'" -o sample.csv -U postgres -d csvanalytics
The delimiter E'\t' part will get you your output with tabs instead of commas as the delimiter. There are other other options as well, please see the documentation for further details.
Using -A like you are just dumps the usual interactive output to sample.csv without the normal padding to making the columns line up, that's why you're seeing the pipes:
-A
--no-align
Switches to unaligned output mode. (The default output mode is otherwise aligned.)

Export to CSV and Compress with GZIP in postgres

I need to export a big table to csv file and compress it.
I can export it using COPY command from postgres like -
COPY foo_table to '/tmp/foo_table.csv' delimiters',' CSV HEADER;
And then can compress it using gzip like -
gzip -c foo_table.csv > foo.gz
The problem with this approach is, I need to create this intermediate csv file, which itself is huge, before I get my final compressed file.
Is there a way of export table in csv and compressing the file in one step?
Regards,
Sujit
The trick is to make COPY send its output to stdout, then pipe the output through gzip:
psql -c "COPY foo_table TO stdout DELIMITER ',' CSV HEADER" \
| gzip > foo_table.csv.gz
You can use directly, as per docs, https://www.postgresql.org/docs/9.4/sql-copy.html
COPY foo_table to PROGRAM 'gzip > /tmp/foo_table.csv' delimiter ',' CSV HEADER;
Expanding a bit on #Joey's answer, below adds support for a couple more features available in the manual.
psql -c "COPY \"Foo_table\" (column1, column2) TO stdout DELIMITER ',' CSV HEADER" \
| gzip > foo_table.csv.gz
If you have capital letters in your table name (woe be onto you), you need the \" before and after the table name.
The second thing I've added is column listing.
Also note from the docs:
This operation is not as efficient as the SQL COPY command because all data must pass through the client/server connection. For large amounts of data the SQL command might be preferable.
PostgreSQL 13.4
psql command \copy also works combined with SELECT column_1, column_2, ... and timestamp date +"%Y-%m-%d_%H%M%S" for filename dump.
\copy (SELECT id, column_1, column_2, ... FROM foo_table) \
TO PROGRAM 'gzip > ~/Downloads/foo_table_dump_`date +"%Y-%m-%d_%H%M%S"`.csv.gz' \
DELIMITER ',' CSV HEADER ;

How to export table as CSV with headings on Postgresql?

I'm trying to export a PostgreSQL table with headings to a CSV file via command line, however I get it to export to CSV file, but without headings.
My code looks as follows:
COPY products_273 to '/tmp/products_199.csv' delimiters',';
COPY products_273 TO '/tmp/products_199.csv' WITH (FORMAT CSV, HEADER);
as described in the manual.
From psql command line:
\COPY my_table TO 'filename' CSV HEADER
no semi-colon at the end.
instead of just table name, you can also write a query for getting only selected column data.
COPY (select id,name from tablename) TO 'filepath/aa.csv' DELIMITER ',' CSV HEADER;
with admin privilege
\COPY (select id,name from tablename) TO 'filepath/aa.csv' DELIMITER ',' CSV HEADER;
When I don't have permission to write a file out from Postgres I find that I can run the query from the command line.
psql -U user -d db_name -c "Copy (Select * From foo_table LIMIT 10) To STDOUT With CSV HEADER DELIMITER ',';" > foo_data.csv
This works
psql dbname -F , --no-align -c "SELECT * FROM TABLE"
The simplest way (using psql) seems to be by using --csv flag:
psql --csv -c "SELECT * FROM products_273" > '/tmp/products_199.csv'
For version 9.5 I use, it would be like this:
COPY products_273 TO '/tmp/products_199.csv' WITH (FORMAT CSV, HEADER);
This solution worked for me using \copy.
psql -h <host> -U <user> -d <dbname> -c "\copy <table_name> FROM '<path to csvfile/file.csv>' with (format csv,header true, delimiter ',');"
Heres how I got it working power shell using pgsl connnect to a Heroku PG database:
I had to first change the client encoding to utf8 like this: \encoding UTF8
Then dumped the data to a CSV file this:
\copy (SELECT * FROM my_table) TO C://wamp64/www/spider/chebi2/dump.csv CSV DELIMITER '~'
I used ~ as the delimiter because I don't like CSV files, I usually use TSV files, but it won't let me add '\t' as the delimiter, so I used ~ because its a rarely used characeter.
The COPY command isn't what is restricted. What is restricted is directing the output from the TO to anywhere except to STDOUT. However, there is no restriction on specifying the output file via the \o command.
\o '/tmp/products_199.csv';
COPY products_273 TO STDOUT WITH (FORMAT CSV, HEADER);
copy (anysql query datawanttoexport) to 'fileablsoutepathwihname' delimiter ',' csv header;
Using this u can export data also.
I am posting this answer because none of the other answers given here actually worked for me. I could not use COPY from within Postgres, because I did not have the correct permissions. So I chose "Export grid rows" and saved the output as UTF-8.
The psql version given by #Brian also did not work for me, for a different reason. The reason it did not work is that apparently the Windows command prompt (I was using Windows) was meddling around with the encoding on its own. I kept getting this error:
ERROR: character with byte sequence 0x81 in encoding "WIN1252" has no equivalent in encoding "UTF8"
The solution I ended up using was to write a short JDBC script (Java) which read the CSV file and issued insert statements directly into my Postgres table. This worked, but the command prompt also would have worked had it not been altering the encoding.
Try this:
"COPY products_273 FROM '\tmp\products_199.csv' DELIMITER ',' CSV HEADER"
In pgAdmin, highlight your query statement just like when you use F5 to execute and press F9 - this will open the file browser so you can pick where you save your CSV.
If you are using Azure Data Studio, the instruction are here: Azure Data Studio: Save As CSV.
I know this isn't a universal solution, but most of the time you just want to grab the file by hand.