Dealing with errors during a copy from - postgresql

I've to import a file from an external source to a postgresql table.
I tried to do it with \copy from , but I keep getting errors (additional columns) in the middle of the file.
Is there a way to tell postgresql to ignore lines containing errors during a "\copy from" ?
Thanks

Give it a try with PostgreSQL Loader instead.

No. All data is correct or there is no data at all, those are the two options you have in PostgreSQL.

Related

COPY a csv file with 108 column into postgresql

I have a csv file with 108 columns which i try to import in my postgresql table. It is obvious that I don't want to specify every columns in my CREATE TABLE statement. But when I enter
\COPY 'table_name' FROM 'directory' DELIMITER ',' CSV HEADER; this error message shows up: "ERROR: Extra Data after Last Expected Column". When having a few columns I know how to fix this problem but, like I said, i don't want to specified the entire 108 columns. By the way my table does contain any columns at all. Any help on how I could do that? Thx !
When dealing with problems like this, I often cheat. Plenty of tools exist online for converting CSV to SQL, https://www.convertcsv.com/csv-to-sql.htm being one of them.
Copy/paste your CSV, copy/paste the generated SQL. Not the most elegant solution, although will work as a one-off situation.
Now, if you're looking to repeat this process regularly (automated I hope), then Python may be a interesting language to explore to quickly write a script to do this for you, then schedule it at a CRON job or whatever method you prefer for invoking it automatically with the correct input (CSV file).
Please feel free to let me know if I've misunderstood your original question, or if I can provide any more help give me a shout and I'll do my best!

How to import CSV into PostgreSQL

Respected,
I have problems with importing CSV into PostgreSQL via pgAdmin. No matter what I do, it shows the following error:
ERROR: extra data after last expected column.
Can anyone please help me and point me out a possible solution?
Thank you.
Milorad K.
check that your data is formatted as postgresql expects it to be
That error could be caused by specifying the wrong quote character or the wrong field separator. or it could be that your input file is corrupt.
I've had corrupt CSV files from banks before, so don't trust anyone.

Delimiter Issue

I am using a Mac laptop and I am trying to copy a local csv file and import it into a postgresql table. I have used the delimiter query and the following query works:
copy c2013_levinj.va_clips_translation
from local '/Users/jacoblevin/Desktop/clips_translation_table.csv'
Delimiter ',' skip 1 rejectmax 1;
However, each time the query is submitted, I receive a message that says "0 rows fetched." I have tried dropping the table and re-creating it as well as using the "select *" query. Suffice to say, i have been unable to pull any data. Does anyone have any ideas what's wrong? Thanks in advance.
What happens if you try this:
copy c2013_levinj.va_clips_translation
from local '/Users/jacoblevin/Desktop/clips_translation_table.csv'
WITH CSV HEADER;
That should be more robust and do what you want.

How should I open a PostgreSQL dump file and add actual data to it?

I have a pretty basic database. I need to drop a good size users list into the db. I have the dump file, need to convert it to a .pg file and then somehow load this data into it.
The data I need to add are in CSV format.
I assume you already have a .pg file, which I assume is a database dump in the "custom" format.
PostgreSQL can load data in CSV format using the COPY statement. So the absolute simplest thing to do is just add your data to the database this way.
If you really must edit your dump, and the file is in the "custom" format, there is unfortunately no way to edit the file manually. However, you can use pg_restore to create a plain SQL backup from the custom format and edit that instead. pg_restore with no -d argument will generate an SQL script for insertion.
As suggested by Daniel, the simplest solution is to keep your data in CSV format and just import into into Postgres as is.
If you're trying to to merge this CSV data into a 3rd party Postgres dump file, then you'll need to first convert the data into SQL insert statements.
One possible unix solution:
awk -F, '{printf "INSERT INTO TABLE my_tab (\"%s\",\"%s\",\"%s\");\n",$1,$2,$3}' data.csv

How to create CSV file from SQL?

I am wondering how to create or export a CSV file from SQL? Is there any function for that similar to pgsql2shp?
I would appreciate your ideas, tip or solutions.
You can save a complete table as a file using this command:
COPY tablename TO STDOUT CSV
Ref: https://www.postgresql.org/docs/current/static/sql-copy.html
You can give this a try. But i believe there may be some syntax changes depending on the version.
COPY (SELECT foo,bar FROM whatever) TO ‘/tmp/dump.csv’ WITH CSV HEADER
If you use pgAdmin, you can export any query you run to a CSV file.