Postgresql copy CSV format, double quote escape not working - postgresql

Running into an issue with copying the following data into a DB
1, ab\"c
I receive an unterminated quote error when running the following SQL
copy table_name from sample.tsv CSV DELIMITER ',' QUOTE '"' ESCAPE E'\\'
Based on the postgresql documentation I expect the escape parameter to be used to escape the quotation character but it's not working. Would like to see if there's a solution to this issue without reformatting the data, or changing the quote character.

try this. Because if quote is ", then it will mix with double quote in (ab"c).
copy table_name from 'sample.tsv' (FORMAT CSV, QUOTE '''', DELIMITER ',',ESCAPE E'\\');

It is expecting to find escaped quotes only inside quotes, so the command you show would work for 1,"ab\"c" but not for what you have.
The command that would work for the data you show is:
copy table_name from sample.tsv DELIMITER ','
But it is not likely to work for the rest of your data.

Related

Copy Command to insert CSV file - Escape Special Characters

I am trying to do a bulk insert into postgres db using copy command from csv file. All the columns in the db table are character_varying(1024) type.The copy command is failing on certain values which are in Double quotes
For example:
"TODD'S JAMES RENO PHCY,INC."
My copy command looks like below:
\copy file_tmp FROM /srv/data0/transfer/data_2.csv USING DELIMITERS ','
Could you please help in how to escape these special characters and get this working?
Although you have specified a delimiter, you have not specified a format, so it is still using "text". In "text" format, thing are escaped by backslashes, not quotes.
Also, 'USING DELIMITERS' is an extremely obsolete syntax.
You probably want something like:
\copy file_tmp FROM /srv/data0/transfer/data_2.csv WITH (FORMAT CSV)
You don't need to specify the delimiter, because it defaults to ',' when using CSV format.
Of course this still might fail on parts of the data you haven't shown us.

How can I reformat escape characters ready to use as Insert Value for MariaDb?

I'm using Powershell with ODBC to transfer table data between Sage 50 and MariaDb.
Some data rows in a text column in Sage can contain both single and double quotes that need to be retained when the data is imported into MariaDb.
I'm struggling to get PowerShell to replace for the Values portion of the Insert statement:
I need to replace a single quote ' with backslash single quote \', and also a double quote " with backslash double quote \"
For anyone else searching for this the answer is :
For single quotes
[regex]::replace($text, "'", "\'")
For double quotes
[regex]::replace($DETAILS, "`"", "\'")
I found this URL helpful https://vwiki.co.uk/MySQL_and_PowerShell and as stated you may wish to put this in a function.

PostgreSQL escape character during copy

I'm trying to import a CSV file into PostgreSQL but I am having an issue with special characters.
I'm using the following command
./psql -d data -U postgres -c "copy users from 'users.csv' delimiter E'\t' quote '~' csv"
It works fine until it encounters a field with the '~' which I'm using as a quote value to not break the existing quotes and inverted commas etc.
How do I escape this character in the csv file 'Person~name' so that it will import as 'Person~name'
CSV rules are listed in https://www.ietf.org/rfc/rfc4180.txt
To embed the quote character inside a string:
If double-quotes are used to enclose fields, then a double-quote
appearing inside a field must be escaped by preceding it with
another double quote. For example:
"aaa","b""bb","ccc"
In your case, replace double-quote by tilde, since you've choosen that delimiter.
Example:
test=> create table copytest(t text);
CREATE TABLE
test=> \copy copytest from stdin delimiter E'\t' quote '~' csv
Enter data to be copied followed by a newline.
End with a backslash and a period on a line by itself.
>> ~foo~~bar~
>> \.
test=> select * from copytest;
t
---------
foo~bar

PostgreSQL COPY csv including Quotes

This is a very simple problem, I am using the psql terminal command COPY as shown bellow
COPY tbname FROM '/tmp/file.csv'
delimiter '|' csv;
However this file.csv contains data such as
random|stuff|32"
as well as
random|other "stuff"|15
I tried to use the double quote to escape the quotes as the Postgres site suggested
random|stuff|32""
random|other ""stuff""|15
This seems to remove the quotes completely which I don't want.
Is there a way to get the import to just treat these quotes as regular characters so that they appear in the database as they do in the csv file?
According to the documentation, the default quote symbol is ", so you need to provide a QUOTE argument with a different symbol. The quote symbol has to be a single one-byte character.
COPY tbname FROM '/tmp/file.csv'
delimiter '|' QUOTE '}' csv; -- use a symbol you know does not appear in your file.

Ignore quotation marks when importing a CSV file into PostgreSQL?

I'm trying to import a tab-delimited file into my PostgreSQL database. One of the fields in my file is a "title" field, which occasionally contains actual quotation marks. For example, my tsv might look like:
id title
5 Hello/Bleah" Foo
(Yeah, there's just that one quotation mark in the title.)
When I try importing the file into my database:
copy articles from 'articles.tsv' with delimiter E'\t' csv header;
I get this error, referencing that line:
ERROR: unterminated CSV quoted field
How do I fix this? Quotation marks are never used to surround entire fields in the file. I tried copy articles from 'articles.tsv' with delimiter E'\t' escape E'\\' csv header; but I get the same error on the same line.
Assuming the file never actually tries to quote its fields:
The option you want is "with quote", see http://www.postgresql.org/docs/8.2/static/sql-copy.html
Unfortunately, I'm not sure how to turn off quote processing altogether, one kludge would be to specify a character that does not appear in your file at all.
Tab separated is the default format for copy statements. Treating them as CSV is just silly. (do you take this path just to skip the header ?)
copy articles from 'articles.tsv';
does exactly what you want.
I struggled with the same error and a few more. Finally gathering knowledge from few SO questions I came up with the following setup for making COPY TO/FROM successful even for quite sophisticated JSON columns:
COPY "your_schema_name.yor_table_name" (your, column_names, here)
FROM STDIN WITH CSV DELIMITER E'\t' QUOTE '\b' ESCAPE '\';
--here rows data
\.
the most important parts:
QUOTE '\b' - quote with backspace (thanks a lot #grautur!)
DELIMITER E'\t' - delimiter with tabs
ESCAPE '\' - and escape with a backslash