This is a very quick question on using the postgresql copy feature to import a csv file.
If I have a row with data such as
random, 1689, rnd\\168
how do I include the special characters \ so that it appears in the db as
random
1689
rnd\\168
What about simply using the copy command?
copy my_table from 'full_csv_file_path' with CSV delimiter ',';
And the CSV file is containing:
random,1689,rnd\\168
Related
I am trying to import the following csv file into YugaByte DB YSQL. Note that the second entry in each row is a JSON object.
"15-06-2018","{\"file_name\": \"myfile1\", \"remote_ip\": \"X.X.X.X\"}"
"15-06-2018","{\"file_name\": \"myfile2\", \"remote_ip\": \"Y.Y.Y.Y\"}"
My table schema is:
postgres=# create table downloads_raw (request_date text, payload jsonb);
I want the JSON snippet in the imported file to become a JSONB value.
I tried doing the following:
postgres=# COPY downloads_raw FROM 'data.csv';
Hitting the following error:
ERROR: 22P04: missing data for column "payload"
CONTEXT: COPY downloads_raw, line 1: ""15-06-2018","{\"file_name\": \"myfile1\", \"remote_ip\": \"X.X.X.X\"}""
LOCATION: NextCopyFrom, copy.c:3443
Time: 2.439 ms
You need to specify FORMAT csv and ESCAPE '\'. Also, the format and escape options need to be enclosed in parenthesis. This should work:
COPY downloads_raw FROM 'data.csv' WITH (FORMAT csv, ESCAPE '\');
List of supported options for COPY command can be found here:
https://docs.yugabyte.com/latest/api/ysql/commands/cmd_copy/
I'm using Python 2.7 and psycopg2 to connect to my DB server ( PostgreSQL 9.3 ) and I a list of objects of ( Product Class ) holds the items which i want to insert
products_list = []
products_list.append(product1)
products_list.append(product2)
And I want to use copy_from to insert this products list to the product table. I tried some tutorials and i had a problem with converting the products list to CSV format because the values contain single quote, new lines, tabs and double quotes. For example ( Product Description ) :
<div class="product_desc">
Details :
Product's Name : name
</div>
The escaping corrupted the HTML code by adding single quote before any single quote and it, So i need to use a save way to convert the list into CSV to COPY it? OR using any other way to insert the list without converting it to CSV format??
I figured it out, First of all i created a function to convert my object to csv row
import csv
#staticmethod
def adding_product_to_csv(item, out):
writer = csv.writer(out, quoting=csv.QUOTE_MINIMAL,quotechar='"',delimiter=',',lineterminator="\r\n")
writer.writerow([item.name,item.description])
Then in my code i created a csv file using Python IO to store the data in it to COPY it and stored every object in the csv file using my previous function:
file_name = "/tmp/file.csv"
myfile = open(file_name, 'a')
for item in object_items:
adding_product_to_csv(item, myfile)
Now I created the CSV file and it's ready to be copied using copy_from which exists in psycopg2 :
# For some reason it needs to be closed before copying it to the table
csv_file.close()
cursor.copy_expert("COPY products(name, description) from stdin with delimiter as ',' csv QUOTE '\"' ESCAPE '\"' NULL 'null' ",open(file_name))
conn.commit()
# Clearing the file
open(file_name, 'w').close()
And it's working now.
I have been trying to import several csv files into MongoDB using the Mongoimport tool. The thing is that despite what the name says in several countries the csv files are saved with semi-colons instead of commas making me unable to use the mongoimport tool properly.
There are some workarounds for this by changing the delimiter option in the region settings, however for several reasons I don't have access to the machine that generates this csv files so I can't do that.
I was wondering is there any way to import this csv files using the mongo tools instead of me having to write something to replace all the semi-colons on a file with commas? Since I find pretty strange mongo overlooking that in some countries semi-colons are used.
mongodb supports tsv then we should replace ";" by "\t" :
tr ";" "\t" < file.csv | mongoimport --type tsv ...
It looks like this is not supported,I can't find the option to specify a delimiter among the allowed arguments for 'mongoimport' on document page http://docs.mongodb.org/manual/reference/program/mongoimport/#bin.mongoimport .
You can file a feature request on jira if it's something you'd like to
see supported.
I have an input CSV file containing something like:
SD-32MM-1001,"100.00",4/11/2012
SD-32MM-1001,"1,000.00",4/12/2012
I was trying to COPY import that into a postgresql table(varchar,float8,date) and ran into an error:
# copy foo from '/tmp/foo.csv' with header csv;
ERROR: invalid input syntax for type double precision: "1,000.00"
Time: 1.251 ms
Aside from preprocessing the input file, is there some setting in PG that will have it read a file like the one above and convert to numeric form in COPY? Something other than COPY?
If preprocessing is required, can it be set as part of the COPY command? (Not the psql \copy)?
Thanks a lot.
The option to pre processing is to first copy to a temporary table as text. From there insert into the definitive table using the to_number function:
select to_number('1,000.00', 'FM000,009.99')::double precision;
It's an odd CSV file that surrounds numeric values with double quotes, but leaves values like SD-32MM-1001 unquoted. In fact, I'm not sure I've ever seen a CSV file like that.
If I were in your shoes, I'd try copy against a file formatted like this.
"SD-32MM-1001",100.00,4/11/2012
"SD-32MM-1001",1000.00,4/12/2012
Note that numbers have no commas. I was able to import that file successfully with
copy test from '/fullpath/test.dat' with csv
I think your best bet is to get better formatted output from your source.
There's 1 column that contains commas. When I output my query to csv, these commas break the csv format. What I've been doing to avoid this is a simple
replace(A."Sales Rep",',','')
Is there a better way of doing this so that I can actually get the commas in the final output without breaking the csv file?
Thanks!
You can use the COPY command to get PostgreSQL to build the CSV for you:
COPY -- copy data between a file and a table
Something like one of these:
copy your_table to 'filename' csv
copy your_table to 'filename' csv force quote *
copy your_table to stdout csv force quote *
copy your_table to stdout csv force quote * header
...
You have to be the super user to copy to a filename though. If you're inside psql, you can use the \copy command:
Performs a frontend (client) copy. This is an operation that runs an SQL COPY command, but instead of the server reading or writing the specified file, psql reads or writes the file and routes the data between the server and the local file system.
The syntax is pretty much the same:
\copy your_table to 'filename.csv' csv force quote * header
...
Quote the fields with "
a,this has a , in it,b
would become
a,"this has a, in it",b
and if the fields have BOTH a , and a ", double the quotes:
a,this has a " and , in it,b
becomes
a,"this has a "" and , in it",b