Copy table FROM error in PostgreSQL - postgresql

I defined a new datatype, say
CREATE TABLE IF NOT EXISTS face_axis ( face INTEGER, normal real[]);
where normal should be a vector.
I have already write them on harddisk, like
5, 1,0,0,
....
Then I want to use
COPY face_axis FROM 'face_axis.csv' with csv;
but it reports error
ERROR: extra data after last expected column
what's wrong with it? thanks,

There are two problems:
The array contents must be enclosed in brackets in all cases
Ambiguous use of comma as both the CSV delimiter and delimiter of values inside the array.
You may use a different CSV separator or quote the contents according to CSV rules:
To be loaded with COPY table FROM file with csv delimiter ';'
1;{1.0,2.0}
To be loaded with COPY table FROM file with csv
1,"{1.0,2.0}"

Related

ERROR: extra data after last expected column on postgres

When I tried to copy a very large txt file into my postgres database, I got a following error below.
Note that I created a table with a single column and am not using any delimiter when importing the txt file.
db1=# create table travis_2018_data (v text);
db1=# \COPY travis_2018_data FROM 'C:\Users\testu\Downloads\travis_2018\2018-Certification\PROP.txt';
The error:
ERROR: extra data after last expected column
CONTEXT: COPY travis_2018_data, line 295032: "000000561125P 02018000000000000
I'm wondering why I still get the error about the extra data (or column) on line 295032 ?
Your text probably contains a tab character which is the default column delimiter for the TEXT format when using \copy (or copy) without specifying a format.
So \copy thinks the line contains two column but only expects one, hence the error message
You need to specify a delimiter that will not occur in the file. The character with the ASCII value 1 is a highly unlikely to occur in such a file, so you can try:
\COPY travis_2018_data FROM '.....' DELIMITER E'\x01'

Bug during COPY in Postgres

I have a table named basic_data which contains more than 8 millions rows and I want to copy all this data into a CSV file.
So I use the COPY command like this :
copy basic_data to '/tmp/data_fdw.csv' delimiter ';' null '';
COPY 8792481
This work great but when I want to insert my data into another table with exactly the same schema (but it's a foreign table), I had this following error:
ERROR: value out of range: overflow

Missing data for column while trying to copy a csv file in a postgresql database

I have an issue trying to copy a CSV file in a table.
Here is my SQL statement:
DROP TABLE IF EXISTS nom_graph;
CREATE TABLE nom_graph
(
DATE VARCHAR(50),
EDP_REC FLOAT,
EDP_EC FLOAT,
NB_KO FLOAT
);
\copy nom_graph FROM '/home/giutools/EDP/out/SYNTHESE_RESYNC.csv' (DELIMITER('|'))
;
and this is the error I get:
psql:nom_graph.sql:179: ERROR: missing data for column "edp_rec"
CONTEXT: COPY nom_graph, line 1: "DATE;EDP_REC;EDP_EC;NB_KO"
The CSV file is composed by a : date ; and all the other values are FLOAT.
I really can't understand what's the issue, been trying to solve it for two days now.
The Problem is with your CSV file,
Step1: Convert the excel file to CSV using http://www.zamzar.com .
Step2: Create table in postgresql with the same column that you see in your excel file.
Step3: Copy the CSV file to the already created table using below command,
copy table_name (column1,column2,..) from 'C:\Users\Public\lifile_name.csv' delimiter ',' csv header;
Done, hope you find this helpful!

Import excel file into teradata using tpt

I am required to load an excel file to a teradata table which already has data in it. I have used TPT Inserter operator to load data with CSV files. I am not sure how to directly load an excel file using TPT Inserter.
When I tried providing the excel file with TextDelimiter='TAB', the parser threw an error
data_connector: TPT19134 !ERROR! Fatal data error processing file 'd:\sample_dat
a.csv'. Delimited Data Parsing error: Too few columns in row 1.
1) Could someone explain what are the options required while directly importing excel file to teradata
2) How to load a TAB delimited file in teradata using tptLoad / tptInserter
the script that I have used is:
define job insert_data
description 'Load from Excel to TD table'
(
define operator insert_operator
type inserter
schema *
attributes
(
varchar logonmech='LDAP',
varchar username='username',
varchar userpassword='password',
varchar tdpid='tdpid',
varchar targettable='excel_to_table'
);
define schema upload_schema
(
quarter varchar(20),
cust_type varchar(20)
);
define operator data_connector
type dataconnector producer
schema upload_schema
attributes
(
varchar filename='d:\sample_data.xlsx',
varchar format='delimited',
varchar textdelimiter='TAB',
varchar openmode='Read'
);
apply ('insert into excel_to_table(quarter, cust_type) values(:quarter, :cust_type);')
to operator (insert_operator[1])
select quarter, cust_type
from operator (data_connector[1]);
);
Thanks!!
The scripts actually seems fine by the looks besides for the fact the the error is related to delimited data and a .xlsx extension file is specified in the script. Are you sure that the specified file is Tab delimited?
Formats supported by TPT Dataconnector operator are:
Binary - Binary data fitting exactly in the defined Schema plus indicator bytes
Delimited - Easier for multiple column human readable files, limited to all varchar schema
Formatted - For working with data exported by Teradata TTUs
Text - For text files containing fixed width columns, also human readable, limited to all varchar schema
Unformatted - For working with data exported by Teradata TTUs
The original excel data (in true xls or xlsx format) is not directly supported by native TPT operators. But if your data is really Tab delimited then this shouldn't be a problem; you should be able to load this. An obvious point to consider in loading a delimited file is that Char or Varchar fields must not contain delimiter within data. You can escape delimiter characters in data by using a '\'. A more subtle point is that you cannot specify TAB delimiter in lower case, i.e. varchar textdelimiter='TAB' works but varchar textdelimiter='tab' doesn't. Also, any other control characters (besides TAB) cannot be specified as delimiters.
If you truly need to load excel files then you may need to pre-process it into a loadable format such as delimited or binary or text data. You can write separate code in any language to achieve this.

COPY command does not ignore sequence column

I have a database whereas the first column is labeled as serial not null primary key. The table creation and automatic sequence table creation is successful. However, whenever I do:
copy <table_name> from '/path/to/file' delimiter ',' CSV HEADER;
PostgreSQL tries to read my first column into the serial column, which fails because my first column in my CSV file contains characters (not an integer).
How can I tell the COPY command to populate using a serial column as the first column?
I determined that if I specified the header names and named my columns exactly like the header names in my CSV file, that the import worked:
copy <table_name>(column1, column2, etc) from '/path/to/file' delimiter ',' CSV HEADER;