Copy *.csv file to view in PostgreSQL - postgresql

I want to import data from *.csv file to view in the PostgreSQL with 9.3 version. Here is the following script I have tried.
Example:
\copy "viewName" from 'D:\filename.csv' DELIMITER ';' CSV HEADER;
Error
ERROR: cannot copy to view "viewName"
Questions:
Where I am going wrong?
Or I need to copy it into table then select it from there?

From http://www.postgresql.org/docs/9.3/static/sql-copy.html:
COPY can only be used with plain tables, not with views. However, you can write COPY (SELECT * FROM viewname) TO ....
Since COPY is the basis for \copy, you might want to try your code with a table instead of a view and the select from there.

Related

Postgres - Copy Command

I have several text files and would like to import their contents into a table.
Each file must be imported as a record in a table (inside just one field).
For this I created the following code:
create table my_table ("content" text);
copy my_table from '/Users/julio/Desktop/my_file.txt';
I would like the text to be placed in the table exactly as it is in the file, including spaces, tabs, and line breaks.
However when I run the command above, I get the error:
ERROR: extra data after last expected column
I realized that the error in question is because of the tab.
Is there any way to escape these characters?
Thank you!
I solved the problem by making a not very elegant solution.
I uploaded the file using the command:
lo_import
SELECT lo_import ('/ Users / julio / Desktop / myfile.txt');
I imported it into the table using a cast and lo_get:
insert into mytable select convert_from (CAST ((lo_get (1172557)) as bytea), 'latin1');
Delete the file link using lo_unlink:
select lo_unlink (1172557);
Thats it!
I hope I can help someone with the same problem!
Julio

How to prevent file creation with psql copy when zero rows are returned from the query without checking the count of query result?

I'm using the postgres copy to create a csv file with header as below.
\copy (select * from result_table) To '/ldb_db/shared/data/cctl_reports/output/CCTL_cfs_in_missing.csv' With CSV HEADER ;
The result_table is a temporary table that I inflate with data from a function that executes before the \copy.
Problem is that it's creating an empty file with headers when no records are found. I don't want that happening. Is there any way I can achieve this?

PostgreSQL 9.5: Append export data into text file

I want to export the selected records into the text file.
Using:
\COPY (SELECT * FROM Table_Name) TO '/root/Exported_Data.txt'
Note: The above script just giving me same records, BUT NOT appending any duplicate or non duplicate records.
Following link might help in appending data to a file using copy command: https://dba.stackexchange.com/questions/149745/copy-command-in-postgresql-to-append-data/149774#149774

SQL server Openquery equivalent to PostgresQL

Is there query equivalent to sql server's openquery or openrowset to use in postgresql to query from excel or csv ?
You can use PostgreSQL's COPY
As per doc:
COPY moves data between PostgreSQL tables and standard file-system
files. COPY TO copies the contents of a table to a file, while COPY
FROM copies data from a file to a table (appending the data to
whatever is in the table already). COPY TO can also copy the results
of a SELECT query
COPY works like this:
Importing a table from CSV
Assuming you already have a table in place with the right columns, the command is as follows
COPY tblemployee FROM '~/empsource.csv' DELIMITERS ',' CSV;
Exporting a CSV from a table.
COPY (select * from tblemployee) TO '~/exp_tblemployee.csv' DELIMITERS ',' CSV;
Its important to mention here that generally if your data is in unicode or need strict Encoding, then Always set client_encoding before running any of the above mentioned commands.
To set CLIENT_ENCODING parameter in PostgreSQL
set client_encoding to 'UTF8'
or
set client_encoding to 'latin1'
Another thing to guard against is nulls, while exporting , if some fields are null then PostgreSQL will add '/N' to represent a null field, this is fine but may cause issues if you are trying to import that data in say SQL server.
A quick fix is modify the export command by specifying what would you prefer as a null placeholder in exported CSV
COPY (select * from tblemployee ) TO '~/exp_tblemployee.csv' DELIMITERS ',' NULL as E'';
Another common requirement is import or export with the header.
Import CSV to table with Header for columns present in first row of csv file.
COPY tblemployee FROM '~/empsource.csv' DELIMITERS ',' CSV HEADER
Export a table to CSV with Headers present in the first row.
COPY (select * from tblemployee) TO '~/exp_tblemployee.csv' DELIMITERS ',' CSV HEADER

export table to csv on postgres

How can I export a table to .csv in Postgres, when I'm not superuser and can't use the copy command?
I can still import the data to postgres with "import" button on the right click, but no export option.
Use psql and redirect stream to file:
psql -U <USER> -d <DB_NAME> -c "COPY <YOUR_TABLE> TO stdout DELIMITER ',' CSV HEADER;" > file.csv
COPY your_table TO '/path/to/your/file.csv' DELIMITER ',' CSV HEADER;
For more details go to this manual
Besides what marvinorez's suggests in his answer you can do, from psql:
\copy your_table TO '/path/to/your/file.csv' DELIMITER ',' CSV HEADER
On the other hand, from pgadmin3, you can also open the table by right clicking on it's name and then selecting View Data. Then you can click on the upper-left corner of the table (where the column name row joins with the row number column, a gray empty square) to select all rows. Finally, you can copy with CtrlC or Edit -> Copy in the menu. The data will be copied to the clipboard in csv format, delimited by semicolon ;.
You can then paste it in LibreOffice Calc, MS Excel to display for instance.
If your table is large (what is large depends on the amount of RAM of your machine, among other things) it might not fit in the clipboard, so in that case, I would not use this method but the first one (\copy).
The easiest way would indeed be a COPY to stdout I think. If you can't do this, how about using pg_dump and then transform the output file with sed, AWK or even a text editor? This should work even with search and replace in an acceptable amount of time :)
I was having trouble with superuser and running psql, I took the simple stupid way using PGAdmin III.
1) SELECT * FROM ;
Before running select Query in the menu bar and select 'Query to File'
This will save it to a folder of your choice. May have to play with the settings on how to export, it likes quoting and ;.
2) SELECT * FROM ;
run normally and then save the output by selecting export in the File menu. This will save as a .csv
This is not a good approach for large tables. Tables I have done this for are a few 100,000 rows and 10-30 columns. Large tables may have problems.