export table to csv on postgres - postgresql

How can I export a table to .csv in Postgres, when I'm not superuser and can't use the copy command?
I can still import the data to postgres with "import" button on the right click, but no export option.

Use psql and redirect stream to file:
psql -U <USER> -d <DB_NAME> -c "COPY <YOUR_TABLE> TO stdout DELIMITER ',' CSV HEADER;" > file.csv

COPY your_table TO '/path/to/your/file.csv' DELIMITER ',' CSV HEADER;
For more details go to this manual

Besides what marvinorez's suggests in his answer you can do, from psql:
\copy your_table TO '/path/to/your/file.csv' DELIMITER ',' CSV HEADER
On the other hand, from pgadmin3, you can also open the table by right clicking on it's name and then selecting View Data. Then you can click on the upper-left corner of the table (where the column name row joins with the row number column, a gray empty square) to select all rows. Finally, you can copy with CtrlC or Edit -> Copy in the menu. The data will be copied to the clipboard in csv format, delimited by semicolon ;.
You can then paste it in LibreOffice Calc, MS Excel to display for instance.
If your table is large (what is large depends on the amount of RAM of your machine, among other things) it might not fit in the clipboard, so in that case, I would not use this method but the first one (\copy).

The easiest way would indeed be a COPY to stdout I think. If you can't do this, how about using pg_dump and then transform the output file with sed, AWK or even a text editor? This should work even with search and replace in an acceptable amount of time :)

I was having trouble with superuser and running psql, I took the simple stupid way using PGAdmin III.
1) SELECT * FROM ;
Before running select Query in the menu bar and select 'Query to File'
This will save it to a folder of your choice. May have to play with the settings on how to export, it likes quoting and ;.
2) SELECT * FROM ;
run normally and then save the output by selecting export in the File menu. This will save as a .csv
This is not a good approach for large tables. Tables I have done this for are a few 100,000 rows and 10-30 columns. Large tables may have problems.

Related

PostgreSQL 9.5: Append export data into text file

I want to export the selected records into the text file.
Using:
\COPY (SELECT * FROM Table_Name) TO '/root/Exported_Data.txt'
Note: The above script just giving me same records, BUT NOT appending any duplicate or non duplicate records.
Following link might help in appending data to a file using copy command: https://dba.stackexchange.com/questions/149745/copy-command-in-postgresql-to-append-data/149774#149774

Copy *.csv file to view in PostgreSQL

I want to import data from *.csv file to view in the PostgreSQL with 9.3 version. Here is the following script I have tried.
Example:
\copy "viewName" from 'D:\filename.csv' DELIMITER ';' CSV HEADER;
Error
ERROR: cannot copy to view "viewName"
Questions:
Where I am going wrong?
Or I need to copy it into table then select it from there?
From http://www.postgresql.org/docs/9.3/static/sql-copy.html:
COPY can only be used with plain tables, not with views. However, you can write COPY (SELECT * FROM viewname) TO ....
Since COPY is the basis for \copy, you might want to try your code with a table instead of a view and the select from there.

Command to read a file and execute script with psql

I am using PostgreSQL 9.0.3. I have an Excel spreadsheet with lots of data to load into couple of tables in Windows OS.
I have written the script to get the data from input file and Insert into some 15 tables. This can't be done with COPY or Import. I named the input file as DATALD.
I find out the psql command -d to point the db and -f for the script sql. But I need to know the commands how to feed the input file along with the script so that the data gets inserted into the tables..
For example this is what I have done:
begin
for emp in (select distinct w_name from DATALD where w_name <> 'w_name')
--insert in a loop
INSERT INTO tblemployer( id_employer, employer_name,date_created, created_by)
VALUES (employer_id,emp.w_name,now(),'SYSTEM1');
Can someone please help?
For an SQL script you must ..
either have the data inlined in your script (in the same file).
or you need to utilize COPY to import the data into Postgres.
I suppose you use a temporary staging table, since the format doesn't seem to fit the target tables. Code example:
How to bulk insert only new rows in PostreSQL
There are other options like pg_read_file(). But:
Use of these functions is restricted to superusers.
Intended for special purposes.

How to export table data in MySql Workbench to csv?

I am wondering how do I export table data into a csv? I read that I need to use mysql workbench command line but I can not figure out how to launch the cmd line(don't know what the command is).
Running on Windows 7 64bit.
You can select the rows from the table you want to export in the MySQL Workbench SQL Editor. You will find an Export button in the resultset that will allow you to export the records to a CSV file, as shown in the following image:
Please also keep in mind that by default MySQL Workbench limits the size of the resultset to 1000 records. You can easily change that in the Preferences dialog:
Hope this helps.
U can use mysql dump or query to export data to csv file
SELECT *
INTO OUTFILE '/tmp/products.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM products
MySQL Workbench 6.3.6
Export the SELECT result
After you run a SELECT: Query > Export Results...
Export table data
In the Navigator, right click on the table > Table Data Export Wizard
All columns and rows are included by default, so click on Next.
Select File Path, type, Field Separator (by default it is ;, not ,!!!) and click on Next.
Click Next > Next > Finish and the file is created in the specified location

PostgreSQL: How to modify the text before /copy it

Lets say I have some customer data like the following saved in a text file:
|Mr |Peter |Bradley |72 Milton Rise |Keynes |MK41 2HQ |
|Mr |Kevin |Carney |43 Glen Way |Lincoln |LI2 7RD | 786 3454
I copied the aforementioned data into my customer table using the following command:
\copy customer(title, fname, lname, addressline, town, zipcode, phone) from 'customer.txt' delimiter '|'
However, as it turns out, there are some extra space characters before and after various parts of the data. What I'd like to do is call trim() before copying the data into the table - what is the best way to achieve this?
Is there a way to call trim() on every value of every row and avoid inserting unclean data in the first place?
Thanks,
I think the best way to go about this is to add a BEFORE INSERT trigger to the table you're inserting to. This way, you can write a stored procedure that will execute before every record is inserted and trim whitepsace (or do any other transformations you may need) on any columns that need it. When you're done, simply remove the trigger (or leave it, which will improve data integrity if you never want that whitespace int those columns). I think explaining how to create a trigger and stored procedure in PostgreSQL is probably outside the scope of this question, but I will link to the documentation for each.
I think this is the best way because it is simpler than parsing through a text file or writing shell code to do this. This kind of sanitization is the kind of thing triggers do very well and very simply.
Creating a Trigger
Creating a Trigger Function
I have somehow similar use case in one of the projects. My input files:
has number of lines in the file as a last line;
needs to have line numbers added on every line;
needs to have file_id added to every line.
I use the following piece of shell code:
FACT=$( dosql "TRUNCATE tab_raw RESTART IDENTITY;
COPY tab_raw(file_id,lnum,bnum,bname,a_day,a_month,a_year,a_time,etype,a_value)
FROM stdin WITH (DELIMITER '|', ENCODING 'latin1', NULL '');
$(sed -e '$d' -e '=' "$FILE"|sed -e 'N;s/\n/|/' -e 's/^/'$DSID'|/')
\.
VACUUM ANALYZE tab_raw;
SELECT count(*) FROM tab_raw;
" | sed -e 's/^[ ]*//' -e '/^$/d'
)
dosql is a shell function, that executes psql with proper connectivity info and executes everything, that was given as an argument.
As a result of this operation I will have $FACT variable holding a total count of inserter records (for error detection).
Later I do another dosql call:
dosql "SET work_mem TO '800MB';
SELECT tab_prepare($DSID);
VACUUM ANALYZE tab_raw;
SELECT tab_duplicates($DSID);
SELECT tab_dst($DSID);
SELECT tab_gaps($DSID);
SELECT tab($DSID);"
to get analyze and move data into the final tables from auxiliary one.