I am wondering how to create or export a CSV file from SQL? Is there any function for that similar to pgsql2shp?
I would appreciate your ideas, tip or solutions.
You can save a complete table as a file using this command:
COPY tablename TO STDOUT CSV
Ref: https://www.postgresql.org/docs/current/static/sql-copy.html
You can give this a try. But i believe there may be some syntax changes depending on the version.
COPY (SELECT foo,bar FROM whatever) TO ‘/tmp/dump.csv’ WITH CSV HEADER
If you use pgAdmin, you can export any query you run to a CSV file.
Related
I have a csv file with 108 columns which i try to import in my postgresql table. It is obvious that I don't want to specify every columns in my CREATE TABLE statement. But when I enter
\COPY 'table_name' FROM 'directory' DELIMITER ',' CSV HEADER; this error message shows up: "ERROR: Extra Data after Last Expected Column". When having a few columns I know how to fix this problem but, like I said, i don't want to specified the entire 108 columns. By the way my table does contain any columns at all. Any help on how I could do that? Thx !
When dealing with problems like this, I often cheat. Plenty of tools exist online for converting CSV to SQL, https://www.convertcsv.com/csv-to-sql.htm being one of them.
Copy/paste your CSV, copy/paste the generated SQL. Not the most elegant solution, although will work as a one-off situation.
Now, if you're looking to repeat this process regularly (automated I hope), then Python may be a interesting language to explore to quickly write a script to do this for you, then schedule it at a CRON job or whatever method you prefer for invoking it automatically with the correct input (CSV file).
Please feel free to let me know if I've misunderstood your original question, or if I can provide any more help give me a shout and I'll do my best!
Hello Stack Overflowers!
I'm currently exporting a Postgres table as a .csv using a C# application I developed. I'm able to export them no problem with the following command...
set PGPASSWORD=password
psql -U USERNAME Database_Name
\copy (SELECT * FROM table1) TO C:\xyz\exportfile.csv CSV DELIMITER ',' HEADER;
The problem I am running into is the .csv is meant to be used with Tableau, however, when importing to excel I run into the same issue. It turns text fields into integers in both Tableau and Excel. This causes issues specifically on joining serial numbers on the Tableau side.
I know I can change these fields in Tableau/Excel manually but I am trying to find a way to make sure the end-user wouldn't need to do this. I'd like for them to just drag and drop the updated .csv postgresql data extracts and be able to start Tableau no problem. They don't seem real tech-savvy. I know you can connect Tableau directly to Postgres but in this particular case, I am not allowed to due to limitations beyond my control.
I'm using PostgreSQL 12 and Tableau v2019.4.0
EDIT: As request providing example data! Both of the fields are TEXT inside of PostgreSQL but the export doesn't specify.
Excel Formatting
ASSETNUM,ITEMNUM
1834,8.11234E+12
1835,8.11234E+12
Notepad Formatting
ASSETNUM,ITEMNUM
1834,8112345673294
1835,8112345673295
Note: If you select the specific cell in Excel it shows the full number.
CSV files don't have any type information, so programs like Excel/Tableau are free to interpret the data how they like.
However, #JorgeCampos's link provides useful information. For example
"=""123""","=""123"""
gets interpreted differently than
123,123
when you load it into Excel.
If you want to add quotes to your data, the easiest way is to use PostgreSQL's string functions, e.g.
SELECT '"=""' || my_column || '"""' FROM my_database
I am using a Mac laptop and I am trying to copy a local csv file and import it into a postgresql table. I have used the delimiter query and the following query works:
copy c2013_levinj.va_clips_translation
from local '/Users/jacoblevin/Desktop/clips_translation_table.csv'
Delimiter ',' skip 1 rejectmax 1;
However, each time the query is submitted, I receive a message that says "0 rows fetched." I have tried dropping the table and re-creating it as well as using the "select *" query. Suffice to say, i have been unable to pull any data. Does anyone have any ideas what's wrong? Thanks in advance.
What happens if you try this:
copy c2013_levinj.va_clips_translation
from local '/Users/jacoblevin/Desktop/clips_translation_table.csv'
WITH CSV HEADER;
That should be more robust and do what you want.
I have a pretty basic database. I need to drop a good size users list into the db. I have the dump file, need to convert it to a .pg file and then somehow load this data into it.
The data I need to add are in CSV format.
I assume you already have a .pg file, which I assume is a database dump in the "custom" format.
PostgreSQL can load data in CSV format using the COPY statement. So the absolute simplest thing to do is just add your data to the database this way.
If you really must edit your dump, and the file is in the "custom" format, there is unfortunately no way to edit the file manually. However, you can use pg_restore to create a plain SQL backup from the custom format and edit that instead. pg_restore with no -d argument will generate an SQL script for insertion.
As suggested by Daniel, the simplest solution is to keep your data in CSV format and just import into into Postgres as is.
If you're trying to to merge this CSV data into a 3rd party Postgres dump file, then you'll need to first convert the data into SQL insert statements.
One possible unix solution:
awk -F, '{printf "INSERT INTO TABLE my_tab (\"%s\",\"%s\",\"%s\");\n",$1,$2,$3}' data.csv
I've to import a file from an external source to a postgresql table.
I tried to do it with \copy from , but I keep getting errors (additional columns) in the middle of the file.
Is there a way to tell postgresql to ignore lines containing errors during a "\copy from" ?
Thanks
Give it a try with PostgreSQL Loader instead.
No. All data is correct or there is no data at all, those are the two options you have in PostgreSQL.