Imoprting a csv file to Postgresql - postgresql

I am trying to import a csv file into a postgres database. The csv file is on a local server while the db is on another server. From what I've seen, the recommendations were using \copy.
\COPY tablename from '/path/to/local/file.csv' with csv
But it shows me a syntax error. And the problem is at \copy.
Any suggestion how can I approach this problem?

Related

Creating Postgres table on AWS RDS using CSV file

I'm having this issue with creating a table on my postgres DB on AWS RDS by importing the raw csv data. Here's the few steps that I already did.
CSV file has been uploaded on my S3 bucket
Followed AWS's tutorial to give RDS permission to import data from S3
Created an empty table on postgres
Tried using pgAdmin's 'import' feature to import the local csv file into the table, but it kept giving me the error.
So I'm using this query below to import the data into the table:
SELECT aws_s3.table_import_from_s3(
'public.bayarea_property_data',
'',
'(FORMAT CSV, HEADER true)',
'cottage-prop-data',
'clean_ta_file_edit.csv',
'us-west-1'
);
However, I keep getting this message:
ERROR: extra data after last expected column
CONTEXT: COPY bayarea_property_data, line 2: ",2009.0,2009.0,0.0,,0,2019,13061.0,,0,0.0,0.0,,2019,0.0,6767.0,576040,172810,403230,70,1,,1.0,,6081,..."
SQL statement "copy public.bayarea_property_data from '/rdsdbdata/extensions/aws_s3/amazon-s3-fifo-6261-20200819T083314Z-0' with (FORMAT CSV, HEADER true)"
SQL state: 22P04
Anyone can help me with this? I'm an AWS noob, so have been struggling over the past few days. Thanks!

how to upload 900MB csv file from a website to postgresql

I want to do some data analysis from NYCopendata. The file is ~900 MB. So I am using postgresql database to store this file. I am using pgadmin4 but could not figure out how to directly store the csv in postgresl without first downloading in my machine. Any help is greatly appreciated.
Thanks.
You can use:
pgAdmin to upload a CSV file from import/export dialog
https://www.pgadmin.org/docs/pgadmin4/4.21/import_export_data.html
COPY statement on the database server
\copy command from psql on any client

How do i import .csv file from a remote server to a postgreSQL database?

The original code is a simple SQL import :
LOAD DATA LOCAL INFILE 'D:/FTP/foo/foo.csv'
INTO TABLE error_logs
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY ''
LINES STARTING BY ''
TERMINATED BY '\n'
IGNORE 1 LINES
(Server,Client,Error,Time);
I need to migrate a web portal (from SQL to Postgres[I know there are tools for that, but its not the question]) and the issue is i am no more working on local.
I didn't see anybody ask the question in this way : import .csv from a remote server to a postgres db.
I think i have to use COPY but i dont get the right syntax...
Thanks for your attention.
the copy command is an option to do this.
I had to do this once time.
How to import CSV file data into a PostgreSQL table?
Copying PostgreSQL database to another server

copy CSV file to postgres table over the wire

I'm connecting to a postgres db over SQL Alchemy. In testing COPY works great for appending rows to a local db--it's very fast.
COPY ratings FROM '/path/blah.csv' DELIMITER ',' CSV;
However, what I would like to do is COPY a CSV file over the SQL Alchemy connection into a remote postgres db. PG documentation indicates that something like
COPY ratings FROM STDIN '/path/blah.csv' DELIMITER ',' CSV;
might work. But it doesn't. I've tried a bunch of reasonable variations.
Ideas? Thanks for any help, and my apologies if this question is redundant.
I'm not an expert with SqlAlchemy, but I've used copy_from for a "file like" object (aka stream) many times with Psycopg2. And I think you can specify Psycopg2 dialect in SqlAlchemy. Please see the following documentation for SqlAlchemy and copy_from in Psycopg2.
Again, never done it. Worse case, you might have to get a connection via Psycopg2.

Can PostgreSQL COPY read CSV from a remote location?

I've been using JDBC with a local Postgres DB to copy data from CSV files into the database with the Postgres COPY command. I use Java to parse the existing CSV file into a CSV format matches the tables in the DB. I then save this parsed CSV to my local disk. I then have JDBC execute a COPY command using the parsed CSV to my local DB. Everything works as expected.
Now I'm trying to perform the same process on a Postgres DB on a remote server using JDBC. However, when JDBC tries to execute the COPY I get
org.postgresql.util.PSQLException: ERROR: could not open file "C:\data\datafile.csv" for reading: No such file or directory
Am I correct in understanding that the COPY command tells the DB to look locally for this file? I.E. the remote server is looking on its C: drive (doesn't exist).
If this is the case, is there anyway to indicate to the copy command to look on my computer rather than "locally" on the remote machine? Reading through the copy documentation I didn't find anything that indicated this functionality.
If the functionality doesn't exist, I'm thinking of just populating the whole database locally then copying to database to the remote server but just wanted to check that I wasn't missing anything.
Thanks for your help.
Create your sql file as follows on your client machine
COPY testtable (column1, c2, c3) FROM STDIN WITH CSV;
1,2,3
4,5,6
\.
Then execute, on your client
psql -U postgres -f /mylocaldrive/copy.sql -h remoteserver.example.com
If you use JDBC, the best solution for you is to use the PostgreSQL COPY API
http://jdbc.postgresql.org/documentation/publicapi/org/postgresql/copy/CopyManager.html
Otherwise (as already noted by others) you can use \copy from psql which allows accessing the local files on the client machine
To my knowledge, the COPY command can only be used to read locally (either from stdin or from file) from the machine where the database is running.
You could make a shell script where you run you the java conversion, then use psql to do a \copy command, which reads from a file on the client machine.