Writing the content of a csv file to PostgreSQL with NPGSQL fails - postgresql

I am writing the Content of a csv file to a PostgreSQL table with help of NPGSQL in version 3.2.5.
The content of my csv is the following
id, value
1, 89
2, 286
3, 80
4, 107
I use the following command to write
Using writer = conn.BeginTextImport("COPY tbl_test (id,value) FROM 'C:/temp/test.csv' DELIMITER ',' CSV HEADER")
When I run my code, the values are written into my database, but the command is throwing the following error message:
Received unexpected backend message CompletedResponse. Please file a bug.
When I run the command directly in the SQL Shell everything works fine, so the problem seems to be produced by NPGSQL.
Here is my command which I use in the SQL Shell:
\COPY tbl_test(id,value) FROM 'C:/temp/test.csv' DELIMITER ',' CSV HEADER;
Has anybody else experience with this message?

As answered in the github issue, You're using the API incorrectly.
If you have your CSV file on the client side (where the Npgsql app is running), then you should be using COPY tbl_test(value) FROM STDIN, not FROM c:\temp\test.csv. The latter is used when the csv file is on the PostgreSQL server. See the documentation.
If you simply want to import a file present on your server, just execute the COPY command as a regular SQL - create a command and execute it with ExecuteNonQuery. Don't use the BeginTextImport API.

Related

Creating Batch Files with PostgreSQL \copy Command in Jetbrains Datagrip

I'm familiarizing myself with the standalone version of Datagrip and having a bit of trouble understanding the different approaches to composing SQL via console, external files, scratch files, etc.
I'm managing, referencing the documentation, and am happy to figure things out as such.
However, I'm trying to ingest CSV data into tables via batch files using the Postgres \copy command. Datagrip will execute this command without error but no data is being populated.
This is my syntax, composed and ran in the console view:
\copy tablename from 'C:\Users\username\data_file.txt' WITH DELIMITER E'\t' csv;
Note that the data is tab-separated and stored in a .txt file.
I'm able to use the import functions of Datagrip (via context menu) just fine but I'd like to understand how to issue commands to do similarly.
\copy is a command of the command-line PostgreSQL client psql.
I doubt that Datagrip invokes psql, so it won't be able to use \copy or any other “backslash command”.
You probably have to use Datagrip's import facilities. Or you start using psql.
Ok, but what about the SQL COPY command https://www.postgresql.org/docs/12/sql-copy.html ?
How can I run something like that with datagrip ?
BEGIN;
CREATE TEMPORARY TABLE temp_json(values text) ON COMMIT DROP;
COPY temp_json FROM 'MY_FILE.JSON';
SELECT values->>'aJsonField' as f
FROM (select values::json AS values FROM temp_json) AS a;
COMMIT;
I try to replace 'MY_FILE.JSON' with full path, parameter (?), I put it in sql directory etc.
The data grip answer is :
[2021-05-05 10:30:45] [58P01] ERROR: could not open file '...' for reading : No such file or directory
EDIT :
I know why. RTFM! -_-
COPY with a file name instructs the PostgreSQL server to directly read from or write to a file. The file must be accessible by the PostgreSQL user (the user ID the server runs as) and the name must be specified from the viewpoint of the server.
Sorry.....

How do I load a CSV into AWS RDS using an Airflow Postgres hook?

I'm trying to use the copy_expert hook here: https://airflow.apache.org/docs/stable/_modules/airflow/hooks/postgres_hook.html
but I don't understand the syntax and I don't have an example to follow. My goal is to load a CSV into an AWS RDS instance running Postgres.
hook_copy_expert = airflow.hooks.postgres_hook.PostgresHook('postgres_amazon')
def import_to_postgres():
sql = f"DELETE FROM amazon.amazon_purchases; COPY amazon.amazon_purchases FROM '{path}' DELIMITER ',' CSV HEADER;"
hook_copy_expert(sql, path, open=open)
t4 = PythonOperator(
task_id = 'import_to_postgres',
python_callable = import_to_postgres,
dag = dag,
)
When I run this, I get an error saying name 'sql' is not defined. Can someone help me understand what I'm doing wrong?
Edit: I got the hook to run but I got an error:
ERROR - must be superuser or a member of the pg_read_server_files role to COPY from a file
HINT: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone.
I thought the whole point of using the Postgres hook was to use the COPY command in SQL without having superuser status? What am I doing wrong?
You can't run COPY on RDS, and you can't run psql's \COPY from a PostgreSQL operator.
Unless it's an enormous file, try loading the CSV data into memory with the Python csv module, and then inserting it to the DB.

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

How to import csv data into postgres table

I tried to import csv file data into postgres table. Running the following line as pgscript in pgAdmin
\copy users_page_rank FROM E'C:\\Users\\GamulinN\\Desktop\\users page rank.csv' DELIMITER ';' CSV
it returned an error:
[ERROR ] 1.0: syntax error, unexpected character
Does anyone know what could be wrong here? I checked this post but couldn't figure out what's the problem.
To import file into postgres with COPY you need one of the following:
1) Connect with psql to the DB and run your comand:
\copy users_page_rank FROM E'C:\\Users\\GamulinN\\Desktop\\users page rank.csv' DELIMITER ';' CSV
It will copy the file from current computer to the table. Details here.
2) Connect with any tool to the DB and run this SQL script:
COPY users_page_rank FROM E'C:\\Users\\GamulinN\\Desktop\\users page rank.csv' DELIMITER ';' CSV
It will copy the file from the server with postgres to the table. Details here. (With this command you can only COPY from files in postgresql data directory. So you will need to transfer files there first.)

Importing CSV file into PostgreSQL

Using MySQL Administrator GUI tool I have exported some data tables retrieved from an sql dumpfile to csv files.
I then tried to import these CSV files into a PostgreSQL database using the postgres COPY command. I've tried entering
COPY articles FROM '[insert .csv dir here]' DELIMITERS ',' CSV;
and also the same command without the delimiters part.
I get an error saying
ERROR: invalid input syntax for integer: "id"
CONTEXT: COPY articles, line 1, column id: "id"
In conclusion my question is what are some thoughts and solutions to this problem? Could it possibly be something to do with the way I created the csv files? or have I made a rookie mistake elsewhere?
If you have header columns just add the header qualifier to the copy statement as per
documentation to skip that line
http://www.postgresql.org/docs/8.4/static/sql-copy.html