Import CSV to Azure PostgreSQL withough Superuser - azure-postgresql

I'm trying to upload a CSV file into Azure Postgresql. However when I run the command, I get the following error:
ERROR: must be superuser to COPY to or from a file
However, Azure Postgresql doesn't allow users to be super users. How can I import the file?

Have you tried the \copy command available in psql? I'm fairly sure it allows reading/ writing files local to the client and doesn't require you to be a superuser. See https://codeburst.io/two-handy-examples-of-the-psql-copy-meta-command-2feaefd5dd90

Related

How do I launch postgresql on terminal without using homebrew? (SQL state: 42501, trying to import CSV to pgAdmin4 with COPY)

I am unable to import a CSV with postgres(v12) into pgAdmin4 using the copy command.
COPY revenue
FROM "../Desktop/Home/revenue.csv"
DELIMITER ','
CSV HEADER;
The result I get is:
ERROR: could not open file "../Desktop/Home/revenue.csv" for reading: Permission denied
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You may want a client-side facility such as psql's \copy.
SQL state: 42501
I found article
How to import a CSV file into PostgreSQL using Mac
but it was unsolved and I already tried ging the user 'postgres' 'read and write' permissions. Maybe I did it wrong, I right clicked on the CSV file and I also tried giving it permission in pgAdmin4 (properties, default privileges)
I found the solution below suggesting I launch postgres from my command line
Postgres ERROR: could not open file for reading: Permission denied
However everything I read on launching postgres on a mac via command line includes homebrew. I didn't install postgresql with homebrew.
-Can I launch postgres on a mac without using homebrew?
If I install homebrew, will it work to launch postgresql even if I didn't use it to install postgres initially?
-Is there another way to use the copy command via pgadmin ? (I tried using dbeaver but it wouldn't connect to my database)
I am super lost.
Thank you!

How to grants pg_read_server_files OR alternative on AWS RDS PostgreSQL

I have a python script to modify a PostgreSQL database with a csv file. The script, the csv and the database used to be on the same private server. I'd like to migrate on AWS so I do some experiments. My first attempt is to run the script on my server and modify the new PostgreSQL database created with RDS. So everything seems to work except that I don't have the pg_read_server_files role on my "main" database user. As far as I understand this is normal, but I don't know what I should do to achieve my mission.
Any ideas ?
Thanks for reading

Export Database from Google Cloud Sql to external Database

I'm trying to export my database created in Google Cloud Sql and import it into a new external server.
I tried to create a sql backup through the google console, downloaded it and copied it to the new server via filezilla and then launched the following command:
psql -U postgres -d ciclods-db -1 -f Backup-db_Cloud_SQL_Export_2019-03-23\ \(17_01_19\)
but i get this output:
ERROR: role "cloudsqladmin" does not exist
REVOKE
ERROR: role
"cloudsqlsuperuser" does not exist GRANT
what is the right procedure to follow in these cases?
I have resolved the same problem by locating and deleting the two lines from the exported sql file with "cloudsqladmin". My app does not use it anyway.
to do this task you can follow the official GCP guide about How to export data from Cloud SQL[1] in that document they give you the option to export the data into a dump file or csv files which can be used for other tools.
https://cloud.google.com/sql/docs/mysql/import-export/exporting
In order to create the export file, you have to do it from a command line and use additional flags. As per documentation‘s “Exporting data to a SQL dump file”, there is a section on Exporting data from an externally-managed database server.
As well you can find there the option to export the data into a CSV file.

Export Postgres table to csv

I am trying to export my Postgres table to a csv on my desktop and I get this error:
ERROR: could not open file "C:\Users\blah\Desktop\countyreport.csv" for writing: Permission denied
SQL state: 42501
This is my query which I believe is the correct syntax
COPY countyreport TO 'C:\\Users\\blah\\Desktop\\countyreport.csv' DELIMITER ',' CSV HEADER;
According to the user manual:
Files named in a COPY command are read or written directly by the
server, not by the client application.
https://www.postgresql.org/docs/current/static/sql-copy.html
The common mistake is to believe that the filesystem access will be that of the (client) user, but it's not. It's normal to run the postgresql server as its own user. Therefore action carried out by the server will be done as a different OS user to the client. The server is usually run as an OS user postgres.
Assuming that you are running the server on your local machine then the simplest way to fix it would be to give postgres access to your home directory or desktop. This can be done by changing the windows security settings on your home directory.
Before you do this.... Stop and think. Is this what you are looking for? If the server is in development then will it always run on the user's machine. If not then you may need to use COPY to write to the stdout. See the manual for information on this.

Can PostgreSQL COPY read CSV from a remote location?

I've been using JDBC with a local Postgres DB to copy data from CSV files into the database with the Postgres COPY command. I use Java to parse the existing CSV file into a CSV format matches the tables in the DB. I then save this parsed CSV to my local disk. I then have JDBC execute a COPY command using the parsed CSV to my local DB. Everything works as expected.
Now I'm trying to perform the same process on a Postgres DB on a remote server using JDBC. However, when JDBC tries to execute the COPY I get
org.postgresql.util.PSQLException: ERROR: could not open file "C:\data\datafile.csv" for reading: No such file or directory
Am I correct in understanding that the COPY command tells the DB to look locally for this file? I.E. the remote server is looking on its C: drive (doesn't exist).
If this is the case, is there anyway to indicate to the copy command to look on my computer rather than "locally" on the remote machine? Reading through the copy documentation I didn't find anything that indicated this functionality.
If the functionality doesn't exist, I'm thinking of just populating the whole database locally then copying to database to the remote server but just wanted to check that I wasn't missing anything.
Thanks for your help.
Create your sql file as follows on your client machine
COPY testtable (column1, c2, c3) FROM STDIN WITH CSV;
1,2,3
4,5,6
\.
Then execute, on your client
psql -U postgres -f /mylocaldrive/copy.sql -h remoteserver.example.com
If you use JDBC, the best solution for you is to use the PostgreSQL COPY API
http://jdbc.postgresql.org/documentation/publicapi/org/postgresql/copy/CopyManager.html
Otherwise (as already noted by others) you can use \copy from psql which allows accessing the local files on the client machine
To my knowledge, the COPY command can only be used to read locally (either from stdin or from file) from the machine where the database is running.
You could make a shell script where you run you the java conversion, then use psql to do a \copy command, which reads from a file on the client machine.