query database and export via txt file - firebird

I need to make a select query in a firebird database and export the result in a .txt file.
Is there a way to make this via sql ? Or command line ?
Appreciate any help

Related

Function/procedure to save a schema as an .sql file in PostgreSQL

I want to create a procedure which when I call creates a backup by creating an .sql file and saves it in my computer.
The procedure's name is trial_gen(). When I execute call trial_gen(), it should create a plain .sql file of the schema.
All solutions I found were only using the SQL shell
SQL code is a script, so I think it makes sense to run one from SQL shell. It would be a stored script (text) in a file anyway.

Creating Batch Files with PostgreSQL \copy Command in Jetbrains Datagrip

I'm familiarizing myself with the standalone version of Datagrip and having a bit of trouble understanding the different approaches to composing SQL via console, external files, scratch files, etc.
I'm managing, referencing the documentation, and am happy to figure things out as such.
However, I'm trying to ingest CSV data into tables via batch files using the Postgres \copy command. Datagrip will execute this command without error but no data is being populated.
This is my syntax, composed and ran in the console view:
\copy tablename from 'C:\Users\username\data_file.txt' WITH DELIMITER E'\t' csv;
Note that the data is tab-separated and stored in a .txt file.
I'm able to use the import functions of Datagrip (via context menu) just fine but I'd like to understand how to issue commands to do similarly.
\copy is a command of the command-line PostgreSQL client psql.
I doubt that Datagrip invokes psql, so it won't be able to use \copy or any other “backslash command”.
You probably have to use Datagrip's import facilities. Or you start using psql.
Ok, but what about the SQL COPY command https://www.postgresql.org/docs/12/sql-copy.html ?
How can I run something like that with datagrip ?
BEGIN;
CREATE TEMPORARY TABLE temp_json(values text) ON COMMIT DROP;
COPY temp_json FROM 'MY_FILE.JSON';
SELECT values->>'aJsonField' as f
FROM (select values::json AS values FROM temp_json) AS a;
COMMIT;
I try to replace 'MY_FILE.JSON' with full path, parameter (?), I put it in sql directory etc.
The data grip answer is :
[2021-05-05 10:30:45] [58P01] ERROR: could not open file '...' for reading : No such file or directory
EDIT :
I know why. RTFM! -_-
COPY with a file name instructs the PostgreSQL server to directly read from or write to a file. The file must be accessible by the PostgreSQL user (the user ID the server runs as) and the name must be specified from the viewpoint of the server.
Sorry.....

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

How to export table data from PostgreSQL (pgAdmin) to CSV file?

I am using pgAdmin version 4.3 and i want to export one table data to CSV file. I used this query
COPY (select * from product_template) TO 'D:\Product_template_Output.csv' DELIMITER ',' CSV HEADER;
but it shows error
a relative path is not allowed to use COPY to a file
How can I resolve this problem any help please ?
From the query editor, once you have executed your query, you just have to click on the "Download as CSV (F8)" button or use F8 key.
Source pgAdmin 4 Query Toolbar
Use absolute paths or cd to a known location so that you can ignore the path.
For example cd into documents directory then run the commands there.
If you are able to cd into your documents directory, then the command would be like this:
Assuming you are want to use PSQL from the command line.
cd ~/Documents && psql -h host -d dbname -U user
\COPY (select * from product_template) TO 'Product_template_Output.csv' DELIMITER ',' CSV HEADER;
The result would be Product_template_Output.csv in your current working directory(Documents folder).
Again using psql.
You have to remove the double quotes:
COPY (select * from product_template) TO 'D:\Product_template_Output.csv'
DELIMITER ',' CSV HEADER;
If your PgAdmin instance resides in a remote server, the aforementioned solutions might not be handy for you if you do not have remote access to the server. In this case, simply select all the query data and copy it. Open an excel file and you could paste it. Simple !! Tweaked.
You might have tough time if your query result is too much though.
Try this command:
COPY (select * from product_template) TO 'D:\Product_template_Output.csv' WITH CSV;
In PgAdmin export option is available in file menu.Execute the query, and then we can view the data in the Output pane. Click on the menu FILE -> EXPORT from query window.
PSQL to export data
COPY noviceusers(code, name) FROM 'C:\noviceusers.csv' DELIMITER ',' CSV HEADER;
https://www.novicetechie.com/2019/12/export-postgresql-data-in-to-excel-file.html for reference.
Write your query to select data on the query tool and execute
Click on the download button on the pgAdmin top bar (selected in red)
Rename the file to your liking
Select which folder to save the file
Congrats!!!

Importing CSV file into PostgreSQL

Using MySQL Administrator GUI tool I have exported some data tables retrieved from an sql dumpfile to csv files.
I then tried to import these CSV files into a PostgreSQL database using the postgres COPY command. I've tried entering
COPY articles FROM '[insert .csv dir here]' DELIMITERS ',' CSV;
and also the same command without the delimiters part.
I get an error saying
ERROR: invalid input syntax for integer: "id"
CONTEXT: COPY articles, line 1, column id: "id"
In conclusion my question is what are some thoughts and solutions to this problem? Could it possibly be something to do with the way I created the csv files? or have I made a rookie mistake elsewhere?
If you have header columns just add the header qualifier to the copy statement as per
documentation to skip that line
http://www.postgresql.org/docs/8.4/static/sql-copy.html