DBeaver Exporting to CSV - export-to-csv

I am in the need of switching from Aginity Workbench to something else for a redshift database [read only user], i was suggested DBeaver
i have searched google in many ways but cannot find a way to Export a result set to CSV without manually running the script then exporting.
With Aginity, i could run exports from command line [Opens and Copies a SQL File resultset directly to CSV], and batch then up [around 150 or so extracts each morning, so manually running would take too long]
Does DBeaver have similar functionality? even if its just a copy to csv sql script i can run from inside DBeaver as a select query [which i do currently with postgres]

Yes, It does and follow below.
Right click on table(Redshift or Azure)
You can select the format(.csv or any listed below format) of output file as shown in below

Related

exporting script,output and result from sql workbench

I wanted to know how can I export every command from SQL workbench to text file like tee in cmd. I want to export every command or query that I run and also the result

Is there a way to call the psql function without syntax error calls on ubuntu when calling multiple sql files in a directory?

I am experiencing issues with running this psql script on ubuntu terminal to map the mimic3 database to omop common data model, the code used is
psql "mimic3" --set=OMOP_SCHEMA="$OMOP_SCHEMA" -f "mimic-omop/etl/etl.sql"
the code stops running at the last truncate table command where it should call this sql script titled pg_function but it gives this error:
psql:mimic-omop/etl/etl.sql:28: etl/pg_function.sql: No such file or directory
I have attached a section of the sql file below as proof that it really exists:
The last part of the query has this code where it calls all the sql files listed in my screenshot below:
I am following the instructions in this link:https://github.com/MIT-LCP/mimic-omop/blob/master/README-run-etl.md

PostgreSQL copy 0?

I have written a simple batch script which loops a directory and echoes some details about each file.When I view its results in the CMD terminal or output it to some file, I can view the results as expected.
The problem comes with PostgreSQL: when I try to import its results into a table, executing the following command:
copy schema.table(field) from program 'C:\\...\\my_bat.bat' with CSV header delimiter E'\t';
It imports 0 results, whereas if I run the same command pointing to a similar batch file in another directory, it works as expected.
What's going on? I am using windows.
Update: I have tried running the copy command calling program again on another batch script and this time, only a part of the string output is being imported.
The service user postgres needs sufficient permissions to run the program.
I remember that it was hard to change settings for that account on windows XP, I have not tried on more recent windows - service users are hidden by most GUI tools.

How to run a SQL script in Intersystems Cache?

I have a script in a file (mysqlscript.sql) that is basically a bunch of inserts/updates/deletes separated by GO statements
insert into ....
GO
update .....
GO
How do I run this script?
You can try to use $system.SQL.ImportDir()
And of course, you can read your file and execute each sql-query in your programm.
The tool Caché Monitor use GO as statement separator and connects to InterSystems Caché. With this tool you can execute your script.

Redirect input from another command in windows batch files

In linux I can do something like this:
mysql -u user -p=pass somedb <(echo "create database foo;")
How can I do that with windows batch scripts?
Basically, I want to make a batch file that runs a sql script without having to keep the script in a separate file.
Thanks
One way is to echo the SQL commands into a file, run the mysql command with option to include the SQL file, and then remove the file (if you really don't want it.)
You can do
echo create database foo;|mysql ...
just fine, but for multiple lines you really want to make a temporary file you just pass to MySQL to read.