Sytanx to execute command in pentaho kettle shell - postgresql

"CMD.EXE /C " psql -h ipaddress -d dbname -u user -p password -c "\copy table to 'd:/bcptest/file.csv' with delimiter as '|'"
I have to execute this command in pentaho shell. But it is showing as parse error in script. By this command i have to copy data from a postgres table which is remote and save as .csv file in local.Please help.

You do not have to write any code inside pentaho for transferring
data.
You just have to create sample transformation using spoon, which will
have table input step create your database connection with the remote
server.
Then use text file output and store your result.
Understand the concept of ETL Tool before implementing anything.

Related

a powershell script to read data from a postgres table

I am creating a powershell script to read data from a postgres DB
But any lines given after the psql.exe command does not works
after the psql.exe line the console asks for the password
and does nothing it's only when I press Ctrl+C the other lines get executed
I tried using Start-Job but then I am unable to read the output of my select command it only returns the following line
Job started: System.Management.Automation.PSRemotingJob stating that the job has started
I also tried the Invoke-Command but that too didn't help.
Can anyone help me with a simple sample that explains how to enter password for the psql.exe cmd and how to read the output from the select cmd
I am sharing the approach that worked for me
$env:PGPASSWORD='password'
$result=Write-Output "Select * FROM public.table_name" | & $psql -h 127.0.0.1 -p 5432 -U -U postgres -d database_name
Now you can access the output of the select from the result variable.
You can use a for method and iterate over result to read each row.

Not able to dump data to sql file

I am trying to take backup of a table as sql file in postgres with below command.. the command is not executing properly.. can some one help ?
pg_dump -U postgres-a test_1 > test_1_dump.sql

How to run postgres sql script from another script?

I have a bunch of SQL scripts that create tables in the database. Each table is located in a separate file so that editing them is much easier.
I wanted to prepare a single SQL script that will create the full schema, create tables, insert test data and generate sequences for the tables.
I was able to do such thing for oracle database but I am having problems with postgres.
The thing is - I do not know how to run the table creating script from another script.
In oracle I do it using the following syntax:
##'path of the script related to the path of the currently running sql file'
And everything works like a charm.
In postgres I was trying to search for something alike and found this:
\ir 'relative path to the file'
Unfortunately when I run my main script I get the message:
No such file or directory.
The example call is here:
\ir './tables/map_user_groups.sql'
I use Postgres 9.3. I tried to run the script using psql:
psql -U postgres -h localhost -d postgres < "path to my main sql file"
The file executes fine except for the calling of those other scripts.
Does anybody know how to solve the problem ?
If something in the question is unclear - just let me know :)
Based on the answer It is possible to reference another SQL file from SQL script, on PostgreSQL, you can include another SQL's files just using the \i syntax. I just tested and is working good on PostgreSQL 9.6:
\i other_script.sql
SELECT * FROM table_1;
SELECT * FROM table_2;
By the #wildplasser comment:
psql -U postgres -h localhost -d postgres < "path to my main sql file"
From psql's perspective the main_sql file is just stdin, and stdin has no "filename". Use -f filename to submit a file with a name:
psql -U postgres -h localhost -d postgres -f filename.sql
How to run postgres sql script from another script?
Seems the same question as: How to import external sql scripts from a sql script in PostgreSQL?

PostgreSQL - read an SQL file into a PostgreSQL database from the commandline

I use Ruby to generate a bunch of SQL commands, and store this into a file.
I then login to my PostgreSQL database. Then I do something like:
\i /tmp/bla.sql
And this populates my database.
This all works fine as it is, no problem here.
I dislike the manual part where I have to use \i, though (because I need this to work in a cron job eventually, and I think commands like \i are only available when you are directly in the interactive psql prompt).
So my question now is:
Is it possible to use a psql command from the command line that directly will start to read in an external file?
You can directly use the psql command as shown below.
Works for me with Ubuntu and Mint. On Windows it should be quite the same...
psql -U user -d database -f filepath
Example:
psql -U postgres -d testdb -f /home/you/file.sql
For more information take a lock at the official documentation: http://www.postgresql.org/docs/current/static/app-psql.html
When you try to execute an sql file using cron, you will also need to set the environment - database name, password etc. This is a short shell script snippet that does it all
source /var/lib/pgsql/scripts/.pgenv
echo $PATH
psql << AAA
select current_date;
select sp_pg_myprocedure(current_date);
AAA
In .pgenv, you set the values such as
export PGPORT=<yourport>
export PGHOST=<yourhost>
export PGDATA=<yourdatadir>
Also have a .pgpass file so that the password is supplied.
http://www.postgresql.org/docs/current/static/libpq-pgpass.html
Replace the part where SELECT is being done with whatever you want to do, or do it as #Kuchi has shown.

Bulk loading into PostgreSQL from a remote client

I need to bulk load a large file into PostgreSQL. I would normally use the COPY command, but this file needs to be loaded from a remote client machine. With MSSQL, I can install the local tools and use bcp.exe on the client to connect to the server.
Is there an equivalent way for PostgreSQL? If not, what is the recommended way of loading a large file from a client machine if I cannot copy the file to the server first?
Thanks.
COPY command is supported in PostgreSQL Protocol v3.0 (Postgresql 7.4 or newer).
The only thing you need to use COPY from a remote client is a libpq enabled client such as psql command line utility.
From the remote client run:
$ psql -d dbname -h 192.168.1.1 -U uname < yourbigscript.sql
You can use the \copy command from psql tool like:
psql -h IP_REMOTE_POSTGRESQL -d DATABASE -U USER_WITH_RIGHTS -c "\copy
TABLE(FIELD_LIST_SEPARATE_BY_COMMA) from 'FILE_IN_CLIENT_MACHINE(MAYBE IN THE SAME
DIRECTORY)' with csv header"
Assuming you have some sort of client in order to run the query, you can use the COPY FROM STDIN form of the COPY command: http://www.postgresql.org/docs/current/static/sql-copy.html
Use psql's \copy command to load data in sql:
$ psql -h <IP> -p <port> -U <username> -d <database>
database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' delimiter as '|'
database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' with csv header