I am migrating oracle code to postgresql where i need to append the query to existing log file.
Basically i want equivalent of oracle command " SPOOL test.log APPEND " in PostgreSQL . Is there a way to do that?
I tried to append new data to the log file using \o or \o+ or copy in PostgreSQL but it overwrites the log file.
My code is something like this :
Oracle:
spool test.log
select uid from users where uid='1111';
spool off
select sysdate from dual;
//other business logic code
-
spool test.log append
select balance from balances where uid='1111';
spool off
Postgresql:
\o test.log
select uid from users where uid='1111';
\o
select current_date;
//other business logic code
-
\o test.log
select balance from balances where uid='1111';
\o
I want the two queries in \o block to append to same file in PostgreSQL.
You could use
\o | cat >> test.log
on UNIX platforms.
Related
I'm attempting to dynamically create a script that gets saved as a bat file that will be scheduled to execute daily via Windows Task Scheduler. The script performs full database backups for each Postgres database using pg_dump.
The current script is as follows:
COPY (SELECT 'pg_dump '|| datname || ' > e:\postgresbackups\FULL\' || datname || '_%date:~4,2%-%date:~7,2%-%date:~10,4%_%time:~0,2%_%time:~3,2%_%time:~6,2%.dump' FROM pg_database) TO 'E:\PostgresBackups\Script\FULL_Postgres_Backup_Job_TEST.bat' (format csv, delimiter ';');
An example of the output is as follows:
pg_dump postgres > e:\postgresbackups\FULL\postgres_%date:~4,2%-%date:~7,2%-%date:~10,4%%time:~0,2%%time:~3,2%_%time:~6,2%.dump
I need help with updating my code so that the output will include double quotes around the name of the dump file; however, when I add this to my COPY script it adds more than what is necessary to the output. I would like the output to look like the following which includes the double-quotes:
pg_dump postgres > "e:\postgresbackups\FULL\postgres_%date:~4,2%-%date:~7,2%-%date:~10,4%%time:~0,2%%time:~3,2%_%time:~6,2%.dump"
Any help would be greatly appreciated!
Thanks to #Mike Organek's comment, my issue has been resolved by switching the format from CSV to TEXT. Now when I enclose the dump filename in double quotes, the output is more of what is expected and works as intended. The only odd thing now is that in the output it creates a second backslash in the filename. My code has been updated as follows:
COPY (SELECT 'pg_dump '|| datname || ' > "e:\postgresbackups\FULL\' || datname || '_%date:~4,2%-%date:~7,2%-%date:~10,4%_%time:~0,2%_%time:~3,2%_%time:~6,2%.dump"' FROM pg_database) TO 'E:\PostgresBackups\Script\FULL_Postgres_Backup_Job.bat' (format text, delimiter ';');
An example of the output that gets created within the bat file is as follows:
pg_dump postgres > "e:\\postgresbackups\\FULL\\postgres_%date:~4,2%-%date:~7,2%-%date:~10,4%_%time:~0,2%_%time:~3,2%_%time:~6,2%.dump"
As you can see, it adds a double backslash; however, the pg_dump executes successfully!
DB2 command to Postgres command.
db2 IMPORT FROM test.csv OF DEL MODIFIED BY USEDEFAULTS COMMITCOUNT 100000 "INSERT_UPDATE INTO TEST.person (name,old,sex)" > ${TEMPTXT}
How can i use postgres command to do the same thing like this db2 command to import from file to insert and update the table ?
Postgres has COPY, but it doesn't perform update.So, first run COPY into a TEMP table and then merge into main table.
For a comma delimiter,
CREATE TEMP TABLE TEST.tmp_person (name text,old text,sex text)
COPY TEST.tmp_person FROM test.csv WITH DELIMITER ',' CSV
-- ^temp table
There are various techniques in Postgres to do INSERT_UPDATE or merge. Refer this post. Use proper casting to appropriate target data types while inserting/updating.
I am using the \copy command for migrating my data . But the table size is 30GB and it is taking hours to migrate. Can I use a where clause so that I can migrate only data that was available a month back?
\copy hotel_room_types TO | (select hotel_room_types.* from hotel_room_types limit 1) $liocation CSV DELIMITER ',';
ERROR: syntax error at or near "."
LINE 1: ...otel_room_types TO STDOUT (select hotel_room_types.* from h...
You can specify a query with psql's \copy just like you can with the SQL command COPY:
\copy (SELECT ... WHERE ...) TO 'filename'
After all, \copy just calls COPY ... TO STDOUT under the hood.
I have SAS code that will write to a postgres table if it is already created but still empty. How can I create/alter a postgres table from SAS (or using a script that pulls in SAS macro variables) if it does not exist or already has data? The number of fields may change. Currently, I use the filename option along with the pipe to write to the postgres file.
filename pgout pipe %unquote(%bquote(')/data/dwight/IFS6.2/app/PLANO/sas_to_psql.sh
%bquote(")&f_out_schema.%bquote(").&file_name.
%bquote(')
)
;
I've tried using this version, but it does not work:
filename pgout pipe %unquote(%bquote(')/data/dwight/IFS6.2/app/PLANO/sas_to_psql.sh
%bquote('')CREATE TABLE mdo_backend.fob_id_desc
SELECT * FROM &library_name..&file_name.
%bquote(")&f_out_schema.%bquote(").&file_name./('')/
%bquote(')
)
;
This is the script I use:
LOAD_TO_PSQL.SH
#!/bin/bash
. /data/projects/ifs/psql/config.sh
psql -d $DB -tAq -c "COPY $1 FROM STDIN USING DELIMITERS '|'"
is possible in PSQL console export file with current date on the end of the file name?
The name of the exported file should be like this table_20140710.csv is it possible to do this dynamically? - the format of the date can be different than the above it isn't so much important.
This is example what i mean:
\set curdate current_date
\copy (SELECT * FROM table) To 'C:/users/user/desktop/table_ ' || :curdate || '.csv' WITH DELIMITER AS ';' CSV HEADER
The exception of the \copy meta command not expanding variables is (meanwhile) documented
Unlike most other meta-commands, the entire remainder of the line is always taken to be the arguments of \copy, and neither variable interpolation nor backquote expansion are performed in the arguments.
To workaround you can build, store and execute the command in multiple steps (similar to the solution Clodoaldo Neto has given):
\set filename 'my fancy dynamic name'
\set command '\\copy (SELECT * FROM generate_series(1, 5)) to ' :'filename'
:command
With this, you need to double (escape) the \ in the embedded meta command. Keep in mind that \set concatenates all further arguments to the second one, so quote spaces between the arguments. You can show the command before execution (:command) with \echo :command.
As an alternative to the local \set command, you could also build the command server side with SQL (the best way depends on where the dynamic content is originating):
SELECT '\copy (SELECT * FROM generate_series(1, 5)) to ''' || :'filename' || '''' AS command \gset
Dynamically build the \copy command and store it in a file. Then execute it with \i
First set tuples only output
\t
Set the output to a file
\o 'C:/users/user/desktop/copy_command.txt'
Build the \copy command
select format(
$$\copy (select * from the_table) To 'C:/users/user/desktop/table_%s.csv' WITH DELIMITER AS ';' CSV HEADER$$
, current_date
);
Restore the output to stdout
\o
Execute the generated command from the file
\i 'C:/users/user/desktop/copy_command.txt'