Log firing of triggers in postgres 9.1 - postgresql

We have DB that has massive amount of business logic stored in triggers inside DB. Is there a way to log firing of triggers along with arguments that they have been fired, and what they have changed?
I saw a lot of tutorials on how to do table audit with triggers, but I would like to audit triggers not tables :)

Take one of the examples that do table auditing with triggers. Use their approach to extract the changed data, but do not write the data into an audit table, but use it for a RAISE NOTICE.
That notice will then be written to the PostgreSQL log file if you set the logging configuration correctly (log_min_messages = notice)
See the manual for details on RAISE: http://www.postgresql.org/docs/current/static/plpgsql-errors-and-messages.html

Related

How does DDL replication work in AWS DMS?

Could you please explain how DDL replication works in AWS DMS (in the case of the two Postgres databases)? I didn't find an explanation of this process in the official documentation.
As I can see the replication task installs awsdms_ddl_audit trigger (here is information about of this trigger). This trigger intercepts DDL operations and writes them to the awsdms_ddl_audit table. I don't understand what happens with these intercepted DDL operations after it.
P.S.
I am asking this because I've noticed that DMS is applying these DDL operations in the middle of the CDC process i.e. it doesn't arrange them with a timeline of CDC changes.
In my case, I update the source database for an hour and at the end of it I remove several columns. The DMS removes these columns when the CDC synchronization process isn't finished yet.
It's very strange behavior.

Is it recommended to use write ahead logs instead of db triggers for audit logging of table changes?

In a microservices environment using postgresql, what would be the best method of capturing audit logs for table changes?
Need to not impact too much of DB performance and also capture a reliable log of database table changes and/or DDLs and the time and person performing the change
Many thanks in advance!

Optimize the trigger to add audit log

I have a local database which is the production database, on which all operations are being done real time. I am storing the log on each action in an audit log table in another database via trigger. It basically checks if any change is made in any of the row's column it will remove that row and add it AGAIN (which is not a good way I think as it should simply update it but due to some reasons I need to delete and add it again).
There are some tables on which operations are being done rapidly like 100s of rows are being added in database. This is slowing the process of saving the data into audit log table. Now if trigger has to like delete 100 rows and add 100 again it will affect the performance obviously and if number of rows increases it will reduce the performance more.
What should be the best practice to tackle this, I have been looking into Read Replica and Foreign Data Wrapper but as for Read Replica it's only Readable and not writable for PostgreSQL and I don't really get to know how Foreign Data Wrapper gonna help me as this was suggested by one of my colleague.
Hope someone can guide me in right direction.
A log is append-only by definition. Loggers should never be modifying or removing existing entries.
Audit logs are no different. Audit triggers should INSERT an entry for each change (however you want to define "change"). They should never UPDATE or DELETE anything*.
The change and the corresponding log entry should be written to the same database within the same transaction, to ensure atomicity/consistency; logging directly to a remote database will always leave you with a window where the log is committed but the change is not (or vice versa).
If you need to aggregate these log entries and push them to a different database, you should do it from an external process, not within the trigger itself. If you need this to happen in real time, you can inform the process of new changes via a notification channel.
* In fact, you should revoke UPDATE/DELETE privileges on the audit table from the user inserting the logs. Furthermore, the trigger should ideally be a SECURITY DEFINER function owned by a privileged user with INSERT rights on the log table. The user connecting to the database should not be given permission to write to the log table directly.
This ensures that if your client application is compromised (whether due to a malfunction, or a malicious user e.g. exploiting an SQL injection vulnerability), then your audit log retains a complete and accurate record of everything it changed.

DB2 "Triggers" on actions beyond update, insert and delete

After researching triggers, I've only come up with thing showing how to update, insert and delete. It seems like that's even part of the syntax itself. DB2 Docs on Triggers
Is there any kind of trigger, or something similar, which would let me track a larger set of actions, things like SELECT and ALTER TABLE?
We (unfortunately) share a database with some teams which we don't strictly trust to not do things like run insane SELECT statements (locking up the databases) or ALTER TABLE without us knowing. We'd like to be able to track when these happen and what user made the change.
Please, no suggestions recommending we get our database separated in some way. We're working towards that in the long term, but we need this in the short term.
The link for DB2 docs given in your post points to IBM i. Is your database DB2 for i?
For IBM i, you can use detailed database monitor to capture all SQL statements including DDL commands like alter table. However, running detailed database monitor for all users causes performance problems.
We were in same situation as you with multiple teams using same server as database. We ended up writing custom user exits to capture all SQLs (with user details) in our case.
Link to database monitor:
https://www.ibm.com/support/knowledgecenter/en/ssw_ibm_i_72/rzajq/strdbmon.htm

PostgreSQL Table-Tracking Utility?

I'd like to see a transactional history of operations that have been executed on one of my tables, and which user executed each operation. Does PostgreSQL offer any tools that allow that kind of historical lookup?
Maybe others can inform you if there are any good utilities that handle this for you, but I know triggers can be used to create audit logs of tables. If you need more complex logic for how and what you want to audit you can also write procedural functions and incorporate them in your triggers. Example: Postgres trigger function
See this link: http://wiki.postgresql.org/wiki/Audit_trigger_91plus