If a notification is sent in postgresql, how can I trigger a function by using listen? I know it can be done in another programming language, and I know you can send triggers after changes to tables, but how do I only trigger a function after a notification inside postgresql? Something like:
LISTEN <channel> THEN PERFORM ...
Related
I want to listen to every transaction that is recorded in the database And I noticed that there is a concept in Postgres under the title LISTEN and NOTIFY (Of course, I am not sure that this is the right way)
Now I want to write a trigger that sends a notification to the channel when any operation occurs in any table
Is it possible?
My way is right?
thanks
Postgres TRIGGER to call NOTIFY with a JSON payload
I'm new to Google Cloud SQL and Pub/Sub. I couldn't find documentation anywhere about this. But another question's accepted and upvoted answer seems to say it is possible to publish a Pub/Sub message whenever there is an insert happen to the database. Excerpt from that answer:
2 - The ideal solution would be to create the Pub/Sub topic and publish to it when you insert new data to the database.
But since my question is a different one, thus I asked a new question here.
Background: I'm using a combination of Google Cloud SQL, Firestore and Realtime Database for my app for its own unique strengths.
What I want to do is to be able to write into Firestore and Realtime databases once an insert is successful in Google Cloud SQL. According to the answer above, this is the steps I should do:
The app calls a Cloud Function to insert a data into Google Cloud SQL database (PostgreSQL). Note: The Postgres tables has some important constraints and triggers Postgres functions, thats why we want to start here.
When the insert is successful I want Google Cloud SQL to publish a message to Pub/Sub.
Then there is another Cloud Function that subscribes to the Pub/Sub topic. This function will write into Firestore / Realtime Database accordingly.
I got steps #1 & #3 all figured out. The solution I'm looking for is for step #2.
The answer in the other question is simply suggesting that your code do both of the following:
Write to Cloud SQL.
If the write is successful, send a message to a pubsub topic.
There isn't anything that will automate or simplify either of these tasks. There are no triggers for Cloud Functions that will respond to writes to Cloud SQL. You write code for task 1, then write the code for task 2. Both of these things should be straightforward and covered in product documentation. I suggest making an attempt at both (separately), and posting again with the code you have that isn't working the way you expect.
If you need to get started with pubsub, there are SDKs for pretty much every major server platform, and the documentation for sending a message is here.
While Google Cloud SQL doesn't manage triggers automatically, you can create a trigger in Postgres:
CREATE OR REPLACE FUNCTION notify_new_record() RETURNS TRIGGER AS $$
BEGIN
PERFORM pg_notify('on_new_record', row_to_json(NEW)::text);
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER on_insert
AFTER INSERT ON your_table
FOR EACH ROW EXECUTE FUNCTION notify_new_record();
Then, in your client, listen to that event:
import pg from 'pg'
const client = new pg.Client()
client.connect()
client.query('LISTEN on_new_record') // same as arg to pg_notify
client.on('notification', msg => {
console.log(msg.channel) // on_new_record
console.log(msg.payload) // {"id":"...",...}
// ... do stuff
})
In the listener, you can either push to pubsub or cloud tasks, or, alternatively, write to firebase/firestore directly (or whatever you need to do).
Source: https://edernegrete.medium.com/psql-event-triggers-in-node-js-ec27a0ba9baa
You could also check out Supabase which now supports triggering cloud functions (in beta) after a row has been created/updated/deleted (essentially does the code above but you get a nice UI to configure it).
I want to implement a client server mechanism in kdb where clients can register them self to receive a callback when some table is updated.
I know how the callback work in kdb, I was not able to figure how to bind table updates in server to a function from where I can call 'callback' from client.
Basically you want to implement 'Publish-Subscribe' mechanism. KDB already has a script 'u.q' in tick library which provides that:
https://code.kx.com/q/cookbook/publish-subscribe/
On server, it maintains list of clients along with their handles, subscription tables and callback functions. You will have to change function on server which handles data insert/update to also publish the data.
q) .u.pub[table name; table data]
This will take care of calling each client's callback function which are registered for this table.
On client side, create the connection to publisher and call the library function for subscription.
q) .u.sub[tablename;list_of_symbols_to_subscribe_to]
You can also look into example publisher and subscriber code: https://github.com/KxSystems/cookbook/tree/master/pubsub
Is it possible to use data from the row a trigger is firing on, as the channel of a pg_notify, like this:
CREATE OR REPLACE FUNCTION notify_pricesinserted()
RETURNS trigger AS $$
DECLARE
BEGIN
PERFORM pg_notify(
NEW.my_label,
row_to_json(NEW)::text);
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER notify_pricesinserted
AFTER INSERT ON prices
FOR EACH ROW
EXECUTE PROCEDURE notify_pricesinserted();
EDIT: I found out the reason it was not working is due to the case of my label. If I replace it with lower(NEW.my_label) and also do the same for the listener then it works.
The pg_notify() part would work without throwing an error. PostgreSQL places very few restrictions on what a channel name could be. But in practice it is probably useless because you would need to establish a LISTEN some_channel command prior to the pg_notify() statement to pick up the payload message somewhere outside of the trigger function and doing that on some dynamic value is difficult in most situations and probably terribly inefficient in all cases.
If - in your trigger - NEW.my_label has a small number of well-defined values, then you might work it out by establishing listening channels on all possible values, but you are probably better off defining a single channel identifier for your table, or perhaps for this specific trigger, and then construct the payload message in such a way that you can easily extract the appropriate information for some response. If you cannot predict the values of NEW.my_label then it is plain impossible.
In your specific case you could have a channel name 'prices' and then do something like:
pg_notify('prices', format('%s: %s, NEW.my_label, row_to_json(NEW)::text));
The session with LISTEN prices will receive:
Asynchronous notification "prices" with payload "some_label: {new_row_to_json}" received from server process with PID 8448.
That is a rather silly response (why the "Asynchronous notification "channel" with payload ..." instead of just the payload and the PID?) but you can easily extract the relevant parts and work with those. Since you would have to manipulate the string anyway it is not a big burden to strip away all the PG overhead in one go, on a single channel, making management of the trigger actions far easier.
I'm trying to understand how a java (client) application that communicates, through JDBC, with a pgSQL database (server) can "catch" the result produced by a query that will be fired (using a trigger) whenever a record is inserted into a table.
So, to clarify, via JDBC I install a trigger procedure prepared to execute a query whenever a record is inserted into a given database table, and from this query's execution will result an output (wrapped in a resultSet, I suppose). And my problem is that I have no idea how the client will be aware of those results, that are asynchronously produced.
I wonder if JDBC supports any "callback" mechanism able to catch the results produced by a query that is fired through a trigger procedure under the "INSERT INTO table" condition. And if there is no such "callback" mechanism, what is the best approach to achieve this result?
Thank you in advance :)
Triggers can't return a resultset.
There's no way to send such a result to the JDBC driver.
There are a few dirty hacks you can use to get results from a trigger to the client, but they're all exactly that. Things like:
DECLARE a cursor for the resultset, then send the cursor name as a NOTIFY payload, so the app can FETCH ALL FROM <cursorname>;
Create a TEMPORARY table and report the name via NOTIFY
It is more typical to append anything the trigger needs to communicate to the app to a table that exists for that purpose and have the app SELECT from it after the operation that fired the trigger ran.
In most cases if you need to do this, you're probably using a trigger where a regular function is a better fit.