How to deploy Firestore listener functions to GCP Cloud? - google-cloud-firestore

I'm using the following approach (source) to build a GCP Cloud Function that handles Firestore events.
const functions = require('firebase-functions');
exports.myFunction = functions.firestore
.document('my-collection/{docId}')
.onWrite((change, context) => { /* ... */ });
There is no example on how to deploy this correctly to GCP Functions though, only Firebase ones.
using the regular gcloud deploy commands such as this one won't work.
gcloud functions deploy FUNCTION_NAME \
--entry-point ENTRY_POINT \
--runtime RUNTIME \
--trigger-event "providers/cloud.firestore/eventTypes/document.write" \
--trigger-resource "projects/YOUR_PROJECT_ID/databases/(default)/documents/messages/{pushId}"
Any ideas on how to do this?

You can achieve the required effect with creating Cloud Firestore Trigger of your choice
For Cloud Functions (1st gen):
In the Trigger type field, select Cloud Firestore.
Select Event Type: Write
Mention the Document on which you want to trigger this function, eg.users/{doc_id}
Check Retry on failure Check box if you want to retry if it did not trigger
Click On Save
Modify your Function as per your requirement
Click Deploy
Make Modification on provided documented path, in our case users/{doc_id}
Check the logs for cloud functions. You will see the function got triggered.
For Cloud Functions (2nd gen):
Under HTTPS, in the Authentication field, select an option depending on whether you want to allow unauthenticated invocations of your function. By default, authentication is required.
Click on Add Eventarc Trigger (A modal will appear)
Choose Trigger Type : First Party
Choose Event Provider: Cloud Firestore
Event: google.firestore.v1.Firestore.Write
Resource : If you are having exact path for document choose Specific Resource else you want to target multiple documents using wildcard pattern
choose Path Pattern
Check Retry on failure Check box if you want to retry if it did not trigger
Click on Save Trigger
Click Next
Modify your Function as per your requirement
Click Deploy
Make modification on the targeted documents
Check the logs for cloud functions. You will see the function got triggered.

Related

Is it possible to publish a message to Google Pub/Sub whenever a data is inserted or updated in Google Cloud SQL?

I'm new to Google Cloud SQL and Pub/Sub. I couldn't find documentation anywhere about this. But another question's accepted and upvoted answer seems to say it is possible to publish a Pub/Sub message whenever there is an insert happen to the database. Excerpt from that answer:
2 - The ideal solution would be to create the Pub/Sub topic and publish to it when you insert new data to the database.
But since my question is a different one, thus I asked a new question here.
Background: I'm using a combination of Google Cloud SQL, Firestore and Realtime Database for my app for its own unique strengths.
What I want to do is to be able to write into Firestore and Realtime databases once an insert is successful in Google Cloud SQL. According to the answer above, this is the steps I should do:
The app calls a Cloud Function to insert a data into Google Cloud SQL database (PostgreSQL). Note: The Postgres tables has some important constraints and triggers Postgres functions, thats why we want to start here.
When the insert is successful I want Google Cloud SQL to publish a message to Pub/Sub.
Then there is another Cloud Function that subscribes to the Pub/Sub topic. This function will write into Firestore / Realtime Database accordingly.
I got steps #1 & #3 all figured out. The solution I'm looking for is for step #2.
The answer in the other question is simply suggesting that your code do both of the following:
Write to Cloud SQL.
If the write is successful, send a message to a pubsub topic.
There isn't anything that will automate or simplify either of these tasks. There are no triggers for Cloud Functions that will respond to writes to Cloud SQL. You write code for task 1, then write the code for task 2. Both of these things should be straightforward and covered in product documentation. I suggest making an attempt at both (separately), and posting again with the code you have that isn't working the way you expect.
If you need to get started with pubsub, there are SDKs for pretty much every major server platform, and the documentation for sending a message is here.
While Google Cloud SQL doesn't manage triggers automatically, you can create a trigger in Postgres:
CREATE OR REPLACE FUNCTION notify_new_record() RETURNS TRIGGER AS $$
BEGIN
PERFORM pg_notify('on_new_record', row_to_json(NEW)::text);
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER on_insert
AFTER INSERT ON your_table
FOR EACH ROW EXECUTE FUNCTION notify_new_record();
Then, in your client, listen to that event:
import pg from 'pg'
const client = new pg.Client()
client.connect()
client.query('LISTEN on_new_record') // same as arg to pg_notify
client.on('notification', msg => {
console.log(msg.channel) // on_new_record
console.log(msg.payload) // {"id":"...",...}
// ... do stuff
})
In the listener, you can either push to pubsub or cloud tasks, or, alternatively, write to firebase/firestore directly (or whatever you need to do).
Source: https://edernegrete.medium.com/psql-event-triggers-in-node-js-ec27a0ba9baa
You could also check out Supabase which now supports triggering cloud functions (in beta) after a row has been created/updated/deleted (essentially does the code above but you get a nice UI to configure it).

Triggering Kusto commands using 'ADX Command' activity in ADFv2 vs calling WebAPI on it

In ADFv2 (Azure Data Factory V2) if we need to trigger a command on an ADX (Azure Data Explorer) cluster , we have two choices:-
Use 'Azure Data Explorer Commmand' activity
Use POST method provided in the 'WebActivity' activity
Having figured out that both the methods work I would say from development/maintenance point of view the first method sounds more slick and systematic especially because it is out of the box feature to support Kusto in ADFv2. Is there any scenario where the Web Activity method would be more preferable or more performant? I am trying to figure out if it's alright to simply use the ADX Command activity all the time to run any Kusto command from ADFv2 instead of ever using the Web activity,
It is indeed recommended to use the "Azure Data Explorer Command" activity:
That activity is more comfortable, as you don't have to construct by yourself a the HTTP request.
That command takes care of few things for you, such as:
In case you are running an async command, it will poll the Operations table until your async command is completed.
Logging.
Error handling.
In addition, you should take into consideration that the result format will be different between both cases, and that each activity has its own limits in terms of response size and timeout.

Google Cloud REST API - How can return compute engine images newer than a specified creationTimestamp?

I'm using Google's Cloud API to only return disk images (compute.instances.list) created after a certain date.
I'm using the following for the filter parameter: creationTimestamp > 2019-08-02 but it's not working. I'm getting Invalid value for field 'filter': 'creationTimestamp \u003e 2019-08-02'. Invalid list filter expression.
Any ideas, or is it not possible? I can have it work using a partial date & a wildcard, using creationTimeStamp = 2019-08-0*, but that's not the same as everything after this date.
This is a known issue by Google Cloud Platform, and you can follow it progress here.
As an alternative, you could use a gcloud command. You could first format the list as, for instance, a table, and then use one of the columns (that will be the creation time stamp) to filter.
The following command
gcloud compute instances list --format="table(name,creationTimestamp)" --filter="CREATION_TIMESTAMP > 2019-08-23"
will give you a list of Compute Engine instances created after 2019-08-23 and, in this case, you will only obtain pear each GCE instance their name and the creation date.
This blog is very interesting and educational about how to use filter, formats, tables and more regarding gcloud commands.

Firestore trigger temporal information

Hi so i understand firestore write triggers run out of order with respect to time. Is is possible to get timestamp information on when a write occured within the trigger functions execution context?
If you're using the Firebase CLI to deploy, every background function is delivered an EventContext object as its second parameter. You can use its timestamp property. Or, you can have the client write it into the document.
I assume something similar is available for the context object provided to code deployed by gcloud.

How to create a Logic App Custom Connector polling trigger?

I've been able to create a Logic App Custom Connector with a webhook trigger by following the docs, however I can't find any documentation on creating a polling trigger. I was only able to find Jeff Hollan's trigger examples, but the polling trigger doesn't seem compatible with the custom connector.
I tried setting up a polling trigger by performing the following steps:
Create an Azure Function with a GET operation expecting a date time query parameter
Have the function return a set of entities that have changed since the last poll
Configure the custom connector to call the Azure Function with the date time query parameter
Configure the response body of the custom connector
Try different things in the 'Trigger configuration' section, but this is most confusing to me.
Whatever I tried, the trigger always fails with a 404 in the trigger outputs, similar to what I initially had with the webhook trigger type.
There are a few things that confuse me:
1. Path of trigger query seems screwed up
It looks like the custom connector UI screws up the path to the trigger. I noticed this when I downloaded the OpenAPI file. The path to my trigger API should be /api/trigger/tasks/completed, but in the OpenAPI file it read /trigger/api/trigger/tasks/completed. It appears the custom connector adds /trigger in front of the path. I sometimes noticed it doing this multiple times, giving me something similar to /trigger/trigger/trigger/api/trigger/tasks/completed. I fixed this in the OpenAPI file and re-imported it into the custom connector.
2. Trigger Configuration section
I don't understand what to do in the Trigger Configuration section of a polling trigger.
I assume the query parameter to monitor state change is some parameter I define myself, e.g. a timestamp, to determine what entities to return.
As the 'select value to pass to selected query param' I would expect I could pick a timestamp from the trigger response. It looks like I can only pick values from a collection, not scalar values from the response as I would expect. How does that work?
Is 'trigger hint' just some information or does it actually control something?