Cannot find pubsub in firebase init - google-cloud-firestore

im trying to run a scheduled function in firebase emulator but cannot find the pubsub option when I run firebase init. Refer pic below
The options I get are Realtime Database, Firestore, Functions, Hosting, Storage, Emulators and Remote Config, there is no pubsub.

"pubsub" functions are created using cloud functions. It is part of Cloud Functions.
You only have to initialize(init) Functions.
Write the pubsub triggers there. View documentation for extra details on how to code pubsub - DOCS https://firebase.google.com/docs/functions/pubsub-events

Related

Running periodic queries on google cloud sql instance

I have a google cloud postgre instance and I'd like to run periodic sql queries on it and use the monitoring system to alert the user with the results.
How can I accomplish just using the gcp platform? Without having to develop a separate app.
As far as I am aware of, There is no Built-in feature for recurring queries in Cloud SQL at the moment.
So you have to implement your own. You can Use Cloud Scheduler to trigger a Cloud function (via HTTP/S endpoint) that runs the query on Cloud SQL and then notify the user in the way that suits your needs (I would recommend using pub/sub).
and you might want to save the result in a GCS bucket and the user is to pull the result from there.
Also, you might want to check BigQuery. It has a built-in feature of Scheduling queries.

flutter/firebase schedule a function at given time

I want to schedule the cloud function at a specific time and that time will be in firestore document.I want that when i add data inside firestore, a cloud function will trigger and get data from that latest added document and will fetch date and time from that document data and then schedule a cloud function at that specific time to perform a specific task (update status in firestore).
For the time scheduling check this official documentation out:
https://firebase.google.com/docs/functions/schedule-functions
If you want to execute code on database changes check the triggers out:
https://firebase.google.com/docs/functions/database-events
You will need something like onUpdate(). Depends on your needs.
You can store files in the firebase storage. Check out the official documentation for a code example.
https://firebase.google.com/docs/storage/web/list-files
In case you want to read data from the firebase in your Flutter app, you can implement the flutter-firebase packages.
Here you can find the instructions for all firebase packages:
https://firebase.flutter.dev/
Stack Overflow is for asking questions. Your Question sounds more like you expecting that someone will give you the whole code, so you don't have to do any research about that.
If that's not the case, sorry for the misunderstanding.
Cloud Functions trigger for Firestore writes. There is nothing built in to trigger them at a time that is specific in the document that is written.
But you can build that yourself using Cloud Tasks, as Doug shows in this excellent blog post: How to schedule a Cloud Function to run in the future with Cloud Tasks (to build a Firestore document TTL)
I used the node-schedule package inside the Firebase Cloud Functions, after which I was able to schedule tasks at dynamic times from FireStore.
Here is sample code:
exports.scheduleMessage=functions.firestore.document('users/{userID}/pending/{messageID}').onCreate(async (snapshot,context) =>{
schedule.scheduleJob(messageId,`myDate.getUTCMinutes()
myDate.getUTCHours() * * *`, async () =>{
// your logic inside
});
});

How to store PubSub data to big query using cloud functions?

I am once again asking for your help.
Let me tell you my current situation first.
I have a device that connects to the "Cloud IoT core" and sends data using mqtt.
The data then goes to the Pub/Sub topic.
Then a "Cloud function" gets triggered which stores the data inside "Firestore"
Another "Cloud function" gets triggered which sends me an email with the stored data inside Firestore.
The size of the data is about 1 Kilobyte and I expect to send about 10K messages per Month
I need that data to create a dashboard for which I am using "Google Data Studio"
To get my data inside there I installed the Firebase extension "Stream Collections to BigQuery" to send the data to "BigQuery". from there I just had to click a few buttons to automaticly stream data from BigQuery to "Google Data Studio"
Everything works so far but as you can see I store the data 4 times. once via email, once inside firestore, once inside BigQuery and Data studio. All of this is going to cost alot of money in the long term, because the data stored doubles every Month.
What I need from You guys is some advice on best practices.
Is there a way to store the data directly inside BigQuery when it arrives in the Pub/Sub?
If so can I also send an email with the data as an attachment?
Is BigQuery a good solution or should I use "Cloud SQL"?
To save data inside Firestore I can execute the following inside a cloud function. Is there a similare way for BigQuery?
firestore.collection("put Collection name here").doc(put document name here).set({
'name' : name
'age' : age
}).then((writeResult) => {
//console.log('Successfully executed set');
return;
}).catch((err) => {
console.log(err);
return;
});
Is there a way to store the data directly inside BigQuery when it
arrives in the Pub/Sub?
Yes, you can use Dataflow to build a streaming pipeline, as explained in different documentation items or blogs:
GCP Doc: Pub/Sub Topic to BigQuery
A Dataflow Journey: from PubSub to BigQuery
Write a Pub/Sub Stream to BigQuery
But you could also use the Node.js Client for BigQuery in a Cloud Function, triggered by Pub/Sub. However, one could consider that this doesn't "store the data directly"...
If so can I also send an email with the data as an attachment?
If you use a Cloud Function, that's quite easy, for example by using the dedicated "Trigger Email" Firebase Extension.
You can also directly send an email from a Cloud Function by using the nodemailer package, see this official Cloud Function sample.
Is BigQuery a good solution or should I use "Cloud SQL"?
It all depends on you exact use case... There is a lot of literature on the net: https://www.google.com/search?client=firefox-b-d&q=difference+between+Cloud+SQL+and+BigQuery
However, since you are going to use Data Studio, a classical answer would be to use BigQuery since it is best suited for analytics. But again, it depends on you exact use case.
(Note that this question alone would probably be closed on SO because it is opinion based).
To save data inside Firestore I can execute the following inside a
cloud function. Is there a similar way for BigQuery?
Yes, as said above, use the Node.js Client for BigQuery in your Cloud Function.

API Access to setting up Endurance Storage/iSCSI snapshots and schedules?

Is there a way through either the IBM Cloud API, or the Softlayer API, to programmatically run/schedule/setup snapshots on an endurance storage device? aka iSCSI drive.
I've looked through the documentation, but have not found anything.
you need to take a look at these methods:
https://sldn.softlayer.com/reference/services/softlayer_network_storage/createsnapshot
the method above will allow you to create a new manual snapshot
https://sldn.softlayer.com/reference/services/softlayer_network_storage/enablesnapshots
the method above will allow you to schedule the snapshots
see bellow some examples of code:
https://softlayer.github.io/php/enableSnapshots/
https://softlayer.github.io/rest/createsnapshot/

Automate / schedule a script

I read a number of blog and watched tutorials - cannot find anything to help me with my problem.
I have a stakeholder that drops files into Google Cloud Storage, I have already written a script that performs ETL tasks to.
It would be great where I can create a trigger which runs my script as soon as the file is dropped in a specific place in Google Cloud Storage.
Google Cloud Storage supports Google Cloud Pub/Sub Notifications. This allows you to programmatically receive notifications when new objects are uploaded to your bucket.