Having a Google Cloud Storage bucket I would like to reveal it (make it public) at specific date and time. How can it be achieved?
I have tried the permissions of bucket only to find out that with principal allUsers I cannot use any condition.
Another way that comes up is to script Google Compute instance with a startup script together with Google Scheduler, this however has a unpredictable delay which is my purpose cannot tolerate.
So is there any other way? I do not necessarily need to use GCS, any other service that will allow me to reveal a folder/files at specific time should be enough.
You can execute your function that makes objects public in Firebase Cloud Functions with functions.pubsub.schedule().onRun():
In Cloud Functions for Firebase, scheduling logic resides in your
functions code, with no special deploy-time requirements. To create a
scheduled function, use functions.pubsub.schedule('your schedule').onRun((context)).
Related
Is there a way how to apply upload limit for google storage bucket per day/month/year?
Is there is a way how to apply limit on amount of Network traffic?
Is there is a way how to apply limit on Class A operations?
Is there is a way how to apply limit on Class B operations?
I found only Queries per 100 seconds per user and Queries per day using
https://cloud.google.com/docs/quota instructions, but this is JSON Api quotas
(I even not sure what kind of api is used inside of StorageClient c# client class)
For defining Quotas, and by the way SLO, you need to have SLI: Service level indicator. that means to have metrics on what you want to observe.
Here, it's not the case. Cloud Storage haven't indicator on the volume of data per day. Thus, you don't have built in indicator and metrics, ... and quotas.
If you want it, you have to build something by your own. To wrap all the Cloud Storage call in a service that count the volume of blob per days and then you will be able to apply your own rules on this personal indicator.
Of course, for preventing any by pass, you have to deny direct access to the buckets and only grant your "indicator service" to access them. Same things for the bucket creation, to register the new buckets in your service.
Not an easy task...
how to create a background service in FLUTTER
with posh notification
I create an app but I went to integer a service to check the database when the app is not run
thank tou
I would recommend using Google Cloud Scheduler, which allows you to create CRON jobs which can send a request to your API, on a regular basis. The first three jobs are free.
If you also need to implement the actual function checking your database, have a look at Google Cloud Functions. Those can be written in Javascript or Typescript and call make calls to external APIs as long as you are on Blaze Plan (which includes the monthly free quotas).
The advantages are:
you get free credit when you create your account
depending on your needs you might not need more than the free monthly quota for Google Functions calls (first 2 millions invocations are free every month)
it's very easy to create a scheduled function which will picked up and run by the Cloud Scheduler
it's highly scalable and reliable so you don't have to worry about managing your own servers
We are developing an app on Flutter on the client side and Firebase on the server side. I'm thinking of running Cloud Functions regularly using Cloud Scheduler based on each user's timestamp.
My idea is to run Cloud Functions using Cloud Scheduler every day at 12:00. Only users who have a timestamp older than 10 days perform a specific action. Is this a best practice?
Or is it possible to process Cloud Functions using the user's timestamp as a trigger?
For example, Cloud Functions is triggered when 10 days have passed since the user's timestamp.
Update
The scenario is as follows.
Cloud Firestore
/user/${userId}/funcStatus/status
Document(status) field is
timestamp:last update date(e.g. 2019/10/31)
I want to execute Cloud Function after 10 days, that is, when it becomes 11/10.
However, the timestamp varies depending on the user. e.g, userA:10/31, userB:10/20
The first option is possible with scheduled functions.
The second option is not possible with scheduled functions alone. You would have to use a Firestore onCreate trigger, then set up a callback with Cloud Tasks to get the function to execute at the right time.
Whichever one you choose is a matter of preference and whatever meets the needs of your app. There is no right or wrong way.
I have a Google Cloud Function triggered by a Google Cloud Storage object.finalize event. When I deploy a new version of this function, I would like to run it for every existing file in the bucket (which have already been processed by the previous version of the function). Processing all the existing files in the bucket is a long running task, hence I don't think a Google Cloud Function which will process all files in a row is an option.
The best option I can see for now is to make a Google Cloud Function I can triggered via HTTP that will list all the files in the bucket and publish one event per file via Google PubSub, and then process each of these events with a slightly modified version of my initial Google Cloud Function which accepts a PubSub event in place of the object.finalize storage event.
I think it can work but I was wondering if there was an easier way to perform this operation.
If the operation you're trying to perform may take longer than the maximum time that a Cloud Function can run, you will need to split that operation into multiple steps. Your approach of using a PubSub trigger for each individual file, sounds like a valid approach to do that for me.
One option might be to write a small program that lists all of the objects in a bucket and, for each object, posts a message to Cloud Pub/Sub that triggers your function in the same way a GCS change would.
It's the first time I used Google Cloud, so I might ask the question in the wrong place.
Information provider upload a new file to Google Cloud Storage every day.
The file contains the information of all my clients/departments.
I have to sort through information and create a new file/s containing the relevant information for each department in my company .so that everyone gets the relevant information to them (security).
I can't figure out what are the steps I need to follow, to complete the task.
Can you help me?
You want to have a process that starts automatically and subsequently generates a new file once you upload something to Google Cloud Storage.
The easiest way to handle this is using Object Change Notifications. You can set up Object Change Notifications per bucket and this will send a POST request to an URL that you can define.
You can then easily set up a server (or run it on app engine) that will execute an action based on the POST request that it receives.
There is an even simpler option (although still in alpha) named cloud functions. Cloud functions is a serverless service that provides event based microservices (e.g. 'do this' if a new file is uploaded on GCS). This means you only have to write the code that defines what needs to happen if a new file is uploaded and then Cloud Functions will take care of executing the code when you upload a file to GCS. See this tutorial on using cloud functions with Google Cloud Storage.