Can anybody help if we can create bucket folder in google cloud storage based on year month and date programmatically? I don't want to create manually as it will be difficult to create a bucket folder each day.
Buckets don't have folders or subdirectories. Please have a look here. However, you can use the workaround described in this question and create a folder through the API. I would put the code in a Cloud Function.
Please note that you should give the required permissions to the App Engine service account as Cloud Functions is using that account and then you can trigger this function with Cloud Scheduler and Pub/Sub by following the instructions here.
Related
I new with S3 storage for flutter, and have the following (basic) question, which I cannot get to the bottom of.
I understand with Amplify for Flutter, the app can upload and read stuff from the S3 storage bucket. I have succeeded in doing this.
However, my question is, if I have items already stored on the S3 server, and I want my app to be able to read and list the items there, how can I do it. Meaning, I don't want to upload from the app to there, rather it is an existing bucket with data, that I want the app to be able to read. Also, I don't want there to have to be a login, and all the examples I see for using S3 with flutter is when there is authenticaion login.
Sorry for the basic question, but I'm getting confused from the start. Any basic guidance will be of great help. Or links to tutorial, etc
Thanks
Try this: https://docs.amplify.aws/lib/storage/existing-resources/q/platform/ios/
I know it's the ios docs, but it appears to work for Flutter.
Run amplify import storage follow the prompts to select your bucket.
Run amplify push.
Your storage should now use your pre-existing s3 bucket. Look in the Amplify console to confirm.
Ideally, I would like to write a function to start a dataprep job on one of the following events
kafka message
file added or change to GCS.
I'm thinking I could write the triggers in python if there is a support library. But I can't find one. Happy to use a different language if I don't have python available.
Thanks
Yes there is a library available now that you can use.
https://cloud.google.com/dataprep/docs/html/API-Workflow---Run-Job_145281449
This explains about Dataprep API and how we can run and schedule the jobs.
If you are able to do it using python and this API. Please post example here as well.
The API documentation for the Trifacta related product is available at https://api.trifacta.com.
Note that to use the Google Dataprep API, you will need to obtain an access token (see https://cloud.google.com/dataprep/docs/html/Manage-API-Access-Tokens_145281444).
You must be a project owner to create access tokens and the Dataprep API for that project. Once that's done, you can create access tokens using the Access tokens page, under the user preferences.
I want to give a service account read-only access to every bucket in my project. What is the best practice for doing this?
The answers here suggest one of:
creating a custom IAM policy
assigning the Legacy Bucket Viewer role on each bucket
using ACLs to allow bucket.get access
None of these seem ideal to me because:
Giving read-only access seems too common a need to require a custom policy
Putting "Legacy" in the name makes it seem like this permission will be retired relatively soon and any new buckets will require modification
Google recommends IAM over ACL and any new buckets will require modification
Is there some way to avoid the bucket.get requirement and still access objects in the bucket? Or is there another method for providing access that I don't know about?
The closest pre-built role is Object Viewer. This allows listing and reading objects. It doesn't include storage.buckets.get permission, but this is not commonly needed - messing with bucket metadata is really an administrative function. It also doesn't include storage.buckets.list which is a bit more commonly needed but is still not part of normal usage patterns for GCS - generally when designing an app you have a fixed number of buckets for specific purposes, so listing is not useful.
If you really do want to give a service account bucket list and get permission, you will have to create a custom role on the project. This is pretty easy, you can do it with:
gcloud iam roles create StorageViewerLister --project=$YOUR_POJECT --permissions=storage.objects.get,storage.objects.list,storage.buckets.get,storage.buckets.list
gcloud projects add-iam-policy-binding $YOUR_PROJECT --member=$YOUR_SERVICE_ACCOUNT --role=StorageViewerLister
I want a storage bucket to be owned by multiple projects, without the use of a service account. How can I accomplish this?
Google Cloud Storage buckets can only belong to a single project, and that project will foot the bill for that bucket. Permissions can absolutely span multiple accounts, though. Any permission for a user or a group that can be applied to a bucket in one project can also be applied to a bucket in another project.
What exactly are you hoping to do?
I don't think you can automatically push a 'bucket' to another user's console, in another project.
I do think you could have your users mount a shared bucket using gcsfuse. Somewhere outside of the console.
https://github.com/GoogleCloudPlatform/gcsfuse
Then when you add 'stuff' to the bucket, it would show up in their mounted folder just like DropBox, Google Drive, Box, OneDrive etc.
With those other projects you mentioned. You don't get to automatically push folders into the folder of another user, similarly you can't automatically push a 'bucket' into the 'project' of another user.
But you can offer a share, that the users have the option of using.
I have design a website where user can upload images and videos. Is it possible the use google cloud storage with compute engine.
Yes, it is possible to use any of google cloud component with each other.
This must be done while creating your instance you need to select the other google component which you want to use with the instance.
In Console while creating new instance, more options you will see
ACCESS Security---Project access and then select which component you want to use with an instance and assign the permission level.
Same will be also possible with gcloud compute while creating instance
I hope this will be helpful.