How to create a gcs bucket without selecting a project - google-cloud-storage

I am trying to create a gcs bucket using this link: https://cloud.google.com/storage/docs/creating-buckets, but when I click on "Open The Cloud Storage Browser", it asks me to select a project. I want the bucket content to be publicly available. Any idea how can I do that? I dont want my bucket to be associated with a project, instead it should be in a global namespace. TIA.

Creating a bucket requires specifying a project.
Making the bucket content publicly accessible is a separate matter - you can set ACLs or IAM policies on the bucket that grant public access.

Related

How to migrate GCS bucket from one project to another in different account

How to transfer GCS bucket from one account to another account without downloading data
Is Transfer Service for Cloud Data Chargable?
You don't transfert GCS bucket from an account to another one. The GCS bucket belong to a project.
You can grant new user on the project, on only on the bucket to allow them access. You can also create another bucket, in another project, with another name (project id and bucket name are global resources, 2 can't have the same name all around the world) and use Transfer service to duplicate the data. The service is free of charge if the data stay in the same region (if not, egress cost will be applied)

Cannot create a bucket in google cloud with the name even if i have deleted the previous one?

I created a bucket with my domain name in google cloud to host my static website there. I deleted it for some reason and now iam not able to create a same named bucket. Is there any thing i cannot do that again. Then is there any way i can create bucket to point my website.

Right way of using Google Storage on a GCE VM

I want to know the right/best way of having one machine copying data to Google Storage.
I need one machine to be able to write to a bucket, but not be able to create or delete other buckets.
While researching, I found out that you should create a account service so this account can log in to GC and then use the storage.
But the problem is, when the machine is from GCE, there are scopes. When setting up the scope "Default" it can Read from Google Storage, but can not write to it. Even after authenticated with a service account.
When the scope is Devstorage.read_write now the machine can create and remove buckets from that storage without login. I find that to risk.
Does anyone have any recommendations?
Thanks
The core problem here is that the "write" scope covers both write and delete, and that the GCE service account is likely a member of project-editors, which can create and delete buckets. It sounds like what you want to do is restrict a service account to only being able to affect a single bucket. You should be able to do this with these steps:
Create a service account in your project (and save the private key file).
In the permissions page for the project, make sure that service account is not a project editor for your project.
Using an account that does have full permissions to your project, create the bucket, then grant the service account write access to the bucket. Example gsutil commands to do this:
gsutil mb gs://yourbucket
gsutil acl ch -u your-service-account-name#gserviceaccount.com:W gs://yourbucket
Create a VM that does not have a GCE service account enabled.
Push the service account's private key file to that VM.
On the VM, gcloud auth activate-service-account --key-file=your-key-file.json
Now gsutil commands run on the VM should be able to write to (and delete) objects in that bucket, but not any other buckets in your project.

How to use Service Accounts with gsutil, for downloading from CS - DCM Google private owned bucket

A project, a Google Group have been set up for controlling data access following the DCM guide: https://support.google.com/dcm/partner/answer/3370481?hl=en-GB&ref_topic=6107456
The project does not contain the bucket I want to access(under Storage->Cloud Storage), since it's Google owned bucket, for which I only have read only access. I can see the bucket in my browser since I am allowed to with my Google account(since I am a member of the ACL).
I used the gsutil tool to configure the service account of the project that was linked with the private bucket using
gsutil config -e
but when I try to access that private bucket with
gsutil ls gs://<bucket_name>
I always get 403 errors, and I don't know why is that. Did anyone tried that before or any ideas are welcome.
Since the bucket is private and in project A, service accounts in your project (project B) will not have access. The service account for your project (project B) would need to be added to the ACL for that bucket.
Note that since you can access this bucket with read access as a user, you can run gsutil config to grant your user credentials to gsutil and use that to read the bucket.

GCS bucket permission for Project

I use GCS bucket to upload content. I am distributing a script to my users that helps them download the content in GCS bucket to their local directory. Each of the users are also GCP project owners.
How do I set permissions in GCS bucket to enable only selected GCP projects to access the contents?
Thanks
In order to give access to users in a different project to Cloud Storage bucket you can edit the bucket permissions and can add "Owners, Editors or Viewers" of that project to bucket ACLs e.g. If you want owners of project A to access the bucket with full permissions in project B you can change the permissions of the bucket and add "Project owners- . By doing that all the owners of the project A will have full control on the bucket in project B.
Note: If you change the ACLs of the bucket the changes will apply to new objects uploaded to the bucket. The object already in the bucket will still have the old ACLs.
You can read more about bucket and object ACLs on these links [1]:
[1] https://cloud.google.com/storage/docs/accesscontrol