gcs fuse on two seperate instances - google-cloud-storage

So I have two separate accounts I'm sharing a gcloud bucket between. At first I had problems getting the credentials right but eventually I just added all the email-looking accounts under the IAM on the second account to the storage buckets permissions. I gave those accounts all the roles, as I want to be able to read and write from both accounts vm instances. At this point I can mount using gcsfuse, But I can't read or write to it? I can see the filesystem, but anytime I try to copy from or to I get a input/output error?

Related

Unable to transfer GCS bucket from one account to another

I am trying to create a transfer job in Data Transfer, to copy all files in a bucket belonging to one account to an existing bucket belonging to another account.
I get access to both source and destination buckets, I get "green light" in the wizard, but when I try to run the transfer job I get the following error message:
To complete this transfer, you need the 'storage.buckets.setIamPolicy'
permission for the source bucket. Ask the bucket's administrator to
grant you the required permission and try again.
I have tried to apply various roles to the user runnning the transfer job, but I can't figure out how to overcome this problem.
Can anyone help me on this?
This permission storage.buckets.setIamPolicy can be granted with either roles/storage.legacyBucketOwner or roles/iam.securityAdmin role. It could be needed to keep the permissions applied to the source object.
Permissions for copying an object:
storage.objects.create (for the destination bucket)
storage.objects.delete (for the destination bucket)
storage.objects.get (for the source object)
storage.objects.getIamPolicy (for the source object)
storage.objects.setIamPolicy (for the destination bucket)
Please see:
Cloud IAM > Documentation > Understanding roles
Cloud Storage > Documentation > Reference > Cloud IAM roles

Data Transfer between Google Storage different Service Accounts

I have two Google Service Credentials and a bucket on each account .I have to transfer files from one bucket to another. How can I do this programmatic ally?
Can I achieve this with two Storage objects or using the Cloud storage Transfer service?
Yes, with Storage Transfer Service you can create a transfer job and send the data to a destination bucket (in another project), keep in mind that it is documented that:
To access the data source and the data sink, this service account must
have source permissions and sink permissions.
Meaning that you can't use two different service accounts, you will need to grant access to only one of the two service accounts you have.
If you want to transfer files from one bucket to another programmatically. First, you must grant permission to the service account associated with the Storage Transfer Service so it can access the data sink(destination bucket), please follow these steps.
Please note that if you are not creating the transfer job in the same project where the source bucket is located, then you must grant permissions to access it.
With Storage Transfer Service you can create a transfer job programmatically with Java and Python, examples include creating the transfer job and checking the transfer operation status. Full code example can be found for Java and Python.

Access google cloud data without GCP account

I have created a bucket and files in it with Google Cloud Storage. I have also edited the permissions of the bucket to allow access to persons within a Googlegroup account.
Now, if they need to access the data, do they need to "sign up" at the Google Cloud Platform?
Is there anyway they can copy all the files in the bucket using gsutil again without GCP account?
No, there's not a way to allow access to a bucket in that way, but Google accounts and GCP accounts are the same, so anyone with a gmail account could access it.
The closest thing to your use case is Signed URLs, which can grant access to individual objects.

working with Google Cloud Storage without gsutil

I have developed a Software in which is configured directories to save files. I run it on Linux. These directories are informed by config file.
I would like to use compute engine nodes because I need to increase its performance. Therefore, I would like to use Google Storage to save these files into a save repository.
In [1] is showed mounting a bucket as file system. I tried it, but no success. I receive authentication error.
Can anyone help me to get success in order to access my bucket by compute engine nodes ?
[1] https://cloud.google.com/compute/docs/disks/gcs-buckets
Best regards,
It sounds like you did not start your GCE instance with a service account.
According to the docs you linked, you need to configure a service account or run gcloud auth login to configure your credentials for accessing cloud storage.
If you are trying to set up gcsfuse without running on GCE you will need to use the gcloud auth login approach.

Right way of using Google Storage on a GCE VM

I want to know the right/best way of having one machine copying data to Google Storage.
I need one machine to be able to write to a bucket, but not be able to create or delete other buckets.
While researching, I found out that you should create a account service so this account can log in to GC and then use the storage.
But the problem is, when the machine is from GCE, there are scopes. When setting up the scope "Default" it can Read from Google Storage, but can not write to it. Even after authenticated with a service account.
When the scope is Devstorage.read_write now the machine can create and remove buckets from that storage without login. I find that to risk.
Does anyone have any recommendations?
Thanks
The core problem here is that the "write" scope covers both write and delete, and that the GCE service account is likely a member of project-editors, which can create and delete buckets. It sounds like what you want to do is restrict a service account to only being able to affect a single bucket. You should be able to do this with these steps:
Create a service account in your project (and save the private key file).
In the permissions page for the project, make sure that service account is not a project editor for your project.
Using an account that does have full permissions to your project, create the bucket, then grant the service account write access to the bucket. Example gsutil commands to do this:
gsutil mb gs://yourbucket
gsutil acl ch -u your-service-account-name#gserviceaccount.com:W gs://yourbucket
Create a VM that does not have a GCE service account enabled.
Push the service account's private key file to that VM.
On the VM, gcloud auth activate-service-account --key-file=your-key-file.json
Now gsutil commands run on the VM should be able to write to (and delete) objects in that bucket, but not any other buckets in your project.