I have a couple Cloud Storage buckets I want to access programatically via API, but the buckets are visible solely to members of a google group. (They're for Doubleclick bid manager reports, if that helps) I provided the name and group email address for a group that I made specifically to get access to these cloud storage buckets.
Looking around on my individual developer console, I can gain access to authentication stuff so I can access my own google account's buckets, but when I use the credentials from my google developer console's project, hopefully granting me access based on which account I'm using (Which I'm assuming is totally wrong) using this code:
$key = file_get_contents("[PATH_TO_KEY].p12");
$cred = new Google_Auth_AssertionCredentials("[ACCOUNT STUFF]",
array("https://www.googleapis.com/auth/devstorage.full_control"),
$key);
$client->setAssertionCredentials($cred);
$client->setClientId($client_id);
$client->setClientSecret($client_secret);
$client->setDeveloperKey($dev_api_key);
$client->setRedirectUri($redirect);
$client->setScopes("https://www.googleapis.com/auth/devstorage.full_control");
$service = new Google_Service_Storage($client);
$bucket = $service->objects->listObjects($da_bucket);
I get a 403. (I might also be authenticating wrong. sorry trying to POC something here)
Is there some place where I can gain API access for a bucket that's accessible for all members of a Google group, but not specifically me?
The service account associated with that .p12 key needs to be a member of the group that can read the bucket.
Related
We’ve been trying to understand if Google Cloud can access data stored in Google Storage when using Google-managed encryption keys.
We want to understand if Google potentially has access to the data stored. If yes, is there a way to restrict such access?
Yes, Google can. No, you cannot restrict Google.
Google publishes data policy documents on its website on how/when/if they access your data. Data access is logged so that you can see such accesses. There is a process requiring approval. A Google employee cannot just poke around in your data. Similar to most legal documents, you must read the documents to understand the details and conditions.
Start with this privacy page:
Privacy Resource Center
Is there way to do a PUT or POST into a Google Storage bucket with an API key
In the api explorer there is the ability to test this out with OAUTH and the API key, but the explorer doesn't allow me to use the api
Is this possible?
PUT https://www.googleapis.com/storage/v1/b/bucket/o/object&key="InsertSomeKey"
OR
https://www.googleapis.com/upload/storage/v1/b/bucket/o?uploadType=media&name=testobject&key="InsertSomeKey"
Not exactly, first of all, you need to authenticate the account that you need to connect, anyway, you must configure an account for access the bucket, like a user account or service account; then you cant make a petition, this is the API reference:
https://cloud.google.com/storage/docs/json_api/v1/
I suggested to do it by code, using the libraries:
https://cloud.google.com/storage/docs/reference/libraries
Tried sharing a bucket with a colleague
Initially I added the "Storage.Object.Viewer" role, and sent the link https://console.cloud.google.com/storage/browser/bucket_name/
However on opening the link the following error was received:
You need the storage.objects.list permission to list objects in this
bucket. Ask a project or bucket owner to give you this permission and
try again.
I added more roles, and finally gave admin rights, but kept getting the same error.
How can I share a bucket with all files? specifically I would like to share with read-only permissions
Although a solution has been discovered for this issue, I'm going to summarise some relevant information which may be useful to someone who stumbles on a similar issue.
Project information isn't required in requests to storage buckets, because bucket names are required to be globally unique on Google Cloud Platform, which means if you specify a bucket name in any request, the request will point to the correct bucket no matter what project it resides within, so permissions for a given user to access that bucket must have been set-up in some capacity.
To allow users to list objects in a bucket, they must have been assigned a role with the storage.objects.list permission. Minimal role's that allow the listing of objects in buckets include:
Storage Object Viewer
Which allows users to view objects and their metadata, except for ACLs. They can also list the objects in a bucket.
Project Viewer
This roles also provides users permission to view other resources in the project. In terms of Cloud Storage, users can list buckets. They can also view bucket metadata, excluding ACLs, when listing.This role can only be applied to a project.
There are other storage specific roles which allow users to list objects in buckets, but to also have other authorisation, for example, to edit/create/delete objects. They include:
Storage Object Admin
Users have full control over objects, including listing, creating, viewing, and deleting objects.
Storage Admin
Users have full control of buckets and objects.
For more information on Cloud Storage IAM Roles please see here.
Assuming the google account used to access the URL by your colleague is the one you gave permissions to, you need to also grant "Viewer" role at the project level else he wouldn't be able to login to the GCP console and access the bucket.
I have uploaded some test images to a Google Cloud Bucket, but don't want to make them public (which would be cheating). When I try to run a rest call for Google Vision API I get:
{
"responses": [
{
"error": {
"code": 7,
"message": "image-annotator::User lacks permission.: Can not open file: gs://images-translate-156512/P1011234.JPG"
}
}
]
}
What are the steps to enable the Google Vision API to access Google Cloud Storage objects within the same project? At the moment I am using only the API key while I experiment with Google Vision. I am suspecting a service account may be required and an ACL on the GCS objects.
I could bypass GCS altogether and base64 encode the image and send it Google Vision API, but really want to try and solve this use case. Not used ACLs yet, or service accounts.
Any help appreciated
API keys are not used for authorization. They are used in situations where you want to specify which project should be billed for an operation and which project's quota should be used, but they do not authenticate you as any particular entity.
In order to use the Cloud Vision API with a non-public GCS object, you'll need to send OAuth authentication information along with your request for a user or service account which has permission to read the GCS object.
Authorization is a complex topic. I recommend using the gcloud command-line utility for ad-hoc experimentation or the Google Cloud client library for the language you're developing in.
Google has a guide for authorization here: https://cloud.google.com/docs/authentication
We are integrating filepicker.io with Google Cloud Storage (ie. asking filepicker to write the uploaded files to our Google Cloud Storage account). Their documentation is pretty clear, however I found that I have to give the service account "Editor" access to the whole project, which is a security concern for us (it means that if somebody gets access to the access tokens used by Filepicker, they can do whatever they want with our Google Cloud project instead of just having access to the file). Trying to use some more restrictive permissions (like "Storage Object Creator" + "Storage Object Viewer") makes Filepicker fail.
Had anyone managed to configure the Google Cloud Storage integration of Filepicker.io with something less than "Editor" access to the project?
Just wanted to note that in the mean time I implemented a workaround: I created a separate Google Cloud project and only gave "Storage Admin" access to that one to Filepicker. Then I gave permissions to the accounts from the other projects to use the given storage bucket from the "upload" project. So this way at least any token leaked on Filepicker's end is limited to accessing this "upload" project.
I just tested this and it seems that the minimum required role to store to GCS using Filestack/Filepicker is either the "Project Editor" role, or the "Storage Admin" role. I will submit a feature request to allow more varied role options.