Gcloud auth for all users on a server - gcloud

I am trying to setup a Gcloud Auth Login for an account on a server that will cover all users.
i.e.
I login using an administrator account and issue the command..
e.g.
gcloud auth login auser#anemail.com
go through the steps required and when I issue the issue the Gcloud Auth List command I get the right result.
But other users cannot see it.
i.e. we use sap data services that use a proxy account on the server when it is running
e.g.
proxyaccount#mail.com
but that user cannot see the the authorized user I authorized using the administrator account.
I get error "you do not currently have an active account selected"
The "other" accounts do not have administration access nor do we want them to, and besides I don't want to have to go through this process for each and every account that connects to the server.
Ian

Each user gets its own gcloud configuration folder. You can see which configuration folder is used by gcloud by running gcloud info.
Note that if your server is a VM on GCP you do not need to configure credentials as they are obtained from metadata server for the VM.
Sharing user credentials is not a good practice. If you need to do this your users can set CLOUDSDK_CONFIG environment variable to point to one shared configuration folder. Also you should at least use service account for this purpose and activate it via gcloud auth activate-service-account instead of using credentials obtained via gcloud auth login.

Related

Will gcloud using my SSH key to login or I will always need to login via the web?

I am trying to perform a very basic command like:
gcloud compute machine-types list
And I get this error:
ERROR: (gcloud.compute.machine-types.list) There was a problem
refreshing your current auth tokens: invalid_grant: Bad Request Please
run:
It tells me to login using 'gcloud auth login' which opens up the browser.
Is it possible to use a ssh key to skip this authentication process or I have to do this always? ssh keys are for accessing compute instances only?
Just trying to understand what SSH keys are used for and how this web based authorization fits into the picture here.
Generally, you authenticate to gcloud (and GCP services) using credentials from a Google (often Gmail) account. Such accounts use 3-legged (O)Auth and this requires the browser prompt for the human to confirm the scopes etc.
If you haven't, you should confirm the prompt, copy the token provided and paste that back into gcloud so that auth will occur transparently.
This process is different than SSH'ing to Compute Engine instances.
When you run gcloud compute machine-types list, you're authenticating (and being authorized) by Google Cloud Platform to invoke (meta)services.
When you run gcloud compute ssh ..., the command uses ssh to connect you to the (Linux) instance.
NOTE gcloud auth login --no-launch-browser is available too (link). This requires you to separately launch a browser and complete the process but it doesn't launch the browser directly from the command.
If you are trying to automate some sort of service, that runs cloud commands on-demand, without operator/browser involved - your best bet would be to create a Service Account for that task, get the key for that account and activate it, using
gcloud auth activate-service-account --key-file=my-service-account-key-file.json
If this service runs on Google Cloud platform - you don't even need to deal with the key. Just associate the service account with an instance you are running.
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances

Recovering access after initially provisioning wrong scopes for an instance

I recently created a VM, but mistakenly gave the default service account Storage: Read Only permissions instead of the intended Read Write under "Identity & API access", so GCS write operations from the VM are now failing.
I realized my mistake, so following the advice in this answer, I stopped the VM, changed the scope to Read Write and started the VM. However, when I SSH in, I'm still getting 403 errors when trying to create buckets.
$ gsutil mb gs://some-random-bucket
Creating gs://some-random-bucket/...
AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation.
Acceptable scopes: https://www.googleapis.com/auth/cloud-platform
How can I fix this? I'm using the default service account, and don't have the IAM permissions to be able to create new ones.
$ gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* (projectnum)-compute#developer.gserviceaccount.com
I will suggest you to try add the scope "cloud-platform" to the instance by running the gcloud command below
gcloud alpha compute instances set-scopes INSTANCE_NAME [--zone=ZONE]
[--scopes=[SCOPE,…] [--service-account=SERVICE_ACCOUNT
As a scopes put "https://www.googleapis.com/auth/cloud-platform" since it give Full access to all Google Cloud Platform resources.
Here is gcloud documentation
Try creating the Google Cloud Storage bucket with your user account.
Type gcloud auth login and access the link you are provided, once there, copy the code and paste it into the command line.
Then do gsutil mb gs://bucket-name.
The security model has 2 things at play, API Scopes and IAM permissions. Access is determined by the AND of them. So you need an acceptable scope and enough IAM privileges in order to do whatever action.
API Scopes are bound to the credentials. They are represented by a URL like, https://www.googleapis.com/auth/cloud-platform.
IAM permissions are bound to the identity. These are setup in the Cloud Console's IAM & admin > IAM section.
This means you can have 2 VMs with the default service account but both have different levels of access.
For simplicity you generally want to just set the IAM permissions and use the cloud-platform API auth scope.
To check if you have this setup go to the VM in cloud console and you'll see something like:
Cloud API access scopes
Allow full access to all Cloud APIs
When you SSH into the VM by default gcloud will be logged in as the service account on the VM. I'd discourage logging in as yourself otherwise you more or less break gcloud's configuration to read the default service account.
Once you have this setup you should be able to use gsutil properly.

Google cloud credentials totally hosed after attempting to setup boto

I had a gcloud user authenticated and was running gsutils fine from the command line (Windows 8.1). But I needed to access gsutils from a python application so I followed the instructions here:
https://cloud.google.com/storage/docs/xml-api/gspythonlibrary#credentials
I got as far as creating a .boto file, but now not only does the my python code fail (boto.exception.NoAuthHandlerFound: No handler was ready to authenticate.). But I can't run bsutils from the command line any more. I get this error:
C:\>gsutil ls
You are attempting to access protected data with no configured
credentials. Please visit https://cloud.google.com/console#/project
and sign up for an account, and then run the "gcloud auth login"
command to configure gsutil to use these credentials.
I have run gcloud auth and it appears to work, I can query my users:
C:\>gcloud auth list
Credentialed Accounts:
- XXXserviceuser#XXXXX.iam.gserviceaccount.com ACTIVE
- myname#company.name
To set the active account, run:
$ gcloud config set account `ACCOUNT`
I have tried both with the account associated with my email active, and the new serveruser account (created following instructions above). Same "protected data with no configured credentials." error. I tried removing the .boto file, and adding the secret CLIENT_ID and CLIENT_SECRET to my .boto file.
Anyone any ideas what the issue could be?
So I think the latest documentation/examples showing how to use (and authenticate) Google Cloud storage via python is in this repo:
https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/storage/api
That just works for me without messing around with keys and service users.
Would be nice if there was a comment somewhere in the old gspythonlibrary docs pointing this out.

How can I create a signed URL for Google Cloud Storage with a project level service account?

For every Google Compute instance, there is a default service account like this:
1234567890123-compute#developer.gserviceaccount.com
I can create my instance with the proper scope (i.e. https://www.googleapis.com/auth/devstorage.full_control) and use this account to make API requests.
On this page: https://cloud.google.com/storage/docs/authentication#service_accounts it says:
Every project has a service account associated with it, which may be used for authentication and to enable advanced features such as Signed URLs and browser uploads using POST.
This implies that I can use this service account to created Signed URLs. However, I have no idea how to create a signed URL with this service account since I can't seem to get the private key (.p12 file) associated with this account.
I can create a new, separate service account from the developer console, and that has the option of downloading a .p12 file for signing, but the project level service accounts do not appear under the "APIs and auth / Credentials" section. I can see them under "Project / Permissions", but I can't do anything with them there.
Am I missing some other way to retrieve the private key for these default accounts, or is there no way to sign urls when using them?
You can use p12 key of any of your service account while you're authenticated through your main account or a GCE service account or other services accounts that have appropriate permissions on the bucket and the file.
In this case, just create a service account download p12 key and use the following command to sign your URL:
$ gsutil signurl -d 10m privatekey.p12 gs://bucket/foo
Though you can authenticate using different service account using the following command:
gcloud auth activate-service-account service-account-email --key-file key.p12
You can list and switch your accounts using these commands:
$ gcloud auth list
$ gcloud config set account

gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE

I am logged in to a GCE instance via SSH. From there I would like to access the Storage with the help of a Service Account:
GCE> gcloud auth list
Credentialed accounts:
- 1234567890-compute#developer.gserviceaccount.com (active)
I first made sure that this Service account is flagged "Can edit" in the permissions of the project I am working in. I also made sure to give him the Write ACL on the bucket I would like him to copy a file:
local> gsutil acl ch -u 1234567890-compute#developer.gserviceaccount.com:W gs://mybucket
But then the following command fails:
GCE> gsutil cp test.txt gs://mybucket/logs
(I also made sure that "logs" is created under "mybucket").
The error message I get is:
Copying file://test.txt [Content-Type=text/plain]...
AccessDeniedException: 403 Insufficient Permission 0 B
What am I missing?
One other thing to look for is to make sure you set up the appropriate scopes when creating the GCE VM. Even if a VM has a service account attached, it must be assigned devstorage scopes in order to access GCS.
For example, if you had created your VM with devstorage.read_only scope, trying to write to a bucket would fail, even if your service account has permission to write to the bucket. You would need devstorage.full_control or devstorage.read_write.
See the section on Preparing an instance to use service accounts for details.
Note: the default compute service account has very limited scopes (including having read-only to GCS). This is done because the default service account has Project Editor IAM permissions. If you use any user service account this is not typically a problem since user created service accounts get all scope access by default.
After adding necessary scopes to the VM, gsutil may still be using cached credentials which don't have the new scopes. Delete ~/.gsutil before trying the gsutil commands again. (Thanks to #mndrix for pointing this out in the comments.)
You have to log in with an account that has the permissions you need for that project:
gcloud auth login
gsutil config -b
Then surf to the URL it provides,
[ CLICK Allow ]
Then copy the verification code and paste to terminal.
Stop VM
goto --> VM instance details.
in "Cloud API access scopes" select "Allow full access to all Cloud APIs" then
Click "save".
restart VM and Delete ~/.gsutil .
I have written an answer to this question since I can not post comments:
This error can also occur if you're running the gsutil command with a sudo prefix in some cases.
After you have created the bucket, go to the permissions tab and add your email and set Storage Admin permission.
Access VM instance via SSH >> run command: gcloud auth login and follow the steps.
Ref: https://groups.google.com/d/msg/gce-discussion/0L6sLRjX8kg/kP47FklzBgAJ
So I tried a bunch of things trying to copy from GCS bucket to my VM.
Hope this post helps someone.
Via SSHed connection:
and following this script:
sudo gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION_IN_LOCAL]
Got this error:
AccessDeniedException: 403 Access Not Configured. Please go to the Google Cloud Platform Console (https://cloud.google.com/console#/project) for your project, select APIs and Auth and enable the Google Cloud Storage JSON API.
What fixed this was following "Activating the API" section mentioned in this link -
https://cloud.google.com/storage/docs/json_api/
Once I activated the API then I authenticated myself in SSHed window via
gcloud auth login
Following authentication procedure I was finally able to download from Google Storage Bucket to my VM.
PS
I did make sure to:
Make sure that gsutils are installed on my VM instance.
Go to my bucket, go to the permissions tab and add desired service accounts and set Storage Admin permission / role.
3.Make sure my VM had proper Cloud API access scopes:
From the docs:
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances#changeserviceaccountandscopes
You need to first stop the instance -> go to edit page -> go to "Cloud API access scopes" and choose "storage full access or read/write or whatever you need it for"
Changing the service account and access scopes for an instance If you
want to run the VM as a different identity, or you determine that the
instance needs a different set of scopes to call the required APIs,
you can change the service account and the access scopes of an
existing instance. For example, you can change access scopes to grant
access to a new API, or change an instance so that it runs as a
service account that you created, instead of the Compute Engine
Default Service Account.
To change an instance's service account and access scopes, the
instance must be temporarily stopped. To stop your instance, read the
documentation for Stopping an instance. After changing the service
account or access scopes, remember to restart the instance. Use one of
the following methods to the change service account or access scopes
of the stopped instance.
Change the permissions of bucket.
Add a user for "All User" and give "Storage Admin" access.