Powershell for Google Cloud: Authenticate with a service account - powershell

I'm trying to build an automatic sync solution that uses a Google Cloud storage bucket for storing data.
When I install the cloud SDK it asks for my authentication, but obviously I don't want to use my credentials on the client's server, it should be done with a service account with specific permissions, right?
The documentation just says to authenticate with your credentials. What is the security best practice here?

Found it, it's this simple command:
gcloud auth activate-service-account --key-file=credentials.json
And it works! I can upload stuff with PowerShell
The doc is here

Related

How to auth google cloud API from Java in the same way I authenticated with gcloud CLI

Using gcloud command line I can do the following operation
gcloud builds describe 74f859e9-d621-4632-b6dd-XXXXXXXX
However I wish to use the Google Cloud API from Java, now as I understand the GCloud CLI is not using a service account, it is using a user account. How can I use the same authentication from Google Cloud Java API to do this same operation to describe a build?
Google provides decent documentation that explains how to use its SDKs (Client Libraries) with all of its services.
Here's the Cloud Build client libraries documentation. Pick your preferred language and go.
If you can't use one of Google's SDKs, then you can write code directly against the underlying API. Google's APIs Explorer is an excellent tool for navigating all Google's services. Here's Cloud Build and projects.builds.get which I think (!?) maps to gcloud build describe. You can confirm that by running gcloud builds describe --log-http to see which underlying calls are made.
Code that doesn't access user data (data owned by a user account), should run as a Service Account. Code that accesses user data or operates on behalf of a user, should use the OAuth flow for the user and use an OAuth Client ID. This is what gcloud does. As a program operating on behalf of users, it authenticates you the user using a regular OAuth flow but it operates using an OAuth Client ID against a hidden backing project. Your code should probably just run as a service account.

Will gcloud using my SSH key to login or I will always need to login via the web?

I am trying to perform a very basic command like:
gcloud compute machine-types list
And I get this error:
ERROR: (gcloud.compute.machine-types.list) There was a problem
refreshing your current auth tokens: invalid_grant: Bad Request Please
run:
It tells me to login using 'gcloud auth login' which opens up the browser.
Is it possible to use a ssh key to skip this authentication process or I have to do this always? ssh keys are for accessing compute instances only?
Just trying to understand what SSH keys are used for and how this web based authorization fits into the picture here.
Generally, you authenticate to gcloud (and GCP services) using credentials from a Google (often Gmail) account. Such accounts use 3-legged (O)Auth and this requires the browser prompt for the human to confirm the scopes etc.
If you haven't, you should confirm the prompt, copy the token provided and paste that back into gcloud so that auth will occur transparently.
This process is different than SSH'ing to Compute Engine instances.
When you run gcloud compute machine-types list, you're authenticating (and being authorized) by Google Cloud Platform to invoke (meta)services.
When you run gcloud compute ssh ..., the command uses ssh to connect you to the (Linux) instance.
NOTE gcloud auth login --no-launch-browser is available too (link). This requires you to separately launch a browser and complete the process but it doesn't launch the browser directly from the command.
If you are trying to automate some sort of service, that runs cloud commands on-demand, without operator/browser involved - your best bet would be to create a Service Account for that task, get the key for that account and activate it, using
gcloud auth activate-service-account --key-file=my-service-account-key-file.json
If this service runs on Google Cloud platform - you don't even need to deal with the key. Just associate the service account with an instance you are running.
https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances

Recovering access after initially provisioning wrong scopes for an instance

I recently created a VM, but mistakenly gave the default service account Storage: Read Only permissions instead of the intended Read Write under "Identity & API access", so GCS write operations from the VM are now failing.
I realized my mistake, so following the advice in this answer, I stopped the VM, changed the scope to Read Write and started the VM. However, when I SSH in, I'm still getting 403 errors when trying to create buckets.
$ gsutil mb gs://some-random-bucket
Creating gs://some-random-bucket/...
AccessDeniedException: 403 Insufficient OAuth2 scope to perform this operation.
Acceptable scopes: https://www.googleapis.com/auth/cloud-platform
How can I fix this? I'm using the default service account, and don't have the IAM permissions to be able to create new ones.
$ gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* (projectnum)-compute#developer.gserviceaccount.com
I will suggest you to try add the scope "cloud-platform" to the instance by running the gcloud command below
gcloud alpha compute instances set-scopes INSTANCE_NAME [--zone=ZONE]
[--scopes=[SCOPE,…] [--service-account=SERVICE_ACCOUNT
As a scopes put "https://www.googleapis.com/auth/cloud-platform" since it give Full access to all Google Cloud Platform resources.
Here is gcloud documentation
Try creating the Google Cloud Storage bucket with your user account.
Type gcloud auth login and access the link you are provided, once there, copy the code and paste it into the command line.
Then do gsutil mb gs://bucket-name.
The security model has 2 things at play, API Scopes and IAM permissions. Access is determined by the AND of them. So you need an acceptable scope and enough IAM privileges in order to do whatever action.
API Scopes are bound to the credentials. They are represented by a URL like, https://www.googleapis.com/auth/cloud-platform.
IAM permissions are bound to the identity. These are setup in the Cloud Console's IAM & admin > IAM section.
This means you can have 2 VMs with the default service account but both have different levels of access.
For simplicity you generally want to just set the IAM permissions and use the cloud-platform API auth scope.
To check if you have this setup go to the VM in cloud console and you'll see something like:
Cloud API access scopes
Allow full access to all Cloud APIs
When you SSH into the VM by default gcloud will be logged in as the service account on the VM. I'd discourage logging in as yourself otherwise you more or less break gcloud's configuration to read the default service account.
Once you have this setup you should be able to use gsutil properly.

Google Cloud Storage 500 Internal Server Error 'Google::Cloud::Storage::SignedUrlUnavailable'

Trying to get Google Cloud Storage working on my app. I successfully saved an image to a bucket, but when trying to retrieve the image, I receive this error:
GCS Storage (615.3ms) Generated URL for file at key: 9A95rZATRKNpGbMNDbu7RqJx ()
Completed 500 Internal Server Error in 618ms (ActiveRecord: 0.2ms)
Google::Cloud::Storage::SignedUrlUnavailable (Google::Cloud::Storage::SignedUrlUnavailable):
Any idea of what's going on? I can't find an explanation for this error in their documentation.
To provide some explanation here...
Google App Engine (as well as Google Compute Engine, Kubernetes Engine, and Cloud Run) provides "ambient" credentials associated with the VM or instance being run, but only in the form of OAuth tokens. For most API calls, this is sufficient and convenient.
However, there are a small number of exceptions, and Google Cloud Storage is one of them. Recent Storage clients (including the google-cloud-storage gem) may require a full service account key to support certain calls that involve signed URLs. This full key is not provided automatically by App Engine (or other hosting environments). You need to provide one yourself. So as a previous answer indicated, if you're using Cloud Storage, you may not be able to depend on the "ambient" credentials. Instead, you should create a service account, download a service account key, and make it available to your app (for example, via the ActiveStorage configs, or by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable).
I was able to figure this out. I had been following Rail's guide on Active Storage with Google Storage Cloud, and was unclear on how to generate my credentials file.
google:
service: GCS
credentials: <%= Rails.root.join("path/to/keyfile.json") %>
project: ""
bucket: ""
Initially, I thought I didn't need a keyfile due to this sentence in Google's Cloud Storage authentication documentation:
If you're running your application on Google App Engine or Google
Compute Engine, the environment already provides a service account's
authentication information, so no further setup is required.
(I am using Google App Engine)
So I commented out the credentials line and started testing. Strangely, I was able to write to Google Cloud Storage without issue. However, when retrieving the image I would receive the 500 server error Google::Cloud::Storage::SignedUrlUnavailable.
I fixed this by generating my private key and adding it to my rails app.
Another possible solution as of google-cloud-storage gem version 1.27 in August 2020 is documented here. My Google::Auth.get_application_default as in the documentation returned an empty object, but using Google::Cloud::Storage::Credentials.default.client instead worked.
If you get Google::Apis::ClientError: badRequest: Request contains an invalid argument response when signing check that you have dash in the project name in the signing URL (i.e projects/-/serviceAccounts explicit project name in the path is deprecated and no longer valid) and that you have "issuer" string correct, as the full email address identifier of the service account not just the service account name.
If you get Google::Apis::ClientError: forbidden: The caller does not have permission verify the roles your Service Account have:
gcloud projects get-iam-policy <project-name>
--filter="bindings.members:<sa_name>"
--flatten="bindings[].members" --format='table(bindings.role)'
=> ROLE
roles/iam.serviceAccountTokenCreator
roles/storage.admin
serviceAccountTokenCreator is required to call the signBlob service, and you need storage.admin to have ownership of the thing you need to sign. I think these are project global rights, I couldn't get it to work with more fine grained permissions unfortunately (i.e one app is admin for a certain Storage bucket)

Google Cloud Platform: Logging in to GCP from commandline

I was sure it will be simple but couldn't find any documentation or resolution.
I'm trying to write a script using gcloud to perform some operations in my GCP instances.
Is there anyway to login/authenticate using gcloud via command line only?
Thanks
You have a couple of options here (depending on what exactly you're trying to do).
The first option is to log in using the --no-launch-browser option. This still requires interaction from a human user, but doesn't require a browser on the machine you're using:
> gcloud auth login --no-launch-browser
Go to the following link in your browser:
https://accounts.google.com/o/oauth2/auth?redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&prompt=select_account&response_type=code&client_id=32555940559.apps.googleusercontent.com&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute&access_type=offline
Enter verification code: *********************************************
Saved Application Default Credentials.
You are now logged in as [user#example.com].
Your current project is [None]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
The non-interactive option involves service accounts. The linked documentation explains them better than I can, but the short version of what you need to do is as follows:
Create a service account in the Google Developers Console. Make sure it has the appropriate "scopes" (these are permissions that determine what this service account can do. Download the corresponding JSON key file.
Run gcloud auth activate-service-account --key-file <path to key file>.
Note that Google Compute Engine VMs come with a slightly-different service account; the difference is described here.