How to authenticate to GCP for command line uploads - google-cloud-storage

I have created a service account and I can authenticate with a
gcloud auth activate-service-account --key-file <json file>
and I have successfully executed the gsutil rsync command to upload files. The issue is that the gcloud auth command appears to terminate the batch file and the following gsutil rsync commands do not execute. How should the batch file be setup to allow the authentication followed by the rsync commands and then a final auth revoke?

Related

cloud_sql_proxy uses unspecified credential file instead of gcloud auth login

My computer rebooted and for some reasons I ignore, cloud_sql_proxy tries to use a credential file instead of my gcloud auth login:
cloud_sql_proxy --instances=instance:region:orcadb=tcp:54321
2022/09/06 12:43:07 Rlimits for file descriptors set to {Current = 8500, Max =
9223372036854775807}
2022/09/06 12:43:07 using credential file for authentication; email=example#example.iam.gserviceaccount.com
2022/09/06 12:43:08 errors parsing config:
googleapi: Error 403: The client is not authorized to make this request., notAuthorized
I checked that my gcloud login is correct by using gcloud auth login, and then checking using gcloud config list account.
I also tried adding the flag --enable_iam_login to the command.
My permissions are already set to owner.
How can I use cloud_sql_proxy without the credential file? Thanks! :)
If no credential file is pointed for cloud_sql_proxy then it takes the file path from the GOOGLE_APPLICATION_CREDENTIALS env. var first.
You have to check if this env var has a value
echo %GOOGLE_APPLICATION_CREDENTIALS%
If yes then clean it up
set GOOGLE_APPLICATION_CREDENTIALS=
Now the cloud_sql_proxy should take the current glcoud account

Google storage service auth not working with cron job

I have this script.sh file, to upload file to G-storage
gcloud auth activate-service-account production-storage#123-testing.iam.gserviceaccount.com --key-file /opt/key.json
gsutil cp /tmp/sampeFile gs://mybuket/backup/1/
When i execute it using terminal, it works fine and upload the file to google storage
But when i run the script using cron job, * * * * * /opt/script.sh the script get executed but file does not get upload to G-storage
NO error is printed as well.
What am i doing wrong. Request for help ?

`gcloud docker` command refreshing access token on each invocation

macOS
Google Cloud SDK 183.0.0
beta 2017.09.15
bq 2.0.27
core 2017.12.08
gcloud
gsutil 4.28
kubectl
I have tried revoking the login and logging in again but every command of gcloud docker refreshes the access token so it hangs about 30 seconds before actually executing a command.
$ gcloud --verbosity=debug docker -- --help
DEBUG: Running [gcloud.docker] with arguments: [--verbosity: "debug"]
INFO: Refreshing access_token
$ gcloud --verbosity=debug docker -- --help
DEBUG: Running [gcloud.docker] with arguments: [--verbosity: "debug"]
INFO: Refreshing access_token
Any ideas what could be causing this?
Login Debug Log
$ gcloud --verbosity=debug auth login
DEBUG: Running [gcloud.auth.login] with arguments: [--verbosity: "debug"]
Your browser has been opened to visit:
https://accounts.google.com/o/oauth2/auth?redirect_uri=http%3A%2F%2Flocalhost%3A8085%2F&prompt=select_account&response_type=code&client_id=32555940559.apps.googleusercontent.com&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth&access_type=offline
INFO: Successfully retrieved access token
WARNING: `gcloud auth login` no longer writes application default credentials.
If you need to use ADC, see:
gcloud auth application-default --help
You are now logged in as [xxx].
Your current project is [xxx]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
INFO: Display format "none".

How can we run gcloud/gsutil/bq command for different accounts in parallel in one server?

I have installed gcloud/bq/gsutil command line tool in one linux server.
And we have several accounts configured in this server.
**gcloud config configurations list**
NAME IS_ACTIVE ACCOUNT PROJECT DEFAULT_ZONE DEFAULT_REGION
gaa True a#xxx.com a
gab False b#xxx.com b
Now I have problem to both run gaa/gab in this server at same time. Because they have different access control on BigQuery and Cloud Stroage.
I will use below commands (bq and gsutil commands):
Set up account
Gcloud config set account a#xxx.com
Copy data from bigquery to Cloud
bq extract --compression=GZIP --destination_format=NEWLINE_DELIMITED_JSON 'nl:82421.ga_sessions_20161219' gs://ga-data-export/82421/82421_ga_sessions_20161219_*.json.gz
Download data from Cloud to local system
gsutil -m cp gs://ga-data-export/82421/82421_ga_sessions_20161219*gz
If only run one account, it is not a problem.
But there are several accounts need to run on one server at same time, I have no idea how to deal with this case.
Per the gcloud documentation on configurations, you can switch your active configuration via the --configuration flag for any gcloud command. However, gsutil does not have such a flag; you must set the environment variable CLOUDSDK_ACTIVE_CONFIG_NAME:
$ # Shell 1
$ export CLOUDSDK_ACTIVE_CONFIG_NAME=gaa
$ gcloud # ...
$ # Shell 2
$ export CLOUDSDK_ACTIVE_CONFIG_NAME=gab
$ gsutil # ...

Unable to push to 2 different container registries with 2 different service accounts

I'm automating our build process. Before push is executed, I run the following script that logs the correct service account in.
if [[ "${DEPLOY_ENV}" == "production" ]]; then
gcloud auth activate-service-account --key-file "$DIR/production-secret.json"
else
gcloud auth activate-service-account --key-file "$DIR/test-secret.json"
fi
However, no matter which login, I'm always pushing to our "test" account's registry when I execute this command
gcloud docker -- push gcr.io/talk-like-humans/api:${IMAGE_VERSION}
Is there another command I need to run to set my push endpoint to be in the correct account?
Thanks,
Todd
The repository you push to is determined by the image name, not the credential used. Your command
gcloud docker -- push gcr.io/talk-like-humans/api:${IMAGE_VERSION}
means that you will always push an image called api:${IMAGE_VERSION} to the talk-like-humans repository (in the gcr.io registry). If both your test and production credentials have write access there, they will both succeed.
It sounds to me like you want something like this:
if [[ "${DEPLOY_ENV}" == "production" ]]; then
gcloud auth activate-service-account --key-file "$DIR/production-secret.json"
REPO="${PROD_REPO}"
else
gcloud auth activate-service-account --key-file "$DIR/test-secret.json"
REPO="${TEST_REPO}"
fi
`gcloud docker -- push "gcr.io/${REPO}/api:${IMAGE_VERSION}"`
In addition, I would revoke access from ${PROD_REPO} for the service account in test-secret.json -- this way your dev process can't accidentally push to production.