Google storage service auth not working with cron job - google-cloud-storage

I have this script.sh file, to upload file to G-storage
gcloud auth activate-service-account production-storage#123-testing.iam.gserviceaccount.com --key-file /opt/key.json
gsutil cp /tmp/sampeFile gs://mybuket/backup/1/
When i execute it using terminal, it works fine and upload the file to google storage
But when i run the script using cron job, * * * * * /opt/script.sh the script get executed but file does not get upload to G-storage
NO error is printed as well.
What am i doing wrong. Request for help ?

Related

How to authenticate to GCP for command line uploads

I have created a service account and I can authenticate with a
gcloud auth activate-service-account --key-file <json file>
and I have successfully executed the gsutil rsync command to upload files. The issue is that the gcloud auth command appears to terminate the batch file and the following gsutil rsync commands do not execute. How should the batch file be setup to allow the authentication followed by the rsync commands and then a final auth revoke?

how do I elevate my gcloud scp and ssh commands?

I want to be able to fire commands at my instance with gcloud because it handles auth for me. This works well but how do I run them with sudo/root access?
For example I can copy files to my accounts folder:
gcloud compute scp --recurse myinst:/home/me/zzz /test --zone us-east1-b
But I cant copy to /tmp:
gcloud compute scp --recurse /test myinst:/tmp --zone us-east1-b
pscp: unable to open directory /tmp/.pki: permission denied
19.32.38.265147.log | 0 kB | 0.4 kB/s | ETA: 00:00:00 | 100%
pscp: unable to open /tmp/ks-script-uqygub: permission denied
What is the right way to run "gcloud compute scp" with sudo? Just to be clear, I of course can ssh into the instance and run sudo interactively
Edit: for now im just editing the permissions on the remote host
Just so I'm understanding correctly, are you trying to copy FROM the remote /tmp folder, or TO it? This question sounds like you're trying to copy to it, but the code says you're trying to copy from it.
This has worked for me in the past for copying from my local drive to a remote drive, though I have some concern over running sudo remotely:
gcloud compute scp myfile.txt [gce_user]#myinst:~/myfile.txt --project=[project_name];
gcloud compute ssh [gce_user]#myinst --command 'sudo cp ~/myfile.txt /tmp/' --project=[project_name];
You would reverse the process (and obviously rewrite the direction and sequence of the commands) if you needed to remotely access the contents of /tmp and then copy them down to your local drive.
Hope this helps!

How can we run gcloud/gsutil/bq command for different accounts in parallel in one server?

I have installed gcloud/bq/gsutil command line tool in one linux server.
And we have several accounts configured in this server.
**gcloud config configurations list**
NAME IS_ACTIVE ACCOUNT PROJECT DEFAULT_ZONE DEFAULT_REGION
gaa True a#xxx.com a
gab False b#xxx.com b
Now I have problem to both run gaa/gab in this server at same time. Because they have different access control on BigQuery and Cloud Stroage.
I will use below commands (bq and gsutil commands):
Set up account
Gcloud config set account a#xxx.com
Copy data from bigquery to Cloud
bq extract --compression=GZIP --destination_format=NEWLINE_DELIMITED_JSON 'nl:82421.ga_sessions_20161219' gs://ga-data-export/82421/82421_ga_sessions_20161219_*.json.gz
Download data from Cloud to local system
gsutil -m cp gs://ga-data-export/82421/82421_ga_sessions_20161219*gz
If only run one account, it is not a problem.
But there are several accounts need to run on one server at same time, I have no idea how to deal with this case.
Per the gcloud documentation on configurations, you can switch your active configuration via the --configuration flag for any gcloud command. However, gsutil does not have such a flag; you must set the environment variable CLOUDSDK_ACTIVE_CONFIG_NAME:
$ # Shell 1
$ export CLOUDSDK_ACTIVE_CONFIG_NAME=gaa
$ gcloud # ...
$ # Shell 2
$ export CLOUDSDK_ACTIVE_CONFIG_NAME=gab
$ gsutil # ...

How to save a file from an https link to Google Cloud Storage

I would like to save a large file (approximately 50 GB) directly on Google Cloud storage. I tried gsutil cp https://archive.org/download/archiveteam-twitter-stream-2015-08/archiveteam-twitter-stream-2015-08.tar gs://my/folder, but that didn't work (InvalidUrlError: Unrecognized scheme "https").
Is there a way of doing that, without having to first download the file to my local storage?
Thanks!
You can use curl to fetch the URL and pipe it to gsutil. For example:
curl -L https://archive.org/download/archiveteam-twitter-stream-2015-08/archiveteam-twitter-stream-2015-08.tar | gsutil cp - gs://your/folder/archiveteam-twitter-stream-2015-08.tar

Is it possible to automate Gsutil login?

Is it possible to automate gsutil based file upload to google cloud store so that the user intervention is not required for login?
My usecase is to have a jenkins job which polls a SCM location for changes to a set of files. If it detects any changes it will upload all files to a specific Google Cloud Store bucket.
After you configure your credentials once gsutil requires no further intervention. I suspect that you ran gsutil configure as user X but Jenkins runs as user Y. As a result, ~jenkins/.boto does not exist. If you place the .boto file in the right location you should be all set.
Another alternative is to use multiple .boto files and then tell gsutil which one to use with the BOTO_CONFIG environment variable:
gsutil config # complete oauth flow
cp ~/.boto /path/to/existing.boto
# detect that we need to upload
BOTO_CONFIG=/path/to/existing.boto gsutil -m cp files gs://bucket
I frequently use this pattern to use gsutil with multiple accounts:
gsutil config # complete oauth flow for user A
mv ~/.boto user-a.boto
gsutil config # complete oauth flow for user B
mv ~/.boto user-b.boto
BOTO_CONFIG=user-a.boto gsutil cp a-file gs://a-bucket
BOTO_CONFIG=user-b.boto gsutil cp b-file gs//b-bucket