gcloud file transfer does not show error but does not appear either - gcloud

I'm trying to transfer files to my virtual machine instance on GCP using the gcloud client using gcloud compute scp --recurse ./local-repo foo:~/ and it looks like the files transfer, but once I ssh into foo, I don't see anything. Any help would be much appreciated. Thank you!

Related

How to move files between two VMs using gcloud?

I have tried using scp provided by gcloud, however, that produces an error:
gcloud compute scp github-action-runner-0001:/tmp/images.tar github-action-runner-0002:/tmp/images.tar
ERROR: (gcloud.compute.scp) All sources must be local files when destination is remote. Got sources: [github-action-runner-0001:/tmp/images.tar], destination: github-action-runner-0002:/tmp/images.tar
I am evaluating if there are alternative ways of copying files from one VM to another using gcloud utilities.
Obviously, copying files locally first would work, however, given the size of the files this would not work reliably.
Imagine you have two Compute Engine virtual machines: VM-A and VM-B.
SSH login to VM-A.
Use the gcloud compute scp command to copy files to/from VM-B.
By logging into VM-A, it becomes the local system and VM-B becomes the remote system.
Note: Your question does not specify what github-action-runner-0001 is. I m assuming that it is the name of a Compute Engine VM. If it is not, then complete step #1 and then copy the files from github-action-runner-0001 to VM-A and then copy the files from VM-A to VM-B using step 2.

Automating gsutil commands

I'm trying to automate some gsutils commands, but struggling to see where the authentication files are kept and how to re-use (if thats what happens).
I've gone through the gcloud init process in bash...
curl https://sdk.cloud.google.com | bash
gcloud init
All works well when I run
'gsutil ls'
Now I'm trying to automate the process, so this would work on a new server adding into a crontab on it (rather than creating a new config each time).
I saw a mention of setting env variable GOOGLE_APPLICATION_CREDENTIALS, so I copied my credentials from web login to a file and tried it, eg trying as a different user to test
export GOOGLE_APPLICATION_CREDENTIALS=/home/user/.gsutil/mycreds
and then gsutil ls, but fails.
So I assume I've got the whole credentials thing a bit wrong. I'm assuming there is a file somewhere that was originally created by gcloud which I could use, but I can't see it anywhere ?
I've looked at the answer here but doesn't seem up to date now, as per last comment.
Edit: I have followed Zacharys steps, gcloud auth activate-service-account --key-file=myfilelocation
However, with 'gsutil ls' I now get..
You are attempting to perform an operation that requires a project id, with none configured. Please re-run gsutil config and make sure to follow the instructions for finding and entering your default project id.
So my next question would be, where is it looking for the project id ? If I run gsutil config, it seems to create a new set of auth which then creates another error, so have removed that.
You should be able to do this without diving in too deep to the implementation of authentication for gsutil.
If you're using standalone gsutil (if you installed via this method), the instructions in the linked question are still valid (as Travis points out).
If you'd like to continue using the gsutil supplied via the Cloud SDK, you should use service accounts. Service accounts are the preferred method of authenticating on headless machines or in non-interactive contexts.
Your flow would look something like the following:
Create a service account via the Google Cloud Developers Console.
On the remote machine, install the Cloud SDK and gsutil. If you're not installing interactively, it's better to skip the curl ... | bash method. Instead, download this install archive, extract it, and run the install.sh script. This script has options (visible with --help); if you specify choices to all of these options, it won't prompt you.
Copy the service account to the remote machine. Run gcloud auth activate-service-account --key-file=/path/to/service-account.json.
Run gsutil. You should be appropriately authenticated.
You have to set default project and user in gsutil. Run the following command:
gcloud init
Choose 1. It shows you different users; select the user and then select the project.
I was trying to create a bucket with project id as name:
$ gsutil mb -l eu gs://PROJECT-ID
Creating gs://root****/...
Error: You are attempting to perform an operation that requires a project id, with none configured. Please re-run gsutil config and make sure to follow the instructions for finding and entering your default project id.
Steps that resolved for me:
gcloud auth login
gcloud config set project <PROJECT-ID>
gsutil mb -l eu gs://<PROJECT-ID>
Creating gs://root***/...
The error is gone out of the way and it works as expected.

Fleetctl uses /root/.ssh instead of remote machine using /home/core/.ssh

I cant manage to clone a private repo from a unit file. I get the Host key verification failed error message. Cloning it on the remote machine from the command line seems to work just fine.
After debugging i saw that the fleet client on the remote is looking for keys in /root/.ssh and my remote machine in /home/core/.ssh.
Any idea how to fix this?
greetings A.
You can specify the user that a unit runs as with User=core. That should look into your home dir for the correct key. More details here: https://coreos.com/os/docs/latest/registry-authentication.html#the-.dockercfg-file

Using Google Cloud Storage with rsync

I am new to Google Cloud. We have historically used AWS for online backups -- essentially, our local servers ran rsync to an EC2 instance at AWS and it all worked fine. I'm now trying to migrate from AWS to Google and of course the setup is pretty different. With gsutil rsync it looked to me as though I wouldn't need to spin up a Compute Engine at all, I could just push stuff straight into gs://aws_mnt bucket
Having installed the SDK on our AWS instance I was able to push all our backups to the gs://aws_mnt bucket very easily using gsutil cp -n
But going forward I want to run a cron job on the local server which uses rsync rather than cp for obvious reasons.
I have two issues:
Despite reading the appropriate documentation (here) I am so stupid I can't figure out how to permanently authorise the local server so I don't have to do gcloud auth login and get a code from a browser each session, as for a cron job that's not really going to work.
When I try to use gsutil rsync from the local server to the gs://aws_mnt bucket that was pre-populated from AWS, I get an error:
gsutil rsync /mnt/archive/backups gs://aws_mnt/kahless
Building synchronization state...
Skipping cloud sub-directory placeholder object gs://aws_mnt/kahless/
Starting synchronization
There is some discussion of this error on github and I've produced detailed output from
gsutil -D -m rsync /mnt/archive/backups gs://aws_mnt/kahless
But since this is a brand-new install of the SDK I can't imagine the thread hasn't already been dealt with so I must be doing something wrong?
Rus
In response to your questions:
Once you have configured credentials using gcloud auth, the 'gcloud auth login' command will cause them to be selected until you login to a different credential... and that state will persist and not require you to go through the browser session again unless/until you revoke those credentials. Note: If you're thinking of running commands from an unattended script (e.g., via cron) please consider using service account credentials. For more details please see https://developers.google.com/cloud/sdk/gcloud/#gcloud.auth
That "skipping..." message is not an error - it's just informing you that gsutil is skipping trying to download the placeholder object, because such objects aren't needed in (and would interfere with) directories in the local file system. I'll update the message in the next version of gsutil to make this more clear. So, what you saw was that the second run of gsutil rsync found nothing to do after comparing the source and destination, and completed normally.

gsutil authentication code failure

I need to download files from GCS to local machine. I tried gsutil config and download in my machine. In my win 8 64 bit, all worked fine. However when I try to setup the same in another dedicated machine which is win vista 23 bit, entering authentication code just shows Failure.
Python gsutil.py config -b pops up the browser with request url, I got the authentication code in browser...pasting that gives Failure.
I am new to Python and could not trace back the problem. Does gsutil have any limitation on number of machines we can configure for same login/project?
Appreciate any help in debugging this.
Also, is it possible to move GCS data into Google SQL Cloud. I am assuming running a script on Google App engine that reads from GCS, parse the data and insert to Google SQL cloud is possible. Are their any documentations/tools to do this?
Thanks
Dhurka