Accessing Google Cloud Source Repositories from Google Cloud Storage - google-cloud-storage

Is possible to access or copy (transfer) a git Google Cloud Source repository to Google Cloud Storage.
The idea is to use the git repo as a website like GitHub Pages.

You can do this as follows:
clone the Google Cloud Source repo
use gsutil cp -r dir1/dir2 gs://my_bucket/subdir to copy the contents of the data to Google Cloud Storage, possibly after processing (e.g., if you want to use something like Jekyll or Middleman to generate your website). Note that this will also copy your .git directory as well, which you might want to exclude.

Related

How do I inject a file into a Netlify deploy from Github to avoid using git-lfs?

I have a Netlify site that deploys from Github. The problem is that it contains a WebGL Unity game which has a pretty big .data file. The file is bigger than 100MB so it would need to be storged in Github's LFS. I don't want to use Github LFS so is there a way to store this file in an external link like Google Drive or Dropbox or something like that and inject it into the deploy at build time?

How to do data protection for Azure Repos, restrict user in downloading as Zip file?

I am trying to find a solution for restricting the users in downloading as zip or cloning option (or) basically to encrypt the source code in Azure Repos. Unable to find any docs related to it.
Based on the explanations in the following document, if the user could view your repo, he has the permissions to clone and download it at the same time:
https://learn.microsoft.com/en-us/azure/devops/organizations/security/set-git-tfvc-repository-permissions?view=azure-devops#git
If you don't want the user clone or download the repo, you would also need to deny the user to access the repo:

Automate mirroring GitHub to GCP Source Repository?

We run Google Cloud Functions (python), which require to be deployed from Google Cloud Source Repository. Since all the code is stored on GitHub we resort to first mirroring GitHub into Source Repository. Although this only requires a few mouse clicks, it becomes a burden to repeat over 3+ projects (dev, staging, production) times 5+ repos (5+ apps).
I am looking to automate the mirroring config, preferably to add into the Terraform automation we already use, into a hands-off project configuration. Does the Google API support this mirroring automation? So far on my Google Cloud expedition everything was available in their API!
I fail to find Terraform examples though, and would appreciate a tip.
Come to think of it, if I can take Source Repository out of the equation, that would be just fine with me too. After all, I only use it as a pass-through / empty shell.
The Cloud Source Repository API includes a Repo resource that has a Mirror Config object where you could type in your Github's URL, webhook and credentials to automate this procedure. I would initially test it with the create method, but if you have an existing Cloud Source Repository I believe the patch method will also be worth exploring.
Additionally, there is an open Feature Request in order to connect a repository via the Cloud Build GitHub App that I recommend you to star and follow, as it could further ease your automation needs.

GitHub Google Cloud Build - Multiple Repositories

I'm interested in trying the Google Cloud Build continuous integration application on GitHub.
My application currently has 2 repositories I would like to deploy in a single Docker image. One of them is NodeJS API server, the other is a browser-based (no server side rendering) ReactJS application.
The idea would be to have the NodeJS repo serve requests under /api/... and any for any other URIs, it would serve up the React app.
My question, is it possible to have the Google Cloud Build grab another repo as well, as long as it's on GitHub? Ideally, a commit to either repo (in the right branch) would trigger the same underlying build. Just curios if this is possible.
One approach would be for GitHub Google Cloud to grab a third repository, which would be a "parent" repo referencing the right SHA1/branch of your two other repositories as submodules.
You can see an example of such a build in "Static Website with Hugo, Cloudflare and Automated Builds using Google Cloud".
That would allow you to still work with "one" repository, even though that would check out two others in their own subfolders.

How to keep storage bucket synced with Google Cloud Source Repository

Question:
Does Google automatically update storage buckets with changes pushed to a project's Cloud Source Repository?
Example:
I create a Google Cloud Platform project called Cooking and store the file recipe.txt in a bucket.
I modify recipe.txt and push the changes from my local master branch to the Cloud Source Repository for Cooking.
When I look at the Source Code panel for my project, I see recipe.txt is up-to-date with my latest changes.
When I look at the storage bucket for my project, I see recipe.txt is not up-to-date (i.e. not in sync with the project's Cloud Source Repository).
No. Google Cloud Source Repositories can be configured to stay in sync with other git repository services, such as GitHub or Bitbucket, but there is no relationship between Google Cloud Source Repository repositories and GCS buckets.