How to make my Github master branch the only branch to be authorised to access Production resources on GCP? - github

I use Github for version control and Google Cloud Platform to orchestrate my resources and store data. I use Kubernetes (google cloud platform) and Jenkins to execute my scripts. BigQuery, Google Cloud Storage (buckets), Cloud MySQL to store my data.
I have a master branch which has production codes and development branch which has development codes. Is there a way how I can restrict only master branch codes to have write access to Production resources on Google Cloud Platform.

Related

Is app.yaml on Google Cloud Platform publicly visible?

I want to deploy a web API on Google cloud and for test purposes I would just put the API key in the app.yaml file as an environment variable. Is this a security issue?
It's generally problematic to persist secrets to files. Even if the app.yaml were inaccessible from the runtime service, you'd still face challenges that it be exposed in build logs and if you inadvertently commit app.yaml to e.g. github.
For "testing", you can run generally run an App Engine locally. This isn't a perfect replica of the production service but it should be sufficient for testing.
A solution for managing secrets is e.g. Google's Secret Manager. SDKs (encouraged) and the underlying REST API (discouraged) are available.

Deploy from Github to multiple clouds?

Greetings from Brazil!
I have an app in github which I am deploying to a cloud service. I want to deploy this same app to other services such as Heroku, AWS and/or IBM Cloud, using Github diff changes (i.e. when I update the repo it automatically updates the cloud app - like magic). Currently GitHub diff changes works fine with streamlit share and heroku, but I have separate repos.
My questions is that: can I deploy an app to multiples services from just one repository in GitHub?
Irrelevant for the question: currently the app is Python3 and I share the app in streamlit share and Heroku, using separate repos. My question, however, is app agnostic.
You can use GitHub Actions to define your deployment workflows.
You can deploy to various cloud providers using available actions/operators:
Amazon ECS
Azure
Heroku
Your project can define a workflow for each cloud provider and, within each workflow, decide when the deployment occurs (automatically on every push, only selected branches or manually - pushing a button).

Automate mirroring GitHub to GCP Source Repository?

We run Google Cloud Functions (python), which require to be deployed from Google Cloud Source Repository. Since all the code is stored on GitHub we resort to first mirroring GitHub into Source Repository. Although this only requires a few mouse clicks, it becomes a burden to repeat over 3+ projects (dev, staging, production) times 5+ repos (5+ apps).
I am looking to automate the mirroring config, preferably to add into the Terraform automation we already use, into a hands-off project configuration. Does the Google API support this mirroring automation? So far on my Google Cloud expedition everything was available in their API!
I fail to find Terraform examples though, and would appreciate a tip.
Come to think of it, if I can take Source Repository out of the equation, that would be just fine with me too. After all, I only use it as a pass-through / empty shell.
The Cloud Source Repository API includes a Repo resource that has a Mirror Config object where you could type in your Github's URL, webhook and credentials to automate this procedure. I would initially test it with the create method, but if you have an existing Cloud Source Repository I believe the patch method will also be worth exploring.
Additionally, there is an open Feature Request in order to connect a repository via the Cloud Build GitHub App that I recommend you to star and follow, as it could further ease your automation needs.

How to integrate OnPrem Azure DevOps Server with the cloud one?

My firm has the Azure DevOps online version where we have all our projects and repo's. We were not able to configure CI/CD for the repo's because our internal server network doesn't have access to the internet.
To overcome this issue, we built a new server that has access to the internet and also to the internal network. On the new server, we installed and configured Azure DevOps Server 2019. We don't want to migrate our repo's from the cloud version to the online version.
I am trying to link the OnPrem repo to the cloud repo but it was not working. I issued a PAT on the cloud version and added it as a service connection under Pipelines in the OnPrem version but still, I am not able to see and link the cloud repo's.
I can clone the repo from the cloud to the OnPrem server but that will not get the latest code as the code is being checked in the cloud repo's
Can anyone please guide me on how to link both of them, please.
Thanks!!!
I don't think there's a meaningful way to integrate Azure DevOps Services and Azure DevOps Server, as they are essentially the same product. I assume (but don't know) that you're looking to integrate Azure DevOps Services to on-premise builds and deployments, as you state that you want to keep the repos in Azure DevOps Services. So, in essence, you want to run build and deployment group agents in on-premise environment.
Take a look at the agent-documentation and especially the communication subsection:
https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops
Or this old blog post, from which the communication section originates:
https://devblogs.microsoft.com/devops/deploying-to-on-premises-environments-with-visual-studio-team-services-or-team-foundation-server/
The ideal solution would probably be that you run self-hosted build agents in your server that's open to internet, and configure an agent pool for them in Azure DevOps Services. For deployments, you'll want to use Deployment Groups and install deployment group agents to target servers, where they'll just need outbound 443 access for communicating with Azure DevOps Services.
If that's not possible, you'd have to install deployment agents to the build machine, which then sees your other on-premise servers, but this is rather unsatisfactory solution since you'd either have to rely on WinRm capabilities for deployments, or expose too much network between your build server and other on-premise servers.

How to keep storage bucket synced with Google Cloud Source Repository

Question:
Does Google automatically update storage buckets with changes pushed to a project's Cloud Source Repository?
Example:
I create a Google Cloud Platform project called Cooking and store the file recipe.txt in a bucket.
I modify recipe.txt and push the changes from my local master branch to the Cloud Source Repository for Cooking.
When I look at the Source Code panel for my project, I see recipe.txt is up-to-date with my latest changes.
When I look at the storage bucket for my project, I see recipe.txt is not up-to-date (i.e. not in sync with the project's Cloud Source Repository).
No. Google Cloud Source Repositories can be configured to stay in sync with other git repository services, such as GitHub or Bitbucket, but there is no relationship between Google Cloud Source Repository repositories and GCS buckets.