Automate mirroring GitHub to GCP Source Repository? - github

We run Google Cloud Functions (python), which require to be deployed from Google Cloud Source Repository. Since all the code is stored on GitHub we resort to first mirroring GitHub into Source Repository. Although this only requires a few mouse clicks, it becomes a burden to repeat over 3+ projects (dev, staging, production) times 5+ repos (5+ apps).
I am looking to automate the mirroring config, preferably to add into the Terraform automation we already use, into a hands-off project configuration. Does the Google API support this mirroring automation? So far on my Google Cloud expedition everything was available in their API!
I fail to find Terraform examples though, and would appreciate a tip.
Come to think of it, if I can take Source Repository out of the equation, that would be just fine with me too. After all, I only use it as a pass-through / empty shell.

The Cloud Source Repository API includes a Repo resource that has a Mirror Config object where you could type in your Github's URL, webhook and credentials to automate this procedure. I would initially test it with the create method, but if you have an existing Cloud Source Repository I believe the patch method will also be worth exploring.
Additionally, there is an open Feature Request in order to connect a repository via the Cloud Build GitHub App that I recommend you to star and follow, as it could further ease your automation needs.

Related

Modifying pre-receive hooks on Bitbucket Cloud

We have Bitbucket Cloud not Bitbucket Server. Is there a way to modify the "pre-receive" functions on Bitbucket? Goal is to audit pushes to make sure there's no obvious vulnerabilities before the code is available on Bitbucket. Git-hooks might work but there's not really a way to get them into version control in the same repo - the only way I can think of doing that would be to ssh into a Bitbucket server and modify the remote repo but I don't think you can do that?
My only guess is there might be a way to keep the pre-receive hooks in source control by putting the hook somewhere like this in the repo:
.bitbucket/pre-receive
But it's hard to find any info on this online.
Unfortunately, this isn't possible.
The GitHub documentation is talking about GitHub Enterprise Server, a product you would install on your own infrastructure. GitHub as in github.com does not support creating pre-receive hooks at all. This is pretty much the norm amongst the popular cloud git hosting providers - no cloud provider will let you write your own arbitrary code and run it on the same infrastructure that holds your git repo, there's too much danger of you breaking out into other data on the same physical storage.
Until someone develops a safe/sandboxed implementation of server-side hooks, you'll need to find another way.
Full disclosure: I work for Atlassian (though I don't work on Bitbucket Cloud)

GitHub Google Cloud Build - Multiple Repositories

I'm interested in trying the Google Cloud Build continuous integration application on GitHub.
My application currently has 2 repositories I would like to deploy in a single Docker image. One of them is NodeJS API server, the other is a browser-based (no server side rendering) ReactJS application.
The idea would be to have the NodeJS repo serve requests under /api/... and any for any other URIs, it would serve up the React app.
My question, is it possible to have the Google Cloud Build grab another repo as well, as long as it's on GitHub? Ideally, a commit to either repo (in the right branch) would trigger the same underlying build. Just curios if this is possible.
One approach would be for GitHub Google Cloud to grab a third repository, which would be a "parent" repo referencing the right SHA1/branch of your two other repositories as submodules.
You can see an example of such a build in "Static Website with Hugo, Cloudflare and Automated Builds using Google Cloud".
That would allow you to still work with "one" repository, even though that would check out two others in their own subfolders.

Google Cloud Builds Trigger for GitHub Repo says "No tag matches" Always

Build Trigger Setup
Setup a Build Trigger to a GitHub Repo
Trigger Type: Tag
Tag(regex): .*
Cloud Build Configuration file
Substitution variables:
_DEPLOYMENT_ENV: staging
The config is below:
Things I've Looked at
I've checked out the GitHub Applications and Authorizations. Google Cloud Platform is approved
The Google Cloud GitHub Marketplace Plugin is on for the repo and functions. I get a greencheck mark on pull requests after it builds the containers
Current Results and Expectation
I expect there to be tags matched because the repo has some tags. I push some new tags and nothing has changed.
Note: Google Cloud Build's GitHub Marketplace Plugin is still in Alpha, so its features are not reliable and it is not unusual to run into breaking changes... and there is no active support for it.
The screen you are seeing is from Google Cloud Platform -> Cloud Build -> Triggers. It is different and unrelated to the Google Cloud plugin found inside the GitHub Marketplace (I know, it is confusing).
The triggers you setup currently pull in GitHub repos into Google Cloud Source Repositories before your triggers are executed.
The GitHub plugin, I have been having some issues with it these few days and I think they are introducing some new breaking changes on it soon, but when it worked, it does not require any triggers and purely looks at the cloudbuild.yaml file to do builds automatically. I had to create separate scripts inside the cloudbuild.yaml to setup different build based on tags / branches (Cloud Build Triggers let you do all these inside the UI), but what you get as part of the GitHub plugin is this "GitHub Checks Events" (the green check / red cross) next to the corresponding commits in GitHub and also a very brief details page. The GitHub plugin is currently acting weird on me and I am in the process of switching over to use Cloud Build Triggers until they have sorted it out.
I think they are working on something to bridge the difference between Google Cloud Build Triggers and the Cloud Build GitHub plugin... just a feeling from the current log messages I see inside Cloud Build...

Github feature like Bitbucket Pipeline

Is there any service / feature of github.com just like Bitbucket Pipeline ?
I'm actually want to push my master branch to FTP server (cpanel, apache) . It's really easy with Bitbucket Pipeline, but any way to do that in Github ?
Github now has a feature called Github Actions, which allows you to execute arbitrary commands and processes triggered by events such as repository writes, pull request merges, and others similar to Bitbucket Pipelines. So your build/test/deploy stages can be run using Github's infrastructure, or you can move your app code to a remote location such as an FTP server, to kick off a code pipeline or update remote artifacts.
GitHub itself doesn't provide this feature, but you can use GitHub apps, such as Travis CI.
Travis CI enables your team to test and ship your apps with confidence. It’s built for everyone and for projects and teams of all sizes, supporting over 20 different languages out of the box, including Javascript and Node.js, Ruby, PHP, Python, Mac/iOS, as well as Docker, while giving you full control over the build environment to customize it to your own needs.
There is also other apps for continuous integration: https://github.com/marketplace/category/continuous-integration
Not that I know of. You could however setup an internal build server using jenkins, circle ci, or travis ci. I have used both jenkins and circle ci both integrate well with github(It's fairly straight forward process). Jenkins is open source, where as circle ci is cloud base solution(it has a free tier). Both I believe could help solve your issue.

Automatically mirroring a Gitlab repo onto Github on push

I'm looking for a way to automatically mirror my Gitlab repos to Github, on push. I use Gitlab repos as my main repos, and would rather have to push to only one remote. But, I want my code to be browsable on Github also.
I found similar questions on StackOverflow, such as this one.
But the answers are always the same: one should add a custom post-receive git hook to the gitlab repo. This requires a shell access to the server running Gitlab. As I'm hosting a community edition Gitlab for many users, and not only me, they can't have easy access to a shell (and this isn't the most user-friendly way to do this), so it does not fit my needs.
I thought about two ways to implement it:
Either a MirrorOnPush project service, implementing such a git hook in Ruby, as the EmailOnPush project service currently do.
Or use a custom server to clone and push the repo, using a webhook.
The first one seems to be the cleaner to me, but I can't find any doc about Gitlab project service and code structure… On the other hand, the second is a bad and ugly hack, but is almost straightforward.
I'd rather implement a project service to handle it. Do you have any doc or leads on how to write a project service for Gitlab (without having to read all the Gitlab source code, as there seems to be no dev doc…) ?
Thanks !
one should add a custom post-receive git hook to the gitlab repo.
Actually, that was the best solution, up until 7.x GitLab, as I detailed in "Gitlab repository mirroring";
A true project service for repo mirroring is requested, but not voted up enough: suggestion: suggestion 4614663.
The main documentations remains:
the app models project services folder,
the spec models project services folder,
the doc/project_services,
the project services scenarios.
This isn't much, as the OP noted before.
Since it That leaves you with the hack approach.