Code Style Testing for Chef using Jenkins and GitHub organizations - github

Currently, we commit Chef cookbooks to individual repos within our GitHub organization. We are configuring a Jenkins job per repo / cookbook which will execute cookstyle first when a commit occurs, and if cookstyle passes with no issues, it will execute Test Kitchen. We have a template Jenkins job we copy and configure for each cookbook we create.
Does anyone know if it's possible to have GitHub hooks in Jenkins to listen for commit events across the entire organization, and then execute cookstyle on a repo where a commit occurred that contain a Chef cookbook? I'd like to have one central job handling the lint testing for our organization.

You wouldn't generally use a single job for this. You would use the GitHub Organization support to scan for all repositories and build them when they change. You can use webhooks to ping individual repos.

You can create github web hooks for your github organization and configure your jenkins with the help of github plugin. Refer this documentation to see how to configure

Related

Using Gitlab runners from within GitHub Action, or: mirroring pull requests

In my code hosted on GitHub, we perform some tests and quite a bit of post-processing using GitHub Actions. Now, we would like to (or, actually, have to) use Gitlab runners hosted by a supercomputing center to do some further testing and benchmarking. This cannot be done with self-hosted GitHub runners, because I cannot influence their decision. We do not want to move the whole workflow and community over to some Gitlab instance either. So here's my (general) question: Is there a way to use Gitlab runners from within GitHub Actions?
What I have tried and what kind of works is to mirror the repository over to the Gitlab instance and let the runners do their magic there. Using this neat approach, the GitHub Action will wait for the results of the runners and integrate them into its own results. However, this does not work if contributors fork the repository and make pull requests.
In principle, it looks like this could be doable if the contributors also have accounts and corresponding permissions at the Gitlab instance. This is fine for now, because the community is small and the Gitlab instance is accessible to external contributors. Note that manual action from the maintainers of the code (i.e., me) is required before contributors can execute code with the runners for the first time, so we should be fine concerning security.
However, I cannot get this to work for pull requests, because I fail to mirror them. As said, direct pushes are fine, but nothing else works. This leads me to the more specific questions: How can I mirror a pull request from GitHub to a Gitlab repository? How can I enable this for both pull request and pushes (and do I need even more cases)?
Any help is appreciated! I'm really no expert on GitHub Actions, Gitlab runners or even git itself (beyond the basics). If there's a better way to achieve this, I'm happy to hear about it!
I can think of several workarounds:
1. Change what triggers your pipelines
Since you cannot mirror pull requests, but you can mirror branches, adapt the pipeline triggers in Gitlab so the pipelines are launched whenever there is a new commit, instead of a new PR.
You can always use a staging branch if you want to limit the pipeline executions.
2. Use webhooks
If the Gitlab instance is available on the internet, create a GitHub action that triggers a Gitlab pipeline execution whenever there is a PR on Github, or even open a PR directly in Gitlab. It is well documented:
Trigger a pipeline using curl
API to create merge request

GitHub Pages Automation

Overview:
I'm using GitHub Pages feature to host documentation. I'm working on a CI/CD process to automate the build so that when the source for the documentation is updated it automatically rebuilds the content and deploys to GitHub Pages.
Details:
So far, using AWS CodeBuild, I've implemented the following:
Pulls down source from GitHub Repo
Uses MkDocs to build and deploy to the special gh-pages branch using the "mkdocs gh-deploy" command.
This is done with command lines in the CodeBuild Buildspec. The reason I'm using commands is that I want to use GitHub Deploy Keys opposed to creating user account (used as a machine account) that my team would need to manage.
I have it all working except what triggers the build. If the process was using a user account to authenticate then I can use AWS CodePipeline which creates a Webhook within the GitHub repo, and then notifications are sent via the Webhook to say that the master branch was updated, which would trigger a new build.
I'd like to implement a similar process but using the GitHub repo's Deploy Key. Any suggestions?

Github webhook is not created when creating a Google Cloud Build trigger

I have many projects which uses Google Cloud Build + Github build pipeline setup. However, there is this one project, which I cannot create a webhook in Github for.
It used to work - but commits to the repository doesn't trigger the build process any more. I deleted the trigger and added it again - but the webhook in Github is not created automatically for this project.
When I run the trigger manually, it picks the wrong, but fixed commit which I did before an year.
Any clue?
Could you try delete a repository on Cloud Source Repositories and setup Google Cloud Build again ?
See:
https://cloud.google.com/cloud-build/docs/running-builds/automate-builds
Note: For external repositories, such as GitHub and Bitbucket, you must have owner-level permissions for the Cloud Platform project with which you're working. When you set up a build trigger with an external repository for the first time, you'll need to set up authorization with that repository.
After you've set up your external repository, Cloud Source Repository creates a mirror of your repository.
https://source.cloud.google.com
https://cloud.google.com/source-repositories/docs/deleting-a-repository
https://cloud.google.com/source-repositories/docs/mirroring-a-github-repository
I am experiencing the same issue. I can create a trigger for a repo, but I cannot connect the repo automatically to cloud build. We also have many projects, and this manual labor is sort of annoying.
Is there any (under the hood) github/gcloud api available in which I can connect a github repo to cloud build? I am aware that this can only be done by someone with admin privileges on a repo or organization in github.
After this, I will be able to run the command gcloud build triggers create github [NAME]

Run Jenkins job only once on changes pushed on github

I followed this answer to set up a Jenkins job, and it's working fine.
I have scheduled a Job on github master commit push as
Poll SCM : * * * * *
But, it continuously starts a build Job each minute.
How can I restrict this so that it runs only once per commit push?
There are several options. The two I've used with most success are:
Use a git commit hook to call the Jenkins rest API to start the job. The simple approach is to call the job's build API call directly (something like http://jenkinsmachine:8080/job/your-jobs-name/build), but that hardcodes the job name and branch into the git hook script. A more flexible approach is to use the Git plugin's own rest mini-API, as described by Kohsuke in his blog.
Use something like the Throttle Concurrent Builds plugin or a creative use of slaves nodes and executors to limit the number of cuncurrent builds of that job.
The first option is much preferred, but there are times when rest access to the jenkins machine from the git machine is not available, so the second option can be used in those circumstances.
This approach is actually polling. That means Jenkins is scanning every minute if there aren't any changes in GitHub repository.
If you want true Push from Github to your Jenkins You need to Integrate Github WebHooks with Jenkins. I wrote a blog post on this subject. Scroll to section 2: "Jenkins - Github Integration"
If you are just playing around or using it for your personal open source project, you may want to look into Jenkins alternatives like Drone.io or Codeship.io. These services are generally free for open source and can configure Github Webhooks in few clicks. But they are not suitable for complicated enterprise builds.

Jenkins github pull request builder for multiple github accounts

We have a Jenkins setup on a server, and are building projects from multiple Github enterprise servers (currently 3 Github enterprise servers). I wanted to use the git pull request builder plugin. However, this plugin only lets us configure one Github account.
Is there a method to include other Github server accounts as well?
The GitHub pull-request builder plugin does mention:
The first fifty characters in the Description are used to differentiate credentials per job, so again use something semi-descriptive
So as long as your project are built in separate job, you should be able to select in their respective job configuration different credentials.