CodeBuild <> GitHub - Hooks broken - github

I'm using AWS COdeBuild with GitHub on several projects, and I noticed today that it doesn't work anymore. Something's broken, and I don't know what.
I have configured CB to automatically build when a PR is updated. It used to work fine, but now it shows "Expected", without any link toward the CodeBuild build. And, on CodeBuild interface, there isn't any build running.
It's as if the commit on the PR didn't trigger any build on CodeBuild.
Considering everything was working fine, and I don't believe I've made any change to the GitHub nor CodeBuild configuration. So, what could be the reason for the build not to be triggered? What should I look for?
Manually triggering a build from CodeBuild UI works fine, and is properly sync with the PR.

The reason was both stupid and simple: I had renamed the GitHub repository using a different case.
I.e: 'myproject' > 'MyProject'
Changing the source in CodeBuild to load the new source (with updated case) fixed it:
https://github.com/UnlyEd/MyProject.git
Simple, stupid, and so easy to miss. GitHub handles very nicely those name changes and there is no need to change the local git config, as it treats older names as aliases. But CodeBuild must somehow check for the source repository name and doesn't handle case changes.

Related

How to automatically deploy main branch changes to staging site using Github Actions

I'm collaborating on a website using Github for source control. The site is hosted on a shared server on Dreamhost. I'd like to set up an easy way for myself and my collaborator to be able to see changes that have been merged into the main branch on the staging site then also run a couple of other shell commands (composer update, for example, after deploying the changes).
I'm new to this. I've found pieces of relevant documentation but have not been able to tie it all together. So far I am running into at least two issues.
Setting up github environments to point to development and staging environments
I looked into Github Workflows but it seemed Github Actions might be easier. I set up Github Environments called staging and development. When setting up the environments, I saw the option to add environment secrets but don't know what exactly to add here. So my environments in Github have names but don't really point to my development and staging environments. I think the first thing I need to figure out is how to link my Github Environments and actual development and staging environments together. I found Deploying with Github Actions but didn't find an answer there.
Invalid workflow file error
Also I found an action in Github Marketplace called branch-deploy. I created a yaml file under .github/workflows to test it. When this runs, I see an error on the workflow in Github of
Invalid workflow file
The workflow is not valid. .github/workflows/deploy.yml (Line: 2, Col: 1): Unexpected value 'id'
Not sure what is going on with this error because the "basic usage" example in the Marketplace page uses the same value for id.

Azure datafactory deployment automation from multiple branches

I want to create automated deployment pipeline for azure datafactory.
For one stream of development we can configure it using doc
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
But when it comes to deploying to two diff test datafactories for parrallel features development (in two different branches), it is not working because the adb_publish which gets generated is only specific to the one datafactory.
Currently we are doing deployement using powershell scripts and passing object list which needs to be deployed.
Our repo is in Azure devops.
I tried
linking the repo to multiple df but then it is causing issue, perhaps when finding deltas to publish.
Creating forks of repo instead of branches so that adb_publish can be seperate for the every datafactory - but this approach will not work when there is a conflict, which needs manual merge, so the testing will be required again instead of moving to prod.
Adf_publish get generated whenever you publish. Publishing takes whatever you have in your repo and updates data factory with it.
To develop multiple features in parallel, you need to just use "Save". Save will commit your changes to the branch you are actually working on. Other branches will do the same. Whenever you want to publish, you need to first make a pull request from your branch to master, then publish. Any merge conflict should be solved when merging everything in the master branch. Then just publish and there shouldn't be any conflicts, and adf_publish will get generated after that.
Hope this helped!
Since a GitHub repository can be associated with only one data factory. And you are only allowed to publish to the Data Factory service from your collaboration branch. Check this
It seems there is not a direct and easy way to accomplish this. If forking repo as workaround, you may have to solve the conflicts before merging as #Martin suggested.

VSTS Filter by repository folder?

I'm using Visual Studio Team Services to build my project which is stored in GitHub (here). The master branch contains multiple projects which make up the solution. Amongst those are a WebAPI project and a Cordova project. I need to build those using two separate build definitions in VSTS.
Previously I had set-up my build definition and used the branch filters to filter on what had been pushed to the repo. For instance:
master/src/API
This worked, but it doesn't any more. It seems as if the underlying code has changed. A filter of 'master' still works and I understand how this feature is probably meant to filter specifically on branches and maybe not on folders within the branch?
It's not a huge problem, but at this time all of my builds will trigger with every check-in, even if nothing changed in the meantime for that source code. So I'm not wondering what a good solution for this issue would be:
Put every project in it's own branch. Seems like a workaround
Some other filter option or maybe another syntax or something?
Leave it as it and don't worry about the extra builds (but that itches, you know...)
Anyone running a similar set-up?
Path filters is not supported for VSTS GitHub CI Build, it is available for Git CI Build on VSTS. You can vote this user voice: https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/15140571-enable-continuous-integration-path-filters-for-git
The workaround is as you said that put every project in its own branch.

Build an open source project from github (not mine) with a ci

There is an open source project (https://github.com/firebase/firebase-jobdispatcher-android), which I would like to get built using travis/circleci or another cloud ci. However, those CI's don't allow you to get to repos that are not yours.
I didn't try, but I have a hunch that I won't be able to get a webhook setup as well to get notified when those repos 'master' branch is updated.
Why not fork ? Because then I somehow need to manually\use cron server to get my forked repo updated! It loses the point of having open source repo builds...
Why do I want to build it continually? Because they do not upload their .aar output to mavencetral or jcenter and I don't want to put the .aars in my project and get it updated all the time - bloats the repo...
In any case, I don't get it - there's an open source project, the repo exists and open to everyone, pulling the data and getting webhooks doesn't compromise that repo in any way why isn't this possible ????
If I'm mistaken and web hook is possible, how can I set up a build that will end up in uploading to mavencentral (probably gradle plugin, I have an account and be happy to have a public copy there)?
(I thought of micro service, free of course of some kind + docker based ci which I can pull and build whatever, I don't mind if a build will take time).

How to set up a github pull request build in a Jenkinsfile?

So, I've been using Jenkins for quite a while. I have set up numerous projects with the Github Pull Request Builder plugin to run tests whenever someone opens a pull request, and then trigger some other job (build, push, deploy, etc) whenever the pull request actually gets merged to master.
So, is there any way to set this up with a Jenkinsfile, or the organization folders, or the multibranch build deal?
The github-organization-folder plugin in combination with the multi-branch plugin plugin offers exactly this awesome feature: It scans a whole organization (optionally restricted to certain patterns in repo/branch names) for Jenkinsfiles and automatically adds jobs. This also happens for Pull Requests.
Once the PR is closed, it automatically removes the job.
To avoid arbitrary code execution, an organization member has to trigger building the job (same as for the GPRB plugin). The phrase can be configured in the Jenkins System settings.
EDIT: Under the Advanced section in Jenkins, you find options about what types of PR you want to build. If you build fork PRs, then there's afaik no way to prevent running code without prior inspecting it.
An example, how this looks like: