Say I have a project, that depends on and build with the latest commit from a repository, managed by someone else, is there a generic way to get build triggering? I am not talking about for a project that you own where you have access to the Webhooks settings but where the project is someone else's.
An example I have for this is Docker images. Where I dockerise an application, I want to have a CI system rebuild that image whenever the application's source repository is updated. I don't have control over the webhooks of the application vendor's git so cannot add a webhook, but would like a trigger when it is updated. A short delay is reasonable (it does not need to be instant).
For argument's sake, we can assume that the repo is hosted on GitHub and that the CI supports web hooks.
Is there a tool/service that does this? I don't think that there is a way provided by GitHub or any of the other large Git hosts (GitLab or BitBucket) for doing this, but if I am mistaken please let me know. All I can think is to poll the repo in some schedules job and trigger the build from that. I suspect there may be a plugin for Jenkins to do this but would like something generic and if polling can be avoided in favour of the publish/subscribe model that would be perfect.
Related
looking for workflow solution. We need something like ad-hoc sharing workflow https://docs.bit.dev/docs/workflows/projects with one addition - before the component publishing could happen only after the code review. let me try to describe the short scenario:
there is a repo with the shared components
there are several consumer projects. each one sits in its own repo
there is no dedicated team to maintain the repo with the shared components
the developer of consumer project imports a share component and make changes
the developer wants to create a pull request for a component changes
So far I see only one solution - the developer manually applies changes he made locally to a shared library repo and manually creates a pull request. Kind of boring. Does the bit.dev provide an automated solution for such case?
While a PR-like feature is still not available in Bit, you can use Git's PR workflow to set up a code review process for components with some automation.
Note this flow can work regardless of the specific workflow your team implements. In this answer, I'll focus on the ad-hock flow, as your team uses.
You'll first need to set up automation on your projects, that when there's a change in component's code, your CI will bit tag && bit export the modified components. This should happen only when a PR is approved and merged to master branch (in Git).
Then using the Git integration feature set up your projects to receive PRs on new versions for components.
With these two setups, this will be the workflow your team can utilize:
Import component to any project and modify.
Submit PR to the project.
Have a peer do a code review.
When change is merged, run bit tag && bit export --eject during CI
Commit and push back changes to package.json to the repo (with a skip-ci flag, per your automation infrastructure).
All projects that use that component get a PR from Bit with the newly available version.
I will update this answer whenever a new feature in Bit improves on this workflow.
as Itay says, you can use the GitHub integration on bit.dev.
But if you want, I create demos projects that show how to use GitHub or Azure CI to integrate the project with Bit, and export new components when code our pushed to master, and also run Bit script on PRs.
https://github.com/teambit/bit-with-github-actions
https://github.com/teambit/bit-with-azure-devops
I hope it will help you.
I would like to use MyGet build service to build my project hosted at GitHub. However, the service is triggered unnecessarily by updating README.md or other documents in the repository. Is there any way to skip these kinds of commits?
Unfortunately, there's no way out-of-the-box to filter commits, as there's no way for a machine to know what you'd like to build and what not (without going through extensive configuration). The GitHub commit webhook will fire either way, no matter what you commit. Any service responding to the webhook will listen for the event, including MyGet.
However, you could build your own triggers as MyGet Build Services supports POST web hooks. You could add your own filtering to your custom trigger, and choose when to fire the webhook, and when not to.
Details about how to create a custom build trigger for MyGet Build Services can be found here: http://docs.myget.org/docs/how-to/auto-trigger-a-myget-build-using-an-http-post-hook-url
There is an open source project (https://github.com/firebase/firebase-jobdispatcher-android), which I would like to get built using travis/circleci or another cloud ci. However, those CI's don't allow you to get to repos that are not yours.
I didn't try, but I have a hunch that I won't be able to get a webhook setup as well to get notified when those repos 'master' branch is updated.
Why not fork ? Because then I somehow need to manually\use cron server to get my forked repo updated! It loses the point of having open source repo builds...
Why do I want to build it continually? Because they do not upload their .aar output to mavencetral or jcenter and I don't want to put the .aars in my project and get it updated all the time - bloats the repo...
In any case, I don't get it - there's an open source project, the repo exists and open to everyone, pulling the data and getting webhooks doesn't compromise that repo in any way why isn't this possible ????
If I'm mistaken and web hook is possible, how can I set up a build that will end up in uploading to mavencentral (probably gradle plugin, I have an account and be happy to have a public copy there)?
(I thought of micro service, free of course of some kind + docker based ci which I can pull and build whatever, I don't mind if a build will take time).
The only similar question I found is the following: Trigger travis ci builds if another git repository updates
Keep in mind that our E2E tests are full stack: we have a live server running, and our tests are run against the UI which hits the server. Nothing is stubbed or mocked.
Now, I can trigger in that way a travis rebuild, however problem arises when I have interdependency between branch and a different repository branch.
So let's say I have 3 repositories: backend, frontend and e2etests. If I create the new branch frontend/foo which requires also the backend/foo, there is no way e2e tests will ever pass because they will run one time with frontend/foo and backend/master and another time with frontend/master and backend/foo.
Did anyone face this problem previously? How did you deal with it?
After a lot of digging it was clear that there is no solution and everyone has their own built-in.
What worked for us is a git pre-push hook, environment variables pointing to our repositories and when you push it asks if you want to use a different branch for the UI project (if you are on the UI, asks if you want to use a different branch for the server project) and by default it uses the current branch for the project you are on. So it's quite fast on push, usually you'll just press ENTER an additional time, the two branches are basically the required dependencies and a travis build will be triggered with an API endpoint.
Works fine so far.
I'm looking for a way to automatically mirror my Gitlab repos to Github, on push. I use Gitlab repos as my main repos, and would rather have to push to only one remote. But, I want my code to be browsable on Github also.
I found similar questions on StackOverflow, such as this one.
But the answers are always the same: one should add a custom post-receive git hook to the gitlab repo. This requires a shell access to the server running Gitlab. As I'm hosting a community edition Gitlab for many users, and not only me, they can't have easy access to a shell (and this isn't the most user-friendly way to do this), so it does not fit my needs.
I thought about two ways to implement it:
Either a MirrorOnPush project service, implementing such a git hook in Ruby, as the EmailOnPush project service currently do.
Or use a custom server to clone and push the repo, using a webhook.
The first one seems to be the cleaner to me, but I can't find any doc about Gitlab project service and code structure… On the other hand, the second is a bad and ugly hack, but is almost straightforward.
I'd rather implement a project service to handle it. Do you have any doc or leads on how to write a project service for Gitlab (without having to read all the Gitlab source code, as there seems to be no dev doc…) ?
Thanks !
one should add a custom post-receive git hook to the gitlab repo.
Actually, that was the best solution, up until 7.x GitLab, as I detailed in "Gitlab repository mirroring";
A true project service for repo mirroring is requested, but not voted up enough: suggestion: suggestion 4614663.
The main documentations remains:
the app models project services folder,
the spec models project services folder,
the doc/project_services,
the project services scenarios.
This isn't much, as the OP noted before.
Since it That leaves you with the hack approach.