I am attempting to setup a deployment on appharbor for code hosted on bitbucket. I have a number of projects that make use of a library project that I have, so I use subrepos to keep my code manageable. This prevents appharbor from deploying because the subrepos aren't included in the download.
This post led me to why this problem occurs:
AppHarbor, BitBucket and SubRepo Work Around
What I'm struggling with is how to implement the workaround that they stated. Is this cron job/zip file something that I'm going to have to host myself or is it possible to do with just bitbucket and appharbor's post build events? Thanks for any help that can point me in the right direction.
Instead of complicated workarounds, you might want to look into adding your library dependency as a NuGet package from a private feed. That could be hosted here: http://www.myget.org
Related
I'm using Visual Studio Team Services to build my project which is stored in GitHub (here). The master branch contains multiple projects which make up the solution. Amongst those are a WebAPI project and a Cordova project. I need to build those using two separate build definitions in VSTS.
Previously I had set-up my build definition and used the branch filters to filter on what had been pushed to the repo. For instance:
master/src/API
This worked, but it doesn't any more. It seems as if the underlying code has changed. A filter of 'master' still works and I understand how this feature is probably meant to filter specifically on branches and maybe not on folders within the branch?
It's not a huge problem, but at this time all of my builds will trigger with every check-in, even if nothing changed in the meantime for that source code. So I'm not wondering what a good solution for this issue would be:
Put every project in it's own branch. Seems like a workaround
Some other filter option or maybe another syntax or something?
Leave it as it and don't worry about the extra builds (but that itches, you know...)
Anyone running a similar set-up?
Path filters is not supported for VSTS GitHub CI Build, it is available for Git CI Build on VSTS. You can vote this user voice: https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/15140571-enable-continuous-integration-path-filters-for-git
The workaround is as you said that put every project in its own branch.
Say I have a project, that depends on and build with the latest commit from a repository, managed by someone else, is there a generic way to get build triggering? I am not talking about for a project that you own where you have access to the Webhooks settings but where the project is someone else's.
An example I have for this is Docker images. Where I dockerise an application, I want to have a CI system rebuild that image whenever the application's source repository is updated. I don't have control over the webhooks of the application vendor's git so cannot add a webhook, but would like a trigger when it is updated. A short delay is reasonable (it does not need to be instant).
For argument's sake, we can assume that the repo is hosted on GitHub and that the CI supports web hooks.
Is there a tool/service that does this? I don't think that there is a way provided by GitHub or any of the other large Git hosts (GitLab or BitBucket) for doing this, but if I am mistaken please let me know. All I can think is to poll the repo in some schedules job and trigger the build from that. I suspect there may be a plugin for Jenkins to do this but would like something generic and if polling can be avoided in favour of the publish/subscribe model that would be perfect.
I'm looking for a way to automatically mirror my Gitlab repos to Github, on push. I use Gitlab repos as my main repos, and would rather have to push to only one remote. But, I want my code to be browsable on Github also.
I found similar questions on StackOverflow, such as this one.
But the answers are always the same: one should add a custom post-receive git hook to the gitlab repo. This requires a shell access to the server running Gitlab. As I'm hosting a community edition Gitlab for many users, and not only me, they can't have easy access to a shell (and this isn't the most user-friendly way to do this), so it does not fit my needs.
I thought about two ways to implement it:
Either a MirrorOnPush project service, implementing such a git hook in Ruby, as the EmailOnPush project service currently do.
Or use a custom server to clone and push the repo, using a webhook.
The first one seems to be the cleaner to me, but I can't find any doc about Gitlab project service and code structure… On the other hand, the second is a bad and ugly hack, but is almost straightforward.
I'd rather implement a project service to handle it. Do you have any doc or leads on how to write a project service for Gitlab (without having to read all the Gitlab source code, as there seems to be no dev doc…) ?
Thanks !
one should add a custom post-receive git hook to the gitlab repo.
Actually, that was the best solution, up until 7.x GitLab, as I detailed in "Gitlab repository mirroring";
A true project service for repo mirroring is requested, but not voted up enough: suggestion: suggestion 4614663.
The main documentations remains:
the app models project services folder,
the spec models project services folder,
the doc/project_services,
the project services scenarios.
This isn't much, as the OP noted before.
Since it That leaves you with the hack approach.
I've been reading up on how people do continuous delivery with some of the popular toolsets.
Lots of posts (like this one) seem to indicate that a common way of doing things is to use something like capistrano to push software from your builds to your machines, and then chef or puppet to configure anything related to it.
My question is, do people generally push there software directly into a special git repo for binary assets, or can capistrano fetch it out of a maven repo? The maven approach seems most natural to me, but I don't seem to be able to find much information on it - which is what makes me think it's not the approach that people are generally taking.
Basically, I'm slightly confused as there seems to be a gap between the build output (where one would normally publish to a maven repo) - and where the delivery tools expect to find the software you have asked them to deploy (which seems to be a file system, or a git repo)
When it comes to artifacts; I attempt to leverage the jenkins plugin to upload to S3. Here's a link to it.
Basiclly right now, all my ci goes through Jenkins and when I get a complete build I upload it to a bucket and have chef pull the tarball/war/gem from it and install it from there.
I searched the web now for several hours but couldn't get around this:
Is there an easy way to deploy a private repository from Github to a staging/development server on each push (or at least manually)? (Best would be if only FTP-data of development server would be needed for this).
I found this: How can I automatically deploy my app after a git push ( GitHub and node.js)? but this kind of "tutorial" in the best answer stops at the point of what exactly to insert into the build.sh. And what modules are needed for this on the development server? SSH, GIT, Ruby? Maybe this sounds stupid to you, or is a wrong thinking of mine, cause nowhere on the net I found any answer to this.
The problem is, that most time, the server on which the contents of the master branch should be deployed is on a shared hosting server, where you doesn't always have SSH, GIT, Python, Ruby, etc. on which most solutions for deploying from github seem to rely on... :/
http://beanstalkapp.com/ is really great at this, you can just enter FTP-Data and deploy automatically or manually for chosen repositories and branches. So I wondered why I couldn't find a similar easy way to deploy from Github?
Thank you very much in advance!
Jonas
It isn't really clear what type of project you have, but here are a couple of ideas.
If your code is written in a compiled language, then you could:
Have a Jenkins server as mentioned in the other comment
Write a simple script in bash that does a git pull and compile and add a cron job to it.
Use an automation framework like Chef or Puppet which would automatically keep the compiled binary up to date.
If your code is an interpreted language (like HTML & JavaScript), then you could:
Use vagrant for local testing. The biggest reason is that changes are live on your local system. It only takes a git push on your machine and a git pull on the production server to make your changes live globally.
Your best bet is probably going to be #2.