What's the best way to use external artifacts as input to a buildbot build? - buildbot

I am using buildbot to build a firmware image from a bunch of other pre-built artifacts. The pre-built artifacts are built using a different system(jenkins) which is not under my control. I want to automatically kick off a new build when new artifacts are available. Since it's not under my control I can't add anything to "trigger" buildbot. I need a poll-based approach. What's the best way to do something like this with buildbot? Should I treat the pre-built artifacts as "source" and make a JenkinsPoller that extends PollingChangeSource? Should I create a new scheduler that polls jenkins?

So I ended up implementing a PollingChangeSource called JenkinsPoller with code lifted from GoogleCodeAtomPoller.

I'd use a PBSource in buildbot and have Jenkins call buildbot sendchange as the last step after creating its artifacts.
buildbot sendchange --master {MASTERHOST}:{PORT} --auth {USER}:{PASS}
--who {USER} {FILENAMES..}
see: http://buildbot.readthedocs.org/en/latest/manual/cmdline.html

Related

How can you convert a GitHub action that uses Docker images into Azure Pipelines custom task

I'm trying to create a custom task to be published on the Azure Pipelines Marketplace so that people can use my security tool within Azure Pipelines. The task requires a lot of additional software, so Docker has been used for packaging.
I've similarly created the action for GitHub Actions, https://github.com/tonybaloney/pycharm-security/blob/master/action.yml
The action will-
Use a custom Docker image (hosted on Docker Hub)
Mount the code after checkout
Run a custom entry point, passing the arguments provided to the action
I cannot see how to achieve the same thing in Azure Pipelines. All of the example code for custom tasks is written in TS/Node or PowerShell.
The only TS/Node.js example doesn't show you how to do anything like download a docker image and run it.
The only other documentation I can find is about how to build a Docker image from within a Pipeline. But I want to download and run an image as a Task.
The Task SDK documentation clearly shows how to construct the package, but not how to do anything beyond getting it to pass arguments.
One possibility is to clone the DockerV2 Task and to customize it to run the Docker commands that I need, but this seems quite convoluted compared with how simple it is in GitHub Actions
How can you convert a GitHub action that uses Docker images into Azure Pipelines custom task
I am afraid you have to clone the DockerV2 Task and to customize it to run the Docker commands that you need.
The reason for its complexity is that their implementation forms are different.
We are customizing github action and publishing to Marketplace, the custom github action did not compile and package the source code, but just quoted the original code. In other words, our custom action is more like providing a link to tell the compiler where to download the source code and pass parameters and rewrite the source code. So we don't need to download the source code of github action and customize it.
However, the Azure Pipelines custom task is different. The custom task needs to be compiled to generate a .visx file, which requires the source code and compiles it after rewriting.
Besides, Azure devops provide a Task groups, so that we could encapsulate a sequence of tasks, already defined in a build or a release pipeline, into a single reusable task that can be added to a build or release pipeline, just like any other task. You can choose to extract the parameters from the encapsulated tasks as configuration variables, and abstract the rest of the task information.
Hope this helps.

Build multiple projects/repositories with one build definition VSTS

I am using VSTS for my OPA5 Tests, so all works for one project. For this I created a Build for these Projects i wanted to test.
But if I want to test all projects, do I need to create a build for all Project or is there a solution to build all projects with one build definition?
The build should do always the same things, saved in a YAML File.
I have seen thats is possible to do builds with difficult branches but not with difficult repositories.
So has anyone a solution for this or is it impossible at the moment?
Yes, it's possible.
You just need to clone another git repositories at the beginning of the build.
So you can add a PowerShell task as the first task and execute git clone command.
And If you are using YAML file, just add the script to execute the PowerShell task.
Besides, you can also refer the post VSTS build from multiple repositories.

How to copy build artifact to external server in Teamcity?

I am working on Angular 4 deployment using TeamCity. I need to copy artifacts generated by build to a external FTP-server in automated manner. Can anyone help me to achieve this?
You can add a specific Build Step with the runner type: FTP Upload
You should use it either:
After producing the content to push.
In another build step, with artifact dependencies.

Calling a Jenkinsfile from a remote repo into build pipeline

I would like to pull a source controlled version of a Declarative Jenkinsfile into a multibranch jenkins job.
For example, I have 20 multibranch build job each building an application and deploying, each build job will have a static jenkinsfile that point to, pull and use a version controlled jenkinsfile.
This would reduce the need to make changes across all repositories when making changes
(we do use shared libraries where relevant)
Thanks in advance
You can define the whole pipeline as a global variable within a Shared Library, then you can use a single step in your Jenkinsfile, as explained here.
In that way you are able to update the content of the pipeline, without updating each Jenkinsfile accross all your repositories.

TeamCity: Best Practices to deploy produced installers (artifacts)

We got a TeamCity server which produces nightly deployable builds. We want our beta tester to have access these nightly builds.
What are the best practices to do this? TeamCity Server is not public, it is in our office, so I assume best approach would be pushing artifacts via FTP or something like that.
Also I have no clue how to trigger a script when an artifact created successfully. Does TeamCity provide a way to do that?
I don't know of a way to trigger a script, but I wouldn't worry about that. You can retrieve artifacts via a URL. Depending on what makes sense for your project, you could have a script set up on a scheduler (cron or Windows Scheduling) that pulls the artifact and sends it to the FTP site for the Beta testers. You can configure it to pull only the latest successful artifact. If you set up the naming right, if the build fails they beta testers won't notice because the new build number just won't be there, no bad builds would be pushed to them.
Read the following help page from the documentation. It shows how you send commands from your build script to tell teamCity to publish the artifacts to a given path.
In TeamCity 7.0+ you can use Deployer plugin. Installation steps can be found here. It also allows to upload artifacts via SMB and SSH.
I suggest you start looking at something like (n)Ant to handle your build process. That way you can handle the entire "build artifacts" -> "publish artifacts" chain in an automated manner. These tools are dependency based, so the artifacts would only be published if the build succeeded.