Trying to setup a Triggerable Scheduler in buildbot nine (0.9.1) but I think I'm misunderstanding how this is supposed to work.
I have two builders:
CompilerBuilder
PackageBuilder
Last step of the CompilerBuilder is a trigger step:
steps.Trigger (schedulerNames=['package'],
waitForFinish=True)
The master configuration file has a Triggerable scheduler:
c['schedulers'].append(schedulers.Triggerable(
name="package",
builderNames=['package']))
What I want to achieve/expect
A SingleBranch Scheduler starts the CompilerBuilder
When the CompilerBuilder reaches the last step, the PackageBuilder is triggered, so the freshly compiled software is packaged.
What really happens
The SingleBranch Scheduler starts the CompilerBuilder
When the CompilerBuilder reaches the last step, the PackageBuilder is triggered
...so far so good, unfortunately the PackageBuilder is started inside a different working dir from the one where the code has been checked-out and compiled, so the package creation process fails.
My understanding was that a Triggered builder would be run in the same working dir of the "calling" builder but I think I'm not understanding how to correctly configure the builders/schedulers correcty.
Any hint?
Unfortunately, each builder has it own working directory. I recommend using addSteps (not addStep) with a list of package steps as part of the single branch scheduler builder. If you don't always want to package, add a doStepIf with some additional logic.
Related
I'm designing a device which would need to perform a number of setup activities at first boot and I'm trying to figure out the best way to do it. One of the tools at my disposal seems to be fantastically incompletely documented pkg_postinst_ontarget.
One of the activities I need to perform depends on an SD card being successfully mounted. Would pkg_postinst_ontarget get executed after all fstab mounting activities have completed?
The yocto build places the post-installation scripts in /etc/ipk-postinsts if you are using ipk packages. Then, those scripts are typically run by systemd on target: the run-postinsts.service unit runs /usr/sbin/run-postinsts which runs and deletes all the scripts stored in /etc/ipk-postinsts. Hence, the scripts are run once at the first startup but disappear after they have been executed.
Following this Microsoft tutorial (Run unit tests with your builds), I was expecting to be able to run my unit-tests automatically when a build is triggered, such as a Pull request.
However, when I look in the Pipeline / Builds tab and try to edit my pipeline, there is nothing that allows me to add a new task (see screenshot below).
However, there doesn't appear to be any way of adding a task. I can't even switch to the YMAL without navigating to the source via the Repo. I was hoping to use the GUI though as my YMAL is non-existent.
I have created a test solution with the following structure, which is held in the repo:
Core Solution
|_ Class Library Project (.NET Core)
|_ MSTest Test Project (.NET Core)
I was hoping to have a build step followed by a Unit Test step using the tests in my MSTest Test Project (.NET Core) project once they were built.
There appears to be a Tasks option in the Releases tab by the way, but I was expecting to be able to add tasks for builds as well, especially Unit Tests.
Being new at this, perhaps I have missed or misunderstood something. I would be grateful for any help and to be pointed in the right direction.
It seems that there is an obscure link that takes you through to the correct process, found it quite by accident, see screenshot below:
It turns out that there is a little link titled Use the visual designer that I'd missed. Seems a little odd that most of the tutorials discuss this process and yet it's partially obscured. I guess that this shows that the platform is a work in process and still being added to and improved.
Another few caveats for those descending this little rabbit hole, after selecting the Use the visual designer link, be sure to select the Empty pipeline template, or whatever is appropriate to your project/solution type, and not the YAML option at the top of the list, otherwise you'll be back where you started.
Finally, it seems that automated Unit Tests don't work on Agents other than Hosted VS2017 Agents (that said, I haven't tried the Hosted option). This is set in the very next screen by selecting the Pipeline and selecting the Agent pool from the drop-down.
One last thing... The pipeline won't run automatically unless you check the Enable continuous integration checkbox on the Triggers tab.
Once this is all done, I simply chose the tasks needed for the build by clicking the plus symbol on the Agent job 1 item.
Good luck
Kaine
You have created a new YAML style of build configuration. Currently Azure DevOps does not support of having YAML & GUI editor at the same time (this is upcoming feature in Q1/2019).
To get a GUI editor, create new build pipeline.
In this selection click "Use the visual designer".
Then at template phase, don't select YAML. Any other template will do. You can easily delete all the build steps after selecting template, so don't be afraid to choose any.
I am finishing a Continuous Integration system with Jenkins and Gradle for a REST service. It will build the App and dependent sub-libraries, build a Docker, start main docker and secondary ones (database, ...) all in Gradle.
As it is a REST service I have a separate project that executes the REST tests completely from outside my project just as it is a REST client, and works ok...
Once my project is built and everything running I need to execute the build in the other project (which is just for tests) as a subproject, and wether it passes or not the tests I want to continue the main script as Dockers need to be stopped and deleted. What is the best approach for this?
Thanks
You just need to create a task with type: GradleBuild in parameter
Example:
task buildAnotherProjectTask(type: GradleBuild) {
buildFile = '../pathToBuildFileInTheOtherProject/build.gradle'
tasks = ['build'] // You can run any task like that
tasks = ['test']
}
and to run it u can use the following command
gradle buildAnotherProjectTask
This is worked with me when i tried it.
Hope my answer will help :)
When running our Release build (which ultimately labels and versions a changeset), I want the variables to be supplied at queueing time. For example 1.0.23 below:
Is there any way to set these variables as required in order to execute the build?
This new "vNext" build platform is incredibly difficult to Google for.
The best I have come up with thus far is to add a task as the first step in the first phase of the build that checks the required variables are set. If any are not, it fails the build.
I use PowerShell for this:
if ([string]::IsNullOrWhitespace($env:Major)) { throw "Major not set" }
This is not ideal, as the build still has to wait to get scheduled on an agent, sync sources, &c. before the validation code runs and fails the build. But, it's still better than building everything just to have, say, packaging (step 14/15) fail because the version wasn't set.
I've opened a feature request on the VSTS UserVoice page asking for "required queue variables".
We have a perfectly working Talend Workflow which has 4 sub-jobs. One of the jobs needed a change, so we modified it and re-built the job within Talend Open Studio. Copied the jar to our production machine. However, when the Task executed, it failed with a "No Class Def Found" error message.
So, is this not how its supposed to be done? Do we have to re-build and re-deploy the main task and all the sub-jobs even for a minor change in a sub-job? Any ideas?
TIA,
Bee
You need to rebuild and deploy the main job.
I don't know why, mays be have you increased the version of your subjob ?