trigger other configuration and send current build status with Jenkins - triggers

In a certain Jenkins config, I wish to trigger another configuration as post build action.
I want to pass as one of the parameters the current build status.
IE: a string / int that represents the status (SUCCESS/FAIL/UNSTABLE).
I have 2 options to create post build triggers:
Using the join plugin
Using the trigger parameterized build in post build actions
I wish there was some kind of accessible env variable at end of run...
Have any idea?
Thanks!

Here is a simple solution that will answer most cases:
Use 'Trigger Parameterized Build' plugin, and set two triggers -
'Stable or unstable but not fail'
'Fail'
Each of those triggers should run the same job - let's call it 'JOB_B'.
For each of the triggers, pass whatever parameters you like, and also pass a user-defined value:
for trigger '1' use: JOB_A_STATUS=SUCCESS
for trigger '2' use: JOB_A_STATUS=FAIL
Now, all you need to do is test the value of ${JOB_A_STATUS} from JOB_B, to see if it is set to 'SUCCESS' or 'FAIL'.
Note this solution does not distinguish between 'stable' and 'unstable', but only knows the difference between 'fail' and 'success'.
Good luck!

You can check for status using Groovy script in post-build step via Groovy Post-Build plugin that can access Jenkins internals via Jenkins Java API. The plugin provides the script with variable manager that can be used to access important parts of the API (see Usage section in the plugin documentation).
For example, this is how you can output build result to build console:
def result = manager.build.result
manager.listener.logger.println "And the result is: ${result}"
Now, you can instead use that value to create a properties file and pass that file to Parameterized Trigger post-build step (it has such an option).
One caveat: I am not sure if it is possible to arrange post-build steps to execute in a particular order to ensure that Groovy post-build step runs before the Parameterized Trigger step.

Related

How to pull code from different branches at runtime and pass parameter to NUnit.xml file?

We recently moved Java(TestNG) to C#.Net (NUnit). Sametime migrated to Jenkins to Team-city. Currently we are facing some challenges while we configuring the new build pipeline in Team-City.
Scenario 1: Our project has multiple branches, we generally pull code from different Git-branches then trigger the automation.
In Jenkins we used to create build-parameter(list), when user try to execute the Job, he/she select the branch-name from the list (build-parameters), then git will pull code user selected branch then trigger execution.
Can you please help how to implement a similar process in Team-City?
How to configure the default value in the list parameter?
Scenario 2: In Jenkins build-parameter use used to pass to (TestNG.xml). eg: browser, environment. When the user select browser and environment from build parameters, when execution trigger TestNG pull those values and initiate the regression.
How should create build parameters (browser, envi.) and pass those
values to NUnit/ config file?
Thanks
Raghu

Trigger Prow job reading parameters from the comment with some parameters

Git hub link to issue that i have raised:
https://github.com/kubernetes/test-infra/issues/25654
We have api tests that are triggered once user comments /test smoke on a PR request,but we want to make this job parametrized which will help run these tests with some parameter.
E.g /test smoke skip
Here we want utilise skip keyword in our job and take action accordingly.
This would enable jobs to run on some user based run time condition helping in creating less jobs.
As of now i dont see any way to pass any parameter with PR comment which can be utilised.
Any workaround to execute the job with parameters would be helpful.

Azure DevOps - get custom Task Reference ID

I want to update a Pipeline with the Definitions - Update REST API call.
That works fine, but when I want to add a custom task (self made build pipeline task extension) then I struggeling to find the correct task reference id:
Invoke-RestMethod : {"$id":"1","innerException":null,"message":"The pipeline is not valid. A task is missing. The pipeline references a task called '7f1fe94f-b811-4ba1-9d6a-b6c27de758d7'. This
usually indicates the task isn't installed, and you may be able to install it from the Marketplace: https://marketplace.visualstudio.com. (Task version 1.*, job 'Job_1', step ''.),Job Job_1: Step
has an invalid task definition reference. A valid task definition reference must specify either an ID or a name and a version specification with a major version
specified.","typeName":"Microsoft.TeamFoundation.DistributedTask.Pipelines.PipelineValidationException,
Microsoft.TeamFoundation.DistributedTask.WebApi","typeKey":"PipelineValidationException","errorCode":0,"eventId":3000}
I check out the registrationId of my custom task with the Installed Extensions - List REST API call. But it is not the correct one. (7f1fe94f-b811-4ba1-9d6a-b6c27de758d7)
I also add the custom task manually to a pipeline and read out the correct task refernce id with the Definitions - Get REST API call. I could find the id in:
$pipeline.process.phases.steps.task.id -> 2c7efb3e-3267-4ac6-addc-86e88a6dab34
But how can I read out this id without adding the custom task manually?
This id is obviously dynamic and changes everytime when the custom task get installed, so there must be a way to get this refernce.
The task id has not changed every time when the custom task gets installed, but he existed in task.json of the task:
{
"id": "2f159376-f4dk-4311-a49c-392f9d534113",
"name": "TaskName",
"friendlyName": "Task Name",
Another option is to use this api:
https://dev.azure.com/{organiztion}/_apis/distributedtask/tasks
You will get a long list of all the tasks, search your task and you will see the id.

Printing the Console output in the Azure DevOps Test Run task

I am doing some initial one off setup using [BeforeTestRun] hook for my specflow tests. This does check on some users to make sure if they exist and creates them with specific roles and permissions if they are not so the automated tests can use them. The function to do this prints a lot of useful information on the Console.Writeline.
When I run the test on my local system I can see the output from this hook function on the main feature file and the output of each scenario under each of them. But when I run the tests via Azure DevOps pipleine, I am not sure where to find the output for the [BeforeTestRun] because it is not bound a particular test scenario. The console of Run Tests Tasks has no information about this.
Can someone please help me to show this output somewhere so I can act accordingly.
I tried to use System.Diagnostics.Debug.Print, System.Diagnostics.Debug.Print, System.Diagnostics.Debug.WriteLine and System.Diagnostics.Trace.WriteLine, but nothing seems to work on pipeline console.
[BeforeTestRun]
public static void BeforeRun()
{
Console.WriteLine(
"Before Test run analyzing the users and their needed properties for performing automation run");
}
I want my output to be visible somewhere so I can act based on that information if needed to.
It's not possible for the console logs.
The product currently does not support printing console logs for passing tests and we do not currently have plans to support this in the near future.
(Source: https://developercommunity.visualstudio.com/content/problem/631082/printing-the-console-output-in-the-azure-devops-te.html)
However, there's another way:
Your build will have an attachment with the file extension .trx. This is a xml file and contains an Output element for each test (see also https://stackoverflow.com/a/55452011):
<TestRun id="[omitted]" name="[omitted] 2020-01-10 17:59:35" runUser="[omitted]" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
<Times creation="2020-01-10T17:59:35.8919298+01:00" queuing="2020-01-10T17:59:35.8919298+01:00" start="2020-01-10T17:59:26.5626373+01:00" finish="2020-01-10T17:59:35.9209479+01:00" />
<Results>
<UnitTestResult testName="TestMethod1">
<Output>
<StdOut>Test</StdOut>
</Output>
</UnitTestResult>
</Results>
</TestRun>

Inject all GERRIT env variables as if the Jenkins job was started by gerrit event

This SO answer has the list of environment variables which gets injected automatically when a Jenkins job is triggered by a gerrit event, but if Jenkins is started manually with a gerrit number as input parameter, how to fetch those GERRIT_* env variables and inject? so the list of environment variables will be same for job started by gerrit event or started manually with gerrit number as input parameter.
You can't do that easly, you would have to use the REST API to search for the GERRIT_* values you're interested in.
But there's another option that, maybe, can solve your problem:
You can re-trigger any job, as it had been trigged at that moment, with all environment variables set. Do the following:
Go to Jenkins web interface
Click on Jenkins > Query and Trigger Gerrit Patches
Search/select the Changes/Patchsets you want
Click on Trigger Selected