I want to analyze the log of the Ansible task inside a release pipeline with a custom task extension (node). In generell this is no big thing, but I ask my self whether it is possible to get the position of my executing task inside the pipeline?
Usecase
The user could add an Ansible task and if he want he can add my custom task to analyze the log (and publish somewhere). To decide which Ansible task inside the release pipeline should analyzed, the user should but the custom task directly after the Ansible task.
Something like this:
Release Pipeline 1.) Stage:
- Initialize job
- Download artifacts
- CopyFilesOverSSH
- Ansible <--- this should analyzed
- Custom Task
- Ansible
- Finialize job
Szenario
A release Pipeline with 3 Stages.
1.) Stage linear
2.) Stage & 3.) Stage parallel on second level
all stages contains a ansible task
all stages should contain my custom task extension to analyze the ansible task log
The Problem
When I now request the Get Release REST call I can loop though all environments, all jobs, all tasks looks like:
let release: Release = await Api.getRelease(Env.System.TeamProject, Env.Release.ReleaseId);
// loop environments (stages)
for (let environment of release.environments) {
// loop deploy steps
for (let deployStep of environment.deploySteps || []) {
// loop phases
for (let releaseDeployPhase of deployStep.releaseDeployPhases || []) {
// loop jobs
for (let deploymentJob of releaseDeployPhase.deploymentJobs || []) {
let ansible: ReleaseTask[] = [];
// loop tasks
for (let task of deploymentJob.tasks || []) {
if (task.startTime && task.finishTime) {
if (task.name === "Ansible") ansible.push(task);
}
}
console.log(ansible);
}
}
}
}
on runtime on this state my executing custom task is not finished yet (TaskStatus.InProgress), so together with the id of my task I am should able to detect the position inside the pipeline.
Some better Solution?
But I hope there is a much better solution, I mean the task should know which position it has inside the pipeline or? Maybe there is an information you can get over the azure-pipelines-task-lib/task lib?
Something like:
task.getRank() >>> 3
task.getEnvironment >>> 2
What also could help is when you could request the task name and the id inside the task program. Currently I have to create own variables with these information.
Something like this:
me.Id() >>> "8bb50e0a-8efb-47a4-b12a-2190f3a4d16a"
me.Name() >>> "MyCustomTask"
We could analyze the log of the Ansible task via power shell and REST API, an example is shown below:
$listurl="https://vsrm.dev.azure.com/{Org name}/{Project name}/_apis/release/releases/{Release id}?api-version=6.1-preview.8"
$PAT="{PAT}"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($PAT)"))
$result = Invoke-RestMethod -Uri $listurl -Headers #{Authorization = "Basic {0}" -f $base64AuthInfo} -Method get
$result.environments.deploySteps.releaseDeployPhases.deploymentJobs.tasks.name
foreach($deploySteps in $result.environments.deploySteps)
{
write-host $deploySteps.releaseDeployPhases.deploymentJobs.tasks.name
foreach($Task in $deploySteps.releaseDeployPhases.deploymentJobs.tasks){
if($Task.name -eq "{analyze task display name}")
{
#write-host $Task.logUrl
$TaskLogURL = $Task.logUrl
}
}
}
#write-host $TaskLogURL
$TaskLog = Invoke-RestMethod -Uri $TaskLogURL -Headers #{Authorization = "Basic {0}" -f $base64AuthInfo} -Method get
write-host $TaskLog
Related
I have a pipeline 'A' which has two stages- dev, prod. After dev environment finishes I want prod stage to be triggered only if pipeline 'B' is ran successfully. I want stage 'prod' of pipeline 'A' to be dependent on pipeline 'B'. Is this feasible?
You can get Pipeline B result in stage dev (link here), and set it as variable, in the prod stage, evaluate the variable value to determine the stage to run or not(link here).
Code sample as below:
stages:
- stage: Dev
jobs:
- job: DevJob
steps:
- task: PowerShell#2
name: GetpipelineBresult
inputs:
targetType: 'inline'
script: |
$url = "https://dev.azure.com/{organization}/{pipelineBProject}/_apis/build/builds?definitions={definitionid}&api-version=5.1"
$personalToken = "$(PAT)"
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($personalToken)"))
$header = #{authorization = "Basic $token"}
$buildPipeline= Invoke-RestMethod -Uri $url -Headers $header -Method Get
$BuildResult= $buildPipeline.value.result | Select-Object -first 1
Write-Host This is Build Result: $BuildResult
echo "##vso[task.setvariable variable=Buildresult;isOutput=true]$BuildResult"
- stage: Prod
condition: eq(dependencies.Dev.outputs['DevJob.GetpipelineBresult.Buildresult'], 'succeeded')
dependsOn: Dev
jobs:
- job:
steps:
- script: echo job Prod
Pipeline B result: succeeded
Pipeline B result: not succeeded(cancel, failed..)
1.Intasll this Azure DevOps Extension
2.In your Dev stage, add Trigger Build task to ensure you could trigger pipeline B and check the latest build result of pipeline B.
3.Create the Generic Service Connection.
4.Use Invoke REST API in Environment Approvals and checks.
API: GET https://dev.azure.com/{organization}/{project}/_apis/build/latest/{definition}?api-version=6.0-preview.1
5.After check pass, second stage will depend on the success build of Pipeline B.
trigger:
- none
stages:
- stage: Dev
jobs:
- job: CI
pool:
vmImage: windows-latest
steps:
- task: TriggerBuild#4
inputs:
definitionIsInCurrentTeamProject: true
buildDefinition: 'PipelineB'
queueBuildForUserThatTriggeredBuild: true
ignoreSslCertificateErrors: false
useSameSourceVersion: false
useCustomSourceVersion: false
useSameBranch: true
waitForQueuedBuildsToFinish: false
storeInEnvironmentVariable: false
authenticationMethod: 'Personal Access Token'
password: 'PAT'
enableBuildInQueueCondition: false
dependentOnSuccessfulBuildCondition: false
dependentOnFailedBuildCondition: false
checkbuildsoncurrentbranch: false
failTaskIfConditionsAreNotFulfilled: false
- stage: Prod
dependsOn: Dev
jobs:
- deployment: CD
environment: {EnvironmentName}
pool:
vmImage: windows-latest
strategy:
runOnce:
deploy:
steps:
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world
I'm trying to pass a parameters through the build rest API using jira, but it doesn't override the parameter.
Pipeline:
parameters:
- name: "Testplan"
type: string
default: "NoPlanDefined"
stage: Test
jobs:
- job: Testing_And_Transfer
- task: PowerShell#2
displayName: "Testing API Call"
inputs:
targetType: 'filepath'
filePath: './script/Jira.ps1'
arguments:
-Jira_id ${{ parameters.Testplan }}
Jira.ps1 content:
Param(
[string]$Jira_id = "no ID"
)
#-----------------------Jira API--------------------------
echo 'This is a test \n ID: '
echo $Jira_id
My rest command is setup like so:
URL: https://dev.azure.com/{My corp}/MyHello/_apis/build/builds?api-version=6.0
Body:
{
"definition": {
"id": 1
},
"parameters": "{ \"Testplan\":\"Postman\" }"
}
When using the trigger, the ps1 return NoPlanDefined as expected.
When using a manual trigger and changing the parameter, the parameter
get changed as expected.
When trying to change the parameter through
the Rest api, Testplan is empty instead of Postman.
I'm I doing something wrong with the REST API?
That's because those are not parameters, despite the name used in the REST call. They are run-time variables, which behave differently and are available at a different scope than parameters.
There is a different API that allows you to specify templateParameters: https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run-pipeline?view=azure-devops-rest-6.1
If you are familiar with PowerShell you can use the AzurePipelinesPS module and the command below to pass parameters to pipelines when invoking them.
$id = '2' # your pipeline id
$templateParameters = #{
Testplan = 'myTestPlan' # your parameter and value
}
Invoke-APPipeline -Instance https://dev.azure.com -Collection 'yourCollectionName' -Project 'yourProjectName' -ApiVersion '6.0-preview.1' -PipelineId $id -TemplateParameters $templateParameters
The module supports "sessions" to limit the number of required parameters. See the module's github page on how to create a session.
I am trying to execute a process which is written in c# through jenkins pipeline during the build and deployment process.
It is a simple executable which takes 3 arguments, when it gets called from jenkins pipeline using a powershell function it doesn't write any logs which are plenty within the code of this exe, also it does not show anything on the pipeline logs as to what happened to this process. Whereas the logs output is clean before and after the execution of this process i.e. "Started..." & "end" gets printed in the jenkins build log.
When i try to run the same exe on a server directly with the same powershel script it runs perfectly fine. Could you please let me know how can i determine whats going wrong here or how can i make the logs more verbose so i can figure out the root cause.
Here is the code snippet
build-utils.ps1
function disable-utility($workspace) {
#the code here fetches the executable and its supporting libraries from the artifactory location and unzip it on the build agent server.
#below is the call to the executable
Type xmlPath #this prints the whole contents of the xml file which is being used as an input to my exe.
echo "disable exe path exists : $(Test-Path ""C:\Jenkins\workspace\utils\disable.exe"")" // output is TRUE
echo "Started..."
Start-Process -NoNewWindow -Filepath "C:\Jenkins\workspace\utils\disable.exe" -ArgumentList "-f xmlPath 0" #xmlPath is a path to a xml file
echo "end."
}
jenkinsfile
library {
identifier: 'jenkins-library#0.2.14',
retriever: legacySCM{[
$class: 'GitSCM',
userRemoteConfigs: [[
credtialsId: 'BITBUCKET_RW'
url: <htps://gitRepoUrl>
]]
]}
}
def executeStep(String stepName) {
def butil = '.\\build\\build-utils.ps1'
if(fileExists(butil))
{
def status = powershell(returnStatus: true, script: "& { . '${butil}'; ${stepName}; }")
echo status
if(status != 0) {
currentBuild.Result = 'Failure'
error("$StepName failed")
}
}
else
{
error("failed to find the file")
}
}
pipeline {
agent {
docker {
image '<path to the docker image to pull a server with VS2017 build tools>'
lable '<image name>'
reuseNode true
}
}
environment {
#loading the env variables here
}
stages {
stage {
step {
executeStep("disable-utility ${env.workspace}")
}
}
}
}
Thanks a lot in advance !
Have you changed it ? go to Regedit [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System Set "EnableLUA"= 0
I am trying to update PowerBI datasource using Powershell in Azure devops Pipeline. This works sometimes and sometimes I get an error :
##[error]System.Threading.Tasks.TaskCanceledException: A task was canceled.
The command on which I get this error is :
$urlUpdateDatasource = [string]::Format("groups/{0}/datasets/{1}/Default.UpdateDatasources", $GroupID, $destinationDataset.Id);
$body= "{
""updateDetails"": [
{
""datasourceSelector"": {
""datasourceType"": ""$srcDBType"",
""connectionDetails"": {
""server"": ""$srcServer"",
""database"": ""$srcDB""
}
},
""connectionDetails"": {
""server"": ""$destinationServer"",
""database"": ""$destinationDB""
}
}
]
}"
Write-Host $body
Write-Host $urlUpdateDatasource
try{
$updateDS = Invoke-PowerBIRestMethod -Url $urlUpdateDatasource -Method Post -Body $body
}
catch{
Write-Host 'Failed to UpdateDatasources $_'
}
I have observed that when this step takes for than 1.5 min. I get this error though the update is successful . SO this error is false and shows my pipeline as failed even though it is not. Please suggest how I can handle this.
I'm trying to run a pipeline that does some Pester Testing and publish the NUnit results.
New tests were introduced and for whatever the reason, Jenkins no longer publishes the test results and errors out immediately after the powershell script. Hence, it doesn't get to the nunit publish piece. I receive this:
ERROR: script returned exit code 128
Finished: FAILURE
I've been trying to include the publish in the always section of the post section of the Jenkinsfile, however, I'm running into problems on how to make that NUnit test file available.
I've tried establishing an agent and unstash the file (even though it probably won't stash if the powershell script cancels the whole pipeline). When I use agent I get the following exception:
java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps
Here is the Jenkinsfile:
pipeline {
agent none
environment {
svcpath = 'D:\\svc\\'
unitTestFile = 'UnitTests.xml'
}
stages {
stage ('Checkout and Stash') {
agent {label 'Agent1'}
steps {
stash name: 'Modules', includes: 'Modules/*/**'
stash name: 'Tests', includes: 'Tests/*/**'
}
}
stage ('Unit Tests') {
agent {label 'Agent1'}
steps {
dir(svcpath + 'Modules\\'){deleteDir()}
dir(svcpath + 'Tests\\'){deleteDir()}
dir(svcpath){
unstash name: 'Modules'
unstash name: 'Tests'
}
dir(svcpath + 'Tests\\'){
powershell """
\$requiredCoverageThreshold = 0.90
\$modules = Get-ChildItem ../Modules/ -File -Recurse -Include *.psm1
\$result = Invoke-Pester -CodeCoverage \$modules -PassThru -OutputFile ${unitTestFile} -OutputFormat NUnitXml
\$codeCoverage = \$result.CodeCoverage.NumberOfCommandsExecuted / \$result.CodeCoverage.NumberOfCommandsAnalyzed
Write-Output \$codeCoverage
if (\$codeCoverage -lt \$requiredCoverageThreshold) {
Write-Output "Build failed: required code coverage threshold of \$(\$requiredCoverageThreshold * 100)% not met. Current coverage: \$(\$codeCoverage * 100)%."
exit 1
} else {
write-output "Required code coverage threshold of \$(\$requiredCoverageThreshold * 100)% met. Current coverage: \$(\$codeCoverage * 100)%."
}
"""
stash name: 'TestResults', includes: unitTestFile
nunit testResultsPattern: unitTestFile
}
}
post {
always {
echo 'This will always run'
agent {label 'Agent1'}
unstash name: 'TestResults'
nunit testResultsPattern: unitTestFile
}
success {
echo 'This will run only if successful'
}
failure {
echo 'This will run only if failed'
}
unstable {
echo 'This will run only if the run was marked as unstable'
}
changed {
echo 'This will run only if the state of the Pipeline has changed'
echo 'For example, if the Pipeline was previously failing but is now successful'
}
}
}
Any and all input is welcome! Thanks!
The exception you are getting is due to Jenkins' strict pipeline DSL. Documentation of allowable uses of agent are here.
Currently agent {...} is not allowed to be used in the post section. Maybe this will change in the future. If you require the whole job to run on the node that services label 'Agent1' the only way to currently do that is to
Put agent {label 'Agent1'} immediately under pipeline { to make it global
Remove all instances of agent {label 'Agent1'} in each stage
Remove the agent {label 'Agent1'} from the post section.
The post section acts more like traditional scripted DSL than the pipeline declarative DSL. So you have to use node() instead of agent.
I believe I've had this same question myself, and this SO post has the answer and some good context.
This Jenkins issue isn't exactly the same thing but shows the node syntax in the post stage.