Question about Azure Devops pipeline question about task conditions - azure-devops

I've been trying to setup an Azure Devops pipeline for testing purposes and i'm struggling to understand why one of my tasks runs the script line despite being skipped.
Here's the pipeline yaml code:
## Example azure-pipelines.yml
## Event (branch to trigger the pipeline execution)
trigger:
branches:
include:
- main
exclude:
- My-branch # Will not run
# Configures pipeline execution on pull requests
pr:
branches:
include:
- main
exclude:
- My-branch # Will not run
# Environment variables created
variables:
- group: my-keys
## OS where the pipeline will run
pool:
vmImage: 'ubuntu-latest'
# List of stages for your application
stages:
- stage: Test
displayName: Application Testing
# List of jobs the pipeline stage will run
jobs:
- job: MyJob
displayName: Install packages and and publishes
variables:
# Sets the environment variable to cache the application packages
npm_config_cache: $(Pipeline.Workspace)/.npm
# List of steps for the job
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- task: Cache#2
displayName: Install and cache packages
inputs:
key: 'npm | "$(Agent.OS)" | package-lock.json'
restoreKeys: |
npm | "$(Agent.OS)"
path: $(npm_config_cache)
- script: npm ci
condition: ne(variables.CACHE_RESTORED, 'true')
- task: Npm#1
displayName: Publish and auto accept
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/main'))
- script: npx my-package --with-token=${my-keys} --auto-publish-changes
- task: Npm#1
displayName: Publish
condition: eq(variables['Build.Reason'], 'PullRequest')
- script: npx my-package --with-token=${my-keys}
- script: echo ${{variables['Build.Reason']}} ${{eq(variables['Build.Reason'], 'PullRequest')}}
A example, for instance when a push is made into the main branch it runs Publish and auto accept followed by the Publish, when it technically should only run the first one. One other thing that i saw was that when a pull request is incoming to one other branch rather than main it shouldn't trigger the script associated to Publish and auto accept but instead jump over that and run only the script in Publish, but instead it runs the scripts in both.
If anyone could provide some help with this i would appreciate it.
Thanks in advance

I think the problem is that you run 4 tasks instead of two
Take a look at NPM task syntax, it has no 'script' parameter
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/package/npm?view=azure-devops
'script' task that you are using is indeed shortcut of another task 'CmdLine#2'
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/command-line?view=azure-devops&tabs=yaml
Firstly you run NPM task with specified condition, but it does nothing
task: Npm#1
displayName: Publish and auto accept
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/main'))
Then you run script task without condition(so it will run always), this task does npm stuff
script: npx my-package --with-token=${my-keys} --auto-publish-changes
Then you run again npm task without desired parameters but with conditions
task: Npm#1
displayName: Publish
condition: eq(variables['Build.Reason'], 'PullRequest')
And finally you run fourth task doing stuff, without conditions so it runs always.
script: npx my-package --with-token=${my-keys}
So as to fix this problem, you need to use Npm#1 task with parameters specified in provided documentation. Or just add conditions to your script tasks(CmdLine#2).
I think that below snippet should work
- task: CmdLine#2
displayName: 'Publish and auto accept'
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/main'))
inputs:
script: 'npx my-package --with-token=${my-keys} --auto-publish-changes'
- task: CmdLine#2
displayName: 'Publish'
condition: eq(variables['Build.Reason'], 'PullRequest')
inputs:
script: 'npx my-package --with-token=${my-keys}'

Related

Pipeline trigger doesn't run because it's getting the wrong branch

Friends how to make Run Pipeline get the desired branch by default.
My trigger is not being fired because in the pipeline execution it is looking for the yml file that does not exist in the master branch, I want it to automatically load branch feat-regression-test.
my yaml file
trigger:
- feat-regression-test
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UseRubyVersion#0
inputs:
versionSpec: '>= 2.5'
- script: |
gem install bundler
bundle install --retry=3 --jobs=4
displayName: 'bundle install'
- script: bundle exec cucumber -t #incluir_setor_resp_em_branco
displayName: 'bundle exec cucumber'
- task: PublishTestResults#2
inputs:
testResultsFormat: 'NUnit'
testResultsFiles: '**/TEST-*.xml'
mergeTestResults: true
searchFolder: '$(Common.TestResultsDirectory)'
testRunTitle: 'Regression Test Geocall Gab'
enter image description here
Can you try this syntax :
trigger:
branches:
include:
- feat-regression-test
You can change the pipeline's default branch by editing the pipeline definition via the UI and going to the "Triggers" section. There will be a "YAML" tab where you can configure this behavior.

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

File from previous step cannot be found in Azure DevOps-Pipeline

In a pipeline I have two different steps. The first one generates some files, the second should take these files as an input.
the Yaml for that pipeline is the following:
name: myscript
stages:
- stage: Tes/t
displayName: owasp-test
jobs:
- job: owasp_test
displayName: run beasic checks for site
pool:
name: default
demands: Agent.OS -equals Windows_NT
steps:
- task: DotNetCoreCLI#2
inputs:
command: 'build'
projects: '**/*.sln'
- task: dependency-check-build-task#5
inputs:
projectName: 'DependencyCheck'
scanPath: '**/*.dll'
format: 'JUNIT'
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/*-junit.xml'
the dependency-check-build-task returns an XML-File:
File upload succeed.
Upload 'P:\Azure-Pipelines-Agent\_work\2\TestResults\dependency-check\dependency-check-junit.xml' to file container: '#/11589616/dependency-check'
Associated artifact 53031 with build 21497
The following step (PublishTestResults) SHOULD take that file but returns
##[warning]No test result files matching **/*-junit.xml were found.
instead. I can see that file in the artifact after the pipeline is run.
This is because your report is written to Common.TestResultsDirectory which is c:\agent_work\1\TestResults (for Microsoft Hosted agents), and publish test task looks in System.DefaultWorkingDirectory which is c:\agent_work\1\s.
Please try:
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/*-junit.xml'
searchFolder: '$(Common.TestResultsDirectory)'
I had the same trouble:
I fixed changing the Agent Specification

How to consume artifacts in AzureDevops build pipeline from one job to another job

I have created a build pipeline, where
in Job1: I create a specific file in python container
in Job2: I would like to use that file for my next process to consume in Docker Container
To achieve this, I have created artifact called configCreate for Job1 and in job2 I was trying to download the artifact created above in $(System.DefaultWorkingDirectory). But when I tried to access the file I still see the file from source not the file from Job1. How can I access the artifact from Job1 artifact to Job2 ? Reference
So,
stages:
- stage: Build
displayName: BUILD NON MASTER BRANCH
condition: and(succeeded(), ne(variables['Build.SourceBranch'], 'refs/heads/master'))
variables:
- group: Common_DEV
jobs:
- job: buildConfig
displayName: 'Create Properties file.'
container: python3
pool: rhel
steps:
- bash: echo "Hello World!!! - $(Build.SourceBranch)"
displayName: "Started building for $(Build.SourceBranch)"
- bash: |
echo PythonV3
python3 -m venv venv
source venv/bin/activate
python --version
pip3 install -r $(System.DefaultWorkingDirectory)/requirements.txt
python3 injectConfigProperties.py
echo "Finish creating the Properties."
deactivate
rm -r venv/
cat $(System.DefaultWorkingDirectory)/properties/config.properties
displayName: Build Config file
- task: CopyFiles#2
inputs:
contents: |
$(System.DefaultWorkingDirectory)
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: $(Build.ArtifactStagingDirectory)
artifactName: configCreate
- job: Job2
displayName: 'JOb2'
container: docker
pool: rhel
dependsOn:
- buildConfig
condition: succeeded()
steps:
- task: DownloadBuildArtifacts#0
displayName: 'Download Build Artifacts from buildConfig.'
inputs:
artifactName: configCreate
downloadPath: $(System.DefaultWorkingDirectory)
- bash: |
cat $(System.DefaultWorkingDirectory)/properties/config.properties
ls -ltr $(System.DefaultWorkingDirectory)/pipelines/
displayName: 'Build Docker Image'
in job2 to access the artifact is cat $(System.DefaultWorkingDirectory)/artifactName/properties/config.properties
I missed artifact Name to access the artifact. Now it's fixed.

Azure pipelines run specific task if and ONLY IF branch is equal to master in yaml

I am trying to change my yaml file to add some more tasks. This is my current yaml file:
trigger:
- master
pool:
vmImage: 'Ubuntu-16.04'
steps:
- task: Maven#3
inputs:
mavenPomFile: 'pom.xml'
# according to: https://github.com/MicrosoftDocs/vsts-docs/issues/3845,
# maven options should go to goals instead, as mavenOptions is for jvm options
mavenOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.11'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
goals: 'verify -Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=WARN -Dorg.slf4j.simpleLogger.showDateTime=true -Djava.awt.headless=true --batch-mode --show-version'
I want to run one goal only if the branch that runs is master. Basically with my tests, I create a dockerfile and I want to push it to dockerhub, but I don't want that to happen every time someone opens a pull request; I want that to happen only if master is running the tests. Something like this
if branch == master
steps: *
but I don't seem to find anything in the Azure Pipelines documentation on how to do this
You can use the following condition on the task you want to condition:
eq(variables['Build.SourceBranch'], 'refs/heads/master')
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml#examples
I do something like this as i find it easier to read later
${{ if contains(Build.SourceBranch, 'master') }}:
steps:
- task: Maven#3
inputs:
etcetera: 'etc.'
${{ else }}:
steps:
- task: Some other task