Pipeline trigger doesn't run because it's getting the wrong branch - azure-devops

Friends how to make Run Pipeline get the desired branch by default.
My trigger is not being fired because in the pipeline execution it is looking for the yml file that does not exist in the master branch, I want it to automatically load branch feat-regression-test.
my yaml file
trigger:
- feat-regression-test
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UseRubyVersion#0
inputs:
versionSpec: '>= 2.5'
- script: |
gem install bundler
bundle install --retry=3 --jobs=4
displayName: 'bundle install'
- script: bundle exec cucumber -t #incluir_setor_resp_em_branco
displayName: 'bundle exec cucumber'
- task: PublishTestResults#2
inputs:
testResultsFormat: 'NUnit'
testResultsFiles: '**/TEST-*.xml'
mergeTestResults: true
searchFolder: '$(Common.TestResultsDirectory)'
testRunTitle: 'Regression Test Geocall Gab'
enter image description here

Can you try this syntax :
trigger:
branches:
include:
- feat-regression-test

You can change the pipeline's default branch by editing the pipeline definition via the UI and going to the "Triggers" section. There will be a "YAML" tab where you can configure this behavior.

Related

Manually triggering Devops pipeline with pipeline resource should use latest resource pipeline run for that branch

I have 2 pipelines in the same repo:
Build
Deploy
The Build pipeline is declared as a pipeline resource in the Deploy pipeline:
resources:
pipelines:
- pipeline: Build
source: BuildPipelineName
trigger: true
When I run the Build pipeline, the Deploy pipeline is correctly triggered on the same branch. However, when I run the Deploy pipeline manually, it does not use the latest pipeline run from same branch.
I tried adding a couple of variations of the line below to the to the pipeline resource, but the variable does not expand:
branch: ${{ variables.Build.SourceBranchName }}
Is there any way to make this work?
Workaround that achieves the result I am looking for, but is not very elegant:
- ${{ if ne(variables['Build.Reason'], 'ResourceTrigger') }}:
- task: DeleteFiles#1
displayName: 'Remove downloaded artifacts from pipeline resource'
inputs:
SourceFolder: $(Pipeline.Workspace)
- task: DownloadPipelineArtifact#2
displayName: 'Download artifacts for branch'
inputs:
source: 'specific'
project: 'myProject'
pipeline: <BuildPipelineId>
runVersion: 'latestFromBranch'
runBranch: $(Build.SourceBranch)
For example, if I have a build pipeline named 'BuildPipelineAndDeployPipeline',
then the below YAML definition can get the latest build pipeline run from a specific branch:
resources:
pipelines:
- pipeline: BuildPipelineAndDeployPipeline
project: xxx
source: BuildPipelineAndDeployPipeline
trigger:
branches:
- main
pool:
vmImage: 'windows-latest'
steps:
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world
echo $(resources.pipeline.BuildPipelineAndDeployPipeline.runID)

Question about Azure Devops pipeline question about task conditions

I've been trying to setup an Azure Devops pipeline for testing purposes and i'm struggling to understand why one of my tasks runs the script line despite being skipped.
Here's the pipeline yaml code:
## Example azure-pipelines.yml
## Event (branch to trigger the pipeline execution)
trigger:
branches:
include:
- main
exclude:
- My-branch # Will not run
# Configures pipeline execution on pull requests
pr:
branches:
include:
- main
exclude:
- My-branch # Will not run
# Environment variables created
variables:
- group: my-keys
## OS where the pipeline will run
pool:
vmImage: 'ubuntu-latest'
# List of stages for your application
stages:
- stage: Test
displayName: Application Testing
# List of jobs the pipeline stage will run
jobs:
- job: MyJob
displayName: Install packages and and publishes
variables:
# Sets the environment variable to cache the application packages
npm_config_cache: $(Pipeline.Workspace)/.npm
# List of steps for the job
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- task: Cache#2
displayName: Install and cache packages
inputs:
key: 'npm | "$(Agent.OS)" | package-lock.json'
restoreKeys: |
npm | "$(Agent.OS)"
path: $(npm_config_cache)
- script: npm ci
condition: ne(variables.CACHE_RESTORED, 'true')
- task: Npm#1
displayName: Publish and auto accept
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/main'))
- script: npx my-package --with-token=${my-keys} --auto-publish-changes
- task: Npm#1
displayName: Publish
condition: eq(variables['Build.Reason'], 'PullRequest')
- script: npx my-package --with-token=${my-keys}
- script: echo ${{variables['Build.Reason']}} ${{eq(variables['Build.Reason'], 'PullRequest')}}
A example, for instance when a push is made into the main branch it runs Publish and auto accept followed by the Publish, when it technically should only run the first one. One other thing that i saw was that when a pull request is incoming to one other branch rather than main it shouldn't trigger the script associated to Publish and auto accept but instead jump over that and run only the script in Publish, but instead it runs the scripts in both.
If anyone could provide some help with this i would appreciate it.
Thanks in advance
I think the problem is that you run 4 tasks instead of two
Take a look at NPM task syntax, it has no 'script' parameter
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/package/npm?view=azure-devops
'script' task that you are using is indeed shortcut of another task 'CmdLine#2'
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/command-line?view=azure-devops&tabs=yaml
Firstly you run NPM task with specified condition, but it does nothing
task: Npm#1
displayName: Publish and auto accept
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/main'))
Then you run script task without condition(so it will run always), this task does npm stuff
script: npx my-package --with-token=${my-keys} --auto-publish-changes
Then you run again npm task without desired parameters but with conditions
task: Npm#1
displayName: Publish
condition: eq(variables['Build.Reason'], 'PullRequest')
And finally you run fourth task doing stuff, without conditions so it runs always.
script: npx my-package --with-token=${my-keys}
So as to fix this problem, you need to use Npm#1 task with parameters specified in provided documentation. Or just add conditions to your script tasks(CmdLine#2).
I think that below snippet should work
- task: CmdLine#2
displayName: 'Publish and auto accept'
condition: and(succeeded(), eq(variables['build.sourceBranch'], 'refs/heads/main'))
inputs:
script: 'npx my-package --with-token=${my-keys} --auto-publish-changes'
- task: CmdLine#2
displayName: 'Publish'
condition: eq(variables['Build.Reason'], 'PullRequest')
inputs:
script: 'npx my-package --with-token=${my-keys}'

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

Azure DevOps build scheduled trigger for every hours is not working

I have defined a .yml file like below. I expect it to run every one hour on my branch 'kabhukya-cicd', but it is not working.
pool:
name: Hosted Windows 2019 with VS2019
demands:
- npm
- azureps
# Scheduled triggers
schedules:
- cron: "0 */1 * * *"
displayName: every one hour trigger
branches:
include:
- kabhukya-cicd
always: true
# npm install, npm start, npm link steps
steps:
- task: Npm#1
displayName: 'npm install'
inputs:
verbose: false
- task: Npm#1
displayName: 'npm start'
inputs:
command: custom
verbose: false
customCommand: start
- task: Npm#1
displayName: 'npm link'
inputs:
command: custom
verbose: false
customCommand: link
I have tried changing the cron schedules, but no luck..
Due to the way yaml works, only the yaml file in a branch will be considered for that branch, so if the branch filters should include a branch "kabhukya-cicd" but the yaml file in branch "kabhukya-cicd" does not contain that schedule with those branch filters, it will not be evaluated. It is a common complication that you could be hitting.
You can also refer to this case for other solutions. In this case, there are different solutions provided by contributors who have the same issue with you.

Azure DevOps test -xml not found after running the Cypress tests

Added a Publish test results task in Azure DevOpsCI/CD pipeline, test were successfull, but after running the test it complaints about ##[warning]No test result files matching **/test-*.xml were found. Could someone please advise on how can we resolve similar problem ?
Publish Test Results task : configuration
Test result format= JUnit
Test results files= **/test-*.xml
Search folder = $(System.DefaultWorkingDirectory)
Test results title = Cypress Test Results
note: I have try adding the search folder path as follows: C:\agent_work\r5\a\drop\ui-tests\cypress
package.json to run the tests
"scripts": {
"test": "cypress run --record --key <key value here>"
}
My directory path in server:
C:\agent_work\r5\a\drop\ui-tests\cypress
My friend, I was facing the same issue on Azure DevOps.
In my case, the folder where the xml files were generated was reports on the root of the repo, that depends on how you got configured Junit on your cypress.json file
So In my case, the solution was changing this on azure-pipelines.yml
testResultsFiles: "results/*.xml"
searchFolder: $(System.DefaultWorkingDirectory)
So that's the entire setup of the testing pipeline
# Node.js
# Build a general Node.js project with npm.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: "npm i"
displayName: "Install project dependencies"
- script: "npm run cy:verify"
displayName: "Cypress Verify"
- script: "source cypress.env" # comment this script to run tests against production
displayName: "Using env variables to change url to test against development branch"
- script: "npm run cy:run-report"
displayName: "Run Cypress Tests"
- task: PublishBuildArtifacts#1
displayName: "Publish Artifact: cypress-azure-devops screenshots"
inputs:
PathtoPublish: cypress/screenshots
ArtifactName: "CypressAzureDevopsTestRunScreenshots"
condition: failed()
- task: PublishTestResults#2
displayName: "Publish Test Results"
condition: succeededOrFailed()
inputs:
testResultsFormat: "JUnit"
testResultsFiles: "results/*.xml"
searchFolder: $(System.DefaultWorkingDirectory)
mergeTestResults: true
testRunTitle: 'Test Results'
continueOnError: true
Saludos desde Argentina 🇦🇷