Azure pipeline, muti stage YAML pipeline using same work directory on build server. How does it not corrupt - azure-devops

Clarifications and corrections:
Testing with one self-hosted agent.
By version I mean version of the application. Or any new commit.
The same work directory is being used for builds of different commits, when there are still pending stages (requiring approval) in multiple build runs.
We have Azure an azure devops YAML pipeline with multi stage and approvers. I noticed that running different build versions of the same pipeline uses the same work directory on the build server.
How does this not cause corruption of content, for example if the pipeline runs simultaneously for different build versions?
For example what if the newer pipeline run checks out source code while the other run is building and creating artifacts for its own version? I have checked the current path for two concurrent builds and it is the same.

What will happen if you run two commits in the same agent:
Here is my example of multi-stage pipeline:
pool: Default
stages:
- stage: A
jobs:
- job: A
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.SourcesDirectory)'
artifact: 'drop'
publishLocation: 'pipeline'
- stage: B
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool: Default
workspace:
clean: all
environment: 'env'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.SourcesDirectory)'
Contents: '**'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
I added a approve check in the environment. My running order is stage A(commit1)->stage A(commit2)->stage B(commit1)->stage B(commit2) .
stage A(commit1):
This job will checkout source code of commit1 and publish files in Sources Directory of commit1.
stage A(commit2):
This job will checkout source code of commit2 and publish files in Sources Directory of commit2.
stage B(commit1):
It is a deployment job and will not checkout resource by default.
The deployment job will download the artifact of commit1 as expected.
If I don't clean the workspace, it will continue to use the source code of commit2. This may cause some issues.
If I add a checkout step in this stage. It will checkout the source of commit1.
So you can add checkout step and clean workspace to the deployment jobs. Nondeployment jobs automatically check out source code and it will use the correct source.

Related

Terraform: Error while loading schemas for plugin components

I have an Azure DevOps Build pipeline that publishes the entire repository as an artifact to be used with the Release pipeline.
# Publish artifacts to be used in release
- task: PublishBuildArtifacts#1
displayName: 'publish artifacts'
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)'
ArtifactName: 'TerraformModule'
publishLocation: 'Container'
The build pipeline triggers the creation of a release pipeline where I try to deploy the terraform configuration.
I can successfully run terraform init in this pipeline but when I try to run plan or apply, I get the following error:
Looking at the screenshot, it looks like it tries to execute the command from /usr/local/bin instead of what I specified in the step? Confused by this. Below is the yaml for my plan step:
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV3#3
displayName: 'terraform plan'
inputs:
provider: aws
command: plan
workingDirectory: '/home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod'
environmentServiceNameAWS: 'AWS-Terraform-Build'
I manually changed workingDirectory to where the Artifacts from the build pipeline were downloaded to. See log below for example:
2022-08-14T23:41:31.3359557Z Downloaded TerraformModule/Projects/Potentium/Prod/main.tf to /home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod/main.tf
The plan step in my build pipeline executes without any issues so I have a feeling it is something to do with the artefacts/extraction that is occurring in the download step. Looking for any advice.
I've had similar issues with the extraction phase, when using ExtractFiles#1 doing a similar thing with terraform. I think there's a bug in it, I could not get it to extract files back to the root of System.DefaultWorkingDirectory unless the root folder was included in the archiv, I am using ArchiveFiles#2. So I was ending up with /opt/az_devops/_work/*/s/s
My solution, was to shell out a command to do the extraction. No problems extracting to the root of System.DefaultWorkingDirectory
Just remember if you're running a subsequent terraform plan, by default the working directory System.DefaultWorkingDirectory will change between runs. So ensure you use these variables rather than an explicit reference.

AzureDevOps-YML-pipeline Get PublishBuildArtifacts URL in yml pipeline

Is there any possibility to get the URL of a published artifact in the yml pipeline, so it can be used in further pipeline tasks/steps?
Sadly, the Microsoft Docs on the two tasks don't give any hints if the published path value is available in any way.
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: report.html
artifactName: HtmlReport
It depends on where you're using the artifacts - Deployment jobs will typically automatically download the artifacts into the $(Pipeline.Workspace) folder with the same name as you declare in the build task.
So in your case it would be located at $(Pipeline.Workspace)\HtmlReport
You can also use the Download Build Artifacts task to download a specific artifact:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'HtmlReport`
This is useful if you have multiple published artifacts and you only want to download one of them in a later stage. There are other options if you wish to download an artifact from a different pipeline.
Note that the Publish Build Artifacts task is now deprecated and you are recommended to use the newer Publish Pipeline Artifacts and matching Download Pipeline Artifacts tasks:
We recommend upgrading from build artifacts (PublishBuildArtifacts#1 and DownloadBuildArtifacts#0) to pipeline artifacts (PublishPipelineArtifact#1 and DownloadPipelineArtifact#2) for faster performance.

Cypress Integration with DevOps

What I want to achieve:
I have a repository on Azure DevOps which hosts my web application. I wrote a test suite for UI Automation using Cypress. I created a separate repository for my test cases to check if they are working properly or not. I created a pipeline which has the following content:
trigger:
- manual-tests
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: 'npm install'
- task: Npm#1
inputs:
command: 'custom'
customCommand: 'run test'
continueOnError: true
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/test-output-*.xml'
testRunTitle: 'My Test Cases'
I have a trigger set to a branch of the repository in which my UI Automation code is stored. What I want is, to trigger my automation script, when there is a push on some branch of the web application repository. Is there a way of doing this? Can we store our test case files in the application repository and give the path of the test script?
It seems that the UI Automation Repo and Web Application Repo are two separate repos.
To trigger my automation script, when there is a push on some branch of the web application repository. Is there a way of doing this?
The function: "trigger a pipeline from a different repo" is not available now.
This feature is still under development. Multi-repository support for YAML pipelines will be available soon for azure devops service.
Please check the function:"Multi-repository support for YAML pipelines" in Azure DevOps Feature Timeline 2020 Q2. This feature will roll out to everyone by the end of July 2020.
Workaround:
You could try to use the Pipeline triggers.
Here are the steps:
Step1: Create a pipeline with web application repository, then you could set the trigger branch.
Step2: Add the Pipeline trigger in the Yaml file (UI Automation Repo).
For example:
resources:
pipelines:
- pipeline: Name
source: Pipeline name
trigger:
branches:
- releases/*
- master
When you make changes in web application repository, the pipeline with the web application will be triggered.
After running the pipeline , the pipeline with UI Automation repo will be triggered.
Can we store our test case files in the application repository and give the path of the test script?
Of cource. You can do it.
If you want to use the test file in the pipeline (UI Automation repo), you could add the repo resouces in the pipeline.
For example:
resources:
repositories:
- repository: MyAzureReposGitRepository
type: git
name: MyProject/WebapplicationRepo
...
steps:
- checkout: MyAzureReposGitRepository
Note: the repo will be check out to the Agent Source Folder.
Hope this helps.

Azure DevOps Linked Work Items in pipeline

How does Azure DevOps identify linked workitems during pipeline (Azure DevOps YAML)?
I believe it is supposed to get only the workitems linked to new commits (i.e. commits which were not included in a previous pipeline)
However, it sometimes seems to link all work items. But this does not always happen. I did not manage to identify the pattern yet. But I noticed that when I do a change in the pipeline YAML, it seems to trigger this behaviour that links all work items again (even if they where linked to a previous commit and not to newly included commits).
Updated to include additional information
This is my build pipeline YAML
name: 03.01.00$(Rev:.r)
pool:
name: Hosted VS2017
demands:
- msbuild
- visualstudio
- vstest
steps:
- checkout: self
clean: true
persistCredentials: true
- task: NuGetCommand#2
displayName: 'NuGet restore'
inputs:
restoreSolution: MySol/MySol.sln
- task: VSBuild#1
displayName: MySol/MySol.sln
inputs:
solution: MySol/MySol.sln
vsVersion: 15.0
- task: WorkItemUpdater#2
inputs:
workitemsSource: 'Build'
workItemType: 'Task,Bug'
updateAssignedTo: 'Never'
updateFields: 'Microsoft.VSTS.Build.IntegrationBuild,v$(Build.BuildNumber)'
- task: VSTest#2
displayName: 'VsTest - testAssemblies'
Pipeline settings:
Processing od new run requests: Enabled
Automatically link work items included in this run: checked & dev branch selected
Triggers:
Override the YAML continuous integration trigger from here: checked
Enable continuous integration: checked
Batch changes while a build is in progress: unchecked
Branch filters
Included dev branch + another feature branch
Path filters
None

Azure DevOps release pipeline: Angular and .NET Core application

We're trying to release an Angular 7 / .NET Core application into Azure using the DevOps release pipelines. I have my build setup to create the .NET and Angular builds as separate artifacts which you can see in the screen shots below (under the Package or Folder box).
From what I've read it seems that you need to create two separate release tasks to deploy the builds to the web app. However the second build seems to be overriding the first which is causing the API not to start.
Does anyone know of a way to ensure the deployments in a given stage simply appends the changes rather than replacing them? Or is there something else I am missing here?
My recommendation would be to implement the following pattern for your pipeline:
'ng build --prod' the angular app in it's own job, and add the artifacts to your pipeline
'dotnet publish' the dotnet core api in it's own job, running in parallel with the angular job, and add the artifacts to your pipeline
Append the Angular and Dotnet Core artifacts together into a new artifact. This serves as your final package to deploy
Deploy the final package
You're missing step 3, so you'd want something like the following logic defined in YAML, where you create a new zip that represents your actual deployed bits in your pipeline. Then release that artifact, since it is the representation of what you have running on your instances.
- job: CreateReleaseArtifact
displayName: 'Package for shared-hosting of angular app and web api'
pool:
vmImage: windows-2019
dependsOn:
- BuildNetcore
- BuildAngularApp
condition: succeeded()
steps:
- checkout: none
- download: current
- task: CopyFiles#2
displayName: 'Copy WebApi Files'
inputs:
SourceFolder: $(Pipeline.Workspace)/api
Contents: '**/*'
TargetFolder: $(Pipeline.Workspace)/package
includeRootFolder: false
- task: CopyFiles#2
displayName: 'Copy Angular Files'
inputs:
SourceFolder: $(Pipeline.Workspace)/webapp
Contents: 'wwwroot/**'
TargetFolder: $(Pipeline.Workspace)/package
includeRootFolder: true
OverWrite: true
- publish: $(Pipeline.Workspace)/package
artifact: package
Does anyone know of a way to ensure the deployments in a given stage simply appends the changes rather than replacing them?
Based on my experience, in your case, after deploy the API or Angular 7, then I you could use the Kudu zip API to upload another one to the Azure WebApp.
You could use the Powershell task to do that. For more inforamtion about powershell demo code, you could refer to this link.
If creating another WebApp is acceptable, you could add a new WebApp and use the same service plan (no extral cost). Then you could deploy them separately.