AzureDevOps Duplicate dist folder in pipelines build ? Why? - azure-devops

There is a duplication of a dist folder in the artifacts produced by the AzureDevOps -> Pipelines, the duplication is the /dist folder and also /drop/dist folder. EDIT: Full azure-pipeline.yml file
# Node.js with Angular
# Build a Node.js project that uses Angular.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
# Major modification referencing
# https://dev.to/thisdotmedia/continuously-integrating-angular-with-azure-devops-2k9l
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
# Build angular app area
- script: npm install
displayName: 'npm install'
- script: npx ng build --prod
displayName: 'npm build'
# Testing area
- script: npm install puppeteer --save-dev
displayName: 'Installing puppeteer (Headless browser for testing)'
- script: npx ng test --watch=false --codeCoverage=true
displayName: 'Running Tests'
- task: PublishTestResults#2
condition: succeededOrFailed()
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/TEST-*.xml'
displayName: 'Publish Test Results'
# Publishing items
# deploy.psl (Powershell script to deploy)
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: 'deploy.ps1'
ArtifactName: 'drop'
publishLocation: 'Container'
# Firebase.json
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: Firebase.json'
inputs:
PathtoPublish: 'firebase.json'
ArtifactName: 'drop'
publishLocation: 'Container'
# App
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: 'dist'
ArtifactName: 'drop/dist'
publishLocation: 'Container'
displayName: 'Publish Artifacts'
# Code Coverage Results
- task: PublishCodeCoverageResults#1
condition: succeededOrFailed()
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(Build.SourcesDirectory)/coverage/ng-azure-devops/cobertura-coverage.xml'
displayName: 'Publish Code Coverage Results'
- script: npx ng lint
displayName: 'Code Analysis'
I've tried the using 'drop' as the ArtifactName, which will NOT produce a duplicate folder artifact anywhere. I am very confused on why 'drop/dist' will produce another '/dist' artifact

AzureDevOps Duplicate dist folder in pipelines build ? Why?
I could reproduce this issue on my side.
When we use the publish the artifacts dist folder with ArtifactName: drop/dist, Azure Devops will create a new folder drop first, then publish the artifacts dist folder to that folder drop.
You can get this message from the build log:
Upload '/home/vsts/work/1/s/dist' to file container:
'#/3620698/drop/dist'
However, the drop folder is already present by default. When we publish the dist folder with with ArtifactName: drop/dist, there are two drop folder, then Azure devops will publish dist folder to those two drop folders:
In order to understand this problem more clearly, you could disable the Multi-stage pipelines in the Preview features, then you will get the output:
Obviously, there are two drop folders here, that is the reason why you get the Duplicate dist folder in pipelines build.
So, to resolve this issue, we could change the ArtifactName: drop/dist to ArtifactName: dropTest/dist:
Now, the duplicate dist folder disappears.
Hope this helps.

Related

How to consume artifacts published by CI pipeline in CD pipeline

I have CI and CD pipelines using Azure DevOps for a frontend angular project. Both are separate pipelines.
Here goes the YAML file for the CI pipeline which produces published artifact: output_final.zip. The below pipeline leverages Azure Pipelines for generating the published artifact.
# Node.js with Angular
# Build a Node.js project that uses Angular.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- integration
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool#0
inputs:
versionSpec: '14.x'
displayName: 'Install Node.js'
- powershell: |
$buildNumber="$(Build.BuildNumber)"
echo $buildNumber > src/version.txt
- script: |
npm install -g #angular/cli
npm install
ng build --prod
displayName: 'npm install and build'
- task: CopyFiles#2
displayName: 'Copy Files of UI'
inputs:
SourceFolder: 'dist/source'
TargetFolder: '$(Build.ArtifactStagingDirectory)/output'
OverWrite: true
- task: ArchiveFiles#2
displayName: 'Archive'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)/output/'
includeRootFolder: false
archiveFile: '$(Build.ArtifactStagingDirectory)/output/output_final.zip'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/output/output_final.zip'
ArtifactName: drop
Now I have a separate CD pipeline which leverages self hosted private agent. In this CD pipeline, I want to consume artifacts published by the CI pipeline in the CD pipeline
Can anyone help me to know how to consume artifacts published by the CI pipeline and use it in the CD pipeline with some sample YAML example.
My suggestion would be to use the Publish Pipeline Artifact task to publish the CI artifact. Then, in your CD pipeline, use the Download Pipeline Artifact to consume it.
You can configure the download task to get the artifact from the current run or from a specific run. In your case, it sounds like you want a specific run since the CI pipeline is separate.
Here's an example of what the yaml tasks might look like:
#CI Task
- task: PublishPipelineArtifact#1
displayName: 'Publish pipeline Artifact'
inputs:
targetPath: '$(Pipeline.Workspace)'
artifact: '<Some Artifact Name>'
publishLocation: 'pipeline'
#CD Task example
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: ''
definition: ''
specificBuildWithTriggering: true
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'

dotnet build failing because project not found

I have a YAML file with tasks to copy & publish files to artifact and download the artifact:
Log for task DownloadPipelineArtifact#2:
Downloading: D:\a\1\s\Service\ProjectName\Repositories.Logging/Repositories.Logging.csproj
Then, I have a task to build. The build fails on this task.
Log for task DotNetCoreCLI#2:
Skipping project "D:\a\1\s\Service\ProjectName\Repositories.Logging\Repositories.Logging.csproj" because it was not found.
Why does it say skipping project because it was not found even though log for DownloadPipelineArtifact#2 shows the same path? What am I missing and how do I fix that?
UPDATE:
I know that there is a difference in the slashes. However, I don't have conrol over updating the slashes:
Tasks to copy, publish, download:
- task: CopyFiles#2
displayName: 'copy service'
inputs:
SourceFolder: 'Service\ProjectName'
contents: '**'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'publish artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(Build.BuildNumber)'
- task: DownloadPipelineArtifact#2
inputs:
artifactName: '$(Build.BuildNumber)'
downloadPath: Service\ProjectName
On updating download task to:
- task: DownloadBuildArtifacts#0
inputs:
artifactName: '$(Build.BuildNumber)'
downloadPath: $(System.DefaultWorkingDirectory)\Service\ProjectName
I see the following log from DownloadBuildArtifacts#0:
Downloaded to D:\a\1\s\Service\ProjectName\20201118.10\Repositories.Logging\Repositories.Logging.csproj
and following log from DotNetCoreCLI#2:
D:\a\1\s\Service\ProjectName\Repositories.Logging\Repositories.Logging.csproj because it was not found
In this case I see the slashes correctly. Is it possible to remove:
20201118.10
from downloadPath so that it becomes:
D:\a\1\s\Service\ProjectName\Repositories.Logging\Repositories.Logging.csproj
Is it possible to remove 20201118.10 from downloadPath.
We could not remove the artifact Name 20201118.10 when we use the task Download build artifacts.
When we check the task Download build artifacts in the classic mode, we could to know the option Artifact name is required:
To resolve this issue, we could add a copy task to copy files to $(System.DefaultWorkingDirectory)\Service\ProjectName folder
- task: CopyFiles#2
displayName: 'Move artifact Name'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)\Service\ProjectName\$(Build.BuildNumber)'
TargetFolder: '$(System.DefaultWorkingDirectory)\Service\ProjectName'
Or you could specify the path including the artifact Name 20201118.10 when you build the project:
- task: DotNetCoreCLI#2
displayName: 'dotnet build'
inputs:
projects: '$(System.DefaultWorkingDirectory)\Service\ProjectName\$(Build.BuildNumber)\Repositories.Logging\Repositories.Logging.csproj'

React JS Azure DevOps Web app won't run but files are present in server and no errors in pipelines

I am doing a simple Azure DevOps CICD deployment. I am following all the steps. First up, here is my YAML file. I have kept the comments as it is, just in case, I am making more mistakes than I am aware of.
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
# Set variables
variables:
directory: ReactJSRecipeApp
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.13.0'
displayName: 'Install Node.js'
- script:
npm install
displayName: 'npm install'
- script:
npm run build
displayName: 'npm build'
- task: CopyFiles#2
displayName: 'Copy files'
inputs:
sourceFolder: 'build'
Contents: '**/*'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
cleanTargetFolder: true
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Build Artifacts'
inputs:
pathtoPublish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
ArtifactName: 'www' # output artifact named www
- task: AzureWebApp#1
displayName: 'Deploy to App Service'
inputs:
azureSubscription: 'ReactJSRecipeAppConnection'
appName: 'reactjsrecipeappwebapp2'
appType: 'webApp'
package: '$(System.ArtifactsDirectory)/$(Build.BuildId).zip'
customWebConfig: '-Handler iisnode -NodeStartFile server.js -appType node'
So, when I run this, I get no errors in the Pipeline.
Further, I want to point out the following.
If I download the artifacts zip folder, I am able to run that folder locally and get my react app running in a localhost server just fine.
I check my Azure web app via Kudo Tools, I see all the files inside wwwroot, with timestamps that match the zip file from the artifact folder. So, I am assuming that the files are indeed getting pushed and to the correct spot in the web server.
Before I run the CICD trigger, these azure web apps were created brand new, and I get the standard azure welcome/landing page. So, the web apps themselves are fine.
After all this, the website itself does not serve the pages. I get a 404. I have tried two different web apps on Azure but the same results.
Any advise, where I am going wrong?
Update 1
I decided to manually check the files on Filezilla. But, its empty!!!
But, KUDO shows files. I dont understand!
Update 2
So, I did a direct deploy from visual studio code with the artifact publish folder. the web app runs fine. So, did this step to make sure that the web app is configured correctly.
Alright, so, it looks like my YAML file was not correct. I finally got it to work.
I am posting it here if someone comes around looking for a ready to use React YAML file (because the Azure DevOps Documentation is not that useful in its current form)
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm install
npm run build
displayName: 'npm install and build'
- task: CopyFiles#2
inputs:
Contents: 'build/**' # Pull the build directory (React)
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: $(Build.ArtifactStagingDirectory) # dist or build files
ArtifactName: 'www' # output artifact named www
# Default value: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)/build/'
includeRootFolder: false
- task: AzureWebApp#1
inputs:
azureSubscription: 'ReactJSRecipeAppConnection'
appName: 'ReactJSRecipeApp4'
package: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
The full repository with simple react code including the YAML is here.
https://github.com/Jay-study-nildana/ReactJSRecipeApp

Managing consistent Webpack builds in DevOps

Within our company we build and release a lot through DevOps for .NET Core applications.
For these application our setup is to have a single pipeline that builds the artifact and then the release pipeline manages pushing this same artifact out through the various stages (test / uat / staging / live)
The advantages we see here is that the package is the same and it is just environment variables on the deployment target that allow for variation in how it runs such as different 3rd party endpoints, different database etc.
We are now looking to move a Vue.js application which is built using Webpack into DevOps for automating the builds and deployments too, but here is where we are in a conundrum.
We need to encapsulate the same variation in our solutions (different api, configs, etc), this is currently managed by doing npm run build:uat, npm run build:live etc.
This works fine, but means we would need to setup different builds for each environment and loose the reassurance the the package we have put out for UAT is consistent with the release we put out for Live.
Are there any best practises around managing builds like this?
Possible options I can see, although open to others:
Build test / uat / live at the same time and then selectively copy the right code for the right environment
Different builds for each release
Can we have a single artifact build and then swap out the configuration variable in some other way?
Any support or advice would be appreciated.
Option 2 should be the simplest solution to understand and operate.
Do not need to keep using a single artifact. A release can be configured to trigger off of multiple artifact sources (builds). Then simply combine multiple builds in one release.
You could create a task which add a label to the current build if there is no other build running/queued.
Additionally add a tag condition for the release trigger.
More details please take a look at boindiil's answer in this link: Combine multiple builds into one release
We ended up going down the following route. We would ideally have had a single build but as the build for each environment is done at the same time from the same npm install install etc I'm happy with this as a compromise.
trigger:
- release/*
variables:
- template: config/app-names.yml
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script:
npm install
displayName: 'npm install'
- script:
npm run build:test -- --vn "$(Build.SourceBranchName)" --n "$(Build.RequestedFor)"
displayName: 'npm build test'
- task: ArchiveFiles#2
displayName: 'Archive test'
inputs:
rootFolderOrFile: 'dist/$(TestAppName)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: 'dist/test/$(TestAppName).zip'
replaceExistingArchive: true
- script:
npm run build:acc -- --vn "$(Build.SourceBranchName)" --n "$(Build.RequestedFor)"
displayName: 'npm build acc'
- task: ArchiveFiles#2
displayName: 'Archive acc'
inputs:
rootFolderOrFile: 'dist/$(AccAppName)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: 'dist/acc/$(AccAppName).zip'
replaceExistingArchive: true
- script:
npm run build:live -- --vn "$(Build.SourceBranchName)" --n "$(Build.RequestedFor)"
displayName: 'npm build live'
- task: ArchiveFiles#2
displayName: 'Archive live'
inputs:
rootFolderOrFile: 'dist/$(LiveAppName)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: 'dist/live/$(LiveAppName).zip'
replaceExistingArchive: true
- task: PublishBuildArtifacts#1
displayName: 'Publish test build'
inputs:
PathtoPublish: 'dist/test/'
ArtifactName: 'test'
publishLocation: 'Container'
- task: PublishBuildArtifacts#1
displayName: 'Publish acc build'
inputs:
PathtoPublish: 'dist/acc/'
ArtifactName: 'acc'
publishLocation: 'Container'
- task: PublishBuildArtifacts#1
displayName: 'Publish live build'
inputs:
PathtoPublish: 'dist/live/'
ArtifactName: 'live'
publishLocation: 'Container'

Can you copy files from one VSTS build agent to another?

Is it possible to copy files from one build agent to another and kick it off as a part of the pipeline task?
One build agent is Linux but I need to continue my work on Windows agent.
Following Hanna's solution, attached more detailed working solution:
Agent-1 and Agent-2 are two different machines from a different agent pool.
Agent-1 does 2 steps:
CopyFiles - writes the file 'livne.txt' (can be any pattern) from the default working directory into the artifact's staging directory
PublishPipelineArtifact - publishes an artifact called: 'PROJECT_NAME' contains all the files copied into the default working directory.
Agent-2 does one main task:
DownloadPipelineArtifact - downloads the file 'livne.txt' (can be any pattern) from the 'PROJECT_NAME' artifact into the current agent's working directory.
simple bash script to make sure 'livne.txt' indeed exists on the working directory.
pool:
name: Agent-1
- task: CopyFiles#2
displayName: 'Copy txt file'
inputs:
SourceFolder: '$(system.DefaultWorkingDirectory)'
Contents: livne.txt
TargetFolder: '$(build.ArtifactStagingDirectory)'
- task: PublishPipelineArtifact#1
displayName: 'Publish Pipeline Artifact'
inputs:
targetPath: '$(build.ArtifactStagingDirectory)'
artifact: 'PROJECT_NAME'
dependsOn: Job_1
pool:
name: Self Hosted Ubuntu for Docker Multiplatform
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifact'
inputs:
artifactName: 'PROJECT_NAME'
itemPattern: livne.txt
targetPath: '$(build.ArtifactStagingDirectory)'
- bash: |
ls $(build.ArtifactStagingDirectory)
cat $(build.ArtifactStagingDirectory)/livne.txt
displayName: 'Bash Script'
I think the best way to do this is typically to publish the files as artefacts of the pipeline and then download those artefacts again on the second agent. I have done this in projects before when one machine is consuming test results from the test agent to build reports.
You might imagine your pipeline would look something like this:
- job: Build
displayName: Build on Linux
steps:
...
- task: PublishPipelineArtifact#1
displayName: Publish Built binaries from Linux
inputs:
path: $(Build.SourcesDirectory)/bin/
artifact: Binaries
- job: Additional
displayName: Do something with the binaries on windows
steps:
- task: DownloadPipelineArtifact#2
inputs:
artifact: Binaries
targetPath: $(Pipeline.Workspace)/Binaries
...
I hope this helps! :)