Adding artifacts at one level up in root folder - azure-devops

I have an artifact called onnxruntime.dll which are downloaded from pipeline and its folder structure is like this MyProject_x64_windows/bin/Nodes/onnxruntime.dll . I would like this artifact to be downloaded at one level up i.e. MyProject_x64_windows/bin/onnxruntime.dll
I am not sure how it is getting downloaded at that level and how can I fix this. I cant copy the complete YAML but am providing the one which I think is required:
variables:
IppRoot: $(Build.SourcesDirectory)/packages/IPP
ONNXXRoot: $(Build.SourcesDirectory)/packages/ONNXRuntime
- stage: MyProject
jobs:
- job: MyProject_Build
strategy:
matrix:
win:
imageName: 'windows-2019'
OrzRootSuffix: 'x64-windows-staticlib'
osSuffix: 'windows'
LibFT4222Suffix: 'windows'
matlabVersion: '9.6.0-2'
extraCmakeOptions: '-D MyProject_ONNX_SUPPORT=On
-D ONNX_RUNTIME_ROOT:PATH=$(ONNXRoot)'
pool:
vmImage: $(imageName)
steps:
- checkout: self
lfs: true
- task: UniversalPackages#0
displayName: 'Download pre-build ONXX Runtime headers and libraries'
inputs:
command: 'download'
vstsFeed: 'MyProjectPackages'
vstsFeedPackage: 'microsoft.ml.onxxruntime'
vstsPackageVersion: '*' # use the latest
downloadDirectory: '$(ONNXRoot)'
- download: SCMockPipeline
displayName: Download SCMock
artifact: scmock
condition: eq(variables['Agent.OS'], 'Windows_NT')
- script: python -m pip install jinja2
displayName: Install python jinja2 template engine
- task: CMake#1
displayName: CMake configure
inputs:
workingDirectory: '$(Build.BinariesDirectory)'
cmakeArgs: '-G Ninja
$(extraCmakeOptions)
-DLibFT4222_ROOT=$(LibFT4222Root)
-DIPP_ROOT=$(IppRoot)
-DCMAKE_BUILD_TYPE=Release
-DCMAKE_INSTALL_PREFIX=$(Build.ArtifactStagingDirectory)
$(Build.SourcesDirectory)'
- task: CMake#1
displayName: CMake build
inputs:
workingDirectory: '$(Build.BinariesDirectory)'
cmakeArgs: '--build . --target install'
- task: PublishPipelineArtifact#1
displayName: 'Publish MyProject'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)
artifactName: 'MyProject_x64-$(osSuffix)'
- task: DownloadPipelineArtifact#2
displayName: Download MyProject artifact
inputs:
artifact: 'MyProject_x64-$(osSuffix)'
path: '$(Build.SourcesDirectory)/MyProject_x64-$(osSuffix)'

You can try to set the destination directory in the DownloadPipelineArtifact task to download the artifact to the bin folder.
- task: PublishPipelineArtifact#1
displayName: 'Publish MyProject'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/bin/Nodes/onnxruntime.dll
artifactName: 'MyProject_x64-$(osSuffix)'
- task: DownloadPipelineArtifact#2
displayName: Download MyProject artifact
inputs:
artifact: 'MyProject_x64-$(osSuffix)'
path: '$(Build.SourcesDirectory)/MyProject_x64-$(osSuffix)/bin'

Related

Copy a selection of files to a server using Azure Devops

I am trying to copy a selection of files to a destination folder on a target machine.
In my first version, I can already copy all files to the destination. Therefore, I use the following task to build an artifact.
steps:
- task: CopyFiles#2
displayName: 'copy files'
inputs:
SourceFolder: $(workingDirectory)
Contents: '**/files/*'
flattenFolders: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: files
Later I try to use that artifact for a deployment
stage: Deploy
displayName: 'Deploy files to destination'
jobs:
- deployment: VMDeploy
displayName: 'download artifacts'
pool:
vmImage: 'ubuntu-latest'
environment:
name: local_env
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'download files'
inputs:
artifact: dags
downloadPath: /opt/myfolder/files
This works perfectly fine for all files.
But what I need is the following:
The 'local_env' environment contains multiple servers. The first three letters of each server would be the perfect wild card for the files I needed.
Or in other words, if the environment contains names such as 'Capricorn', 'Aries', 'Pisces', I would like to copy 'cap*.* ', ari*.* ' or 'pis*.*' on the corresponding server.
The way I fixed it for now was
- task: Bash#3
inputs:
targetType: 'inline'
script: "HN=$(hostname | head -c 3) \n cd /opt/myfolder/files/ \n rm -r $(ls -I \"$HN*.*\")"
It does its job, but I am open to mark a better solution as resolution.

Azure Pipeline Copy Secure file into build folder

I've a vite/svelte project which uses .env files for environment settings. I also have an Azure Pipeline which contains a secure file .env.staging this is on the .gitignore list of the associated repo. I'd like to download this secure file, copy it to my build directory and then have it's contents read when I run vite build --mode staging (well, npm run build:staging which includes vite build...)
When run locally from my machine npm run build:staging works as expected and reads the .env.staging file, however it seems to get ignored when used in the pipeline, am I doing anything wrong?
Here's my yml.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DownloadSecureFile#1
name: "dotenvStaging"
inputs:
secureFile: '.env.staging'
displayName: "Download .env.staging"
- task: NodeTool#0
inputs:
versionSpec: 14.15.4
displayName: "Install Node.JS"
- task: CopyFiles#2
inputs:
contents: "$(Agent.TempDirectory)/.env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"
- script: npm install
displayName: "npm install"
- script: npm run build:staging
displayName: "npm run build:staging"
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: 'dist'
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
#replaceExistingArchive: true
#verbose: # Optional
#quiet: # Optional
displayName: "Create archive"
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
ArtifactName: 'drop'
publishLocation: 'Container'
displayName: "Publish archive"
I'm not sure if CopyFiles#2 is doing what I expect or not as it just matches the content parameter to copy whatever files match, which could be 0 if I'm writing it wrong...
Another note, I also tried using $(dotenvStaging.secureFilePath) as the content parameter, but it doesn't seem to do anything either.
Naturally I figured it out as soon as I posted, I needed to update the CopyFiles part to specify sourceFolder, clearly it didn't like my absolute file path for content.
- task: CopyFiles#2
inputs:
sourceFolder: "$(Agent.TempDirectory)"
contents: ".env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"

How to copy specific folders/files to artifact?

I have the following build pipeline:
pool:
name: Azure Pipelines
demands:
- npm
- msbuild
steps:
- task: Npm#1
displayName: 'npm install'
inputs:
workingDir: Project123/Angular
verbose: false
steps:
- task: Npm#1
displayName: 'npm custom: angular build'
inputs:
command: custom
workingDir: Project123/Angular
verbose: false
customCommand: 'run-script build --prod --extractCss'
steps:
- task: NuGetCommand#2
displayName: 'NuGet restore'
steps:
- task: MSBuild#1
displayName: '.Net build'
inputs:
solution: 'Project123/*.csproj'
msbuildArchitecture: x64
configuration: Release
msbuildArguments: '/p:OutputPath=$(Build.ArtifactStagingDirectory)'
- task: CopyFiles#2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: Project123/Bundles
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: Release'
inputs:
ArtifactName: Release
When we build the project from VS manually, this is the (expected) artifact we deploy:
However, the YAML I have generates this artifact instead:
How do I accomplish the expected artifact?
I am actually surprised that AngularOutput was copied in the root of the artifact, and not Bundles...I specified in the copy task to copy the Bundles folder , which would contain the AngularOutput...
Since I do not know the .csproj configuration, just provide a workaround based on the screenshot.
Don't need this Angular folder in this artifact
These Web.Debug and Web.Release are unneeded in the artifact
Add task power shell and delete the folder and files via below script
#Delete Angular folder and files
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Remove-Item ''$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Angular''
Remove-Item ''$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Web.Debug.configure''
Remove-Item ''$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Web.Release.configure''
Those dlls, .pdb, .xml and .configure files are not needed here.
There needs to be a 'Bundles' directory here which is generated by the angular build task
Check this Copy files task, these files in the folder Project123/Bundles, right?
- task: CopyFiles#2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: Project123/Bundles
TargetFolder: '$(Build.ArtifactStagingDirectory)'
We need to change the Copy file task as below:
- task: CopyFiles#2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: Project123/Bundles
TargetFolder: '$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Bundles'
It will save these files in the folder Bundles instead of root.
This AngularOutput folder needs to be located under 'Bundles' directory inside 'Project123' folder above
We could copy the folder to project123 folder and then publish the artifact.
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.ArtifactStagingDirectory)/AngularOutput'
Contents: '**'
TargetFolder: '$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/AngularOutput'
You could run below YAML build and check the result.
pool:
name: Azure Pipelines
demands:
- npm
- msbuild
steps:
- task: Npm#1
displayName: 'npm install'
inputs:
workingDir: Project123/Angular
verbose: false
- task: Npm#1
displayName: 'npm custom: angular build'
inputs:
command: custom
workingDir: Project123/Angular
verbose: false
customCommand: 'run-script build --prod --extractCss'
- task: NuGetCommand#2
displayName: 'NuGet restore'
- task: MSBuild#1
displayName: '.Net build'
inputs:
solution: 'Project123/*.csproj'
msbuildArchitecture: x64
configuration: Release
msbuildArguments: '/p:OutputPath=$(Build.ArtifactStagingDirectory)'
- task: CopyFiles#2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: Project123/Bundles
TargetFolder: '$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Bundles'
#Delete Angular folder and files
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Remove-Item ''$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Angular''
Remove-Item ''$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Web.Debug.configure''
Remove-Item ''$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Web.Release.configure''
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.ArtifactStagingDirectory)/AngularOutput'
Contents: '**'
TargetFolder: '$(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/AngularOutput'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: Release'
inputs:
ArtifactName: Release
Update1
If the issue is delete the folder and files, please update the power shell script as below and try it again.
#Delete Angular folder and files
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Remove-Item -Path $(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Angular -Recurse -Force
Remove-Item -Path $(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Web.Debug.configure -Recurse -Force
Remove-Item -Path $(Build.ArtifactStagingDirectory)/_PublishedWebsites/Project123/Web.Release.configure -Recurse -Force
And the test result:

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

Azure DevOps Multi Stage Pipeline Error: No package found with specified pattern: /home/vsts/work/1/s/**/*.zip - How do I fix?

I have an Azure DevOps Build (yaml) and Release Pipeline (Classic) successfully deploying to Azure.
I am trying to convert these 2 separate steps in a Multi Stage Yaml Pipeline.
On the Azure App Service Deploy task (AzureRmWebAppDeployment#4), I am getting the following error:
No package found with specified pattern: /home/vsts/work/1/a/*.zip
Below is my Multi Stage Yaml Pipeline
stages:
- stage: Build
jobs:
- job: 'Build'
pool:
vmImage: 'windows-latest'
variables:
buildConfiguration: 'Release'
steps:
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '**/*.csproj'
vstsFeed: 'dd55642d-8943-411f-8856-9714dd0da8af'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
projects: '**/*.csproj'
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI#2
displayName: Test
inputs:
command: test
projects: '**/*[Tt]ests/*.csproj'
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: false
projects: '**/Tools.Client.Blazor.ServerApp.csproj'
arguments: '--configuration $(buildConfiguration) --output $(build.artifactstagingdirectory)'
- task: PublishSymbols#2
displayName: 'Publish symbols path'
inputs:
SearchPattern: '**\bin\**\*.pdb'
PublishSymbols: false
continueOnError: true
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)\AzureDeploy'
inputs:
SourceFolder: AzureDeploy
TargetFolder: '$(build.artifactstagingdirectory)\AzureDeploy'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
- stage: Systest
jobs:
- job: 'Systest'
variables:
resourceGroupName: '$(appName)-rg-$(environment)'
location: 'East US'
appServiceName: '$(appName)-svc-$(environment)'
appInsightsName: '$(appName)-ins-$(environment)'
appServicePlanName: '$(appName)-asp-$(environment)'
appName: 'tools'
owner: 'Pod'
environment: 'systest'
steps:
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'ARM Template deployment: Resource Group scope'
inputs:
azureResourceManagerConnection: 'Dev/Test Connection'
subscriptionId: ''
resourceGroupName: '$(resourceGroupName)'
location: '$(location)'
csmFile: '$(System.DefaultWorkingDirectory)/AzureDeploy/Tools.azureDeploy.json'
csmParametersFile: '$(System.DefaultWorkingDirectory)/AzureDeploy/Tools.azureDeploy.parameter.json'
overrideParameters: '-appServiceName "$(appServiceName)" -appInsightsName "$(appInsightsName)" -appServicePlanName "$(appServicePlanName)" -owner "$(owner)" -environment "$(environment)" -location "$(location)"'
- task: AzureRmWebAppDeployment#4
displayName: 'Azure App Service Deploy: $(appServiceName)'
inputs:
ConnectionType: 'AzureRM'
azureSubscription: ''
appType: 'webApp'
WebAppName: '$(appServiceName)'
packageForLinux: '$(Build.ArtifactStagingDirectory)/*.zip'
Any help / suggestions would be appreciated.
Because it's 2 stages the second stage doesn't have the file you published in the first stage, you need to download it.
You can use Pipeline artifacts instead of build artifacts.
Pipeline artifacts provide a way to share files between stages in a
pipeline or between different pipelines. They are typically the output
of a build process that needs to be consumed by another job or be
deployed. Artifacts are associated with the run they were produced in
and remain available after the run has completed.
To publish (upload) an artifact for the current run:
steps:
- publish: $(build.artifactstagingdirectory)
artifact: drop
And in the second stage, you download the artifact:
steps:
- download: current
artifact: drop
You can also achieve it with build artifacts and download with DownloadBuildArtifacts#0 task.
During Publish it will not work like this. Instead of using path "/home/vsts/work/1/a/.zip", this path can be used "$(System.DefaultWorkingDirectory)/_Releasepipelinename/drop/.zip"