I have tried 2 different attempts to reference the correct pipeline when downloading specific artifacts via two different built in steps. and cannot seem to get it to work.
this first way, will error in the oddest way i have ever seen. the pipeline will show 'pending' when selecting the run itself, but when going back one level to view the pipeline as a whole it will show a red x as if the run has failed, but when trying to dig into it, i can get no output.
#attempt 1
- download: $(PIPELINE_NAME)
and. this second way will successfully download the package in <1sec.... meaning it isnt actually downloading anything. the next step will fail as no package is found.
- task: DownloadPipelineArtifact#2
name:
displayName: 'Download artifact test'
inputs:
buildtype: specific
project: <project name here>
pipeline: '$(PIPELINE_NAME)' #this just doesnt work for some reason
runVersion: specific
runId: $(resources.pipeline.$(PIPELINE_NAME).runID)
downloadPath: $(Pipeline.Workspace)
Is this just not meant to work? neither of these will resolve the variables and will fail. Anyone have any other suggestions? do these NEED to be hardcoded? or am i missing some syntax? We have several nearly identical micro-services that i would like to just share the same template between them if i could.
According to the arguments document of Download Pipeline Artifacts task, the value of pipeline parameter should be the definition ID of the build pipeline, not the source name. So this may be the reason that your parameter doesn't work.
To download artifact from another pipeline, you don't need to specify pipeline resources. This is my method of downloading artifacts from another pipeline:
Firstly, in my previous pipeline, I have a Publish pipeline artifact task:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'drop'
publishLocation: 'pipeline'
And then, in another pipeline, I have a Download Pipeline Artifacts task:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: '$(project)'
definition: '$(pipelineID)'
buildVersionToDownload: 'specific'
pipelineId: '$(buildID)'
artifactName: 'drop'
targetPath: '$(Pipeline.Workspace)'
Tip:
You can click on "settings" above the task and fill in the information with UI if you don't need to use variables. UI will produce picklists of projects, pipelines, ... that you can choose.
Related
I have quite a bit of experience with Azure DevOps on-premise and know where the build artifacts are stored when I run a build pipeline. However, we are moving to Microsoft-hosted build agents, and won't have any on-prem build server to store the artifacts from the build.
My question is, how do I get the build artifacts created from a build pipeline processed by a Microsoft-hosted build agent? Ultimately, I would like to download those artifacts to a file share that we have on-prem. Is this something that can be done?
I created a build pipeline in Azure DevOps (not on-prem) and ran the build. I added the marketplace extension Publish Build Artifact to the build pipeline and I expected that an artifact would be published so that I could download it to our company server. No artifact was produced.
In Azure yaml based pipelines, you can either use the pipeline artifacts method in this the artifacts are stored and associated with the pipeline or the universal method required additional steps to setup that uses the internal feed to store the artifacts. For pipeline artifacts you necessarily don't need to version the artifact since it is contained inside the pipeline. For universal unless you are overwriting the artifact you need to uniquely version them by any version mechanism you want to use, either to leverage off from app source code versioning or your own custom versioning.. like timestamp etc.
Below is an example of an dot. net build with both publish methods done in yaml pipeline.
# To run and store your build
- task: DotNetCoreCLI#2
displayName: "Build Project"
enabled: true
inputs:
command: "build"
projects: $(SolutionPath)
arguments: "--configuration Release --output $(Build.ArtifactStagingDirectory)"
# Here we start the postbuild steps
# This is to archive the build artifacts, stored by file number.
- task: ArchiveFiles#2
displayName: "Archive Artifacts"
inputs:
rootFolderOrFile: "$(Build.ArtifactStagingDirectory)"
includeRootFolder: false
archiveType: "zip"
archiveFile: "$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip"
replaceExistingArchive: true
# This is to publish the artifact inside the pipeline, stored by file number.
- task: PublishBuildArtifacts#1
displayName: "Publish Artifacts"
inputs:
PathtoPublish: "$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip"
ArtifactName: "drop"
publishLocation: "Container"
# This is to publish the artifact inside the universal artifacts feeds, stored by date/timestamp variable passed below as filename from another step.
- task: UniversalPackages#0
inputs:
command: 'publish'
publishDirectory: '$(Build.ArtifactStagingDirectory)'
feedsToUsePublish: 'internal'
vstsFeedPublish: 'your feed'
vstsFeedPackagePublish: 'azure-pipeline-dotnetapi'
versionOption: 'custom'
versionPublish: '$(setBuildValues.ApplicationVersion)'
packagePublishDescription: '$(setBuildValues.appVersion)'
For download of artifacts to a path you can use the below task and pass your network location as the path for download, i am not sure if it can use UNC paths might have to map the network path.
steps:
- task: DownloadBuildArtifacts#1
displayName: 'Download Build Artifacts'
inputs:
buildType: "current"
downloadType: "single"
downloadPath: ${{ parameters.DownloadPath }} #INSERT YOUR DOWNLOAD PATH HERE.
Need to create a YAML based azure build pipeline i.e, need to run particular tasks like only build step when it is PR automated and when the same pipeline manually run it should run build task along with archive and publish artifact tasks
You can distinguish with the Build.Reason https://learn.microsoft.com/en-US/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables-devops-services in the yaml pipeline why a build was triggered. This you might set as condition in your build stages/jobs/steps.
See the following example from one of our build pipelines:
- task: DotNetCoreCLI#2
displayName: "Publish NuGet"
condition: and(succeeded(), ne(variables['Build.Reason'], 'Schedule'))
inputs:
command: 'push'
searchPatternPush: '$(Build.SourcesDirectory)/source/**/*.nupkg'
nuGetFeedType: 'internal'
feedPublish: 'MyFeed'
I am trying to specify "Build pipeline" value in download build artifacts task as variable value like "$(Pipeline)".
But the classic editor does not allow to do so.
Is there anyway to achieve this?
Look like you can't use a variable in this input when you use the Classic Editor, but you can use a variable if you use a YAML pipeline:
variables:
buildName: TestBuild
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'specific'
project: 'xxxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx'
pipeline: '$(buildName)'
buildVersionToDownload: 'latest'
downloadType: 'single'
downloadPath: '$(System.ArtifactsDirectory)'
artifactName: test
If you still want to use the Classic Editor you can "hack" the system by export the build to json:
Open the json and replace there the value with a variable, then import the build:
Results:
Can we see a more complete screenshot of what you're trying to accomplish? There are two tasks for this, "Download Build Artifacts" and "Download Pipeline Artifact", and neither is giving me any trouble with a variable reference:
I am trying to download the latest available artifact for a given tag from the current build pipeline and branch, but I am getting the following error.
##[error]No builds currently exist in the pipeline definition supplied.
This is a 3 stage pipeline for automation testing with build, deploy, and run tests stages. In the run tests stage I am trying to download the most recently available artifact in the build stage, which might be this run, or it might be an earlier run.
If I leave the tags option out, it will try and fetch it from the last available run, but this artifact may not have been created then, hence my use of tags to try and filter it.
- task: DownloadPipelineArtifact#2
displayName: 'Download Latest DLLs'
inputs:
source: 'specific'
project: $(System.TeamProjectId)
pipeline: $(System.DefinitionId)
runVersion: 'latestFromBranch'
runBranch: $(Build.SourceBranch)
tags: 'myBuildTag'
allowPartiallySucceededBuilds: true
artifact: myArtifactName
patterns: '**/IntegrationTests/**/*'
path: '$(Agent.TempDirectory)\myArtifactName'
continueOnError: true
Any help would be appreciated
Downloading latest Pipeline Artifact from branch based on Tag
I could reproduce this issue on my side.
I assume this is an issue for the DownloadPipelineArtifact task in the multiple stages with tags.
After much investigation, I found if we use the DownloadPipelineArtifact task in the multi stages:
- task: DownloadPipelineArtifact#2
displayName: 'Download Latest DLLs'
inputs:
source: 'specific'
project: $(System.TeamProjectId)
pipeline: $(System.DefinitionId)
runVersion: 'latestFromBranch'
runBranch: $(Build.SourceBranch)
it will try to download the latest build on specify branch. Since we use the multi stages, the build stage is successfully in current pipeline, this task DownloadPipelineArtifact will download the artifact from previous build stage. However, the tag has not been added at this time, it needs to be added after the pipeline is completed:
In this case, we will receive that error No builds currently exist in the pipeline definition supplied..
Because the current pipeline where the build stage generates the artifact has not been tagged with tags, its tag is empty.
The key to this issue is that the multi-stage yaml makes the build and test in the same pipeline. It is different from the classic pipeline. We are not using the task DownloadPipelineArtifact after a pipeline is completed.
I submit this issue on azure devops task: https://github.com/microsoft/azure-pipelines-tasks/issues/13101. You could check this ticket for feedback.
Hope this helps.
We're in the process of moving our product to an azure web app. We have an extensive existing pipeline containing multiple parallel jobs. One of these jobs compiles a asp.net web application. Some others compile a vue.js website. Currently, the results of the web application and the vue projects are combined in a separate stage, using a powershell script.
Now I can convert the publish step of the web application to generate a deployment package for azure. But what is the best way of also adding the vue outputs into this package so I can deploy it to azure correctly, without losing the parallel jobs? I cannot include the is output files in my project, because they don't exist within the web application build job
You can use publish build artifact task to upload the build results of the web application and vue projects to azure devops server as #Krzysztof mentioned. And you can add a new job to download the artifacts.
Please check below simple example in yaml.
In order to combine the build results, you can use extract file task to extract the zipped artifacts and published the unpacked artifacts in Build_web job. And in the Combine job you can use copy file task to copy the results of vue artifacts to the web artifacts folder. And then you can use archive file task to pack the artifacts which now contains the results of web and vue application.
Combine job should dependsOn Build_web and Build_vue jobs
jobs:
- job: Build_Web
pool:
vmImage: "windows-latest"
steps:
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '*.zip'
destinationFolder: '$(Build.ArtifactStagingDirectory)\unzip'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)\unzip'
artifactName: webapp
- job: Build_Vue
pool:
vmImage: "windows-latest"
steps:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: 'path to the build results'
artifactName: vueapp
- job: Combine
dependsOn:
- Build_Web
- Build_Vue
pool:
vmImage: "windows-latest"
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
artifactName: webapp
downloadPath: "$(System.ArtifactsDirectory)"
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
artifactName: vueapp
downloadPath: "$(System.ArtifactsDirectory)"
- task: CopyFiles#2
inputs:
SourceFolder: '$(System.ArtifactsDirectory)\vueapp'
TargetFolder: 'path to web application result folder' #eg. $(System.ArtifactsDirectory)\webapp\Content\d_C\a\1\s\AboutSite\AboutSite\obj\Release\netcoreapp2.0\PubTmp\Out\
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: $(System.ArtifactsDirectory)\webapp
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/webapplication.zip'
Above example only shows a general idea. You can aslo move the ExtractFile task to Combine job. In either way, you will have to use extract file, copy file and archive file task.
For TargetFolder parameter in copy file task, you can check the download build artifacts log for webapp artifact to get the full path. For example as below screenshot shows.
You can use PublishPipelineArtifact#1 to create artifacts for your projects and then in a separate job DownloadPipelineArtifact#2. By defining path you may compose your final artifact (if this mixing many projects is not more complicated than putting one inside another). And publish your artifact as build or pipeline artifact depending how you have roganized your release.
# Download an artifact named 'WebApp' to 'bin' in $(Build.SourcesDirectory)
- task: DownloadPipelineArtifact#2
inputs:
artifact: 'WebApp'
path: $(Build.SourcesDirectory)/bin
Here you have more info about publishing and downloding artifacts.