How to make correct my azure deployment issue by rearranging my azure-pipelines.yml? - azure-devops

I am trying to establish a pipeline by using azure cloud and devops. But I got an error below while deploying from succeeded building. How can I solve this issue?
I read an article it is awesome "http://www.alessandromoura.com.br/2020/04/23/azure-devops-publish-and-download-artifacts/"
I applied your rule sets but I got error always below :
Error: No package found with specified pattern: D:\a\r1\a***.zip
Check if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job.
azure-pipelines.yml :
# Docker
# Build and push an image to Azure Container Registry
# https://learn.microsoft.com/azure/devops/pipelines/languages/docker
trigger:
- main
resources:
- repo: self
variables:
# Container registry service connection established during pipeline creation
dockerRegistryServiceConnection: 'xxxxx'
imageRepository: 'xxxhelloaspnetcore'
containerRegistry: 'xxxcontainer01.azurecr.io'
dockerfilePath: '$(Build.SourcesDirectory)/Dockerfile'
tag: '$(Build.BuildId)'
# Agent VM image name
vmImageName: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build and push stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: Docker#2
displayName: Build and push an image to container registry
inputs:
command: buildAndPush
repository: $(imageRepository)
dockerfile: $(dockerfilePath)
containerRegistry: $(dockerRegistryServiceConnection)
tags: |
$(tag)
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"

Your pipeline creates and pushes image to container registry so you don't have there pipeline artifacts. This is why DownloadPipelineArtifact throws error.
DownloadPipelineArtifact makes sens only if you use PublishPipelineArtifact before. This doc - Publish and download artifacts in Azure Pipelines describe it very well.
There is also a way to download artifacts from another pipeline - but it requires to use resource, but you don't have defined pipeline resource in your pipeline.
So all works as expected. Can you explain what actually do you want to download and what achieve by that?

Related

How to Create Azure Devops Release pipeline from Github to DevOps Using Yaml?

Need Help With Fixing this Yaml Code for Release Pipeline in Azure Synapse that Would Take json Templates from GitHub Branch and Deploy then to Azure Synapse
Error While Pipeline Run : Error during execution: Error: No file found with this pattern
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- main
pool:
vmImage: ubuntu-latest
variables:
system.debug: true
stages:
- stage: deploy
displayName: 'Deploy to Synapse Workspace'
jobs:
- job: deploy
displayName: 'Synapse Workspace Deployment'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureSynapseWorkspace.synapsecicd-deploy.synapse-deploy.Synapse workspace deployment#1
displayName: 'Synapse deployment task for workspace: synapse-qa-van '
inputs:
TemplateFile: 'github.com/PromitXI/Synapse-dev/blob/workspace_publish/synapse-data-dev-van/TemplateParametersForWorkspace.json'
ParametersFile: 'github.com/PromitXI/Synapse-dev/blob/workspace_publish/synapse-data-dev-van/TemplateParametersForWorkspace.json'
azureSubscription: 'Promit''s Cloud (58xxxxxxxxxxf09-82b8-882exxxxxxxx)'
ResourceGroupName: 'RG-Van-Test-DataPlatform-westus3'
TargetWorkspaceName: 'synapse-qa-van '
workspacePublish: true
workspacePublishPath: Synapse-dev
workspacePublishRepository: workspace_publish
timeoutInMinutes: 15
retryCountOnTaskFailure: 1
I was Trying to Implement the Yaml Code To Azure Dev-Ops Pipeline after Connecting to the GitHub repository
Expecting that the ARm templates in the Workspace_publish branch would get Deployed in my Synapse QA Instance

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

Accessing Docker Images In A Multi Stage Azure Devops

I am building an Azure DevOps pipeline and was trying out the multi-stage feature, this is defined by using a yml file.
In the yml definition I have two stages, one is to build docker images using a docker-compose command, second stage is to push these images to ACR. It seems this is not possible as I haven't had any success accessing the recently built images from the first stage.
Here's a sample yml file
stages:
- stage: Build
displayName: Build image
jobs:
- job: Build
displayName: Build
pool:
vmImage: 'ubuntu-latest'
steps:
- publish: $(Build.ArtifactStagingDirectory)
artifact: docker-images
- task: DockerCompose#0
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: '************'
azureContainerRegistry: '************'
dockerComposeFile: '**/docker-compose.yml'
action: 'Build services'
additionalImageTags: '$(Build.BuildId)'
- stage: Push
displayName: Push image
jobs:
- job: Push
displayName: Push
pool:
vmImage: 'ubuntu-latest'
steps:
- download: current
artifact: docker-images
- task: DockerCompose#0
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: '************'
azureContainerRegistry: '************'
dockerComposeFile: '**/docker-compose.yml'
action: 'Push services'
additionalImageTags: '$(Build.BuildId)'
The question is, how do I access docker images that was built in my previous stage? where is it stored? I've tried by downloading $(Build.ArtifactStagingDirectory) from the first stage but it didn't seem to have it. The same applies if I have one stage but separate jobs.
If I would to use both build and push in one stage it works fine, but I want to have a separate stages for each.
First of all you should always put publish artifacts task at the end of the stage. Or you will just publish an empty folder.
Second, the dock compose command build and save the image in the docker folder on the hosted machine. Nothing will be output to the artifacts folder $(Build.ArtifactStagingDirectory) of the agent.
As a workaround to pass the docker image between stages, you can use docker image save command to specifically save the image in folder $(Build.ArtifactStagingDirectory). And use publish artifacts task to publish the image to azure devops server. Then you can use download artifacts to download the image in the next stage.
You can check below example:
1, In Build Stage add a Docker#0(version 0.*) after DockerCompose task to run image save command to save the image to folder $(Build.ArtifactStagingDirectory)
- task: Docker#0
displayName: 'Run a Docker command'
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: '************'
azureContainerRegistry: '************'
action: 'Run a Docker command'
customCommand: 'image save <imageName>:$(Build.BuildId) -o $(Build.ArtifactStagingDirectory)/imagetest.tar'
2, Put the publish artifact task at the end of Build stage to publish the image
- publish: $(Build.ArtifactStagingDirectory)
artifact: docker-images
3, Now you can download the image archive file from Build stage to Publish stage. And you can run docker load command to load the archive image. After it is loaded, you can then push it to ACR
- download: current
artifact: docker-images
- task: Docker#0
displayName: 'Run a Docker command'
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: '************'
azureContainerRegistry: '************'
action: 'Run a Docker command'
customCommand: 'load --input $(Pipeline.Workspace)/docker-images/imagetest.tar'
Hope above helps!
You're specifying
pool:
vmImage: 'ubuntu-latest'
This means every stage is pulling a fresh, blank VM image from Microsoft's hosted pipeline pool and running the commands on it. Your build does not persist.
So the short answer is "you can't". If you want state to persist across jobs, you need to create a dedicated private agent.

Unable to resolve $(Release.ReleaseId) in Azure DevOps yml pipelines

When using the Releases tab in Azure DevOps web console, to create release definitions, the tasks can resolve $(Release.ReleaseId) inside of a bash task.
But if I instead do my deployment in the azure-pipelines.yml file and do echo $(Release.ReleaseId), I get null because the variable doesn't exist. How come?
Here is part of the yml file
- stage: Deploy
dependsOn: BuildAndPublishArtifact
condition: succeeded('BuildAndPublishArtifact')
jobs:
- deployment: DeployToAWSDev
displayName: My display name
pool:
vmImage: 'Ubuntu-16.04'
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: MyArtifact
- task: Base#3
inputs:
targetType: 'inline'
script: |
echo $(Release.ReleaseId) # Nothing
Thanks for any help to point in the right direction of how I can retrieve my release id.
Refer to the documentation on variables. There's no differentiation of "build" vs "release" in a YAML pipeline. Thus, Build.BuildId would be the run's ID.

How to capture and retain the artifact package version for Universal artifacts in azure pipelines for cd

I have this azure devops ci/cd pipeline using yaml. My yaml has two stages CI and CD. My CI stage has one job called BuildandDeploy. The CD stage has one deployment job. I am using universal artifacts to publish and downloading the same. In the CD phase I am using UniversalPackages devops task to download the artifact. The task has a input variable called vstsPackageVersion which is the package version that is shown in universal artifacts. I have known of two other variables that could be used $(Build.BuildId) and $(Build.BuildNumber). As a temporary work around I am hard coding the package version for the universal artifact.
I wasn't able to download the artifact with either of the built-in variables. Since the CI and CD are in the same pipeline, is there any way to store and retrieve the package version of the artifact? Is there a identifier like latest that I could use to get the latest artifact from universal package.
# specific branch build with batching
trigger:
batch: true
branches:
include:
- master
stages:
- stage: CI
jobs:
- job: BuildAndPublish
pool:
vmImage: 'Ubuntu-16.04'
steps:
-
script: |
docker build -t $(dockerId).azurecr.io/$(imageName):$(version) .
docker login -u $(dockerId) -p $(pswd) $(dockerId).azurecr.io
docker push $(dockerId).azurecr.io/$(imageName):$(version)
- task: Bash#3
displayName: Initialize Helm Client - create local repo
inputs:
targetType: 'inline'
script: '
helm init --client-only
'
- task: HelmDeploy#0
displayName: Package helm chart
inputs:
connectionType: 'Kubernetes Service Connection'
command: 'package'
chartPath: 'my-helm-dir'
- task: UniversalPackages#0
displayName: Publish helm package to my-company-artifacts
inputs:
command: 'publish'
publishDirectory: '$(Build.ArtifactStagingDirectory)'
feedsToUsePublish: 'internal'
vstsFeedPublish: '$(my-feed-guid)'
vstsFeedPackagePublish: 'my-artifact-name'
versionOption: patch
packagePublishDescription: 'My helm package descrition'
- stage: CD
jobs:
- deployment: DeployJob
displayName: Deploy Job
pool:
vmImage: Ubuntu-16.04
environment: dev
strategy:
runOnce:
deploy:
steps:
- task: UniversalPackages#0
displayName: 'Universal download'
inputs:
command: download
vstsFeed: '$(my-feed-name)'
vstsFeedPackage: 'my-artifact-name'
vstsPackageVersion: 0.0.32
- task: ExtractFiles#1
displayName: 'Extract files '
inputs:
archiveFilePatterns: '*.tgz'
destinationFolder: 'my-folder'
cleanDestinationFolder: true
The Universal Packages task based on az artifacts universal cli that not support "latest version", but only specific version (by the way, this cli is on preview).
As workaround, you can use the Rest API to retrieve the latest version and set a new variable, then, in the download task use this variable.
For example, add a PowerShell task that get the version number and set the variable:
- powershell: |
$head = #{ Authorization = "Bearer $env:TOKEN" }
$url = "https://feeds.dev.azure.com/{organization}/_apis/packaging/Feeds/{feed-name}/packages/{package-guid}?api-version=5.0-preview.1"
$package = Invoke-RestMethod -Uri $url -Method Get -Headers $head -ContentType application/json
$latestVersion = ($package.versions.Where({ $_.isLatest -eq $True })).version
Write-Host "The latest version is $latestVersion"
Write-Host "##vso[task.setvariable variable=latestVersion]$latestVersion"
env:
TOKEN: $(system.accesstoken)
Now, in the download task use it:
vstsPackageVersion: $(latestVersion)