Deploy to another workspace in synapse with CI/CD - azure-devops

Currently I am able to deploy to Synapse via the Azure DevOps portal pipeline that I made with the UI. I am trying to achieve the same result via a yml file but I am encountering the problem [error]Encountered with exception:Error: No file found with this pattern and I have the following code:
trigger:
branches:
include:
- workspace_publish
pool:
name: Azure Pipelines
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Deploying
steps:
- task: CopyFiles#2
displayName: 'Copy ARM Template Files to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: testsaws
Contents: '*json'
archiveFile: '$(Build.ArtifactStagingDirectory)'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishPipelineArtifact#1
displayName: 'Publish Pipeline Artifact'
inputs:
#PathtoPublish: ./testsaws
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'ASA_Drop'
- task: AzureSynapseWorkspace.synapsecicd-deploy.synapse-deploy.Synapse workspace deployment#1
displayName: 'Synapse deployment task for workspace: qasaws'
inputs:
azureSubscription: 'dataplatform-qa-group-SPN'
subscriptionId: 'XXX'
action: 'Create Or Update Resource Group'
resourceGroupName: 'dataplatform-qa-group'
location: 'West US 2'
TemplateFile: '$(Build.ArtifactStagingDirectory)/ASA_Drop/ARM/TemplateForWorkspace.json'
ParametersFile: '$(Build.ArtifactStagingDirectory)/ASA_Drop/ARM/TemplateParametersForWorkspaceQA.json'
Any help or suggestion in finding the problem is welcome and appreciated.

When in doubt, print out the contents. I used the following command to print out the content of the current working directory and found out the artifact didn't contain any file.
# PowerShell
# Run a PowerShell script on Linux, macOS, or Windows
- task: PowerShell#2
inputs:
#targetType: 'filePath' # Optional. Options: filePath, inline
#filePath: # Required when targetType == FilePath
#arguments: # Optional
#script: '# Write your PowerShell commands here.Write-Host Hello World'
# Required when targetType == Inline
#errorActionPreference: 'stop' # Optional. Options: default, stop, continue, silentlyContinue
#warningPreference: 'default' # Optional. Options: default, stop, continue, silentlyContinue
#informationPreference: 'default' # Optional. Options: default, stop, continue, silentlyContinue
#verbosePreference: 'default' # Optional. Options: default, stop, continue, silentlyContinue
#debugPreference: 'default' # Optional. Options: default, stop, continue, silentlyContinue
#failOnStderr: false # Optional
#ignoreLASTEXITCODE: false # Optional
#pwsh: false # Optional
#workingDirectory: # Optional
Then I changed the following in the yaml file:
- task: PublishPipelineArtifact#1
displayName: 'Publish Pipeline Artifact'
inputs:
#PathtoPublish: ./testsaws
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'ASA_Drop'
- task: AzureSynapseWorkspace.synapsecicd-deploy.synapse-deploy.Synapse workspace deployment#1
displayName: 'Synapse deployment task for workspace: qasaws'
inputs:
azureSubscription: 'qa-group-SPN'
subscriptionId: 'XXX'
action: 'Create Or Update Resource Group'
resourceGroupName: 'qa-group'
location: 'West US 2'
TemplateFile: 'testsaws/TemplateForWorkspace.json'
ParametersFile: 'testsaws/TemplateParametersForWorkspaceQA.json'

Related

How can I include and use a powershell file from another repo

I want to use a generic powershell file in multiple build pipelines from a specific repo. This file I want to use it in my release stages to set our servers to maintenance mode an back to running.
At the moment I checkout the second repo in my build stage. But the DotNetCoreCLI#2 restore task does also a checkout and my powershell file disappears.
steps:
- checkout: templates
- checkout: self
- task: CopyFiles#2
displayName: 'Copy ADC Powershell script to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: '$(Build.SourcesDirectory)/MYREPONAME/scripts/'
Contents: changestate.ps1
TargetFolder: $(Build.ArtifactsDirectory)
And then my task that runs the script doesn't find my powershell file
- task: PowerShell#2
displayName: 'Set $(Agent.MachineName) to maintenance'
inputs:
targetType: filePath
filePath: '$(System.ArtifactsDirectory)/${{ parameters.artifactName }}/changestate.ps1'
arguments: '-server $(AGENT.MACHINENAME) -state "maintenance"'
Has anyone maybe a better idea how can I do that?
Thanks for help.
Update: Solved in this way
I used a seperate stage and the publish task
- checkout: templates
- task: CopyFiles#2
displayName: 'Copy ADC Powershell script to: $(Build.ArtifactStagingDirectory)'
inputs:
SourceFolder: '$(Build.SourcesDirectory)/scripts/'
Contents: '*.ps1'
TargetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
displayName: "Publish artifact"
inputs:
pathtoPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: '${{ parameters.artifactName }}'

Azure Pipeline - Self Hosted Agent YAML how to copy release files to my Azure VM

I have an Azure Pipeline using a Pool of a Self Hosted Agent and I have the following YAML that is working great:
# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/apps/windows/dot-net
trigger:
- master
# Al usuar el pool default, usara de la organizacion el pool que cree como Self hosted CustomAgentSelfInsurance
pool:
name: Default
#vmImage: 'windows-latest'
#container: mcr.microsoft.com/windows/servercore:ltsc2019
variables:
- group: 'CertPass'
#solution: '**/SelfInsurance.sln'
#buildPlatform: 'Any CPU'
#buildConfiguration: 'Release'
#pathToMageTool: "\"C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v10.0A\\bin\\NETFX 4.8 Tools\\mage.exe\""
steps:
- task: DownloadSecureFile#1
displayName: Download Pfx
name: myCertificatePfx
inputs:
secureFile: ventasmlcert.pfx
- task: DownloadSecureFile#1
displayName: Download sni
name: snInstallPfx
inputs:
secureFile: SnInstallPfx.exe
- task: DownloadSecureFile#1
displayName: Download Testpipelineapp
name: testPipelineapp
inputs:
secureFile: TestPipelineapp.zip
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '$(Agent.TempDirectory)/TestPipelineapp.zip'
cleanDestinationFolder: false
overwriteExistingFiles: true
#- task: PowerShell#2
# env:
# SN_INSTALL_PFX: $(snInstallPfx.secureFilePath)
# MYCERTIFICATE_PFX: $(myCertificatePfx.secureFilePath)
# MYCERTIFICATE_PFX_PASSWORD: $(certpass)
# inputs:
# targetType: 'inline'
# script: '&"$($ENV:SN_INSTALL_PFX)" "$($ENV:MYCERTIFICATE_PFX)" "$($ENV:MYCERTIFICATE_PFX_PASSWORD)"'
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
displayName: 'Build .csproj file'
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /p:SkipInvalidConfigurations=false /p:OutDir="$(System.DefaultWorkingDirectory)\publish_output"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
# Command line
# Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
- task: CmdLine#2
inputs:
script: TestPipelineapp.exe #'echo Write your commands here.'
#workingDirectory: # Optional
#failOnStderr: false # Optional
What I need is to create a task that takes my build release files and copy them to a folder in my Azure VM.
Any clue and example on how can I do that?
you can use Environment to configure your VM for azure devops to have access, and use the deployment environment to run a download task -> copy to your folder.
docs: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops

How to use output variable value from one powershell script to another powershell script

Hi All I'm working on ymal release pipeline where I have two PowerShell script.
test1.ps1
test2.ps1
step 1)
In test1.ps1 I'm setting a output variable using :
$link="google.com"
Write-Host "##vso[task.setvariable variable=packageurl]$link"
step 2)
Now I want to use this pacakgeurl variable value in test2.ps1 script.
Ymal code will look like :
- task: AzurePowerShell#3
displayName: 'Query Latest Package'
inputs:
azureSubscription: 'test1'
ScriptType: FilePath
ScriptPath: 'source\test1.ps1'
- task: PowerShell#2
displayName: 'Download package'
inputs:
targetType: filePath
filePath: 'source\test2.ps1'
So basically I have to use the output variable value from 1 task to 2 task via PowerShell script.
I also tried to follow this link : https://developercommunity.visualstudio.com/content/problem/676342/how-to-use-output-variable-from-a-powershell-scrip.html
Can anyone help me on this ..?
Thanks in advance.
This works as you expect:
trigger: none
pr: none
pool:
vmImage: 'windows-latest'
steps:
- task: AzurePowerShell#3
displayName: 'Query Latest Package'
inputs:
azureSubscription: 'rg-the-code-manual'
ScriptType: FilePath
ScriptPath: 'stackoverflow\94-variables\script-1.ps1'
azurePowerShellVersion: LatestVersion
- task: PowerShell#2
displayName: 'Download package'
inputs:
targetType: filePath
filePath: 'stackoverflow\94-variables\script-2.ps1'
Yaml file is as you have it.
script-1.ps1 sets variable:
$link="google.com"
Write-Host "##vso[task.setvariable variable=packageurl]$link"
And script-2.ps1 uses that variable:
Write-Host $env:PACKAGEURL

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

Azure DevOps Multi Stage Pipeline Error: No package found with specified pattern: /home/vsts/work/1/s/**/*.zip - How do I fix?

I have an Azure DevOps Build (yaml) and Release Pipeline (Classic) successfully deploying to Azure.
I am trying to convert these 2 separate steps in a Multi Stage Yaml Pipeline.
On the Azure App Service Deploy task (AzureRmWebAppDeployment#4), I am getting the following error:
No package found with specified pattern: /home/vsts/work/1/a/*.zip
Below is my Multi Stage Yaml Pipeline
stages:
- stage: Build
jobs:
- job: 'Build'
pool:
vmImage: 'windows-latest'
variables:
buildConfiguration: 'Release'
steps:
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '**/*.csproj'
vstsFeed: 'dd55642d-8943-411f-8856-9714dd0da8af'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
projects: '**/*.csproj'
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI#2
displayName: Test
inputs:
command: test
projects: '**/*[Tt]ests/*.csproj'
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: false
projects: '**/Tools.Client.Blazor.ServerApp.csproj'
arguments: '--configuration $(buildConfiguration) --output $(build.artifactstagingdirectory)'
- task: PublishSymbols#2
displayName: 'Publish symbols path'
inputs:
SearchPattern: '**\bin\**\*.pdb'
PublishSymbols: false
continueOnError: true
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)\AzureDeploy'
inputs:
SourceFolder: AzureDeploy
TargetFolder: '$(build.artifactstagingdirectory)\AzureDeploy'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
- stage: Systest
jobs:
- job: 'Systest'
variables:
resourceGroupName: '$(appName)-rg-$(environment)'
location: 'East US'
appServiceName: '$(appName)-svc-$(environment)'
appInsightsName: '$(appName)-ins-$(environment)'
appServicePlanName: '$(appName)-asp-$(environment)'
appName: 'tools'
owner: 'Pod'
environment: 'systest'
steps:
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'ARM Template deployment: Resource Group scope'
inputs:
azureResourceManagerConnection: 'Dev/Test Connection'
subscriptionId: ''
resourceGroupName: '$(resourceGroupName)'
location: '$(location)'
csmFile: '$(System.DefaultWorkingDirectory)/AzureDeploy/Tools.azureDeploy.json'
csmParametersFile: '$(System.DefaultWorkingDirectory)/AzureDeploy/Tools.azureDeploy.parameter.json'
overrideParameters: '-appServiceName "$(appServiceName)" -appInsightsName "$(appInsightsName)" -appServicePlanName "$(appServicePlanName)" -owner "$(owner)" -environment "$(environment)" -location "$(location)"'
- task: AzureRmWebAppDeployment#4
displayName: 'Azure App Service Deploy: $(appServiceName)'
inputs:
ConnectionType: 'AzureRM'
azureSubscription: ''
appType: 'webApp'
WebAppName: '$(appServiceName)'
packageForLinux: '$(Build.ArtifactStagingDirectory)/*.zip'
Any help / suggestions would be appreciated.
Because it's 2 stages the second stage doesn't have the file you published in the first stage, you need to download it.
You can use Pipeline artifacts instead of build artifacts.
Pipeline artifacts provide a way to share files between stages in a
pipeline or between different pipelines. They are typically the output
of a build process that needs to be consumed by another job or be
deployed. Artifacts are associated with the run they were produced in
and remain available after the run has completed.
To publish (upload) an artifact for the current run:
steps:
- publish: $(build.artifactstagingdirectory)
artifact: drop
And in the second stage, you download the artifact:
steps:
- download: current
artifact: drop
You can also achieve it with build artifacts and download with DownloadBuildArtifacts#0 task.
During Publish it will not work like this. Instead of using path "/home/vsts/work/1/a/.zip", this path can be used "$(System.DefaultWorkingDirectory)/_Releasepipelinename/drop/.zip"