AzureCLI task in Azure DevOps pipeline cannot find the location of scripts - azure-devops

I am using Azure DevOps pipeline and in one of my tasks I need to run a bash script which contains some Azure CLI scripts. I have put this script in a folder called scripts, and my pipeline is running in pipelines folder. Pipelines and script folders are at the same level in root directory. The following shows the part of my pipeline where I run the AzureCLI#2 task, but when the pipeline runs it raises the error that it cannot find the file!
I have already pushed everything in the repository and I can see the files. However, the pipeline cannot find it. I am using AzureCLI#2 documentation link to provide values for this task. The part of pipeline that uses AzureCLI is as follows:
pool:
vmImage: ubuntu-20.04
trigger:
branches:
include:
- "feature/ORGTHDATAMA-4810"
exclude:
- "main"
- "release"
paths:
include:
- "dip-comma-poc/**"
variables:
- group: proj-comma-shared-vg
stages:
- stage: DownloadArtifact
displayName: "Download python whl from artifactory"
jobs:
- job: DownloadArtifactJob
steps:
- checkout: self
## To download from devops artifactory with AZ CLI
## https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-cli-v2?view=azure-pipelines
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'scriptPath'
scriptPath: 'dip-comma-poc/deployment-pipelines/scripts/sp-login.sh'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
displayName: "Download python whl from artifactory"
This caused the following error:
To resolve the error I tried using relative path in scriptPath as following but it caused the same error:
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'scriptPath'
scriptPath: './scripts/sp-login.sh'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
displayName: "Download python whl from artifactory"
I also tried inlineScript but again it cannot find the file.
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'inlineScript'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
inlineScript: './scripts/sp-login.sh $1 $2 $3'
displayName: "Download python whl from artifactory"
This also raised the same error:
How can I refer to my script in the pipeline yaml file so that it does not raise "No such file or directory error" as shown above? Thank you.

Open the Git repository on the web UI of your Azure DevOps and then check whether its file structure is looking like as below image shows.
If it is same as this file structure. You need to change the file path set on the Azure CLI task to be "deployment-pipelines/scripts/sp-login.sh" instead of "dip-comma-poc/deployment-pipelines/scripts/sp-login.sh".
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'scriptPath'
scriptPath: 'deployment-pipelines/scripts/sp-login.sh'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
displayName: "Download python whl from artifactory"

Related

Azure devops deploy to function app with access restrictions

I am trying to deploy a function app via an Azure DevOps pipeline, however I am receiving the following error:
##[error]Failed to deploy web package to App Service.
##[error]To debug further please check Kudu stack trace URL : $URL_REMOVED
##[error]Error: Error: Failed to deploy web package to App Service. Ip Forbidden (CODE: 403)
From some googling a suggested solution seems to be to whitelist agent IP before the deployment, and then remove it after. I have added this to my pipeline, and I can see the agent IP get added to access restrictions, however the deployment still fails.
Here is my pipeline file:
# Node.js Function App to Linux on Azure
# Build a Node.js function app and deploy it to Azure as a Linux function app.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- main
variables:
# Azure Resource Manager connection created during pipeline creation
azureSubscription: 'xxx'
# Function app name
functionAppName: 'xxx'
# Environment name
environmentName: 'xxx'
# Agent VM image name
vmImageName: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin
fi
displayName: 'Build extensions'
- script: |
npm install
npm run build --if-present
npm run test --if-present
displayName: 'Prepare binaries'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(environmentName)
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: AzureCLI#2
inputs:
azureSubscription: '$(azureSubscription)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
agentIP=$(curl -s https://api.ipify.org/)
az functionapp config access-restriction add -g xxx -n xxx --action Allow --ip-address $agentIP --priority 200
- task: AzureFunctionApp#1
displayName: 'Azure Functions App Deploy: xxx'
inputs:
azureSubscription: '$(azureSubscription)'
appType: functionAppLinux
appName: $(functionAppName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
Is anyone able to advise where I am going wrong?
I've had a simmilar issue while adding the agent IP to the network restrictions of an storage account (using Powershell but you'll understand the idea), we added a 60s sleep to be sure that the setting are taken into account by Azure.
$sa_name = "sapricer$env_prefix"
if ($null -ne (Get-AzStorageAccount -ResourceGroupName $sa_rg -AccountName $sa_name -ErrorAction Ignore)) {
Write-Output "Storage account '$sa_name' exists"
if ($enable) {
Write-Output "Add ip rule for $current_ip on $sa_name..."
Add-AzStorageAccountNetworkRule -ResourceGroupName $sa_rg -AccountName $sa_name -IPAddressOrRange $current_ip
}
else {
Write-Output "Remove ip rule for $current_ip on $sa_name..."
Remove-AzStorageAccountNetworkRule -ResourceGroupName $sa_rg -AccountName $sa_name -IPAddressOrRange $current_ip
}
}
Start-Sleep -Seconds 60
I found the solution to this.
Function Apps have two IP Restriction sections, one for the App and one for the SCM site. The SCM site is the one that requires the IP to be whitelisted in order for the deployment to work:
az functionapp config access-restriction add --scm-site true -g xxx -n xxx --action Allow --ip-address $agentIP --priority 200
You can deploy Azure function app to azure devops pipeline using azure function app task from Azure devops pipeline tasks
Here is the sample snippet for deploying azure function app
variables:
azureSubscription: Contoso
# To ignore SSL error, uncomment the below variable
# VSTS_ARM_REST_IGNORE_SSL_ERRORS: true
steps:
- task: AzureFunctionApp#1
displayName: Azure Function App Deploy
inputs:
azureSubscription: $(azureSubscription)
appName: samplefunctionapp
package: $(System.DefaultWorkingDirectory)/**/*.zip
Here is the Microsoft Document for deploying azure function app.

How can I execute and schedule Databricks notebook from Azure Devops Pipeline using YAML

I wanted to do CICD of my azure Databricks notebook using YAML file.
I have followed the below flow
Pushed my code from Databricks notebook to Azure Repos.
Created a Build using below YAML script.
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
TargetFolder: ' $(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: notebooks'
inputs:
ArtifactName: dev_release
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'publish build'
publishLocation: 'Container'
By doing above I was able to create a Artifact.
Now I have added another task to deploy that artifact to my Databricks workspace. By using below YAML Script.
- stage: Deploy
displayName: Deploy stage
jobs:
- job: Deploy
displayName: Deploy
pool:
vmImage: 'vs2017-win2016'
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'dev_release'
downloadPath: '$(System.ArtifactsDirectory)'
- task: databricksDeployScripts#0
inputs:
authMethod: 'bearer'
bearerToken: 'dapj0ee865674cd9tfb583dbad61b78ce9b1-4'
region: 'Central US'
localPath: '$(System.DefaultWorkingDirectory)'
databricksPath: '/Shared'
Now i want to run the deployed notebook from here only. So I have "Configure Databricks CLI" task and "Execute Databricks" task to execute the note book.
Got below Error:
##[error]Error: Unable to locate executable file: 'databricks'. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also verify the file has a valid extension for an executable file.
##[error]The given notebook does not exist.
How can I execute notebook from Azure DevOps. My notebooks are in Scala Language.
Is there any other way to use in Production servers.
As you have deployed the Databricks Notebook using Azure DevOps and asking for any other way to run it, I would like to suggest you Azure Data Factory Service.
In Azure Data Factory, you can create pipeline that executes a Databricks notebook against the Databricks jobs cluster. You can also pass Azure Data Factory parameters to the Databricks notebook during execution.
Follow the official tutorial to Run Databricks Notebook with Databricks Notebook Activity in Azure Data Factory to deploy and run Databrick Notebook.
Additionally, you can schedule the pipeline trigger at any particular time or event to make the process completely automatic. Refer https://learn.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers
try this :
- job: job_name
displayName: test job
pool:
name: agent_name(selfhostedagent)
#pool:
workspace:
clean: all
steps:
- checkout: none
- task: DownloadBuildArtifacts#0
displayName: 'Download Build Artifacts'
inputs:
artifactName: app
downloadPath: $(System.DefaultWorkingDirectory)
- task: riserrad.azdo-databricks.azdo-databricks-configuredatabricks.configuredatabricks#0
displayName: 'Configure Databricks CLI'
inputs:
url: '$(Databricks_URL)'
token: '$(Databricks_PAT)'
- task: riserrad.azdo-databricks.azdo-databricks-deploynotebooks.deploynotebooks#0
displayName: 'Deploy Notebooks to Workspace'
inputs:
notebooksFolderPath: '$(System.DefaultWorkingDirectory)/app/path/to/notebbok'
workspaceFolder: /Shared
- task: riserrad.azdo-databricks.azdo-databricks-executenotebook.executenotebook#0
displayName: 'Execute /Shared/path/to/notebook'
inputs:
notebookPath: '/Shared/path/to/notebook'
existingClusterId: '$(cluster_id)'

Can't run a script from a Yaml Azure Devops pipeline

My Yaml Azure Devops pipeline failed when running a script.
Situation
I have this script in the Tuto-BuildDeploy repository:
trigger:
- none
pool:
vmImage: windows-latest
resources:
repositories:
- repository: TutoDeploy
ref: main
type: git
name: Tuto-Deploy
jobs:
- job: checkout
steps:
- checkout: self
- checkout: TutoDeploy
- job: Deploy
dependsOn:
- checkout
steps:
- task: AzurePowerShell#5
inputs:
azureSubscription: 'ToAzureCnx'
ScriptType: 'FilePath'
ScriptPath: .\Tuto-Deploy\build.ps1
azurePowerShellVersion: 'LatestVersion'
This is my build.ps1 file:
param
(
)
$resourceGroup = "RG2"
$location = "westeurope"
New-AzResourceGroup -Name $resourceGroup -Location $location -Force
What happend
I get this error message:
##[error]The term 'D:\a\1\s\Tuto-Deploy\build.ps1' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that
the path is correct and try again.
what I tested
I added :
- script: dir $(Build.SourcesDirectory)\Tuto-Deploy
To check build.ps1 if is downloaded.
I also tried to run it from a pipeline in the project Tuto-Deploy:
trigger:
- main
pool:
vmImage: windows-latest
steps:
- task: AzurePowerShell#5
inputs:
azureSubscription: 'ToAzureCnx'
ScriptType: 'FilePath'
ScriptPath: '$(System.DefaultWorkingDirectory)/build.ps1'
azurePowerShellVersion: 'LatestVersion'
It works fine.
So I don't think I have a problem with the script.
What I need
I don't understand why it is not working. What ca I do?
thanks
You ran the checkout steps in a separate job. And this caused the problem.
Each job will run in in fresh new agent. See here. So TutoDeploy repo which is downloaded in the first job is not accessiable in the second job. You should combine the checkout job with Deploy job. You can set condtions for the AzurePowershell task if it only needs be executed when the checkout steps are successful. See below:
- job:
steps:
- checkout: self
- checkout: TutoDeploy
- task: AzurePowerShell#5
inputs:
azureSubscription: 'ToAzureCnx'
ScriptType: 'FilePath'
ScriptPath: .\Tuto-Deploy\build.ps1
azurePowerShellVersion: 'LatestVersion'
condition: succeeded()

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

Pass variable set in PowerShell task to another task (YAML)

I have a yaml file with following tasks:
parameters:
steps:
- task: AzurePowerShell#4
displayName: 'script'
inputs:
azureSubscription:
ScriptPath:
ScriptArguments:
azurePowerShellVersion: LatestVersion
- task: AzureResourceGroupDeployment#2
displayName: 'deployment'
inputs:
azureSubscription:
resourceGroupName:
location:
overrideParameters: '-abc $(var1) -def $(var2)'
deploymentMode: 'Incremental'
In the Powershell Script, I'm setting 2 variables as follows:
$ABC = 1
$DEF = 2
Write-Host "##vso[task.setvariable variable=var1;isOutput=true]$ABC"
Write-Host "##vso[task.setvariable variable=var2;isOutput=true]$DEF"
On trying to use these variables in the 2nd task (in overrideParameters section), I see the following error:
[error]InvalidContentLink: Unable to download deployment content from 'xxxx$(var1)'
[error]InvalidContentLink: Unable to download deployment content from 'xxxx$(var2)'
Am I setting variables in PowerShell script wrong?
You can try to add a reference name to the first task. For example:
- task: AzurePowerShell#4
displayName: 'script'
inputs:
azureSubscription:
ScriptPath:
ScriptArguments:
azurePowerShellVersion: LatestVersion
name: test
Then in the second task, get the variable value in the form of $(test.var1).
This is because in definition editor, downstream tasks won't get variable name intellisense for output variables that were published by an ad-hoc script. You can refer to this document for details.
In addition, here is a blog with some examples on how to pass variables in Azure Pipelines YAML tasks.