Inline PythonScript Azure Pipelines task in external file - azure-devops

I am developing a Azure task template, and I have a large .py file that I want to be executed in one step
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: inline
script: |
... really long python code
It's possible to store the code in another file, at the same level of the yml template, and consume it from there? Or what would be the best approach to keep the template clean?
I know that it's possible to use scriptSource
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: 'filePath'
scriptPath: 'my_python.py'
arguments: '${{ parameters.my_param }}'
But as the template is in another repository than the repository ran in the pipeline, I don't think that I can reach that my_python.py without downloading it with a wget, or cloning, or doing additional steps. I am right?
Regards!

To use a template from another repo you need to define repository source like here:
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
steps:
- template: common.yml#templates # Template reference
Once you have you need to just checkout this repo:
# Repo: Contoso/LinuxProduct
# File: azure-pipelines.yml
resources:
repositories:
- repository: templates
type: github
name: Contoso/BuildTemplates
steps:
- checkout: self
- checkout: templates #this download whole repo
- template: common.yml#templates # Template reference
Now you need to figure out where it is downloaded ;)
Multiple repositories: If you have multiple checkout steps in your job, your source code is checked out into directories named after the repositories as a subfolder of s in (Agent.BuildDirectory). If (Agent.BuildDirectory) is C:\agent\_work\1 and your repositories are named tools and code, your code is checked out to C:\agent\_work\1\s\tools and C:\agent\_work\1\s\code.
So if you have script in scripts folder in templates repo you will find it in $(Agent.BuildDirectory)\templates\scripts\script.py.
So then you can use it like this:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptSource: 'filePath'
scriptPath: '$(Agent.BuildDirectory)\templates\scripts\script.py'
arguments: '${{ parameters.my_param }}'

Related

How do you copy azure repo folders to a folder on a VM in an Environment in a pipeline?

I have an Environment called 'Dev' that has a resource, which is a VM. As part of the 'Dev' pipeline I want to copy files from a specific folder on the develop branch of a specific repo to a specific folder on the VM that's on the Environment.
I've not worked with Environments before or yaml pipelines much but I gather I need to use the CopyFiles#2 task.
So I've got an azure pipeline yaml file something like this:
variables:
isDev: $[eq(variables['Build.SourceBranch'], 'refs/heads/develop')]
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
contents: 'myFolder\**'
Overwrite: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: myArtifact
- stage: Deployment
dependsOn: Build
condition: and(succeeded(), eq(variables.isDev, true))
jobs:
- deployment: Deploy
displayName: Deploy to Dev
pool:
vmImage: 'windows-latest'
environment: Dev
strategy:
runOnce:
deploy:
steps:
- script: echo Foo Bar
The first question is how to I get this to copy the files to a specific path on the Dev environment?
Is the PublishBuildArtifacts really needed? The reason I ask is that I want this to copy files every time the pipeline is run and not error if the artifact already exists.
It also feels a bit dirty to have to check the branch is the correct branch this way. Is there a better way to do it?
The deployment strategy you're using relies on specifying an agent pool, which means it doesn't run on the machines in the environment. If you use a strategy such as rolling, it will run the specified steps on those machines automatically, including any download steps to download artifacts.
Ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#deployment-strategies
You need to publish artifacts as part of the pipeline if you want them to be automatically available to down-stream jobs. Each run will get a different set of artifacts, even if the actual artifact contents are the same.
That said, based on the YAML you posted, you probably don't need to. In fact, you don't need the "build" stage at all. You could just add a checkout step during your rolling deployment, and the repo would be cloned on each of the target machines.
Ok, worked this out with help from this article: https://dev.to/kenakamu/azure-devops-yaml-release-pipeline-trigger-when-build-pipeline-completed-54d5.
I've taken the advice from Daniel Mann regarding the strategy being 'rolling'. I then split my pipeline into 2 pipelines; 1 for building the artifacts and 1 for releasing (copying them).
If you want just download the particular folders instead of all the source files from the repository, you can try using the REST API "Items - Get" to download each particular folder individually.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&download=true&$format=zip&versionDescriptor.version={versionDescriptor.version}&resolveLfs=true&api-version=6.0
For example:
Have the repository like as below.
Now, in the YAML pipeline, I just want to download the 'TestFolder01' folder from the main branch.
jobs:
- job: build
. . .
steps:
- checkout: none # Do not check out all the source files.
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'
This will download the 'TestFolder01' folder as a ZIP file (TestFolder01.zip) into the current working directory. You can use the unzip command to decompress it.
[UPDATE]
If you want to download the particular folders in the deploy job which target to your VM environment, yes, the folders will be download into the pipeline working directory on the VM.
Actually, you can consider a VM type environment resource is a self-hosted agent installed on the VM. So, when your deploy job is targeting to the VM environment resource, it is running on the self-hosted agent on the VM.
The pipeline working directory is under the directory where you install the VM environment resource (self-hosted agent). Normally, you can use the variable $(Pipeline.Workspace) to get value of this path (see here).
stages:
- stage: Deployment
jobs:
- deployment: Deploy
displayName: 'Deploy to Dev'
environment: 'Dev.VM-01'
strategy:
runOnce:
deploy:
steps:
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
echo "Current working directory: $PWD"
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'

AzureStaticWebApp#0 push from artifacts

I've a CD pipeline that builds a project (multiple times for different environments) and publishes / saves the ./dist directories as one stage. I can download each environment and run locally as expected.
Each environment build is a stage that needs a manual approval. This is where I am getting lost. Each stage shows the correct artifact being pulled into the stage BUT the AzureStaticWebApp#0 -> app_location input results in a "Could not detect this directory." error.
To recap:
After building the project and saving as an artifact (I can manually download and verify) I am unable to push that built code to Azure Static Web App as it cannot be found. I've tried any number of combinations to no effect. Any advice?
I'm using templates, here is the Push Built Project to Azure Static Web Apps template
When this template runs, I can see jobs running and successfully pulling down the right artifact with this output:
Successfully downloaded artifacts to /home/vsts/work/1/
Finishing: Download Artifact
But the AzureStaticWebApp#0 task gives this error:
App Directory Location: '/home/vsts/work/1/DEV' is invalid. Could not detect this directory. Please verify your deployment configuration file reflects your repository structure.
parameters:
- name: environment
default: development
type: string
- name: variableGroup
default: development-variables-group
type: string
jobs:
- deployment:
displayName: 'Deploy to'
environment: ${{parameters.environment}}
variables:
- group: ${{parameters.variableGroup}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- task: AzureStaticWebApp#0
inputs:
app_location: '$(Pipeline.Workspace)/DEV'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deployment-token)
EDIT
Does the task AzureStaticWebApp not have access to anything outside the project?
- deployment:
displayName: 'Deploy to'
environment: ${{parameters.environment}}
variables:
- group: ${{parameters.variableGroup}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: true
# This step pulls down a complied site. E.g DEV/index.htm, ./images, staticwebapp.config.json
# That has an output like:
# Downloading DEV/index.html to /home/vsts/work/1/DEV/index.html
# Successfully downloaded artifacts to /home/vsts/work/1/
- download: current
artifact: DEV
- task: AzureStaticWebApp#0
inputs:
# I've tried many different values for app_location but all return back not found error
app_location: '/DEV'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deploymenttoken)
Solved --- well found a way to make it work.
The build step created 'app/dist' directory and content
The 'app/dist' folder only is published as an
artifact
When downloading the artifact you need to 'put it back'
into the project. In this case DEV/ -> app/dist.
- task: DownloadPipelineArtifact#2
inputs:
artifact: DEV
path: ./app/dist # Put build artifact back into the project
displayName: "Download artifacts"
- task: AzureStaticWebApp#0
inputs:
app_location: 'app/dist'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deploymenttoken)
app_location specifies the root of your application code. The property should point to a location in your repo.
Check the documentation here: https://learn.microsoft.com/en-us/azure/static-web-apps/publish-devops
Also, https://github.com/Azure/static-web-apps/issues/5#issuecomment-855309544
There are two other solutions described in #552 that do not require adding the DownloadPipelineArtifact#2 step.
The solution that worked for me was to set the workingDirectory to the pipeline's artifact location. Then set the app_location value relative to the workingDirectory
For example, if your artifact was downloaded to $(Pipeline.Workspace)/MyBuild/drop
Your properties would be:
app_location: /drop
workingDirectory: $(Pipeline.Workspace)/MyBuild
yml snippet:
- task: AzureStaticWebApp#0
displayName: Publish Static Web App
inputs:
app_location: /drop # The name of your artifact
output_location: "" # Leave this empty
skip_app_build: true
azure_static_web_apps_api_token: $(deployment-token)
# The path where your artifact was downloaded
workingDirectory: $(Pipeline.Workspace)/MyBuild
This is really confusing because every other pipeline task accepts absolute paths but AzureStaticWebApp#0 requires a relative path for app_location.

Checkout another repository in azure pipelines yml

Well, I have the following code that uses a template file in azure Devops:
resources:
repositories:
- repository: templates
type: git
name: "Framework Back-end/templates-devops"
extends:
template: azure-pipelines-template.yml#templates
This works very well, downloading yml file from another project inside same organization. But inside my "azure-pipelines-template.yml" I'm trying to do the following:
- job: Deploy
pool: ${{parameters.agent}}
displayName: Deploy on Kubernetes
dependsOn: Push
condition: and(succeeded(), in(variables['Build.SourceBranchName'], 'master', 'main', 'qas', 'develop'))
steps:
- checkout: self
- checkout: templates
But I got the error:
remote: TF401019: The Git repository with name or identifier templates-devops does not exist or you do not have permissions for the operation you are attempting.
fatal: repository 'https://xx#xx/xxx/Framework%20Back-end/_git/templates-devops/' not found
I need to make a checkout because in other steps I will need to use the files that exist in "template-devops" repository. I can't understand why my pipeline can download the "azure-pipelines-template.yml" file but can't checkout the repository.
SOLVED
Was a permission problem in Settings , I disabled the flags:
Limit job authorization scope to referenced Azure DevOps repositories
Limit job authorization scope to current project for non-release pipelines
Limit job authorization scope to current project for release pipelines
I create a demo to reproduce your environment, but it works well on my side. The checkout step works well. Here is my yaml file and temp file:
Main.yaml
resources:
repositories:
- repository: templates
type: git
name: MyAgile TestPro/yaml
pool:
name: 'default'
extends:
template: azure-pipelines-template.yml#templates
Temp.yaml
stages:
- stage:
jobs:
- job: Deploy
steps:
- checkout: self
- checkout: templates
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Hello World MyAgileTestPro temp"
My test result:
At present, we recommend you can check if your account has the project admin level for your project Framework Back-end. And please check if there is any name changed about your project.

Unable to resolve $(Release.ReleaseId) in Azure DevOps yml pipelines

When using the Releases tab in Azure DevOps web console, to create release definitions, the tasks can resolve $(Release.ReleaseId) inside of a bash task.
But if I instead do my deployment in the azure-pipelines.yml file and do echo $(Release.ReleaseId), I get null because the variable doesn't exist. How come?
Here is part of the yml file
- stage: Deploy
dependsOn: BuildAndPublishArtifact
condition: succeeded('BuildAndPublishArtifact')
jobs:
- deployment: DeployToAWSDev
displayName: My display name
pool:
vmImage: 'Ubuntu-16.04'
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: MyArtifact
- task: Base#3
inputs:
targetType: 'inline'
script: |
echo $(Release.ReleaseId) # Nothing
Thanks for any help to point in the right direction of how I can retrieve my release id.
Refer to the documentation on variables. There's no differentiation of "build" vs "release" in a YAML pipeline. Thus, Build.BuildId would be the run's ID.

Using ARM Templates from external repository

I'm working with azure multistage pipelines, using deployment jobs with templates in a separate repo. I'm currently starting to use ARM templates in my deployment process and want to run ARM templates that are located in a different repository as well. This is where I get a little stuck, any help/advice appreciated.
To illustrate my setup:
Repo A -> Source code that has to be build and deployed to azure
Repo B -> Azure pipeline templates (only consists of yml files)
Repo C -> ARM templates
So what I want to accomplish: A uses B uses C.
REPO A: Documentation build and release yml
resources:
repositories:
- repository: templates
type: git
name: <ACCOUNT>/Azure.Pipelines.Templates
ref: refs/tags/2.2.40
stages:
- stage: Build
jobs:
- template: src/jobs/doc-build.yml#templates
- stage: DEV
jobs:
- template: src/deployment-jobs/doc.yml#templates
....
REPO B: Documentation deployment
parameters:
webAppName: ''
connectedServiceName: 'DEV'
jobs:
- deployment: doc_deploy
pool:
name: 'DOC'
environment: 'doc'
strategy:
runOnce:
deploy:
steps:
- template: ../deployment/arm-template.yml
parameters:
connectedServiceName: ${{ parameters.connectedServiceName }}
resourceGroupName: 'documentation'
templateFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation.jsonc
paramFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation-params.json
parameters: -name ${{ parameters.webAppName }}
...
REPO C: contains arm template + param file
The issue I'm facing is that I can't seem to be able to get to the files of repo c. I tried adding another repository entry on multiple levels but it does not seem to clone the dependent repo at all.
My current workaround/solution:
Use a powershell script to manually clone repo C and directly reference the file on disk.
Related github issue: https://github.com/microsoft/azure-pipelines-yaml/issues/103
I've also stumbled upon this issue, having to load arm templates from another repo into the current build. What I did was setting up a build on the arm-template-containing repo, producing a build artifact with following azure-pipelines.yml: (this would be your repo c)
trigger:
- master
steps:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/templates'
ArtifactName: 'templates'
publishLocation: 'Container'
Afterwards I could add following step into the actual pipeline:
- task: DownloadPipelineArtifact#2
displayName: 'Get ARM Templates'
inputs:
buildType: 'specific'
project: <YOUR-PROJECT-ID>'
definition: '<ARM-BUILD-DEFINITION-ID>'
buildVersionToDownload: 'latest'
artifactName: 'scripts'
targetPath: '$(Pipeline.Workspace)/templates'
and I was able to access the files as follows:
- task: AzureResourceGroupDeployment#2
displayName: 'Create Queues $(ResourceGroup.Name) '
inputs:
azureSubscription: '<YOUR-SUBSCRIPTION>'
resourceGroupName: '$(ResourceGroup.Name)'
location: '$(ResourceGroup.Location)'
csmFile: '$(Pipeline.Workspace)/templates/servicebus.json'
For more information about the Download Pipeline Artifact task check out following link:
Download Pipeline Artifact task