AzureStaticWebApp#0 push from artifacts - azure-devops

I've a CD pipeline that builds a project (multiple times for different environments) and publishes / saves the ./dist directories as one stage. I can download each environment and run locally as expected.
Each environment build is a stage that needs a manual approval. This is where I am getting lost. Each stage shows the correct artifact being pulled into the stage BUT the AzureStaticWebApp#0 -> app_location input results in a "Could not detect this directory." error.
To recap:
After building the project and saving as an artifact (I can manually download and verify) I am unable to push that built code to Azure Static Web App as it cannot be found. I've tried any number of combinations to no effect. Any advice?
I'm using templates, here is the Push Built Project to Azure Static Web Apps template
When this template runs, I can see jobs running and successfully pulling down the right artifact with this output:
Successfully downloaded artifacts to /home/vsts/work/1/
Finishing: Download Artifact
But the AzureStaticWebApp#0 task gives this error:
App Directory Location: '/home/vsts/work/1/DEV' is invalid. Could not detect this directory. Please verify your deployment configuration file reflects your repository structure.
parameters:
- name: environment
default: development
type: string
- name: variableGroup
default: development-variables-group
type: string
jobs:
- deployment:
displayName: 'Deploy to'
environment: ${{parameters.environment}}
variables:
- group: ${{parameters.variableGroup}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- task: AzureStaticWebApp#0
inputs:
app_location: '$(Pipeline.Workspace)/DEV'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deployment-token)
EDIT
Does the task AzureStaticWebApp not have access to anything outside the project?
- deployment:
displayName: 'Deploy to'
environment: ${{parameters.environment}}
variables:
- group: ${{parameters.variableGroup}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- checkout: self
submodules: true
# This step pulls down a complied site. E.g DEV/index.htm, ./images, staticwebapp.config.json
# That has an output like:
# Downloading DEV/index.html to /home/vsts/work/1/DEV/index.html
# Successfully downloaded artifacts to /home/vsts/work/1/
- download: current
artifact: DEV
- task: AzureStaticWebApp#0
inputs:
# I've tried many different values for app_location but all return back not found error
app_location: '/DEV'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deploymenttoken)

Solved --- well found a way to make it work.
The build step created 'app/dist' directory and content
The 'app/dist' folder only is published as an
artifact
When downloading the artifact you need to 'put it back'
into the project. In this case DEV/ -> app/dist.
- task: DownloadPipelineArtifact#2
inputs:
artifact: DEV
path: ./app/dist # Put build artifact back into the project
displayName: "Download artifacts"
- task: AzureStaticWebApp#0
inputs:
app_location: 'app/dist'
api_location: 'api'
output_location: 'dist'
skip_app_build: true
env:
azure_static_web_apps_api_token: $(deploymenttoken)

app_location specifies the root of your application code. The property should point to a location in your repo.
Check the documentation here: https://learn.microsoft.com/en-us/azure/static-web-apps/publish-devops
Also, https://github.com/Azure/static-web-apps/issues/5#issuecomment-855309544

There are two other solutions described in #552 that do not require adding the DownloadPipelineArtifact#2 step.
The solution that worked for me was to set the workingDirectory to the pipeline's artifact location. Then set the app_location value relative to the workingDirectory
For example, if your artifact was downloaded to $(Pipeline.Workspace)/MyBuild/drop
Your properties would be:
app_location: /drop
workingDirectory: $(Pipeline.Workspace)/MyBuild
yml snippet:
- task: AzureStaticWebApp#0
displayName: Publish Static Web App
inputs:
app_location: /drop # The name of your artifact
output_location: "" # Leave this empty
skip_app_build: true
azure_static_web_apps_api_token: $(deployment-token)
# The path where your artifact was downloaded
workingDirectory: $(Pipeline.Workspace)/MyBuild
This is really confusing because every other pipeline task accepts absolute paths but AzureStaticWebApp#0 requires a relative path for app_location.

Related

XML transformation using only IIS Web App Deployment On Machine Group task (YAML pipeline)

I have created a pipeline using only YAML.
I have defined the deployment part like this:
- stage: AzureDevOpsStaging
displayName: Deploy build artifacts to staging environment
dependsOn: BuildSolution
condition: succeeded('BuildSolution')
jobs:
- deployment: DeployArtifacts
displayName: Deploy artifacts
environment:
name: AzureDevOpsStaging
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: drop
- task: IISWebAppDeploymentOnMachineGroup#0
displayName: Deploy artifacts to IIS
inputs:
webSiteName: 'mysite-staging'
package: '$(Pipeline.Workspace)\drop\*.zip'
xmlTransformation: true
When I run this I get:
##[warning]Unable to apply transformation for the given package. Verify the following.
##[warning]1. Whether the Transformation is already applied for the MSBuild generated package during build. If yes, remove the <DependentUpon> tag for each config in the csproj file and rebuild.
##[warning]2. Ensure that the config file and transformation files are present in the same folder inside the package.
Things that I've checked:
Both Web.config and Web.AzureDevOpsStaging.config files are in the zip/artifact
Name of stage - The docs say stage must have the same name as your transform config file; that is: Web.AzureDevOpsStaging.config.
Name of .config transform file - the name of the .config transform file is Web.AzureDevOpsStaging.config
Name of environment (the docs doesn't say the name has to be the same as Web.ThisPart.config but I still named the environment
AzureDevOpsStaging just in case.)
But again doing all of the above results in the Web.config not being transformed.
I got it to work with using the file transform task instead which is referenced in the docs from the IIS Web App Deploy task:
- stage: AzureDevOpsStaging
displayName: Deploy build artifacts to staging environment
dependsOn: BuildSolution
condition: succeeded('BuildSolution')
jobs:
- deployment: DeployArtifacts
displayName: Deploy artifacts
environment:
name: AzureDevOpsStaging
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: drop
- task: FileTransform#1
inputs:
folderPath: '$(Pipeline.Workspace)\drop\*.zip'
enableXmlTransform: true
xmlTransformationRules: -transform **\*.AzureDevOpsStaging.config -xml **\*.config
- task: IISWebAppDeploymentOnMachineGroup#0
displayName: Deploy artifacts to IIS
inputs:
webSiteName: 'mysite-staging'
package: '$(Pipeline.Workspace)\drop\*.zip'
So can someone please explain to me how I am supposed to configure my YAML to get it to work using only the IISWebAppDeploymentOnMachineGroup#0 task?
And if this is not possible am I using the task FileTransform#1 properly?
Also, I saw there is a version FileTransform#2 as well. That task didn't have one of the properties that #1 has so I reverted to using v1 instead. But would be great if someone has a bit more info on this newer version and if it's going to deprecate #1 in the future?
Btw, I also got xmlTransformation: true to work with classic release pipeline under the Releases tab in Azure DevOps using the UI. But again I don't want to use the classic stuff, I want to do everything in YAML.
And if this is not possible am I using the task FileTransform#1 properly?
The answer is yes.
The task FileTransform is the one I am using and use frequently.
When I use it in the YAML pipeline as you set:
- task: FileTransform#1
displayName: 'File Transform'
inputs:
folderPath: '$(Pipeline.Workspace)\drop\*.zip'
enableXmlTransform: true
xmlTransformationRules: '-transform **\*.UAT.config -xml **\*.config'
fileType: xml
It works fine on my side:
In order to perform the conversion correctly, it is necessary to ensure that the syntax in config is correct, and the specified directory is correct

How do you copy azure repo folders to a folder on a VM in an Environment in a pipeline?

I have an Environment called 'Dev' that has a resource, which is a VM. As part of the 'Dev' pipeline I want to copy files from a specific folder on the develop branch of a specific repo to a specific folder on the VM that's on the Environment.
I've not worked with Environments before or yaml pipelines much but I gather I need to use the CopyFiles#2 task.
So I've got an azure pipeline yaml file something like this:
variables:
isDev: $[eq(variables['Build.SourceBranch'], 'refs/heads/develop')]
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
contents: 'myFolder\**'
Overwrite: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: myArtifact
- stage: Deployment
dependsOn: Build
condition: and(succeeded(), eq(variables.isDev, true))
jobs:
- deployment: Deploy
displayName: Deploy to Dev
pool:
vmImage: 'windows-latest'
environment: Dev
strategy:
runOnce:
deploy:
steps:
- script: echo Foo Bar
The first question is how to I get this to copy the files to a specific path on the Dev environment?
Is the PublishBuildArtifacts really needed? The reason I ask is that I want this to copy files every time the pipeline is run and not error if the artifact already exists.
It also feels a bit dirty to have to check the branch is the correct branch this way. Is there a better way to do it?
The deployment strategy you're using relies on specifying an agent pool, which means it doesn't run on the machines in the environment. If you use a strategy such as rolling, it will run the specified steps on those machines automatically, including any download steps to download artifacts.
Ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#deployment-strategies
You need to publish artifacts as part of the pipeline if you want them to be automatically available to down-stream jobs. Each run will get a different set of artifacts, even if the actual artifact contents are the same.
That said, based on the YAML you posted, you probably don't need to. In fact, you don't need the "build" stage at all. You could just add a checkout step during your rolling deployment, and the repo would be cloned on each of the target machines.
Ok, worked this out with help from this article: https://dev.to/kenakamu/azure-devops-yaml-release-pipeline-trigger-when-build-pipeline-completed-54d5.
I've taken the advice from Daniel Mann regarding the strategy being 'rolling'. I then split my pipeline into 2 pipelines; 1 for building the artifacts and 1 for releasing (copying them).
If you want just download the particular folders instead of all the source files from the repository, you can try using the REST API "Items - Get" to download each particular folder individually.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&download=true&$format=zip&versionDescriptor.version={versionDescriptor.version}&resolveLfs=true&api-version=6.0
For example:
Have the repository like as below.
Now, in the YAML pipeline, I just want to download the 'TestFolder01' folder from the main branch.
jobs:
- job: build
. . .
steps:
- checkout: none # Do not check out all the source files.
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'
This will download the 'TestFolder01' folder as a ZIP file (TestFolder01.zip) into the current working directory. You can use the unzip command to decompress it.
[UPDATE]
If you want to download the particular folders in the deploy job which target to your VM environment, yes, the folders will be download into the pipeline working directory on the VM.
Actually, you can consider a VM type environment resource is a self-hosted agent installed on the VM. So, when your deploy job is targeting to the VM environment resource, it is running on the self-hosted agent on the VM.
The pipeline working directory is under the directory where you install the VM environment resource (self-hosted agent). Normally, you can use the variable $(Pipeline.Workspace) to get value of this path (see here).
stages:
- stage: Deployment
jobs:
- deployment: Deploy
displayName: 'Deploy to Dev'
environment: 'Dev.VM-01'
strategy:
runOnce:
deploy:
steps:
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
echo "Current working directory: $PWD"
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'

Task AzureStaticWebApp#0 'could not detect this directory' but its presented

I am working on building a pipeline using AzureDevOps, and I face a strange problem.
This is my pipeline:
- stage: 'Test'
displayName: 'Deploy to the test environment'
dependsOn: Build
jobs:
- job: 'Deploy'
steps:
- download: current
artifact: lorehub-front
- bash: cd $(Pipeline.Workspace); echo $(ls)
- bash: cd $(Pipeline.Workspace)/lorehub-front; echo $(ls)
- task: AzureStaticWebApp#0
displayName: 'Publish to Azure Static WebApp'
inputs:
app_location: $(Pipeline.Workspace)/lorehub-front
azure_static_web_apps_api_token: xxxx
The first bash shows that the folder 'lorehub-front' is presented
The second bash shows that the inside folder is correct files (index.html and etc)
Script contents:
cd /home/vsts/work/1/lorehub-front; echo $(ls)
android-chrome-192x192.png android-chrome-512x512.png
apple-touch-icon.png css env-config.js favicon-16x16.png
favicon-32x32.png favicon.ico fonts index.html js site.webmanifest
But I am receiving this error:
App Directory Location: '/home/vsts/work/1/lorehub-front' is invalid. Could not
detect this directory. Please verify your deployment configuration
file reflects your repository structure.
App Directory Location: '/home/vsts/work/1/lorehub-front' is invalid.
The method to resolve this issue is that you need to change the path to /lorehub-front.
- task: AzureStaticWebApp#0
displayName: 'Publish to Azure Static WebApp'
inputs:
app_location: /lorehub-front
azure_static_web_apps_api_token: xxxx
For more detailed info, you could refer to this doc: Tutorial: Publish Azure Static Web Apps with Azure DevOps
Enter / if your application source code is at the root of the repository, or /app if your application code is in a directory called app.
In case anyone else comes across this post from Google or whatever, I had the exact same problem as Andrei above but for the life of me I couldn't get the accepted solution here to work.
No matter what I put in app_location:, the task just flat out refused to see any files.
Upon further investigation, I found this github issue which claims the following (Emphasis mine):
The AzureStaticWebApp task’s app location is relative to the current directory
Looking at the documentation, we can see that this task uses the default directory of $(System.DefaultWorkingDirectory), which is different to $(Pipeline.Workspace) for whatever reason.
So, if you find yourself stuck in the same position and cannot get this task to recognise your artifacts, the solution is to add cwd: $(Pipeline.Workspace) to your task, e.g.
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: WebApp
- task: AzureStaticWebApp#0
inputs:
app_location: /
skip_app_build: true
azure_static_web_apps_api_token: $(deployment_token)
cwd: $(Pipeline.Workspace)/WebApp

How to make correct my azure deployment issue by rearranging my azure-pipelines.yml?

I am trying to establish a pipeline by using azure cloud and devops. But I got an error below while deploying from succeeded building. How can I solve this issue?
I read an article it is awesome "http://www.alessandromoura.com.br/2020/04/23/azure-devops-publish-and-download-artifacts/"
I applied your rule sets but I got error always below :
Error: No package found with specified pattern: D:\a\r1\a***.zip
Check if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job.
azure-pipelines.yml :
# Docker
# Build and push an image to Azure Container Registry
# https://learn.microsoft.com/azure/devops/pipelines/languages/docker
trigger:
- main
resources:
- repo: self
variables:
# Container registry service connection established during pipeline creation
dockerRegistryServiceConnection: 'xxxxx'
imageRepository: 'xxxhelloaspnetcore'
containerRegistry: 'xxxcontainer01.azurecr.io'
dockerfilePath: '$(Build.SourcesDirectory)/Dockerfile'
tag: '$(Build.BuildId)'
# Agent VM image name
vmImageName: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build and push stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: Docker#2
displayName: Build and push an image to container registry
inputs:
command: buildAndPush
repository: $(imageRepository)
dockerfile: $(dockerfilePath)
containerRegistry: $(dockerRegistryServiceConnection)
tags: |
$(tag)
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
Your pipeline creates and pushes image to container registry so you don't have there pipeline artifacts. This is why DownloadPipelineArtifact throws error.
DownloadPipelineArtifact makes sens only if you use PublishPipelineArtifact before. This doc - Publish and download artifacts in Azure Pipelines describe it very well.
There is also a way to download artifacts from another pipeline - but it requires to use resource, but you don't have defined pipeline resource in your pipeline.
So all works as expected. Can you explain what actually do you want to download and what achieve by that?

Using ARM Templates from external repository

I'm working with azure multistage pipelines, using deployment jobs with templates in a separate repo. I'm currently starting to use ARM templates in my deployment process and want to run ARM templates that are located in a different repository as well. This is where I get a little stuck, any help/advice appreciated.
To illustrate my setup:
Repo A -> Source code that has to be build and deployed to azure
Repo B -> Azure pipeline templates (only consists of yml files)
Repo C -> ARM templates
So what I want to accomplish: A uses B uses C.
REPO A: Documentation build and release yml
resources:
repositories:
- repository: templates
type: git
name: <ACCOUNT>/Azure.Pipelines.Templates
ref: refs/tags/2.2.40
stages:
- stage: Build
jobs:
- template: src/jobs/doc-build.yml#templates
- stage: DEV
jobs:
- template: src/deployment-jobs/doc.yml#templates
....
REPO B: Documentation deployment
parameters:
webAppName: ''
connectedServiceName: 'DEV'
jobs:
- deployment: doc_deploy
pool:
name: 'DOC'
environment: 'doc'
strategy:
runOnce:
deploy:
steps:
- template: ../deployment/arm-template.yml
parameters:
connectedServiceName: ${{ parameters.connectedServiceName }}
resourceGroupName: 'documentation'
templateFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation.jsonc
paramFile: $(Build.SourcesDirectory)/Azure.ARM.Templates/src/web-app/documentation-params.json
parameters: -name ${{ parameters.webAppName }}
...
REPO C: contains arm template + param file
The issue I'm facing is that I can't seem to be able to get to the files of repo c. I tried adding another repository entry on multiple levels but it does not seem to clone the dependent repo at all.
My current workaround/solution:
Use a powershell script to manually clone repo C and directly reference the file on disk.
Related github issue: https://github.com/microsoft/azure-pipelines-yaml/issues/103
I've also stumbled upon this issue, having to load arm templates from another repo into the current build. What I did was setting up a build on the arm-template-containing repo, producing a build artifact with following azure-pipelines.yml: (this would be your repo c)
trigger:
- master
steps:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/templates'
ArtifactName: 'templates'
publishLocation: 'Container'
Afterwards I could add following step into the actual pipeline:
- task: DownloadPipelineArtifact#2
displayName: 'Get ARM Templates'
inputs:
buildType: 'specific'
project: <YOUR-PROJECT-ID>'
definition: '<ARM-BUILD-DEFINITION-ID>'
buildVersionToDownload: 'latest'
artifactName: 'scripts'
targetPath: '$(Pipeline.Workspace)/templates'
and I was able to access the files as follows:
- task: AzureResourceGroupDeployment#2
displayName: 'Create Queues $(ResourceGroup.Name) '
inputs:
azureSubscription: '<YOUR-SUBSCRIPTION>'
resourceGroupName: '$(ResourceGroup.Name)'
location: '$(ResourceGroup.Location)'
csmFile: '$(Pipeline.Workspace)/templates/servicebus.json'
For more information about the Download Pipeline Artifact task check out following link:
Download Pipeline Artifact task