Azure DevOps release pipeline: Angular and .NET Core application - azure-devops

We're trying to release an Angular 7 / .NET Core application into Azure using the DevOps release pipelines. I have my build setup to create the .NET and Angular builds as separate artifacts which you can see in the screen shots below (under the Package or Folder box).
From what I've read it seems that you need to create two separate release tasks to deploy the builds to the web app. However the second build seems to be overriding the first which is causing the API not to start.
Does anyone know of a way to ensure the deployments in a given stage simply appends the changes rather than replacing them? Or is there something else I am missing here?

My recommendation would be to implement the following pattern for your pipeline:
'ng build --prod' the angular app in it's own job, and add the artifacts to your pipeline
'dotnet publish' the dotnet core api in it's own job, running in parallel with the angular job, and add the artifacts to your pipeline
Append the Angular and Dotnet Core artifacts together into a new artifact. This serves as your final package to deploy
Deploy the final package
You're missing step 3, so you'd want something like the following logic defined in YAML, where you create a new zip that represents your actual deployed bits in your pipeline. Then release that artifact, since it is the representation of what you have running on your instances.
- job: CreateReleaseArtifact
displayName: 'Package for shared-hosting of angular app and web api'
pool:
vmImage: windows-2019
dependsOn:
- BuildNetcore
- BuildAngularApp
condition: succeeded()
steps:
- checkout: none
- download: current
- task: CopyFiles#2
displayName: 'Copy WebApi Files'
inputs:
SourceFolder: $(Pipeline.Workspace)/api
Contents: '**/*'
TargetFolder: $(Pipeline.Workspace)/package
includeRootFolder: false
- task: CopyFiles#2
displayName: 'Copy Angular Files'
inputs:
SourceFolder: $(Pipeline.Workspace)/webapp
Contents: 'wwwroot/**'
TargetFolder: $(Pipeline.Workspace)/package
includeRootFolder: true
OverWrite: true
- publish: $(Pipeline.Workspace)/package
artifact: package

Does anyone know of a way to ensure the deployments in a given stage simply appends the changes rather than replacing them?
Based on my experience, in your case, after deploy the API or Angular 7, then I you could use the Kudu zip API to upload another one to the Azure WebApp.
You could use the Powershell task to do that. For more inforamtion about powershell demo code, you could refer to this link.
If creating another WebApp is acceptable, you could add a new WebApp and use the same service plan (no extral cost). Then you could deploy them separately.

Related

Terraform: Error while loading schemas for plugin components

I have an Azure DevOps Build pipeline that publishes the entire repository as an artifact to be used with the Release pipeline.
# Publish artifacts to be used in release
- task: PublishBuildArtifacts#1
displayName: 'publish artifacts'
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)'
ArtifactName: 'TerraformModule'
publishLocation: 'Container'
The build pipeline triggers the creation of a release pipeline where I try to deploy the terraform configuration.
I can successfully run terraform init in this pipeline but when I try to run plan or apply, I get the following error:
Looking at the screenshot, it looks like it tries to execute the command from /usr/local/bin instead of what I specified in the step? Confused by this. Below is the yaml for my plan step:
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV3#3
displayName: 'terraform plan'
inputs:
provider: aws
command: plan
workingDirectory: '/home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod'
environmentServiceNameAWS: 'AWS-Terraform-Build'
I manually changed workingDirectory to where the Artifacts from the build pipeline were downloaded to. See log below for example:
2022-08-14T23:41:31.3359557Z Downloaded TerraformModule/Projects/Potentium/Prod/main.tf to /home/vsts/work/r1/a/_terraform/TerraformModule/Projects/Potentium/Prod/main.tf
The plan step in my build pipeline executes without any issues so I have a feeling it is something to do with the artefacts/extraction that is occurring in the download step. Looking for any advice.
I've had similar issues with the extraction phase, when using ExtractFiles#1 doing a similar thing with terraform. I think there's a bug in it, I could not get it to extract files back to the root of System.DefaultWorkingDirectory unless the root folder was included in the archiv, I am using ArchiveFiles#2. So I was ending up with /opt/az_devops/_work/*/s/s
My solution, was to shell out a command to do the extraction. No problems extracting to the root of System.DefaultWorkingDirectory
Just remember if you're running a subsequent terraform plan, by default the working directory System.DefaultWorkingDirectory will change between runs. So ensure you use these variables rather than an explicit reference.

Single WCF project will not create build artifact under YAML Azure DevOps Build Pipeline

Our build pipeline includes a YML template that is used to build all of our WCF services and Web applications. For all of the WCF services but one, it works like a charm. For this one WCF service, however, the following output is generated in the build logs during the Publish Artifact stage:
##[warning]Directory 'D:\azagent\A2_work\1381\a' is empty. Nothing will be added to build artifact 'drop'.
Our Build stage invokes a separate YML file which includes the following to build and publish the project:
- task: VSBuild#1
displayName: "Build ${{ parameters.solution}}"
inputs:
solution: ${{ parameters.solution }}
msbuildArgs: >
/p:DeployOnBuild=true
/p:WebPublishMethod=Package
/p:PackageAsSingleFile=true
/p:SkipInvalidConfigurations=true
/p:IgnoreDeployManagedRuntimeVersion=true
/p:PackageLocation="$(build.ArtifactStagingDirectory)"
platform: ${{ parameters.buildPlatform }}
configuration: ${{ parameters.buildConfiguration }}
- task: PublishBuildArtifacts#1
displayName: "Publish Build Artifact"
inputs:
PathtoPublish: "$(Build.ArtifactStagingDirectory)"
ArtifactName: "drop"
publishLocation: "Container"
As stated earlier, this works perfectly for all other WCF projects, and generates a build artifact. However, for the problematic WCF Service, no build artifact is generated. What we've observed is that no ZIP file is created in the D:\azagent\A2_work\1381\a folder (theoretically, Build.ArtifactStagingDirectory).
I have tried numerous recommended solutions to resolve this issue, all to no effect.
Adding a CopyFiles#2 task between the VSBuild#1 task and the PublishBuildArtifacts#1 task did place the files in the Build.ArtifactStagingDirectory, but they were not in a ZIP file. Further, the deployment task (later in the pipeline) failed because no ZIP file was present in the drop folder.
Adding /p:OutDir=$(Build.ArtfactStagingDirectory) did seem to produce some sort of artifact, but the deployment task still failed, claiming it could not find the ZIP file in the drop folder.
I created a test repo that contained only the WCF project (as it's normally contained in a solution containing it and a Web application) and ran the pipeline against that repo. No build artifact was created.
Ultimately, nothing I do seems to be able to get this project to generate a build artifact.
What am I missing here? What further information can I provide that will help you to help me resolve this issue?
According to Microsoft's Developer Community, this has been a known issue since August of 2019. WCF services built using a YAML pipeline do not produce build artifacts. Consequently, they cannot be deployed via YAML pipelines.
As of this date, there is neither a fix nor a workaround available from Microsoft.

Azure pipeline, muti stage YAML pipeline using same work directory on build server. How does it not corrupt

Clarifications and corrections:
Testing with one self-hosted agent.
By version I mean version of the application. Or any new commit.
The same work directory is being used for builds of different commits, when there are still pending stages (requiring approval) in multiple build runs.
We have Azure an azure devops YAML pipeline with multi stage and approvers. I noticed that running different build versions of the same pipeline uses the same work directory on the build server.
How does this not cause corruption of content, for example if the pipeline runs simultaneously for different build versions?
For example what if the newer pipeline run checks out source code while the other run is building and creating artifacts for its own version? I have checked the current path for two concurrent builds and it is the same.
What will happen if you run two commits in the same agent:
Here is my example of multi-stage pipeline:
pool: Default
stages:
- stage: A
jobs:
- job: A
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.SourcesDirectory)'
artifact: 'drop'
publishLocation: 'pipeline'
- stage: B
jobs:
- deployment: DeployWeb
displayName: deploy Web App
pool: Default
workspace:
clean: all
environment: 'env'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.SourcesDirectory)'
Contents: '**'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
I added a approve check in the environment. My running order is stage A(commit1)->stage A(commit2)->stage B(commit1)->stage B(commit2) .
stage A(commit1):
This job will checkout source code of commit1 and publish files in Sources Directory of commit1.
stage A(commit2):
This job will checkout source code of commit2 and publish files in Sources Directory of commit2.
stage B(commit1):
It is a deployment job and will not checkout resource by default.
The deployment job will download the artifact of commit1 as expected.
If I don't clean the workspace, it will continue to use the source code of commit2. This may cause some issues.
If I add a checkout step in this stage. It will checkout the source of commit1.
So you can add checkout step and clean workspace to the deployment jobs. Nondeployment jobs automatically check out source code and it will use the correct source.

How to generate DACPAC file

I'm trying to deploy my project in Azure DevOps through IIS website and SQL deployment. However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process.
P.S. I do not have access to the VM where I am deploying due to restrictions. I can access database as I marked as DBO on the machine.
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
Thank you for your help!
However I am struggling with deploying SQL part as I do not have a .dacpac file in my build artifacts. How do I generate this as any option that I have tried it ended up with failing of the process. I can access database as I marked as DBO on the machine.
Firstly you have to create SQL Server Database Project using SSDT (or Azure Data Studio insiders preview) by importing objects of the live database.
The database project then is to be placed into a repository
The pipeline (classic or yaml) is to have a build task MSBuild#1. Here is an YAML example. It generates the dacpac
- task: MSBuild#1
displayName: 'Build solution YourDatabase.sln'
inputs:
solution: 'src/YourDatabase.sln'
This task compiles the database project and produces dacpac file(s)
Then produced files are to be extracted:
- task: CopyFiles#2
displayName: 'Extract DACPACs'
inputs:
CleanTargetFolder: false
SourceFolder: '$(agent.builddirectory)\s\src\YourDatabase\bin\Debug\'
Contents: '*.dacpac'
TargetFolder: '$(build.artifactstagingdirectory)'
And finally, published as the artefact
- task: PublishPipelineArtifact#1
displayName: 'Publish Artifact'
inputs:
targetPath: '$(build.artifactstagingdirectory)'
artifact: 'drop'
Deployment of the dacpac is the final goal and can be done using SqlDacpacDeploymentOnMachineGroup#0, however, this is out of the scope of the original question
My question is also, do I need to generate this DACPAC file through build every time or it can be generated only once, stored on machine, and I point from deployment process to that file?
It depends.
Classic pipelines have a separation of BUILD and RELEASE phases. In this case, you can build it once and reuse that dacpac for many future releases.
In case of multi-stage yaml pipelines, it is common that every pipeline run triggers build and deployment stages, because they are still belong to the same pipeline and run as a single unit work.

How to Publish a ClickOnce application with Azure DevOps Pipeline on different environments?

I try for several days now to publish my ClickOnce application with Azure DevOps Pipeline. Before going in detail here is what I would like to do from my release view:
I started with one artifact and 2 release stage modifying the config.deploy file with staging variables during my Staging stage and modifying the config.deploy file with production variables during my Production stage. Deployment was working fine but installation of application was not working because of hash check system.
So I decided to create 2 builds with 2 artifacts. I renamed the classic drop by a drop_staging during my first build and drop_production for my second build. I was hoping the build system (MSBuild) was able to select the correct app.Debug.config then app.Release.config file during the build and publish process.
Here is my build definition
Here is my build arguments
/target:publish
/p:ApplicationVersion=$(Build.BuildNumber)
/p:PublishURL=http://app-staging.example.com/
/p:UpdateEnabled=true
/p:UpdateMode=Foreground
/p:ProductName="App Staging"
/p:OutputPath="$(build.ArtifactStagingDirectory)\Publish\\"
Configuration is set to Staging for first build then on Production for second build. I have, of course, a Staging and Production build definition in visual Studio. I have an app.config with app.Staging.config and app.Production.config in my project.
I cannot simply add a task to transform my config file after the build because I will not respect the hash. I should find a way to say to my build to use the correct XML transformation config file. I don't see any other solution or maybe applying this transformation before the build? Is it possible? What are your solutions?
finally I could solve this by adding a file transform before my build.
In case you need more help here is my YAML detail for transformation
steps:
- task: FileTransform#1
displayName: 'File Transform: '
inputs:
folderPath: App.Example
enableXmlTransform: true
xmlTransformationRules: '-transform **\*.Staging.config -xml **\*.config'
fileType: xml
#Your build pipeline references the ‘BuildPlatform’ variable, which you’ve selected to be settable at queue time. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab, and then select the option to make it settable at queue time. See https://go.microsoft.com/fwlink/?linkid=865971
steps:
- task: VSBuild#1
displayName: 'Build solution'
inputs:
solution: Example.sln
msbuildArgs: '/target:publish /p:ApplicationVersion=$(Build.BuildNumber) /p:PublishURL=http://staging.example.com/ /p:UpdateEnabled=true /p:UpdateMode=Foreground /p:ProductName="App Staging" /p:OutputPath="$(build.ArtifactStagingDirectory)\Publish\\"'
platform: '$(BuildPlatform)'
configuration: Staging
To add to the build solution stage, you can use your visual studio Publish profile as shown below on the msbuildArgs. Please note this doesn't do the version incrementation for you
- task: VSBuild#1
displayName: 'Publish Project'
inputs:
solution: '$(projectSolution)'
msbuildArgs: '/target:publish /p:ApplicationRevision=$(applicationRevision) /p:PublishProfile="Application/PublishProfiles/HerbalPublishProfile.pubxml" /p:PublishDir="$(build.ArtifactStagingDirectory)\Publish\\"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
msbuildArchitecture: x64