get a sql server (express) during a pipeline build azure devops - azure-devops

i'm setting up a pipeline for asp.net application. During integration tests task i need to connect to a SQL server. How can i say to the pipeline that i need a sql service ?
I have tried with multiple microsoft hosted agent pool (Windows Server 1803, Hosted 2017 & 2019)
I use Windows Server 1803 and issue is:
The operating system of the container does not match the operating system of the host.
I would like setting up correctly a temporaly sql server to running tests.
I have used localdb instead.
i run this script before my intregration tests task
SqlLocalDB.exe create "DeptLocalDB"
SqlLocalDB.exe share "DeptLocalDB" "DeptSharedLocalDB"
SqlLocalDB.exe start "DeptLocalDB"
SqlLocalDB.exe info "DeptLocalDB"
To connect with powershell: Invoke-Sqlcmd -Query "SELECT GETDATE() AS TimeOfQuery;" -ServerInstance "(localdb)\.\DeptSharedLocalDB"
To connect with sqlcmd: sqlcmd -S (localdb)\.\DeptSharedLocalDB
To connect a c# app (connectionString): "Data Source=(localdb)\.\DeptS
haredLocalDB;Initial Catalog=DeptLocalDB;Integrated Security=True;"
If someone know how to mount a sql server in a container on azure pipeline, it will be appreciated. Thank for reading

Chocolatey is installed on windows-latest.
So, if you in you YAML file define:
pool:
vmImage: windows-latest
you can then install SQL Server Express using choco:
- script: choco install sql-server-express

azure-pipelines.yml:
pool:
vmImage: 'windows-latest'
...
- task: PowerShell#2
displayName: 'start mssqllocaldb'
inputs:
targetType: 'inline'
script: 'sqllocaldb start mssqllocaldb'
After the following connection string is valid:
Data Source=(localdb)\\MSSQLLocalDB;Initial Catalog=my-db;Integrated Security=True;
Source:
https://www.jannikbuschke.de/blog/azure-devops-enable-mssqllocaldb/

In my case (some .NET Core integration tests that needed LocalDB), just using the windows-latest image did the job:
pool:
vmImage: 'windows-latest'
The image comes with Visual Studio which includes LocalDB.

Here a full example of how I achieved running Integration tests using Sql Express in Azure Devops. I had to use the YAML based pipelines so I could use simonauner's approach using Chocolatey to install Sql Express. Make note that I had to install EF Core tools since I use .Net Core 3.1 in this pipeline. Also note that I generate an EF Code First migration SQL file on the fly so that the rigged SQL Express instance is filled with contents.
Deploy SQL Express instance in Azure Devops, install and generate and run EF Code first migration sql script to update database with schema and seed data using EF Code First tools.
# ASP.NET Core (.NET Framework)
# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/dotnet-core
trigger:
- feature/testability
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- script: choco install sql-server-express
- task: NuGetToolInstaller#1
- task: VisualStudioTestPlatformInstaller#1
displayName: 'Visual Studio Test Platform Installer'
inputs:
versionSelector: latestStable
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
command: build
projects: '**/*.csproj'
arguments: '--configuration Debug' # Update this to match your need
- script: 'dotnet tool install --global dotnet-ef'
displayName: 'Generate EF Code First Migrations SQL Script Update script'
- script: 'dotnet ef migrations script -i -o %BUILD_ARTIFACTSTAGINGDIRECTORY%\migrate.sql --project .\SomeAcme\SomeAcme.csproj'
displayName: 'Generate EF Code First Migrations migrate.sql'
- script: 'sqlcmd -S .\SQLEXPRESS -Q "CREATE DATABASE [SomeAcmeDb]"'
displayName: 'Create database SomeAcmeDb in Azure Devops SQL EXPRESS'
- script: 'sqlcmd -i %BUILD_ARTIFACTSTAGINGDIRECTORY%\migrate.sql -S .\SQLEXPRESS -d SomeAcmeDb'
displayName: ' Run migrate.sql on SQL EXPRESS in Azure Devops'
# PowerShell
# Run a PowerShell script on Linux, macOS, or Windows
- task: PowerShell#2
inputs:
targetType: 'inline' # Optional. Options: filePath, inline
#filePath: # Required when targetType == FilePath
#arguments: # Optional
script: 'gci -recurse -filter *.dll' # Required when targetType == Inline
#errorActionPreference: 'stop' # Optional. Options: stop, continue, silentlyContinue
#failOnStderr: false # Optional
#ignoreLASTEXITCODE: false # Optional
#pwsh: false # Optional
#workingDirectory: # Optional
- task: VSTest#2
displayName: 'VsTest - testAssemblies'
inputs:
testAssemblyVer2: |
**\*SomeAcme.Tests.dll
!**\*TestAdapter.dll
!**\obj\**
vsTestVersion: toolsInstaller
testFiltercriteria: 'Category=IntegrationTest'
runInParallel: false
codeCoverageEnabled: false
testRunTitle: 'XUnit tests SomeAcme solution integration test starting'
failOnMinTestsNotRun: true
rerunFailedTests: false

I need localdb in order to run database unit tests, and I also require at least SQL Server 2016 SP2 because my team is using some of the newer language features.
On a windows-2019 image, simply use choco or powershell to execute the following command:
choco upgrade sqllocaldb

Related

Azure Pipeline - DevOps19 + VS22: Building Solution with two different projects

This post will be a bit longer, as I not only describe my problem, but also show my different attempts to solve the problem.
I have a solution contaning .Net-6-Web-Api-Project (csproj) and a C++/CLI-Wrapper-Project (vcxproj). I have a reference from the C#-Project to the c++-Project. I use DevOps 2019 and VS22 on my local building agent.
I'm not able to successfully run this solution through an Azure DevOps Pipeline using the task DotNetCoreCLI#2, VSBuild#1 or a custom script as a workaround for the MSBuild#1 to publish.
VSBuild
My initial approach was to simply use the VSBuild#1 task. Using this task does not allow the pipeline to start, with the following error:
##[Error 1]
No agent found in pool My_Pool which satisfies the specified demands:
agent.name -equals My_Agend_Unity_1
Cmd
msbuild
visualstudio
Agent.Version -gtVersion 2.153.1
The cause is the compatibility issue between DevOps 2019 and VS2022. The agent does not recognize VS2022 and therefore does not create system capabilities for it. Its the same issue for the MSBuild#1 and why I tried a custom script to work around, because it couldn't find MSBuild.
DotNetCoreCLI
The first error I got was:
error MSB4019: The imported project "C:\Microsoft.Cpp.Default.props" was not found. Confirm that the expression in the Import declaration "\Microsoft.Cpp.Default.props" is correct, and that the file exists on disk.
So I fixed that by adding the env variable to the task:
env:
PATH: 'C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170'
The resulting further error was:
##[error]Error: Unable to locate executable file: 'dotnet'. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also verify the file has a valid extension for an executable file.
So I tried to fix it by using the Task UseDotNet#2, even though it doesn't make sense to me. But at the end I still get an error similar to the first error.:
MSBuild version 17.3.2+561848881 for .NET
C:\agent\_work\2\s\XXX\YYY\CPPWrapper\MyProject.vcxproj : warning NU1503: Skipping restore for project "C:\agent\_work\2\s\XXX\YYY\CPPWrapper\MyProject.vcxproj". The project file may be invalid or missing targets required for restore. [C:\agent\_work\2\s\XXX\YYY\MySolution.sln]
Determining projects to restore...
"C:\agent\_work\2\s\XXX\YYY\DotNet6Project\MyProject.csproj" restored (in "2,4 sec").
C:\agent\_work\2\s\XXX\YYY\CPPWrapper\MyProject.vcxproj(21,3):error MSB4019: The imported project "C:\Microsoft.Cpp.Default.props" was not found. Confirm that the expression in the Import declaration "\Microsoft.Cpp.Default.props" is correct, and that the file exists on disk.
C:\agent\_work\2\s\XXX\YYY\CPPWrapper\MyProject.vcxproj(21,3): error MSB4019: The imported project "C:\Microsoft.Cpp.Default.props" was not found. Confirm that the expression in the Import declaration "\Microsoft.Cpp.Default.props" is correct, and that the file exists on disk.
##[error]Error: The process 'C:\agent\_work\_tool\dotnet\dotnet.exe' failed with exit code 1
##[error]Dotnet command failed with non-zero exit code on the following projects : C:\agent\_work\2\s\XXX\YYY\MySolution.sln
##[section]Finishing: Build & Publish XXX Service - DotNetCoreCLI#2
MSBuild
My last hope then was my custom script that I already use in another pipeline that accesses the same agent and uses MSBuild from VS22. This is the approach I've come furthest with, as it looks like the project builds fine, but then fails because of this error.
(ResolvePackageAssets Target) -> C:\Program Files\dotnet\sdk\7.0.101\Sdks\Microsoft.NET.Sdk\targets\Microsoft.PackageDependencyResolution.targets(267,5):
error NETSDK1064: Package "Microsoft.EntityFrameworkCore.Analyzers",
Version 6.0.4, not found. It may have been deleted after the NuGet restore.
Otherwise, the NuGet restore may have been only partially completed due to limitations on the maximum path length.
[C:\agent\_work\2\s\XXX\YYY\DotNet6Project\MyProject.csproj]
How to proceed with it, I do not know right now. I enabled already long paths via Group Policy Editor→Administrative templates→All Settings→Enable Win32 long paths.
My yaml file:
pool:
name: 'My_Pool'
demands:
- agent.name -equals My_Agent
variables:
buildPlatform: 'x64'
buildConfiguration: 'Release'
solution: '$(System.DefaultWorkingDirectory)/XXX/YYY/MySolution.sln'
DotNet6Project: '$(System.DefaultWorkingDirectory)/XXX/YYY/DotNet6Project/MyProject.csproj'
CPPWrapper: '$(System.DefaultWorkingDirectory)/XXX/YYY/CPPWrapper/MyProject.vcxproj'
steps:
- task: NuGetToolInstaller#0
displayName: 'NuGet Tool Installer - NuGetToolInstaller#0'
name: 'NuGetToolInstaller'
inputs:
versionSpec: '>=6.1.0'
- task: NuGetCommand#2
displayName: 'NuGet Restore - NuGetCommand#2'
inputs:
command: 'restore'
restoreSolution: '$(solution)'
noCache: true
- task: BatchScript#1
displayName: 'Run BatchScript to create DLLs, Libs & Header - BatchScript#1'
inputs:
filename: '$(System.DefaultWorkingDirectory)/ICP/ZZZ/build_release.bat'
env:
PATH: 'C:\Program Files\CMake\bin'
- task: PowerShell#2
displayName: 'Run Powershell Script to unpack Packages from BatchScript for ZZZWrapper - PowerShell#2'
inputs:
filePath: '$(System.DefaultWorkingDirectory)/XXX/YYY/CPPWrapper/install_ZZZ_package.ps1'
# Workaround for MSBuild#1
- script: |
#echo off
setlocal enabledelayedexpansion
for /f "usebackq tokens=*" %%i in (`"!ProgramFiles(x86)!\Microsoft Visual Studio\Installer\vswhere.exe" -latest -products * -requires Microsoft.Component.MSBuild -find MSBuild\**\Bin\MSBuild.exe`) do (set msbuild_exe=%%i)
"!msbuild_exe!" "$(solution)" /p:Configuration="$(buildConfiguration)" /p:Platform="$(buildPlatform)" /p:PackageLocation="$(build.artifactStagingDirectory)" /t:rebuild
displayName: 'Build - Script'
# ---------- VSBuild ----------------
#- task: VSBuild#1
# inputs:
# solution: '$(solution)'
# msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"'
# platform: '$(buildPlatform)'
# configuration: '$(buildConfiguration)'
# ---------- DotNetCoreCLI ----------
#- task: UseDotNet#2
# inputs:
# packageType: 'sdk'
# version: '6.x'
#- task: DotNetCoreCLI#2
# displayName: 'Build & Publish - DotNetCoreCLI#2'
# inputs:
# command: 'publish'
# publishWebProjects: false
# projects: '$(solution)'
# arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'
# zipAfterPublish: false
# env:
# PATH: 'C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170'
- task: PublishBuildArtifacts#1
displayName: 'Publish Build Artifacts - PublishBuildArtifacts#1'
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'XXXArtifact'
publishLocation: 'Container'
From your yaml file ,I understand you want to build a solution and then publish build artifact. Accroding the error of your described task, I would like to provide suggestions that you can check.
1 VSBuild#1
taskError information: No agent found in pool My_Pool which satisfies the specified demandsThis indicates there are no agents on your machine that meet the demand requirements. You should check whether exists the agent which name is My_Agend_Unity_1 and exists check for Cmd,msbuild,visualstudio,Agent.Version -gtVersion 2.153.1.
See more information refer to doc:pool definition
2  UseDotNet#2
There’s a warning warning NU1503: Skipping restore for project that indicates  the packages required for the project MyProject are not restored correctly. You should edit the affected project to add targets for restore.
Please refer to doc:NuGet Warning NU1503
About the error MSB4019,you should check whether the project path "C:\Microsoft.Cpp.Default.props" exists. Here’s a ticekt  similar to your issue.You can try to this workaround and see if it works.
3 MSBuild
MSBuildAbout error NETSDK1064, this error occurs when the build tools can't find a NuGet package that's needed to build a project. This is typically due to a package restore issue related to warning NU1503 inTask   UseDotNet#2`. You can refer this doc:NETSDK1064: Package not found to take some actions provided to resolve this error.

dotnet test is not running on csproj

We are running a local azure devops server. Since the last microsoft patch day the dotnet test fails:
When i start the dotnet test on the server direct the dotnet test fails with the same csproj-file. Do i start the dotnet test without specifying the csproj-file, but in the same folder, it runs and the tests succeed. When i start the command line from visual studio tools it runs. Any ideas?
And here is the YAML:
steps:
task: DotNetCoreCLI#2
displayName: 'dotnet test '
inputs:
command: test
projects: |
**/Test.csproj
arguments: '-c Release'
testRunTitle: 'Test'
enabled: false
continueOnError: true

How to generate EF Core migrations script when ConnectionString is only known after ARM template deployment?

I want to release an app to Azure and deploy migrations to a database before deploying the Web App. That sounds relatively simple, you can create a migrations.sql script with dotnet-ef in your Build pipeline and apply this script in your Release pipeline.
However, I cannot create a a migrations.sql script in the Build pipeline as I am using four different databases for a DTAP environment. Thus, I would need to generate a migrations.sql script per environment and perform these separately against each of the databases. (as I understand it)
In my Release pipeline I use an incremental ARM template to deploy resources and set the ConnectionString (which comes from an Azure Key Vault) in the Azure Web App application settings configuration.
How/where do I generate the migrations.sql script? Do I do this in a Release pipeline? Am I making a major mistake in my reasoning?
EDIT:
Thanks for Madej's answer that shows the environment doesn't matter. I tried implementing creating the migrations.sql script in my pipelines.
# ASP.NET Core (.NET Framework)
# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/dotnet-core
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
projects: '**/*.csproj'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: DotNetCoreCLI#2
displayName: "Install dotnet-ef"
inputs:
command: 'custom'
custom: 'tool'
arguments: 'install --global dotnet-ef'
- task: DotNetCoreCLI#2
displayName: "Restore tools"
inputs:
command: 'custom'
custom: 'tool'
arguments: 'restore'
- task: DotNetCoreCLI#2
displayName: "Restore"
inputs:
command: 'restore'
projects: '$(projects)'
feedsToUse: 'select'
- task: DotNetCoreCLI#2
displayName: "Build"
inputs:
command: 'build'
projects: '$(projects)'
arguments: '--configuration $(BuildConfiguration)'
- task: DotNetCoreCLI#2
displayName: "Create migrations.sql"
inputs:
command: 'custom'
custom: 'ef'
arguments: 'migrations script --configuration $(BuildConfiguration) --no-build --idempotent --output $(Build.ArtifactStagingDirectory)\migrations.sql'
workingDirectory: 'WebApi.api'
- task: DotNetCoreCLI#2
displayName: "Publish"
inputs:
command: 'publish'
publishWebProjects: true
arguments: '--configuration $(BuildConfiguration) --output $(Build.ArtifactStagingDirectory)'
zipAfterPublish: false
- task: PublishBuildArtifacts#1
displayName: "Publish to Azure Pipelines"
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
My pipeline doesn't work, in the task "Create migrations.sql" I run into the following error:
An error occurred while accessing the Microsoft.Extensions.Hosting services. Continuing without the application service provider. Error: DefaultAzureCredential failed to retrieve a token from the included credentials.
- EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
- ManagedIdentityCredential authentication unavailable. No Managed Identity endpoint found.
- Visual Studio Token provider can't be accessed at C:\Users\VssAdministrator\AppData\Local\.IdentityService\AzureServiceAuth\tokenprovider.json
- Stored credentials not found. Need to authenticate user in VSCode Azure Account.
- Please run 'az login' to set up account
This is because in my Program.cs I add a keyvault and authenticate with the Azure.Identity DefaultAzureCredential as follows:
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.ConfigureAppConfiguration((hostingContext, config) =>
{
var settings = config.Build();
var credentials = new DefaultAzureCredential(
new DefaultAzureCredentialOptions() {
ExcludeSharedTokenCacheCredential = true,
VisualStudioTenantId = settings["VisualStudioTenantId"],
}
);
config.AddAzureKeyVault(new Uri(settings["KeyVault:Endpoint"]), credentials).Build();
})
.UseStartup<Startup>();
});
The Azure Pipelines cannot get a token from DefaultAzureCredential. How do I authenticate the Azure Pipelines?
I have figured out the solution to the problem in my edit. The primary way that the DefaultAzureCredential class gets credentials is via environment variables.
Thus, I had to define the environment variables somewhere. I didn't want to do this in the pipeline variables to avoid having to manage them as they should be available from the project in the form of a service connection to Azure.
I did the following:
In my pipelines added an AzureCLI task to read out the service principal id, key and tenant id and set them to job variables as follows:
- task: AzureCLI#2
inputs:
azureSubscription: '<subscription>'
scriptType: 'ps'
scriptLocation: 'inlineScript'
inlineScript: |
Write-Host '##vso[task.setvariable variable=AZURE_CLIENT_ID]'$env:servicePrincipalId
Write-Host '##vso[task.setvariable variable=AZURE_CLIENT_SECRET]'$env:servicePrincipalKey
Write-Host '##vso[task.setvariable variable=AZURE_TENANT_ID]'$env:tenantId
addSpnToEnvironment: true
In my "Create migrations.sql" task pass these variables as environment variables as follows:
- task: DotNetCoreCLI#2
displayName: "Create migrations.sql"
inputs:
command: 'custom'
custom: 'ef'
arguments: 'migrations script --configuration $(BuildConfiguration) --no-build --idempotent --output $(Build.ArtifactStagingDirectory)\migrations.sql'
workingDirectory: 'WebApi.api'
env:
AZURE_CLIENT_ID: $(AZURE_CLIENT_ID)
AZURE_CLIENT_SECRET: $(AZURE_CLIENT_SECRET)
AZURE_TENANT_ID: $(AZURE_TENANT_ID)
Added the service principal to the Azure Key Vault RBAC as a Key Vault Secrets User. I could only do this with az:
az role assignment create --role 'Key Vault Secrets User (preview)' --scope '/subscriptions/<subscription ID>/resourcegroups/<resource group name>/providers/Microsoft.KeyVault/vaults/<vault name>' --assignee '<service principal object id>'
This absolutely solved my problems without having to manage any more secrets/variables as they are all contained in the pipeline itself and don't pose any security threats.
You can do this in a build pipeline because migration.sql script makes some checks if specific migration was already applied or not.
To create migration script when you use Azure Key Vault in you confiugration the easiest way is to run command from Azure Clit task:
- task: AzureCLI#2
inputs:
azureSubscription: 'rg-tcm-si'
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: 'dotnet ef migrations script --configuration $(BuildConfiguration) --no-build --idempotent --output $(Build.ArtifactStagingDirectory)\migrations.sql'
workingDirectory: 'Itan.Database'
Before that you need to add get and list permissions to your serivde principal which is behind your connection service:
And then even if you need to deploy the same script to different environments/databases it is all fine until they haven't been drifted. So if you do all changes through ef core you are good to go with migration.sql done once and applied many times.
In database you should have:
which contains already applied migrations. ANd then in script you will find:
IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20200101111512_InitialCreate')
BEGIN
CREATE TABLE [SomeTable] (
[Id] uniqueidentifier NOT NULL,
[StorageDate] datetime2 NOT NULL,
.....
);
END;
GO
Thus you are safe to run it against multiple databases.
And then to deploy you can use
steps:
- task: SqlAzureDacpacDeployment#1
displayName: 'Azure SQL SqlTask'
inputs:
azureSubscription: 'YourSubscription'
ServerName: 'YourServerName'
DatabaseName: 'YourDatabaseName'
SqlUsername: UserName
SqlPassword: '$(SqlServerPassword)'
deployType: SqlTask
SqlFile: '$(System.DefaultWorkingDirectory)/staging/drop/migrations.sql'

Azure CI and CD with NBGV

We're using NB.GV in our CI pipeline like:
- task: DotNetCoreCLI#2
inputs:
command: custom
custom: tool
arguments: install --tool-path . nbgv --ignore-failed-sources
displayName: Install NBGV tool
- script: nbgv cloud -c -a
displayName: Set Version
It sets the version correctly, so in further tasks, we're able to use them (e.g. $(GitBuildVersion))
The problem comes when we're trying to setup a CD pipeline based on this article. There we need to read $(Build.BuildNumber) which has a different value than expected. Based on official documentation it should be e.g. 1.3.1.57621 but we're getting 1.3.1+g15e1898f47.
It seems like AssemblyInformationalVersion and BuildVersion got exchanged.
We have set setVersionVariables: true in version.json.
Thanks for any help in advance
You my add this after running ngbv to update you BuildNumber
- powershell: Write-Host "##vso[build.updatebuildnumber]$(GitBuildVersion)"
displayName: 'Update build number to $(GitBuildVersion)'
According to this you should have then expected value:
You may also check what variables you have by running - bash: env | sort

Publishing VS Code extension via Azure DevOps

After reading VSCode Publish Extension docs, I've succeeded to publish a VSCode extension manually with vsce.
I'm wondering if there is a way to publish extensions automatically via Azure DevOps pipelines (build or release) instead of doing it manually.
I've tried to use vsce there but I'm getting an authentication error
Resource not available for anonymous access. Client authentication required.
Using vsce publish -p <access_token> is not possible because the pipeline is public and everyone can see the access token...
So, is there a way to publish a Visual Studio Code extension automatically via Azure DevOps Pipeline or even Travis CI?
You can add the Personal Access Token as a secret variable, then nobody can couldn't see it.
Go to Azure DevOps to your pipeline and click on "Edit", not in the top left click on "Variables":
Now click on the + icon and add the variable, mark the checkbox "Keep this value secret":
Now you can use it in this way: $(PAT), for example:
vsce publish -p $(PAT)
The variable value will not appear in the YAML :)
Is there a way to publish a Visual Studio Code extension automatically
via Azure DevOps Pipeline?
Of course yes!
To have a good experience for CI/CD in Azure Devops, I recommend you store the source code in Azure Devops or Github.
Build \ CI
In build, most of work is update the version which in manifest of VSIX, build\create package. For the version increased, here I use the counter expression feature which supported in VSTS to achieve that:
counter('name', seed)
Use this expression in variable declaration bloc. For detailed and completed build process, refer to my sample YAML code:
trigger:
- '*'
pool:
vmImage: 'windows-2019'
variables:
VersionPatch: $[counter('versioncount', 24)]
solution: '**/*.sln'
BuildPlatform: 'Any CPU'
BuildConfiguration: 'Release'
name: 2.0.$(VersionPatch)
steps:
- task: UseDotNet#2
inputs:
packageType: 'sdk'
version: '3.0.100'
includePreviewVersions: true
- task: NuGetToolInstaller#1
inputs:
versionSpec: 5.1.0
- task: PowerShell#2
displayName: Update version
inputs:
filePath: 'Build\VersionUpdate.ps1'
arguments: '$(Build.BuildNumber)'
pwsh: true
- task: NuGetCommand#2
inputs:
command: 'restore'
- task: DotNetCoreCLI#2
displayName:
inputs:
command: 'restore'
projects: 'tests/**/*.csproj'
vstsFeed: '{My feed ID}'
includeNuGetOrg: false
- task: VSBuild#1
inputs:
solution: '**\*.sln'
maximumCpuCount: true
platform: '$(BuildPlatform)'
configuration: '$(BuildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(BuildPlatform)'
configuration: '$(BuildConfiguration)'
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.SourcesDirectory)'
Contents: |
Build/**
**/*.vsix
**/*.nupkg
README.md
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishPipelineArtifact#0
inputs:
artifactName: 'ExtensionDrop'
targetPath: '$(Build.ArtifactStagingDirectory)'
In UpdateVersion.ps1 file:
$VerbosePreference="Continue"
$version = $args[0]
if (!$version) {
$version = "0.0.0"
}
Write-Host "This Version is: $version"
$FullPath = Resolve-Path $PSScriptRoot\..\src\Merlin.Compiler.Vsix\source.vsixmanifest
Write-Host $FullPath
[xml]$content = Get-Content $FullPath
$content.PackageManifest.Metadata.Identity.Version = $version
$content.Save($FullPath)
Release\ CD
After build succeed, set the release pipeline for this repos. In release, use powershell script and VsixPublisher.exe to publish the vsix file.
$PAToken = $args[0]
$VsixPath = "$PSScriptRoot\..\src\Merlin.Compiler.Vsix\bin\Release\Merlin.Compiler.Vsix"
$ManifestPath = "$PSScriptRoot\ExtensionManifest.json"
$Installation = & "${env:ProgramFiles(x86)}\Microsoft Visual Studio\Installer\vswhere.exe" -latest -prerelease -format json | ConvertFrom-Json
$Path = $Installation.installationPath
$VsixPublisher = Join-Path -Path $Path -ChildPath "VSSDK\VisualStudioIntegration\Tools\Bin\VsixPublisher.exe" -Resolve
& $VsixPublisher publish -payload $VsixPath -publishManifest $ManifestPath -personalAccessToken $PAToken -ignoreWarnings "VSIXValidatorWarning01,VSIXValidatorWarning02,VSIXValidatorWarning08"
In CD, use VsixPublisher.exe which exist in VS to publish the vsix file.
You can set the PAToken in Variable tab, then set it as secret. Thus it would not be public for others. Here PAT token is a necessary one which could not be replaced by others. And also, when generate the token, need choose All accessible organizations. Or it will cause the permission error.
Further #Shayki's answer there are some more steps because you can't just run vsce publish -p $(PAT).
The vsce should be installed (can be in devDependencies)
Add a "deploy" (or name it as you like) script to the package.json scripts.
"deploy": "vsce publish -p"
Add a "publish" step in the azure-pipeline.yml file. the condition is for running the publish script only on master so Pull Requests will not publish. Also run it only in Linux's build, in case you configured multiple platforms. If you configured only one (for example, windows) replace Linux with that platform
- bash: |
echo ">>> Publish"
yarn deploy $(token)
displayName: Publish
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'), eq(variables['Agent.OS'], 'Linux'))
Example azure-pipeline.yml