missing TfxInstaller task for YAML pipeline - azure-devops

In the Add a custom pipelines task extension Microsoft describe how to create a custom Azure DevOps task extension. Under Step 6: Create a build and release pipeline to publish the extension to Marketplace they show a example of YAML pipeline which should automatically build and publish your custom extension to the marketplace.
The last stage of the YAML pipeline:
- stage: Download_build_artifacts_and_publish_the_extension
jobs:
- job:
steps:
- task: TfxInstaller#3
inputs:
version: "v0.7.x"
- task: DownloadBuildArtifacts#0
inputs:
buildType: "current"
downloadType: "single"
artifactName: "$(ArtifactName)"
downloadPath: "$(System.DefaultWorkingDirectory)"
- task: PublishAzureDevOpsExtension#3
inputs:
connectTo: 'VsTeam'
connectedServiceName: 'ServiceConnection' # Change to whatever you named the service connection
fileType: 'vsix'
vsixFile: '/Publisher.*.vsix'
publisherId: '$(PublisherID)'
extensionId: '$(ExtensionID)'
extensionName: '$(ExtensionName)'
updateTasksVersion: false
extensionVisibility: 'private' # Change to public if you're publishing to the marketplace
extensionPricing: 'free'
contains the tasks TfxInstaller and PublishAzureDevOpsExtension.
On our Azure DevOps 2019.1 (on premise) server I get the feedback that these tasks are unknown. Also when I try to seek for more information's about this tasks, I do not found anything. Not in the docs, not on the marketplace and not on google.
Where can I find these tasks Microsoft using for there tutorials? Any more information's about them?

You need to install Azure DevOps Extension Tasks in order to use TfxInstaller and PublishAzureDevOpsExtension.

Related

Error 500 (Run From Package Initialization failed) while doing web deploy with Azure Piplines

I am trying to publish a Blazor net core app using Azure Pipelines, but I constantly get a 500 error on the Web Deployment stage.
Once the pipeline runs I check through Kudu console and the only two files on the server are an empty web.config and FAILED TO INITIALIZE RUN FROM PACKAGE.txt with Run From Package Initialization failed. inside.
Below is the YAML of the pipeline.
pool:
name: Azure Pipelines
#Your build pipeline references an undefined variable named ‘Parameters.RestoreBuildProjects’. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab. See https://go.microsoft.com/fwlink/?linkid=865972
#Your build pipeline references the ‘BuildConfiguration’ variable, which you’ve selected to be settable at queue time. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab, and then select the option to make it settable at queue time. See https://go.microsoft.com/fwlink/?linkid=865971
steps:
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '$(Parameters.RestoreBuildProjects)'
feedsToUse: config
nugetConfigPath: NuGet.Config
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: false
projects: '**/TPL/Server/TPL.Server.csproj'
arguments: '--configuration $(BuildConfiguration) --output $(build.artifactstagingdirectory)'
modifyOutputPath: false
- task: AzureRmWebAppDeployment#4
displayName: 'Azure App Service Deploy: tpl'
inputs:
azureSubscription: '**hidden**'
WebAppName: tpl
deployToSlotOrASE: true
ResourceGroupName: TPL
SlotName: test
packageForLinux: '$(build.artifactstagingdirectory)/**/*.zip'
Deleting and recreating the slot fixed this. I previously had old way CI (from portal's deployment center menu) and my hunch is that didn't get disconnected properly or something like that didn't get cleaned up.

Azure DevOPS - run task only if artifact exists from build

I have two pipelines - build and publish. Build pipeline can produce up two artifacts but it depends on given parameters.
Publish pipeline is automatically triggered when build pipeline is completed. Publish pipeline then tooks published artifacts and deploy them. However I want to run publish tasks only and only if particular artifacts exists from build pipeline.
Right now, if artifact does not exists, it will fail "download" task.
Simplified to important parts and redacted some secret info
resources:
pipelines:
- pipeline: buildDev # Internal name of the source pipeline, used elsewhere within app-ci YAML, # e.g. to reference published artifacts
source: "Build"
trigger:
branches:
- dev
- feat/*
stages:
- stage: publish
displayName: "🚀🔥 Publish to Firebase"
jobs:
- job: publish_firebase_android
displayName: "🔥🤖Publish Android to Firebase"
steps:
- script: |
- download: buildDev
artifact: android
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload>
displayName: "Deploy APK to Firebase"
workingDirectory: "$(Pipeline.Workspace)/buildDev/android/"
- job: publish_firebase_ios
displayName: "🔥🍏Publish iOS to Firebase"
steps:
- download: buildDev
artifact: iOS
- download: buildDev
artifact: changelog
- task: DownloadSecureFile#1
name: firebaseKey
displayName: "Download Firebase key"
inputs:
secureFile: "<secure>.json"
- script: <upload...>
workingDirectory: "$(Pipeline.Workspace)/buildDev/iOS/"
displayName: "Deploy IPA to Firebase"
I've tried to find some solution but every other solution solve the only problem within the same pipeline. Based on MS Docs I can't find if there is a prepared env. a variable that could point to "pipeline resources". With that env. variable I could theoretically run a script which checks presence of artifact, set variable and use that variable as condition for steps.
I think you can use stage filters in trigger. I don't know what structure your build pipeline is, but you can set up a stage to publish artifacts. Execute that stage if there are artifacts to publish, otherwise skip it. You can do this using conditions. Here is a simple sample:
stages:
- stage: Build
jobs:
- job: build
steps:
...
- stage: Artifact
condition: ... # Set the condition based on your parameter
jobs:
- job: artifact
steps:
...
Then use the stage filter in the publishing pipeline. If the stage executes successfully, then the publish pipeline will run, otherwise, the publish pipeline will not run.
resources:
pipelines:
- pipeline: buildpipeline
source: buildpipeline
trigger:
stages:
- Artifact
Using variable groups is an option as well. You can use the variable groups to pass variable from a pipeline to another pipeline. Here are the detailed steps:
(1). Create a variable group in Pipelines/Library and add a new Variable. I will call this variable "var" later.
(2). In your build pipeline, you can update "var" based on your parameters:
variables:
- group: {group name}
- bash: |
az pipelines variable-group variable update --group-id {id} --name var --value yes
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
condition: ...
Tip 1. If you don't know your variable group id, go to Pipelines/Library and select your variable group. You can find it in the URL: https://dev.azure.com/...&variableGroupId={id}&...
Tip 2. If you meet the error "You do not have permissions to perform this operation on the variable group.", go to Pipelines/Library and select your variable group. Click on "Security" and give "{pipeline name} Build Service" user the Administrator role.
Tip 3. Use your parameter in condition to decide whether to update var.
(3). In your publish pipeline, you can use var from variable group in condition:
condition: eq(variables['var'], 'yes')

Not found scriptPath in azure devops

I put a shell script file in a folder on my repo root and tried to run that in my devops pipeline but it says that cannot find the scriptPath:
[error]Not found scriptPath: /home/vsts/work/1/s/pipelines/databricks-cli-config.sh
I am simply creating a task to run the shell script, like this:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
displayName: "Install and configure the Databricks CLI"
Any idea?
Make sure you checkout your code and you are on correct level. So if you are on regular job please add working directory:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
cwd: '$(System.DefaultWorkingDirectory)'
displayName: "Install and configure the Databricks CLI"
and if you use it on deployment job, by default code is not being checked out there. So you need you need to publish this script as artifact and then download it in deployment job (deployment jobs download artifact by default) or add
- checkout: self
step do download code on deployment job.
I assumed that you use YAML.

Disable AzureFileCopy#2 pre job?

I am using Azure Devops Pipelines (YAML).
I have a AzureFileCopy#2 task which copies file from the source into a Storage Account. The Storage Account is created dynamically by an earlier ARM deploy task (the ARM task outputs the SA name which is then parsed into a variable for later consumption).
The AzureFileCopy#2 task works perfectly and copies all the files into the Storage Account. But, I notice in the run that the AzureFileCopy#2 task actually runs twice - once by me and once as a "pre-job". The pre-job of course fails with a warning that it can't reference the Storage Account (because by that stage I haven't created the variable).
Fortunately, it's only a warning but it is rather annoying to have that warning in every run.
I believe that pre-jobs can't be disabled (though I could drop that is a a feature enhancement) so is there a better way of handing this presumably common scenario?
Thanks in advance
EDIT: Obfuscated YAML added:
variables:
My.Configuration: 'Release'
My.SQLProject: 'contoso.api'
My.ARMProject: 'contoso.azure.templates'
My.IntEnvironment: 'i'
My.ResourceGroupNumber: 66
My.ArtifactLocation: 'drop'
# BUILD STAGES ARE HERE
- stage: 'Stage_Deploy'
displayName: 'Stage Deploy'
jobs:
- deployment: 'Job_Deploy'
pool:
vmImage: 'windows-2019'
displayName: 'Job Deploy'
environment: 'env1'
strategy:
runOnce:
deploy:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifacts from Drop'
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'ARM Deployment'
inputs:
deploymentScope: 'Resource Group'
azureResourceManagerConnection: 'CONTOSO CONNECTION'
subscriptionId: 'aaaaaaaa-0000-0000-00000-aaaaaaaaaaaaa'
action: 'Create Or Update Resource Group'
resourceGroupName: 'contoso-$(My.IntEnvironment)-eun-core-$(My.ResourceGroupNumber)-rg'
location: 'North Europe'
templateLocation: 'Linked artifact'
csmFile: '$(Pipeline.Workspace)/$(My.ArtifactLocation)/$(My.ARMProject)/azuredeploy.json'
csmParametersFile: '$(Pipeline.Workspace)/$(My.ArtifactLocation)/$(My.ARMProject)/azuredeploy.parameters.json'
overrideParameters: '-environment $(My.IntEnvironment)'
deploymentMode: 'Incremental'
deploymentOutputs: 'ARMOutput'
- task: PowerShell#2
condition: true
displayName: 'Parse ARM Template Outputs'
inputs:
targetType: filePath
filePath: '$(Pipeline.Workspace)/$(My.ArtifactLocation)/$(My.ARMProject)/Parse-ARMOutput.ps1'
arguments: '-ARMOutput ''$(ARMOutput)'''
- task: AzureFileCopy#2
condition: true
displayName: 'Copy Static Web Content to SA'
inputs:
SourcePath: '$(Pipeline.Workspace)/$(My.ArtifactLocation)'
azureSubscription: 'CONTOSO CONNECTION'
Destination: AzureBlob
storage: '$(ARM.AppDataStorageName)'
ContainerName: static
Then, when I run it, the following stages happen:
1. Initialize job
2. Pre-job: Copy Static Web Content to SA
It is this pre-job that, in the debugging shows this:
##[debug]StorageAccountRM=$(ARM.AppDataStorageName)
<other debug lines followed by...>
##[warning]Can\'t find loc string for key: StorageAccountDoesNotExist
Later on the task "Copy Static Web Content to SA" runs as a normal task and it runs fine.
Fortunately, it's only a warning but it is rather annoying to have
that warning in every run. I believe that pre-jobs can't be disabled
(though I could drop that is a a feature enhancement) so is there a
better way of handing this presumably common scenario?
Sorry but I'm afraid it's not supported to disable the warning. The warning occurs because it's by design.
(The source code of AzureFileCopyV2 task causes this behavior.)
More details:
We can find source of that task here. It contains one task.json file in which defines content like this:
"instanceNameFormat": "$(Destination) File Copy",
"prejobexecution": {
"Node": {
"target": "PreJobExecutionAzureFileCopy.js"
}
},
The task.json file describes the build or release task and is what the build/release system uses to render configuration options to the user and to know which scripts to execute at build/release time. Since the task.json has definition like prejobexecution, the predefined pre-job task will do the check about the inputs defined in PreJobExecutionAzureFileCopy.js file. Including the Storage Account.
So this is something by design of the code of AzureFileCopyV2 task, we can't disable the warning in pre-job task. If you do want to resolve that warning, you can consider using AzureFileCopyV1 in which doesn't define the prejobexecution, but this is not recommended. Compared with Version1, Version2 has some improvement and fixes some old issues.

I have a build pipeline in Azure DevOps for my github repo - where are the binaries?

Here is the (unedited from template) YAML definition of the pipeline:
# .NET Desktop
# Build and run tests for .NET Desktop or Windows classic desktop solutions.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/apps/windows/dot-net
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
This pipeline triggers when code is pushed to the master branch of my repo as intended - however I can't find the binaries that it built! How do I access them so I can share them with folks? Are the binaries unavailable because some of my unit tests failed, causing the build to fail, or something?
You need to publish those somewhere. It's up to you to choose what to keep at what stage of the pipeline. You can copy files into a directory, or just grab the whole $(Build.SourcesDirectory). You can also instruct the VsBuild task to redirect output to a specific directory by passing in the /p:OutputPath=$(Build.ArtifactStagingDirectory) commandline argument.
You then have a few options:
GitHub Release task - Creates a release in GitHub and associates the files you want to it.
# GitHub Release
# Create, edit, or delete a GitHub release
- task: GitHubRelease#0
inputs:
gitHubConnection:
#repositoryName: '$(Build.Repository.Name)'
#action: 'create' # Options: create, edit, delete
#target: '$(Build.SourceVersion)' # Required when action == Create || Action == Edit
#tagSource: 'auto' # Required when action == Create# Options: auto, manual
#tagPattern: # Optional
#tag: # Required when action == Edit || Action == Delete || TagSource == Manual
#title: # Optional
#releaseNotesSource: 'file' # Optional. Options: file, input
#releaseNotesFile: # Optional
#releaseNotes: # Optional
#assets: '$(Build.ArtifactStagingDirectory)/*' # Optional
#assetUploadMode: 'delete' # Optional. Options: delete, replace
#isDraft: false # Optional
#isPreRelease: false # Optional
#addChangeLog: true # Optional
#compareWith: 'lastFullRelease' # Required when addChangeLog == True. Options: lastFullRelease, lastRelease, lastReleaseByTag
#releaseTag: # Required when compareWith == LastReleaseByTag
Publish Pipeline Artifact (Azure DevOps) - Links the selected files to the build as an artifact. You can download them from the pipeline's summary page in Azure DevOps. Works well in both Build and Release pipelines.
# Publish pipeline artifact
# Publish (upload) a file or directory as a named artifact for the current run
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
artifact: 'Output'
Publish Build Artifact (Azure DevOps and TFS) - Similar to Publish Pipeline Artifact, but less efficient in its transfers and specific to build pipelines. Can also publish to a file share instead of an attachment to the pipeline summary.
# Publish build artifacts
# Publish build artifacts to Azure Pipelines or a Windows file share
- task: PublishBuildArtifacts#1
inputs:
#pathtoPublish: '$(Build.ArtifactStagingDirectory)'
#artifactName: 'drop'
#publishLocation: 'Container' # Options: container, filePath
#targetPath: # Required when publishLocation == FilePath
#parallel: false # Optional
#parallelCount: # Optional
You are missing the task of "copy and publish artifact" task. This task will copy the resulting compiled binaries as artifact to be downloaded later.
For more information about this copy and publish artifact, visit the official documentation: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/copy-and-publish-build-artifacts?view=azure-devops
UPDATE: the copy and publish artifact task is deprecated in Azure DevOps, please use the newest one: https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/build-artifacts?view=azure-devops&tabs=yaml
Use this in your YAML: PublishBuildArtifacts#1