I put a shell script file in a folder on my repo root and tried to run that in my devops pipeline but it says that cannot find the scriptPath:
[error]Not found scriptPath: /home/vsts/work/1/s/pipelines/databricks-cli-config.sh
I am simply creating a task to run the shell script, like this:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
displayName: "Install and configure the Databricks CLI"
Any idea?
Make sure you checkout your code and you are on correct level. So if you are on regular job please add working directory:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
cwd: '$(System.DefaultWorkingDirectory)'
displayName: "Install and configure the Databricks CLI"
and if you use it on deployment job, by default code is not being checked out there. So you need you need to publish this script as artifact and then download it in deployment job (deployment jobs download artifact by default) or add
- checkout: self
step do download code on deployment job.
I assumed that you use YAML.
Related
I am trying to publish a Blazor net core app using Azure Pipelines, but I constantly get a 500 error on the Web Deployment stage.
Once the pipeline runs I check through Kudu console and the only two files on the server are an empty web.config and FAILED TO INITIALIZE RUN FROM PACKAGE.txt with Run From Package Initialization failed. inside.
Below is the YAML of the pipeline.
pool:
name: Azure Pipelines
#Your build pipeline references an undefined variable named ‘Parameters.RestoreBuildProjects’. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab. See https://go.microsoft.com/fwlink/?linkid=865972
#Your build pipeline references the ‘BuildConfiguration’ variable, which you’ve selected to be settable at queue time. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab, and then select the option to make it settable at queue time. See https://go.microsoft.com/fwlink/?linkid=865971
steps:
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '$(Parameters.RestoreBuildProjects)'
feedsToUse: config
nugetConfigPath: NuGet.Config
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: false
projects: '**/TPL/Server/TPL.Server.csproj'
arguments: '--configuration $(BuildConfiguration) --output $(build.artifactstagingdirectory)'
modifyOutputPath: false
- task: AzureRmWebAppDeployment#4
displayName: 'Azure App Service Deploy: tpl'
inputs:
azureSubscription: '**hidden**'
WebAppName: tpl
deployToSlotOrASE: true
ResourceGroupName: TPL
SlotName: test
packageForLinux: '$(build.artifactstagingdirectory)/**/*.zip'
Deleting and recreating the slot fixed this. I previously had old way CI (from portal's deployment center menu) and my hunch is that didn't get disconnected properly or something like that didn't get cleaned up.
I have an Environment called 'Dev' that has a resource, which is a VM. As part of the 'Dev' pipeline I want to copy files from a specific folder on the develop branch of a specific repo to a specific folder on the VM that's on the Environment.
I've not worked with Environments before or yaml pipelines much but I gather I need to use the CopyFiles#2 task.
So I've got an azure pipeline yaml file something like this:
variables:
isDev: $[eq(variables['Build.SourceBranch'], 'refs/heads/develop')]
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
contents: 'myFolder\**'
Overwrite: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: myArtifact
- stage: Deployment
dependsOn: Build
condition: and(succeeded(), eq(variables.isDev, true))
jobs:
- deployment: Deploy
displayName: Deploy to Dev
pool:
vmImage: 'windows-latest'
environment: Dev
strategy:
runOnce:
deploy:
steps:
- script: echo Foo Bar
The first question is how to I get this to copy the files to a specific path on the Dev environment?
Is the PublishBuildArtifacts really needed? The reason I ask is that I want this to copy files every time the pipeline is run and not error if the artifact already exists.
It also feels a bit dirty to have to check the branch is the correct branch this way. Is there a better way to do it?
The deployment strategy you're using relies on specifying an agent pool, which means it doesn't run on the machines in the environment. If you use a strategy such as rolling, it will run the specified steps on those machines automatically, including any download steps to download artifacts.
Ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#deployment-strategies
You need to publish artifacts as part of the pipeline if you want them to be automatically available to down-stream jobs. Each run will get a different set of artifacts, even if the actual artifact contents are the same.
That said, based on the YAML you posted, you probably don't need to. In fact, you don't need the "build" stage at all. You could just add a checkout step during your rolling deployment, and the repo would be cloned on each of the target machines.
Ok, worked this out with help from this article: https://dev.to/kenakamu/azure-devops-yaml-release-pipeline-trigger-when-build-pipeline-completed-54d5.
I've taken the advice from Daniel Mann regarding the strategy being 'rolling'. I then split my pipeline into 2 pipelines; 1 for building the artifacts and 1 for releasing (copying them).
If you want just download the particular folders instead of all the source files from the repository, you can try using the REST API "Items - Get" to download each particular folder individually.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&download=true&$format=zip&versionDescriptor.version={versionDescriptor.version}&resolveLfs=true&api-version=6.0
For example:
Have the repository like as below.
Now, in the YAML pipeline, I just want to download the 'TestFolder01' folder from the main branch.
jobs:
- job: build
. . .
steps:
- checkout: none # Do not check out all the source files.
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'
This will download the 'TestFolder01' folder as a ZIP file (TestFolder01.zip) into the current working directory. You can use the unzip command to decompress it.
[UPDATE]
If you want to download the particular folders in the deploy job which target to your VM environment, yes, the folders will be download into the pipeline working directory on the VM.
Actually, you can consider a VM type environment resource is a self-hosted agent installed on the VM. So, when your deploy job is targeting to the VM environment resource, it is running on the self-hosted agent on the VM.
The pipeline working directory is under the directory where you install the VM environment resource (self-hosted agent). Normally, you can use the variable $(Pipeline.Workspace) to get value of this path (see here).
stages:
- stage: Deployment
jobs:
- deployment: Deploy
displayName: 'Deploy to Dev'
environment: 'Dev.VM-01'
strategy:
runOnce:
deploy:
steps:
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
echo "Current working directory: $PWD"
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'
I have a Azure DevOps build pipeline that runs a Cypress test. In that Cypress test we have a test user login with a e-mail and password. On my local system I have the password in a cypress.env.json file.
On the Azure build pipeline I get the message that the password is undefined which makes sense since we put the cypress.env.json file in the .gitignore not to expose it to the repo.
I've created a Azure variable to represent the password: $(ACCOUNT_PASSWORD)
So I think I need to create the cypress.env.json file in the build pipeline and use Azure variables for it, but I can't figure out how to create a file during the build step.
I have this task:
- task: CmdLine#2
displayName: 'run Cypress'
inputs:
script: |
npm run ci
So I need to add a task before this that creates the cypress.env.json file with the variable that represents the password:
{
"ACCOUNT_PASSWORD": $(ACCOUNT_PASSWORD)
}
You can add a simple PS script that creates the file:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$json = '{
"ACCOUNT_PASSWORD": $(ACCOUNT_PASSWORD)
}'
$json | Out-File cypress.env.json
workingDirectory: '$(Build.SourcesDirectory)'
pwsh: true # For Linux
In the workingDirectory set the path to where you want the file to be created.
If you want to create a json file using Azure Pipeline's Bash#3 for linux environments, you can do
steps:
- task: Bash#3
inputs:
targetType: "inline"
script: |
echo '{"ACCOUNT_PASSWORD": "$(ACCOUNT_PASSWORD)"}' > server/cypress.env.json
cd server
echo $(ls)
cat git-tag.json
- task: Docker#2
inputs:
command: buildAndPush
...
The cd echo and cat commands are not necessary. They are only there to log stuff to the console so you can see where the file is and the contents. This task may seem trivial to many but for someone like me with little bash experience even something simple like this task took quite a while to figure out how to debug and get right.
Azure Pipelines runs this bash script in the Build.SourcesDirectory, which is the root of your project. In this example above I have a /server folder which is where I want to put the .json file.
Later on, I run the Docker#2 task which builds my server. The Docker#2 task looks at my Dockerfile which has the instruction COPY . ./ I could be wrong here being new to docker, but my assumption is the Azure VM running the pipeline executes the docker COPY command and copies the file from the Build.SourcesDirectory to a destination inside the docker container.
In the Add a custom pipelines task extension Microsoft describe how to create a custom Azure DevOps task extension. Under Step 6: Create a build and release pipeline to publish the extension to Marketplace they show a example of YAML pipeline which should automatically build and publish your custom extension to the marketplace.
The last stage of the YAML pipeline:
- stage: Download_build_artifacts_and_publish_the_extension
jobs:
- job:
steps:
- task: TfxInstaller#3
inputs:
version: "v0.7.x"
- task: DownloadBuildArtifacts#0
inputs:
buildType: "current"
downloadType: "single"
artifactName: "$(ArtifactName)"
downloadPath: "$(System.DefaultWorkingDirectory)"
- task: PublishAzureDevOpsExtension#3
inputs:
connectTo: 'VsTeam'
connectedServiceName: 'ServiceConnection' # Change to whatever you named the service connection
fileType: 'vsix'
vsixFile: '/Publisher.*.vsix'
publisherId: '$(PublisherID)'
extensionId: '$(ExtensionID)'
extensionName: '$(ExtensionName)'
updateTasksVersion: false
extensionVisibility: 'private' # Change to public if you're publishing to the marketplace
extensionPricing: 'free'
contains the tasks TfxInstaller and PublishAzureDevOpsExtension.
On our Azure DevOps 2019.1 (on premise) server I get the feedback that these tasks are unknown. Also when I try to seek for more information's about this tasks, I do not found anything. Not in the docs, not on the marketplace and not on google.
Where can I find these tasks Microsoft using for there tutorials? Any more information's about them?
You need to install Azure DevOps Extension Tasks in order to use TfxInstaller and PublishAzureDevOpsExtension.
After reading VSCode Publish Extension docs, I've succeeded to publish a VSCode extension manually with vsce.
I'm wondering if there is a way to publish extensions automatically via Azure DevOps pipelines (build or release) instead of doing it manually.
I've tried to use vsce there but I'm getting an authentication error
Resource not available for anonymous access. Client authentication required.
Using vsce publish -p <access_token> is not possible because the pipeline is public and everyone can see the access token...
So, is there a way to publish a Visual Studio Code extension automatically via Azure DevOps Pipeline or even Travis CI?
You can add the Personal Access Token as a secret variable, then nobody can couldn't see it.
Go to Azure DevOps to your pipeline and click on "Edit", not in the top left click on "Variables":
Now click on the + icon and add the variable, mark the checkbox "Keep this value secret":
Now you can use it in this way: $(PAT), for example:
vsce publish -p $(PAT)
The variable value will not appear in the YAML :)
Is there a way to publish a Visual Studio Code extension automatically
via Azure DevOps Pipeline?
Of course yes!
To have a good experience for CI/CD in Azure Devops, I recommend you store the source code in Azure Devops or Github.
Build \ CI
In build, most of work is update the version which in manifest of VSIX, build\create package. For the version increased, here I use the counter expression feature which supported in VSTS to achieve that:
counter('name', seed)
Use this expression in variable declaration bloc. For detailed and completed build process, refer to my sample YAML code:
trigger:
- '*'
pool:
vmImage: 'windows-2019'
variables:
VersionPatch: $[counter('versioncount', 24)]
solution: '**/*.sln'
BuildPlatform: 'Any CPU'
BuildConfiguration: 'Release'
name: 2.0.$(VersionPatch)
steps:
- task: UseDotNet#2
inputs:
packageType: 'sdk'
version: '3.0.100'
includePreviewVersions: true
- task: NuGetToolInstaller#1
inputs:
versionSpec: 5.1.0
- task: PowerShell#2
displayName: Update version
inputs:
filePath: 'Build\VersionUpdate.ps1'
arguments: '$(Build.BuildNumber)'
pwsh: true
- task: NuGetCommand#2
inputs:
command: 'restore'
- task: DotNetCoreCLI#2
displayName:
inputs:
command: 'restore'
projects: 'tests/**/*.csproj'
vstsFeed: '{My feed ID}'
includeNuGetOrg: false
- task: VSBuild#1
inputs:
solution: '**\*.sln'
maximumCpuCount: true
platform: '$(BuildPlatform)'
configuration: '$(BuildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(BuildPlatform)'
configuration: '$(BuildConfiguration)'
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.SourcesDirectory)'
Contents: |
Build/**
**/*.vsix
**/*.nupkg
README.md
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishPipelineArtifact#0
inputs:
artifactName: 'ExtensionDrop'
targetPath: '$(Build.ArtifactStagingDirectory)'
In UpdateVersion.ps1 file:
$VerbosePreference="Continue"
$version = $args[0]
if (!$version) {
$version = "0.0.0"
}
Write-Host "This Version is: $version"
$FullPath = Resolve-Path $PSScriptRoot\..\src\Merlin.Compiler.Vsix\source.vsixmanifest
Write-Host $FullPath
[xml]$content = Get-Content $FullPath
$content.PackageManifest.Metadata.Identity.Version = $version
$content.Save($FullPath)
Release\ CD
After build succeed, set the release pipeline for this repos. In release, use powershell script and VsixPublisher.exe to publish the vsix file.
$PAToken = $args[0]
$VsixPath = "$PSScriptRoot\..\src\Merlin.Compiler.Vsix\bin\Release\Merlin.Compiler.Vsix"
$ManifestPath = "$PSScriptRoot\ExtensionManifest.json"
$Installation = & "${env:ProgramFiles(x86)}\Microsoft Visual Studio\Installer\vswhere.exe" -latest -prerelease -format json | ConvertFrom-Json
$Path = $Installation.installationPath
$VsixPublisher = Join-Path -Path $Path -ChildPath "VSSDK\VisualStudioIntegration\Tools\Bin\VsixPublisher.exe" -Resolve
& $VsixPublisher publish -payload $VsixPath -publishManifest $ManifestPath -personalAccessToken $PAToken -ignoreWarnings "VSIXValidatorWarning01,VSIXValidatorWarning02,VSIXValidatorWarning08"
In CD, use VsixPublisher.exe which exist in VS to publish the vsix file.
You can set the PAToken in Variable tab, then set it as secret. Thus it would not be public for others. Here PAT token is a necessary one which could not be replaced by others. And also, when generate the token, need choose All accessible organizations. Or it will cause the permission error.
Further #Shayki's answer there are some more steps because you can't just run vsce publish -p $(PAT).
The vsce should be installed (can be in devDependencies)
Add a "deploy" (or name it as you like) script to the package.json scripts.
"deploy": "vsce publish -p"
Add a "publish" step in the azure-pipeline.yml file. the condition is for running the publish script only on master so Pull Requests will not publish. Also run it only in Linux's build, in case you configured multiple platforms. If you configured only one (for example, windows) replace Linux with that platform
- bash: |
echo ">>> Publish"
yarn deploy $(token)
displayName: Publish
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'), eq(variables['Agent.OS'], 'Linux'))
Example azure-pipeline.yml