Access GitVersion variable in azure pipeline for javascript build - azure-devops

I've got the following pipeline:
steps:
- task: GitVersion#4
- script: |
echo '##vso[task.setvariable variable=buildVersion]$(GitVersion.FullSemVer")'
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- task: Npm#1
inputs:
command: 'install'
workingDir: '$(Build.SourcesDirectory)'
displayName: "NPM Install"
- task: Npm#1
inputs:
command: 'custom'
workingDir: '$(Build.SourcesDirectory)'
customCommand: 'run-script build'
displayName: "NPM Build"
- task: Npm#1
inputs:
command: 'custom'
workingDir: '$(Build.SourcesDirectory)'
customCommand: 'npm version $(buildVersion)'
displayName: "Add version"
But I can't get access to the GitVersion output. I've tried with simply referencing $(GitVersion.FullSemVer) as well, but it gives the same result. The output from npm version is:
[command]C:\windows\system32\cmd.exe /D /S /C "C:\hostedtoolcache\windows\node\10.16.0\x64\npm.cmd npm version "$(GitVersion.FullSemVer)'""
Usage: npm <command>
If I write out the actual variables it looks fine.
Edit: It seems the problem at hand is that the version number is quoted, which npm doesn't like. So the question is more how to make that not happen.

You have an extra " in $(GitVersion.FullSemVer"), just remove it and it will be fine:
echo '##vso[task.setvariable variable=buildVersion]$(GitVersion.FullSemRev)'
For example:
- task: GitVersion#4
- script: 'echo ##vso[setvariable variable=buildVersion]$(GitVersion.FullSemRev)'
- script: 'echo $(buildVersion)'
Results:

Related

Azure Pipeline Copy Secure file into build folder

I've a vite/svelte project which uses .env files for environment settings. I also have an Azure Pipeline which contains a secure file .env.staging this is on the .gitignore list of the associated repo. I'd like to download this secure file, copy it to my build directory and then have it's contents read when I run vite build --mode staging (well, npm run build:staging which includes vite build...)
When run locally from my machine npm run build:staging works as expected and reads the .env.staging file, however it seems to get ignored when used in the pipeline, am I doing anything wrong?
Here's my yml.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DownloadSecureFile#1
name: "dotenvStaging"
inputs:
secureFile: '.env.staging'
displayName: "Download .env.staging"
- task: NodeTool#0
inputs:
versionSpec: 14.15.4
displayName: "Install Node.JS"
- task: CopyFiles#2
inputs:
contents: "$(Agent.TempDirectory)/.env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"
- script: npm install
displayName: "npm install"
- script: npm run build:staging
displayName: "npm run build:staging"
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: 'dist'
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
#replaceExistingArchive: true
#verbose: # Optional
#quiet: # Optional
displayName: "Create archive"
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
ArtifactName: 'drop'
publishLocation: 'Container'
displayName: "Publish archive"
I'm not sure if CopyFiles#2 is doing what I expect or not as it just matches the content parameter to copy whatever files match, which could be 0 if I'm writing it wrong...
Another note, I also tried using $(dotenvStaging.secureFilePath) as the content parameter, but it doesn't seem to do anything either.
Naturally I figured it out as soon as I posted, I needed to update the CopyFiles part to specify sourceFolder, clearly it didn't like my absolute file path for content.
- task: CopyFiles#2
inputs:
sourceFolder: "$(Agent.TempDirectory)"
contents: ".env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"

Can't Switch to Windows Latest Agent in Azure Pipeline. Gives error gyp ERR! This is a bug in `node-gyp`

pool:
name: Azure Pipelines
demands: npm
#Your build pipeline references an undefined variable named ‘Parameters.solution’. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab. See https://go.microsoft.com/fwlink/?linkid=865972
#Your build pipeline references an undefined variable named ‘Parameters.ArtifactName’. Create or edit the build pipeline for this YAML file, define the variable on the Variables tab. See https://go.microsoft.com/fwlink/?linkid=865972
steps:
task: NuGetToolInstaller#0
displayName: 'Use NuGet 4.4.1'
inputs:
versionSpec: 4.4.1
task: NuGetCommand#2
displayName: 'NuGet restore'
inputs:
restoreSolution: '$(Parameters.solution)'
feedsToUse: config
nugetConfigPath: Nuget.config
task: Npm#1
displayName: 'npm custom'
inputs:
command: custom
verbose: false
customCommand: 'cache clean --force'
task: Npm#1
displayName: 'npm custom'
inputs:
command: custom
verbose: false
customCommand: 'i --unsafe-perm node-sass'
task: Npm#1
displayName: 'npm custom'
inputs:
command: custom
verbose: false
customCommand: 'install run-sequence'
task: NodeTool#0
displayName: 'Use Node 6.x'
task: Npm#1
displayName: 'npm install'
inputs:
verbose: false
task: gulp#0
displayName: 'gulp CI-Build'
inputs:
targets: 'CI-Build'
task: ArchiveFiles#2
displayName: Output
inputs:
rootFolderOrFile: output
includeRootFolder: false
task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
ArtifactName: '$(Parameters.ArtifactName)'
condition: succeededOrFailed()
It is giving the error as below:
3675 error gyp ERR! cwd D:\a\1\s\node_modules\gulp-sass\node_modules\node-sass
3675 error gyp ERR! node -v v16.13.2
3675 error gyp ERR! node-gyp -v v3.8.0
3675 error gyp ERR! This is a bug in node-gyp.
3675 error gyp ERR! Try to update node-gyp and file an Issue if it does not help:
3675 error gyp ERR! https://github.com/nodejs/node-gyp/issues
3675 error Build failed with error code: 7
3676 verbose exit 1
I have tried different version of Node like Node11.x & Node 13.x
But can't compile the Build.
Please help !

Adding artifacts at one level up in root folder

I have an artifact called onnxruntime.dll which are downloaded from pipeline and its folder structure is like this MyProject_x64_windows/bin/Nodes/onnxruntime.dll . I would like this artifact to be downloaded at one level up i.e. MyProject_x64_windows/bin/onnxruntime.dll
I am not sure how it is getting downloaded at that level and how can I fix this. I cant copy the complete YAML but am providing the one which I think is required:
variables:
IppRoot: $(Build.SourcesDirectory)/packages/IPP
ONNXXRoot: $(Build.SourcesDirectory)/packages/ONNXRuntime
- stage: MyProject
jobs:
- job: MyProject_Build
strategy:
matrix:
win:
imageName: 'windows-2019'
OrzRootSuffix: 'x64-windows-staticlib'
osSuffix: 'windows'
LibFT4222Suffix: 'windows'
matlabVersion: '9.6.0-2'
extraCmakeOptions: '-D MyProject_ONNX_SUPPORT=On
-D ONNX_RUNTIME_ROOT:PATH=$(ONNXRoot)'
pool:
vmImage: $(imageName)
steps:
- checkout: self
lfs: true
- task: UniversalPackages#0
displayName: 'Download pre-build ONXX Runtime headers and libraries'
inputs:
command: 'download'
vstsFeed: 'MyProjectPackages'
vstsFeedPackage: 'microsoft.ml.onxxruntime'
vstsPackageVersion: '*' # use the latest
downloadDirectory: '$(ONNXRoot)'
- download: SCMockPipeline
displayName: Download SCMock
artifact: scmock
condition: eq(variables['Agent.OS'], 'Windows_NT')
- script: python -m pip install jinja2
displayName: Install python jinja2 template engine
- task: CMake#1
displayName: CMake configure
inputs:
workingDirectory: '$(Build.BinariesDirectory)'
cmakeArgs: '-G Ninja
$(extraCmakeOptions)
-DLibFT4222_ROOT=$(LibFT4222Root)
-DIPP_ROOT=$(IppRoot)
-DCMAKE_BUILD_TYPE=Release
-DCMAKE_INSTALL_PREFIX=$(Build.ArtifactStagingDirectory)
$(Build.SourcesDirectory)'
- task: CMake#1
displayName: CMake build
inputs:
workingDirectory: '$(Build.BinariesDirectory)'
cmakeArgs: '--build . --target install'
- task: PublishPipelineArtifact#1
displayName: 'Publish MyProject'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)
artifactName: 'MyProject_x64-$(osSuffix)'
- task: DownloadPipelineArtifact#2
displayName: Download MyProject artifact
inputs:
artifact: 'MyProject_x64-$(osSuffix)'
path: '$(Build.SourcesDirectory)/MyProject_x64-$(osSuffix)'
You can try to set the destination directory in the DownloadPipelineArtifact task to download the artifact to the bin folder.
- task: PublishPipelineArtifact#1
displayName: 'Publish MyProject'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/bin/Nodes/onnxruntime.dll
artifactName: 'MyProject_x64-$(osSuffix)'
- task: DownloadPipelineArtifact#2
displayName: Download MyProject artifact
inputs:
artifact: 'MyProject_x64-$(osSuffix)'
path: '$(Build.SourcesDirectory)/MyProject_x64-$(osSuffix)/bin'

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

How to consume artifacts in AzureDevops build pipeline from one job to another job

I have created a build pipeline, where
in Job1: I create a specific file in python container
in Job2: I would like to use that file for my next process to consume in Docker Container
To achieve this, I have created artifact called configCreate for Job1 and in job2 I was trying to download the artifact created above in $(System.DefaultWorkingDirectory). But when I tried to access the file I still see the file from source not the file from Job1. How can I access the artifact from Job1 artifact to Job2 ? Reference
So,
stages:
- stage: Build
displayName: BUILD NON MASTER BRANCH
condition: and(succeeded(), ne(variables['Build.SourceBranch'], 'refs/heads/master'))
variables:
- group: Common_DEV
jobs:
- job: buildConfig
displayName: 'Create Properties file.'
container: python3
pool: rhel
steps:
- bash: echo "Hello World!!! - $(Build.SourceBranch)"
displayName: "Started building for $(Build.SourceBranch)"
- bash: |
echo PythonV3
python3 -m venv venv
source venv/bin/activate
python --version
pip3 install -r $(System.DefaultWorkingDirectory)/requirements.txt
python3 injectConfigProperties.py
echo "Finish creating the Properties."
deactivate
rm -r venv/
cat $(System.DefaultWorkingDirectory)/properties/config.properties
displayName: Build Config file
- task: CopyFiles#2
inputs:
contents: |
$(System.DefaultWorkingDirectory)
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathtoPublish: $(Build.ArtifactStagingDirectory)
artifactName: configCreate
- job: Job2
displayName: 'JOb2'
container: docker
pool: rhel
dependsOn:
- buildConfig
condition: succeeded()
steps:
- task: DownloadBuildArtifacts#0
displayName: 'Download Build Artifacts from buildConfig.'
inputs:
artifactName: configCreate
downloadPath: $(System.DefaultWorkingDirectory)
- bash: |
cat $(System.DefaultWorkingDirectory)/properties/config.properties
ls -ltr $(System.DefaultWorkingDirectory)/pipelines/
displayName: 'Build Docker Image'
in job2 to access the artifact is cat $(System.DefaultWorkingDirectory)/artifactName/properties/config.properties
I missed artifact Name to access the artifact. Now it's fixed.