Azure Devops yaml Pipeline - PowerShell file move, then commit - powershell

I am able to move files (using PowerShell) in my Azure DevOps yaml pipeline.
I am able to commit to my git repo via Azure DevOps yaml pipeline. (As found here)
The problem is that I cannot do both in the same pipeline. Moving files worked fine until I added the following commands in a nested template (gitCalled.yml):
steps:
- checkout: self
persistCredentials: true
- script: |
git config --global user.email Continuous.Integrator#yourcompany.com & git config --global user.name "Continuous.Integrator"
workingDirectory: $(System.DefaultWorkingDirectory)
- script: |
git checkout -b packaging-test
echo 'This is a test' > data.txt
git add -A
git commit -m "deployment $(Build.BuildNumber)"
git push --set-upstream origin packaging-test
displayName: Add data.txt file
workingDirectory: $(System.DefaultWorkingDirectory)
Specifically, the checkout/persistCredentials command is what appears to be the issue. The result, when using it, is that my powershell file is no longer able to be found. If I comment out that command, my git commands fail, but the powerShell file is found and ran.
I'm in an either-or situation, but I want to do both ;)
Here is the calling template:
parameters:
- name: aliasPackageName
type: string
stages:
- stage: PipelineMoveFiles
displayName: Pipeline
jobs:
- job: MoveFiles
displayName: MoveFiles
pool:
vmImage: ubuntu-latest
steps:
- pwsh: |
$numOfForceAppFiles = $(Build.SourcesDirectory)/PackageCreation/MoveFiles.ps1
Write-Host "##vso[task.setvariable variable=numOfForceAppFiles;isOutput=true]$numOfForceAppFiles"
name: movesFilesStep
displayName: Move files to their package folders
- pwsh: |
Write-Host "Number of files still to move: "$(movesFilesStep.numOfForceAppFiles)
if ( 0 -eq $numOfForceAppFiles )
{
Write-Host "All Files were moved"
}
else
{
Write-Host "All Files were NOT moved"
# exit 1;
}
displayName: Validate that all files were moved
- template: gitCalled.yml
ERROR:
Starting: Move files to their package folders
==============================================================================
Task : PowerShell
Description : Run a PowerShell script on Linux, macOS, or Windows
Version : 2.200.0
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/utility/powershell
==============================================================================
Generating script.
========================== Starting Command Output ===========================
/usr/bin/pwsh -NoLogo -NoProfile -NonInteractive -Command . '/home/vsts/work/_temp/33de9584-b260-4422-b6db-b8cef3ea129d.ps1'
/home/vsts/work/1/s/PackageCreation/MoveFiles.ps1: /home/vsts/work/_temp/33de9584-b260-4422-b6db-b8cef3ea129d.ps1:2
Line |
2 | … umOfForceAppFiles = /home/vsts/work/1/s/PackageCreation/MoveFiles.ps1
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| The term '/home/vsts/work/1/s/PackageCreation/MoveFiles.ps1'
| is not recognized as a name of a cmdlet, function, script
| file, or executable program. Check the spelling of the name,
| or if a path was included, verify that the path is correct and
| try again.
##[error]PowerShell exited with code '1'.
Finishing: Move files to their package folders
Any help would be much appreciated!!

Based on your YAML sample, you need to execute the Powershell file in PowerShell Inline script and get the output.
You can use the script to execute the ps file:
$numOfForceAppFiles = & "$(Build.SourcesDirectory)/PackageCreation/MoveFiles.ps1"
Refer to the following sample:
parameters:
- name: aliasPackageName
type: string
stages:
- stage: PipelineMoveFiles
displayName: Pipeline
jobs:
- job: MoveFiles
displayName: MoveFiles
pool:
vmImage: ubuntu-latest
steps:
- pwsh: |
       $numOfForceAppFiles = & "$(Build.SourcesDirectory)/PackageCreation/MoveFiles.ps1"
       
       Write-Host "##vso[task.setvariable variable=numOfForceAppFiles;isOutput=true]$numOfForceAppFiles"
displayName: Move files to their package folders

Instead of running the powershell and git commands in different steps (pwsh and script) I decided to roll them together. The following is a partial, but working code.
steps:
- checkout: self
persistCredentials: true
clean: true
- powershell: |
git --version
git config user.email Continuous.Integrator#yourcompany.com
git config user.name "Continuous.Integrator"
git checkout -b packaging-test
$numOfForceAppFiles = $(Build.SourcesDirectory)/PackageCreation/MoveFiles.ps1
git add -A
git commit -m "deployment $(Build.BuildNumber)"
git push --set-upstream origin packaging-test

Related

Permission denied while executing files between stages in Azure

I have a sample project with a simple C file( hello world program). I am trying to get familiar with artifacts and hence I have started with pipeline artifacts. The pipeline has 2 stages, build and Test.
In the build stage, I compile the C file, then publish the artifact. In the test stage, I run the object file.
trigger:
branches:
include:
- '*'
pool:
vmImage: ubuntu-latest
stages:
- stage: build
jobs:
- job: buildjob
steps:
- script: |
echo "building the test.c file"
gcc test.c -o test
echo "build completed"
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(System.DefaultWorkingDirectory)
artifactName: Test
- stage: test
jobs:
- job: testJob
steps:
- download: current
artifact: Test
- script: |
cd Test
echo "Running the object file"
./test
echo "job finished"
Error:
I can see that the artifacts have been published:
Concerns: What do I have to do to get the object file running`? Also can i just pass the object file alone to the artifact? How?
UPDATE
I have managed to find the correct path to the artifact folder. But I cannot seem to execute the file. It shows permission denied
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
branches:
include:
- '*'
pool:
vmImage: ubuntu-latest
stages:
- stage: build
jobs:
- job: buildjob
steps:
- script: |
echo "building the test.c file"
gcc test.c -o test
echo "build completed"
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(System.DefaultWorkingDirectory)
artifactName: Test
- stage: test
jobs:
- job: testJob
steps:
- download: current
artifact: Test
- script: |
echo "Running the object file"
cd $(Pipeline.Workspace)/Test
./test
echo "job finished"
New Error:
The execute permission bit on your compiled executable was lost between stages. When the file is downloaded in stage "test", it does not have the execute bit set anymore hence the "Permission denied" error when you try to runit.
Setting the x bit again by adding via chmod will solve this:
- stage: test
jobs:
- job: testJob
steps:
- download: current
artifact: Test
- script: |
echo "Running the object file"
cd $(Pipeline.Workspace)/Test
chmod +x ./test
./test
echo "job finished"
The issue is that under Linux the file permission get lost through the compression via zip.
A working solution would be to use tar as it preserves the file permissions.
PublishBuildArtifacts
StoreAsTar - Tar the artifact before uploading
boolean. Default value: false.
Adds all files from the publish path to a tar archive before uploading. This allows you to preserve the UNIX file permissions. Use extractTars option of theDownloadBuildArtifacts task to extract the downloaded items automatically. This setting is ignored on Windows agents.
DownloadBuildArtifacts
extractTars - Extract all files that are stored inside tar archives
boolean.
Set to true to extract all downloaded files that have the .tar extension. This is helpful because you need to pack your artifact files into tar if you want to preserve Unix file permissions. Enabling the StoreAsTar option in the Publish build artifacts task will store artifacts as .tar files automatically.

Azure DevOps Pipeline - Checkout only folder [duplicate]

My repository in my organisation's devops project contains a lot of .net solutions and some unity projects as well. When I run my build pipeline, it fails due to several of these:
Error MSB3491: Could not write lines to file "obj\Release\path\to\file". There is not enough space on the disk.
I would like the pipeline to only checkout and fetch parts of the repository that are required for a successful build. This might also help with execution time of the pipeline since it currently also fetches the whole of my unity projects with gigabytes of resources which takes forever.
I would like to spread my projects across multiple repositories but the admin won't give me more than the one I already have. It got a lot better when I configured git fetch as shallow (--depth=1) but I still get the error every now and then.
This is how I configured the checkout:
steps:
- checkout: self
clean: true
# shallow fetch
fetchDepth: 1
lfs: false
submodules: false
The build is done using VSBuild#1 task.
I can't find a valid solution to my problem except for using multiple repositories, which is not an option right now.
Edit: Shayki Abramczyk's solution #1 works perfectly. Here is my full implementation.
GitSparseCheckout.yml:
parameters:
access: ''
repository: ''
sourcePath: ''
steps:
- checkout: none
- task: CmdLine#2
inputs:
script: |
ECHO ##[command] git init
git init
ECHO ##[command] git sparse-checkout: ${{ parameters.sourcePath }}
git config core.sparsecheckout true
echo ${{ parameters.sourcePath }} >> .git/info/sparse-checkout
ECHO ##[command] git remote add origin https://${{ parameters.repository }}
git remote add origin https://${{ parameters.access }}#${{ parameters.repository }}
ECHO ##[command] git fetch --progress --verbose --depth=1 origin master
git fetch --progress --verbose --depth=1 origin master
ECHO ##[command] git pull --progress --verbose origin master
git pull --progress --verbose origin master
Checkout is called like this (where template path has to be adjusted):
- template: ../steps/GitSparseCheckout.yml
parameters:
access: anything:<YOUR_PERSONAL_ACCESS_TOKEN>
repository: dev.azure.com/organisation/project/_git/repository
sourcePath: path/to/files/
In Azure DevOps you don't have option to get only part of the repository, but there is a workaround:
Disable the "Get sources" step and get only the source you want by manually executing the according git commands in a script.
To disable the default "Get Sources" just specify none in the checkout statement:
- checkout: none
In the pipeline add a CMD/PowerShell task to get the sources manually with one of the following 2 options:
1. Get only part of the repo with git sparse-checkout.
For example, get only the directories src_1 and src_2 within the test folder (lines starting with REM ### are just the usual batch comments):
- script: |
REM ### this will create a 'root' directory for your repo and cd into it
mkdir myRepo
cd myRepo
REM ### initialize Git in the current directory
git init
REM ### set Git sparsecheckout to TRUE
git config core.sparsecheckout true
REM ### write the directories that you want to pull to the .git/info/sparse-checkout file (without the root directory)
REM ### you can add multiple directories with multiple lines
echo test/src_1/ >> .git/info/sparse-checkout
echo test/src_2/ >> .git/info/sparse-checkout
REM ### fetch the remote repo using your access token
git remote add -f origin https://your.access.token#path.to.your/repo
REM ### pull the files from the source branch of this build, using the build-in Azure DevOps variable for the branch name
git pull origin $(Build.SourceBranch)
displayName: 'Get only test/src_1 & test/src_2 directories instead of entire repository'
Now in the builds task make myRepo the working directory.
Fetching the remote repo using an access token is necessary, since using checkout: none will prevent your login credentials from being used.
In the end of the pipeline you may want to add step to clean the myRepo directory.
2. Get parts of the repo with Azure DevOps Rest API (Git - Items - Get Items Batch).
The other answers work well but I found a different way using potentially newer features of git.
This will fetch to a depth of 1 and show all the files in the root folder plus folder1, folder2 and folder3
- task: CmdLine#2
inputs:
script: |
git init
git sparse-checkout init --cone
git sparse-checkout set folder1 folder2 folder3
git remote add origin https://<github-username>:%GITHUB_TOKEN%#<your-git-repo>
git fetch --progress --verbose --depth=1 origin
git switch develop
env:
GITHUB_TOKEN: $(GITHUB_TOKEN)
Maybe it is helpful for you to check out only a specific branch. This works by:
resources:
repositories:
- repository: MyGitHubRepo
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
ref: features/tools
steps:
- checkout: MyGitHubRepo
Or by using the inline syntax like so
- checkout: git://MyProject/MyRepo#features/tools # checks out the features/tools branch
- checkout: git://MyProject/MyRepo#refs/heads/features/tools # also checks out the features/tools branch
- checkout: git://MyProject/MyRepo#refs/tags/MyTag # checks out the commit referenced by MyTag.
More information can be found here
A Solution For Pull Request and Master Support
I realized after posting this solution it is similar to the updated one on the post. However this solution is a bit more rich and optimized. But most importantly this solution uses the pull request merge branch in Dev Ops for the deployments like the native checkouts do. It also fetches only the needed commits.
Supports multiple folder/path patterns as parameters
Minimal checkout with the bare minimum needed via sparse checkout
Shallow depth, multithreaded fetch, with a sparse index.
It takes into account using the PR merge branch against main rather than the raw PR branch itself if needed.
Uses native System Token already in pipeline
Handles detection and alternative ref flows for master where a merge branch does not exist.
Example Use in your Script:
- job: JobNameHere
displayName: JobDisplayName Here
steps:
- template: templates/sparse-checkout.yml
parameters:
checkoutFolders:
- /Scripts
- /example-file.ps1
# other steps
templates/sparse-checkout.yaml
parameters:
- name: checkoutFolders
default: '*'
type: object
steps:
- checkout: none
- task: PowerShell#2
inputs:
targetType: inline
script: |
$useMasterMergeIfAvaiable = $true
$checkoutFolders = ($env:CheckoutFolders | ConvertFrom-Json)
Write-Host $checkoutFolders
$sw = [Diagnostics.Stopwatch]::StartNew() # For timing the run.
$checkoutLocation = $env:Repository_Path
################ Setup Variables ###############
$accessToken = "$env:System_AccessToken";
$repoUriSegments = $env:Build_Repository_Uri.Split("#");
$repository = "$($repoUriSegments[0]):$accessToken#$($repoUriSegments[1])"
$checkoutBranchName = $env:Build_SourceBranch;
$prId = $env:System_PullRequest_PullRequestId;
$repositoryPathForDisplay = $repository.Replace("$accessToken", "****");
$isPullRequest = $env:Build_Reason -eq "PullRequest";
################ Configure Refs ##############
if ($isPullRequest)
{
Write-Host "Detected Pull Request"
$pullRequestRefMap = "refs/heads/$($checkoutBranchName):refs/remotes/origin/pull/$prId"
$mergeRefMap = "refs/pull/$prId/merge:refs/remotes/origin/pull/$prId";
$mergeRefRemote = $mergeRefMap.Split(":")[0];
$remoteMergeBranch = git ls-remote $repository "$mergeRefRemote" # See if remote merge ref exiss for PR.
if ($useMasterMergeIfAvaiable -and $remoteMergeBranch)
{
Write-Host "Remote Merge Branch Found: $remoteMergeBranch" -ForegroundColor Green
$refMapForCheckout = $mergeRefMap
$remoteRefForCheckout = "pull/$prId/merge"
}else{
Write-Host "No merge from master found (or merge flag is off in script), using pullrequest branch." -ForegroundColor Yellow
$refMapForCheckout = $pullRequestRefMap
$remoteRefForCheckout = "heads/$checkoutBranchName"
}
$localRef = "origin/pull/$prId"
}else{
Write-Host "This is not a pull request. Assuming master branch as source."
$localRef = "origin/master"
$remoteRefForCheckout = "master"
}
######## Sparse Checkout ###########
Write-Host "Beginning Sparse Checkout..." -ForegroundColor Green;
Write-Host " | Repository: $repositoryPathForDisplay" -ForegroundColor Cyan
if (-not (test-path $checkoutLocation) ) {
$out = mkdir -Force $checkoutLocation
}
$out = Set-Location $checkoutLocation
git init -q
git config core.sparsecheckout true
git config advice.detachedHead false
git config index.sparse true
git remote add origin $repository
git config remote.origin.fetch $refMapForCheckout
git sparse-checkout set --sparse-index $checkoutFolders
Write-Host " | Remote origin configured. Fetching..."
git fetch -j 4 --depth 1 --no-tags -q origin $remoteRefForCheckout
Write-Host " | Checking out..."
git checkout $localRef -q
Get-ChildItem -Name
# tree . # Shows a graphical structure - can be large with lots of files.
############ Clean up ##################
if (Test-Path -Path ..\$checkoutLocation)
{
Write-Host "`nChecked Out`n#############"
Set-Location ../
}
$sw.Stop()
Write-Host "`nCheckout Complete in $($sw.Elapsed.TotalSeconds) seconds." -ForegroundColor Green
displayName: 'Sparse Checkout'
env:
Build_Repository_Uri: $(Build.Repository.Uri)
Build_Reason: $(Build.Reason)
System_PullRequest_SourceBranch: $(System.PullRequest.SourceBranch)
System_PullRequest_PullRequestId: $(System.PullRequest.PullRequestId)
System_PullRequest_SourceRepositoryURI: $(System.PullRequest.SourceRepositoryURI)
Build_BuildId: $(Build.BuildId)
Build_SourceBranch: $(Build.SourceBranch)
CheckoutFolders: ${{ convertToJson(parameters.checkoutFolders) }}
System_AccessToken: $(System.AccessToken)
Repository_Path: $(Build.Repository.LocalPath)
With LFS support on Ubuntu and Windows agents
parameters:
folders: '*'
steps:
- bash: |
set -ex
export ORIGIN=$(Build.Repository.Uri)
export REF=$(Build.SourceVersion)
export FOLDERS='${{ parameters.folders }}'
git version
git lfs version
git init
git sparse-checkout init --cone
git sparse-checkout add $FOLDERS
git remote add origin $ORIGIN
git config core.sparsecheckout true
git config gc.auto 0
git config advice.detachedHead false
git config http.version HTTP/1.1
git lfs install --local
git config uploadpack.allowReachableSHA1InWant true
git config http.extraheader "AUTHORIZATION: bearer $(System.AccessToken)"
git fetch --force --no-tags --progress --depth 1 origin develop $REF
git checkout $REF --progress --force
displayName: Fast sparse Checkout
Then use as a step
steps:
- checkout: none
- template: fastCheckout.yaml
parameters:
folders: 'Folder1 src/Folder2'
You can pass folders as paramters
The exports are there to make it easier to test the script locally.
Improved checkouts from 10mins to 2mins

Avoid git clean with Azure Devops self-hosted Build Agent

I have a YAML build script in an Azure hosted git repository which gets triggered across 7 build agents running on a local VM. Every time this runs, the build performs a git clean which takes a significant amount of time due to a large node_modules folder which takes a long time to clean up.
The MSDN page here seems to suggest this is configurable but shows no detail of how to configure it. I can't tell whether this is a setting that should be specified on the agent, the YAML script, within DevOps on the pipeline, or where.
Is there any other documentation I'm missing or is this not possible?
Update:
The start of the YAML file is here:
variables:
BUILD_VERSION: 1.0.0.$(Build.BuildId)
buildConfiguration: 'Release'
process.clean: false
jobs:
###### ######################################################
###### 1 - Build and publish .NET
#############################################################
- job: net_build_publish
displayName: .NET build and publish
pool:
name: default
steps:
- script: echo $(BUILD_VERSION)
- task: DotNetCoreCLI#2
displayName: dotnet build $(buildConfiguration)
inputs:
command: 'build'
projects: |
myrepo/**/API/*.csproj
arguments: '-c $(buildConfiguration) /p:Version=$(BUILD_VERSION)'
The complete yaml is a lot longer, but the output from the first job includes this output in a Checkout task
Checkout myrepo#master to s
View raw log
Starting: Checkout myrepo#master to s
==============================================================================
Task : Get sources
Description : Get sources from a repository. Supports Git, TfsVC, and SVN repositories.
Version : 1.0.0
Author : Microsoft
Help : [More Information](https://go.microsoft.com/fwlink/?LinkId=798199)
==============================================================================
Syncing repository: myrepo (Git)
Prepending Path environment variable with directory containing 'git.exe'.
git version
git version 2.26.2.windows.1
git lfs version
git-lfs/2.11.0 (GitHub; windows amd64; go 1.14.2; git 48b28d97)
git config --get remote.origin.url
git clean -ffdx
Removing myrepo/Data/Core/API/bin/
Removing myrepo/Data/Core/API/customersettings.json
Removing myrepo/Data/Core/API/obj/
Removing myrepo/Data/Core/Shared/bin/
Removing myrepo/Data/Core/Shared/obj/
....
We have another job further down which runs npm install and npm build for an Angular project, and every build in the pipeline is taking 5 minutes to perform the npm install step, possibly because of this git clean when retrieving the repository?
Click on your pipeline to show the run history
Click Edit
Click the 3 dot kebab menu
Click Triggers
Click YAML
Click Get Sources
Set Clean to False and Save
To say this is obfuscated is an understatement!
I can't say what affect this will have though, I think the agent reuses the same folder each time a pipeline runs and I'm not Node.js developer so I don't know what leaving old node_modules hanging around will do!
P.S. what people were saying about pipeline caching I don't think is what you were asking, also pipeline caching zips up the cached folder and uploads it to your artifacts storage, it then downloads it each time, if you only have 1 build agent then actually not doing a git clean might be more efficent I'm not 100%
As I mentioned below. You need to calculate hash before you run npm install. If hash is the same as the one kept close to node_modules you can skip installing dependencies. This may help you achieve this:
steps:
- task: PowerShell#2
displayName: 'Calculate and save packages.config hash'
inputs:
targetType: 'inline'
pwsh: true
script: |
# generates a hash of package-lock.json
$newHash = Get-FileHash -Algorithm MD5 -Path (Get-ChildItem package-lock.json)
$hashPath = "$(System.DefaultWorkingDirectory)/cache-npm/hash.txt"
if(Test-Path -path $hashPath) {
if(Compare-Object -ReferenceObject $(Get-Content $hashPath) -DifferenceObject $newHash) {
Write-Host "##vso[task.setvariable variable=NodeModulesAreUpToDate;]true"
$newHash > $hashPath
Write-Host ("Hash File saved to " + $hashPath)
} else {
# files are the same
Write-Host "no need to install node_modules"
}
} else {
$newHash > $hashPath
Write-Host ("Hash File saved to " + $hashPath)
}
$storedHash = Get-Content $hashPath
Write-Host $storedHash
workingDirectory: '$(System.DefaultWorkingDirectory)/cache-npm'
- script: npm install
workingDirectory: '$(Build.SourcesDirectory)/cache-npm'
condition: ne(variables['NodeModulesAreUpToDate'], true)
git clean -ffdx will clean any change untracked by source control in the source. You may try Pipeline caching, which can help reduce build time by allowing the outputs or downloaded dependencies from one run to be reused in later runs, thereby reducing or avoiding the cost to recreate or redownload the same files again. Check the following link:
https://learn.microsoft.com/en-us/azure/devops/pipelines/release/caching?view=azure-devops#nodejsnpm
variables:
npm_config_cache: $(Pipeline.Workspace)/.npm
steps:
- task: Cache#2
inputs:
key: 'npm | "$(Agent.OS)" | package-lock.json'
restoreKeys: |
npm | "$(Agent.OS)"
path: $(npm_config_cache)
displayName: Cache npm
In the checkout step, it allows us to set the boolean option clean to true or false. The default is true so it runs git clean by default.
Below is a minimal example with clean set to false.
jobs:
- job: Build_Job
timeoutInMinutes: 0
pool: 'PoolOne'
steps:
- checkout: self
clean: false
submodules: recursive
- task: PowerShell#2
displayName: Make build
inputs:
targetType: 'inline'
script: |
bash -c 'make'
More documentation and related options can be found here

using `git commit —no-verify` for pre-commit azure pipeline

I see that I can use pre-commit with pipelines, is there a way to set up the yaml file for azure pipeline to use git commit --no-verify when if fails for specific cases? or is there a way to troubleshoot the pipeline when the issue occurs?
this is what I have for the yaml file
pool:
vmImage: ubuntu-18.04
variables:
PRE_COMMIT_HOME: $(Pipeline.Workspace)/pre-commit-cache
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: ${{ parameters.python }}
- script: |
echo "##vso[task.setvariable variable=PY]$(python -VV)"
displayName: set version variables
- task: CacheBeta#0
inputs:
key: pre-commit | .pre-commit-config.yaml | "$(PY)"
path: $(PRE_COMMIT_HOME)
- script: python -m pip install --upgrade pre-commit
displayName: install pre-commit
- script: pre-commit run --all-files --show-diff-on-failure
displayName: run pre-commit
Check the documentation here:
Not all hooks are perfect so sometimes you may need to skip execution
of one or more hooks. pre-commit solves this by querying a SKIP
environment variable. The SKIP environment variable is a comma
separated list of hook ids. This allows you to skip a single hook
instead of --no-verifying the entire commit.
$ SKIP=flake8 git commit -m "foo"

Setting environment variable value from .ps1 script not working in Github Actions

I have two ps1 scripts in Github Actions.
My scenario:
The first script executes before build
Project builds
The second script executes after build.
I need to set the value inside the first script and use it inside the second script.
So I decided to use BUILD_NUMBER environment variable and set it to 10 as a default value.
jobs:
Droid:
runs-on: windows-latest
env:
BUILD_NUMBER: "10"
Inside the first script I tried to set this variable in several ways but in the second script the value of BUILD_NUMBER was 10.
My attempts to set it:
[Environment]::SetEnvironmentVariable($env:BUILD_NUMBER, $buildNumber, 'Machine')
$env:BUILD_NUMBER: '123'
But inside the second script I was getting 10 value by this $newName = "${env:BUILD_NUMBER}"
The whole code of Github Actions side:
name: CI
# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
push:
branches:
- 'master'
- 'develop'
- 'feature/*'
- 'rc/*'
pull_request:
branches:
- 'master'
- 'develop'
- 'feature/*'
- 'rc/*'
jobs:
Droid:
runs-on: windows-latest
env:
DOTNET_CLI_TELEMETRY_OPTOUT: 'true'
BUILD_NUMBER: "10"
steps:
- uses: actions/checkout#v1
- name: Run a calculate version and set sign in password script
run: .\Scripts\CalculateVersionAndSetSignPassword.ps1
shell: powershell
# Build goes here. It is skipped by me for testing purposes
- uses: actions/checkout#v1
- name: Run a change apk name script
run: |
.\Scripts\ChangeApkName.ps1
shell: powershell
set-env was depricated - please check GitHub Actions: Deprecating set-env and add-path commands
As a replacement you may use
echo "BUILD_NUMBER=yellow" >> $GITHUB_ENV
and then:
jobs:
show:
runs-on: ubuntu-latest
steps:
- name: Is variable exported?
run: |
echo "BUILD_NUMBER=yellow" >> $GITHUB_ENV
- name: PowerShell script
# You may pin to the exact commit or the version.
# uses: Amadevus/pwsh-script#25a636480c7bc678a60bbf4e3e5ac03aca6cf2cd
uses: Amadevus/pwsh-script#v2.0.0
continue-on-error: true
with:
# PowerShell script to execute in Actions-hydrated context
script: |
Write-Host $env:BUILD_NUMBER
- name: Read exported variable
run: |
echo "${{ env.BUILD_NUMBER}}"
To set environment variables in a step that can be referenced in another, you will need to use the ::set-env syntax.
In your case, your first script will have to run this command:
Write-Output "::set-env name=BUILD_NUMBER::$buildNumber"
And the second script should be able to reference it with $env:BUILD_NUMBER.
[6/20/20] Update with full example.
Action yaml file (Inline powershell will have similar behavior than with a ps1):
name: StackOverFlow
on:
push:
branches: [ master ]
jobs:
build:
runs-on: windows-latest
steps:
- run: |
$buildNumber = "12345"
Write-Output "::set-env name=BUILD_NUMBER::$buildNumber"
- run: Write-Output "Doing something else..."
- run: Write-Output "The build number is $env:BUILD_NUMBER"
Output logs:
2020-06-20T23:13:23.3209811Z ##[section]Starting: Request a runner to run this job
2020-06-20T23:13:23.5144969Z Can't find any online and idle self-hosted runner in current repository that matches the required labels: 'windows-latest'
2020-06-20T23:13:23.5145013Z Can't find any online and idle self-hosted runner in current repository's account/organization that matches the required labels: 'windows-latest'
2020-06-20T23:13:23.5145038Z Found online and idle hosted runner in current repository's account/organization that matches the required labels: 'windows-latest'
2020-06-20T23:13:23.6348644Z ##[section]Finishing: Request a runner to run this job
2020-06-20T23:13:29.9867339Z Current runner version: '2.263.0'
2020-06-20T23:13:29.9982614Z ##[group]Operating System
2020-06-20T23:13:29.9983190Z Microsoft Windows Server 2019
2020-06-20T23:13:29.9983380Z 10.0.17763
2020-06-20T23:13:29.9983515Z Datacenter
2020-06-20T23:13:29.9983691Z ##[endgroup]
2020-06-20T23:13:29.9983875Z ##[group]Virtual Environment
2020-06-20T23:13:29.9984067Z Environment: windows-2019
2020-06-20T23:13:29.9984247Z Version: 20200608.1
2020-06-20T23:13:29.9984524Z Included Software: https://github.com/actions/virtual-environments/blob/win19/20200608.1/images/win/Windows2019-Readme.md
2020-06-20T23:13:29.9984752Z ##[endgroup]
2020-06-20T23:13:29.9985890Z Prepare workflow directory
2020-06-20T23:13:30.0151643Z Prepare all required actions
2020-06-20T23:13:30.9154166Z ##[group]Run $buildNumber = "12345"
2020-06-20T23:13:30.9154566Z [36;1m$buildNumber = "12345"[0m
2020-06-20T23:13:30.9154784Z [36;1mWrite-Output "::set-env name=BUILD_NUMBER::$buildNumber"[0m
2020-06-20T23:13:30.9820753Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2020-06-20T23:13:30.9821156Z ##[endgroup]
2020-06-20T23:13:43.2981407Z ##[group]Run Write-Output "Doing something else..."
2020-06-20T23:13:43.2981812Z [36;1mWrite-Output "Doing something else..."[0m
2020-06-20T23:13:43.3022226Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2020-06-20T23:13:43.3022501Z env:
2020-06-20T23:13:43.3022706Z BUILD_NUMBER: 12345
2020-06-20T23:13:43.3022906Z ##[endgroup]
2020-06-20T23:13:43.8091340Z Doing something else...
2020-06-20T23:13:43.8671648Z ##[group]Run Write-Output "The build number is $env:BUILD_NUMBER"
2020-06-20T23:13:43.8671986Z [36;1mWrite-Output "The build number is $($env:BUILD_NUMBER)"[0m
2020-06-20T23:13:43.8717102Z shell: C:\Program Files\PowerShell\7\pwsh.EXE -command ". '{0}'"
2020-06-20T23:13:43.8717288Z env:
2020-06-20T23:13:43.8718175Z BUILD_NUMBER: 12345
2020-06-20T23:13:43.8718286Z ##[endgroup]
2020-06-20T23:13:44.4148124Z The build number is 12345
2020-06-20T23:13:44.4368449Z Cleaning up orphan processes
Found the resolution in Michael Stum`s repo that he provided in this question:
The key was Get-ChildItem Env: | Where-Object {$_.Name -Match "^MH_"} | %{ echo "::set-output name=$($_.Name)::$($_.Value)" } in .yml and $Env:MH_BUILD_VERSION = $version in .ps1 script file in his repository.
So I successfully retrieved an output from .ps1 script and used it in Github Actions.