I'm trying to use AWS cli within a script section of an Azure pipeline. The script section is in a template file and it's accessed from the main pipeline.
steps:
- bash: |
step_function_state=`aws stepfunctions list-executions --state-machine-arn $(stateMachineArn) --status-filter RUNNING | jq -r '.executions[]|.status' | head -1`
echo "State machine RUNNING status: ${step_function_state}"
# Rest of the script#
displayName: "Test Script"
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_DEFAULT_REGION: $(AWS_DEFAULT_REGION)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
stateMachineArn, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION are stored in a variable group. When running the pipeline it gives the following error,
An error occurred (UnrecognizedClientException) when calling the ListExecutions operation: The security token included in the request is invalid.
Using the same credentials I am able to run my local CLI and get the results.
I tried printenv command and all the AWS variables are in the environment too. What could I possibly do wrong?
I realized that this issue occurred due to credential mismatch.
After adding the correct credentials (same as local cli) the pipeline CLI also started to work.
Based on the error log it felt like aws_session_token could be an issue but the actual issue was in aws_access_key_id and aws_secret_access_key.
Related
From a devops build pipeline, I'd like to run a bicep file for a deployment into a resource group.
My devops service connection is names '365response-tfssc-dev', as seen below:
My yaml job is as follows:
- job: deployAzure
displayName: deploy bicep to Azure
pool:
vmImage: "ubuntu-latest"
dependsOn: [waitForValidation]
steps:
- task: AzureCLI#2
displayName: Deploy Bicep To Azure
inputs:
azureSubscription: "365response-tfssc-dev"
scriptType: "bash"
scriptLocation: "inlineScript"
inlineScript: |
az deployment group create --resource-group rg-365Response-$(env)-001 \
--template-file '$(System.DefaultWorkingDirectory)\bicep\365Response.main.bicep' \
--parameters '$(System.DefaultWorkingDirectory)\bicep\365Response.parameters.$(env).json' \
If I run this from the terminal window of vs code then it works ok.
When this job runs it gives the following message:
/usr/bin/az account set --subscription 'correct subscription id is
listed here' /usr/bin/bash
/home/vsts/work/_temp/azureclitaskscript1654444101122.sh ERROR:
unrecognized arguments:
ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_SERVICEMANAGEMENTURL=https://m...
this line is very very long
The very long line is as follows:
ERROR: unrecognized arguments: ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_SERVICEMANAGEMENTURL=https://management.core.windows.net/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ENVIRONMENT=AzureCloud ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ARMMANAGEMENTPORTALURL=https://portal.azure.com/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_MANAGEMENTPORTALURL=https://manage.windowsazure.com/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_GALLERYURL=https://gallery.azure.com/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_SUBSCRIPTIONID=subIdHere ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8={"environment":"AzureCloud","scopeLevel":"Subscription","subscriptionId":"subIdHere","subscriptionName":"dev-001","creationMode":"Manual","environmentUrl":"https://management.azure.com/","galleryUrl":"https://gallery.azure.com/","serviceManagementUrl":"https://management.core.windows.net/","resourceManagerUrl":"https://management.azure.com/","activeDirectoryAuthority":"https://login.microsoftonline.com/","environmentAuthorityUrl":"https://login.windows.net/","graphUrl":"https://graph.windows.net/","managementPortalUrl":"https://manage.windowsazure.com/","armManagementPortalUrl":"https://portal.azure.com/","activeDirectoryServiceEndpointResourceId":"https://management.core.windows.net/","sqlDatabaseDnsSuffix":".database.windows.net","AzureKeyVaultDnsSuffix":"vault.azure.net","AzureKeyVaultServiceEndpointResourceId":"https://vault.azure.net","StorageEndpointSuffix":"core.windows.net","EnableAdfsAuthentication":"false"} ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_SQLDATABASEDNSSUFFIX=.database.windows.net ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ENVIRONMENTAUTHORITYURL=https://login.windows.net/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_CREATIONMODE=Manual ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_AZUREKEYVAULTSERVICEENDPOINTRESOURCEID=https://vault.azure.net ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_SUBSCRIPTIONNAME=dev-001 ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_AZUREKEYVAULTDNSSUFFIX=vault.azure.net ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_SCOPELEVEL=Subscription agent.jobstatus=Succeeded ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ACTIVEDIRECTORYSERVICEENDPOINTRESOURCEID=https://management.core.windows.net/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_GRAPHURL=https://graph.windows.net/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ENVIRONMENTURL=https://management.azure.com/ ENDPOINT_URL_7940768d-1de7-44d9-92bf-05d293639bc8=https://management.azure.com/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ACTIVEDIRECTORYAUTHORITY=https://login.microsoftonline.com/ ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_ENABLEADFSAUTHENTICATION=false ENDPOINT_DATA_7940768d-1de7-44d9-92bf-05d293639bc8_RESOURCEMANAGERURL=https://management.azure.com/ SELENIUM_JAR_PATH=/usr/share/java/selenium-server.jar COMMON_TESTRESULTSDIRECTORY=/home/vsts/work/1/TestResults GOROOT_1_17_X64=/opt/hostedtoolcache/go/1.17.10/x64 CONDA=/usr/share/miniconda SYSTEM_JOBNAME=__default AGENT_RETAINDEFAULTENCODING=false JAVA_HOME_11_X64=/usr/lib/jvm/temurin-11-jdk-amd64 SYSTEM_PIPELINESTARTTIME=2022-06-05 15:48:16+00:00 AZURE_CONFIG_DIR=/home/vsts/work/_temp/.azclitask SYSTEM_TASKINSTANCENAME=AzureCLI AGENT_HOMEDIRECTORY=/home/vsts/agents/2.204.0 AGENT_TEMPDIRECTORY=/home/vsts/work/_temp BUILD_REQUESTEDFOREMAIL=aza.'my email here' VSTS_PROCESS_LOOKUP_ID=vsts_8ec9ddb3-be14-4d39-96fe-b09bdd94b311 SYSTEM_COLLECTIONURI=https://dev.azure.com/idsservicesbeta/ BUILD_DEFINITIONNAME=Scaffolding (1) ENDPOINT_URL_SYSTEMVSSCONNECTION=https://dev.azure.com/idsservicesbeta/ JAVA_HOME=/usr/lib/jvm/temurin-11-jdk-amd64 GRADLE_HOME=/usr/share/gradle-7.4.2 SYSTEM_STAGENAME=deployBicep SYSTEM_JOBPARALLELISMTAG=Private AGENT_OS=Linux BUILD_BUILDURI=vstfs:///Build/Build/1755 AGENT_JOBNAME=deploy bicep to Azure XDG_CONFIG_HOME=/home/vsts/.config DOTNET_SKIP_FIRST_TIME_EXPERIENCE=1 BUILD_REPOSITORY_URI=https://idsservicesbeta#dev.azure.com/idsservicesbeta/365-Response/_git/Scaffolding ANT_HOME=/usr/share/ant RESOURCES_TRIGGERINGALIAS= JAVA_HOME_8_X64=/usr/lib/jvm/temurin-8-jdk-amd64 BUILD_DEFINITIONVERSION=1 HOMEBREW_PREFIX=/home/linuxbrew/.linuxbrew RUNNER_TOOLSDIRECTORY=/opt/hostedtoolcache SYSTEM_SERVERTYPE=Hosted AGENT_USEWORKSPACEID=true BUILD_REQUESTEDFORID=08c91bb3-5fb2-6b27-a830-47c6829ed7f8 SYSTEM_JOBIDENTIFIER=deployBicep.deployAzure.__default SYSTEM_ARTIFACTSDIRECTORY=/home/vsts/work/1/a AGENT_VERSION=2.204.0 HOMEBREW_CLEANUP_PERIODIC_FULL_DAYS=3650 BUILD_SOURCEVERSIONAUTHOR=BizTalkers SYSTEM_JOBDISPLAYNAME=deploy bicep to Azure BUILD_REPOSITORY_NAME=Scaffolding BOOTSTRAP_HASKELL_NONINTERACTIVE=1 PWD=/home/vsts/work/1/s PIPX_BIN_DIR=/opt/pipx_bin BUILD_ARTIFACTSTAGINGDIRECTORY=/home/vsts/work/1/a AGENT_ACCEPTTEEEULA=True BUILD_SOURCEBRANCHNAME=main AGENT_UPLOADTIMELINEATTACHMENTSTOBLOB=true TASK_DISPLAYNAME=Deploy Bicep To Azure BUILD_CONTAINERID=27996509 ANDROID_NDK_LATEST_HOME=/usr/local/lib/android/sdk/ndk/23.2.8568313 RESOURCES_TRIGGERINGCATEGORY= POWERSHELL_DISTRIBUTION_CHANNEL=Azure-DevOps-ubuntu20 SYSTEM_STAGEDISPLAYNAME=deployBicep SYSTEM_PLANID=6892c8d0-c78e-4c67-b035-05b3489e50dc SYSTEM_POSTLINESSPEED=500 BUILD_BUILDNUMBER=Deploy Bicep files 1755 DOTNET_MULTILEVEL_LOOKUP=0 BUILD_REPOSITORY_LOCALPATH=/home/vsts/work/1/s VSTS_AGENT_PERFLOG=/home/vsts/perflog HOME=/home/vsts LANG=C.UTF-8 BUILD_REPOSITORY_PROVIDER=TfsGit STATS_KEEPALIVE=false SYSTEM_TIMELINEID=6892c8d0-c78e-4c67-b035-05b3489e50dc SYSTEM_PHASEDISPLAYNAME=deploy bicep to Azure SYSTEM_TASKDEFINITIONSURI=https://dev.azure.com/idsservicesbeta/ BUILD_STAGINGDIRECTORY=/home/vsts/work/1/a SYSTEM_HOSTTYPE=build AGENT_WORKFOLDER=/home/vsts/work SYSTEM_STAGEID=bc4f992b-d3a8-5fa4-4306-364494a1b562 SYSTEM_DEFINITIONID=45 INVOCATION_ID=ddfbd830e49e4577879f4d283f4ac321 INPUT_SCRIPTARGUMENTS= AGENT_DISABLELOGPLUGIN_TESTFILEPUBLISHERPLUGIN=true TF_BUILD=True JAVA_HOME_17_X64=/usr/lib/jvm/temurin-17-jdk-amd64 AGENT_TASKRESTRICTIONSENFORCEMENTMODE=Enabled AGENT_ROOTDIRECTORY=/home/vsts/work SYSTEM_JOBATTEMPT=1 ANDROID_NDK_HOME=/usr/local/lib/android/sdk/ndk-bundle SYSTEM_DEFINITIONNAME=Scaffolding (1) HOMEBREW_NO_AUTO_UPDATE=1 BUILD_BINARIESDIRECTORY=/home/vsts/work/1/b NVM_DIR=/home/vsts/.nvm SGX_AESM_ADDR=1 SYSTEM_PHASEATTEMPT=1 SYSTEM_ENABLEACCESSTOKEN=SecretVariable SYSTEM_TEAMFOUNDATIONSERVERURI=https://dev.azure.com/idsservicesbeta/ SYSTEM_TASKDISPLAYNAME=Deploy Bicep To Azure BUILD_BUILDID=1755 TEMPLATEFILE=bicep/365Response.main.json BUILD_REPOSITORY_ID=92e4e7ea-8e17-425b-ad1c-899f9922bc0f AGENT_NAME=Hosted Agent ANDROID_HOME=/usr/local/lib/android/sdk SYSTEM_JOBPOSITIONINPHASE=1 AGENT_MACHINENAME=fv-az414-868 ACCEPT_EULA=Y SYSTEM_PULLREQUEST_ISFORK=False SYSTEM_JOBTIMEOUT=60 SYSTEM_TEAMPROJECTID=4009b106-170a-496d-9af8-9ec836b38dc3 SYSTEM_COLLECTIONID=b3e27278-2d93-48a2-af86-fa3370179011 USER=vsts SYSTEM_TEAMPROJECT=365-Response HOMEBREW_CELLAR=/home/linuxbrew/.linuxbrew/Cellar BUILD_SOURCEVERSION=715f5872b0f65eade29314d0f30bf57a3f191896 PIPX_HOME=/opt/pipx AGENT_DISABLELOGPLUGIN_TESTRESULTLOGPLUGIN=true SYSTEM_PHASEID=f1ebf77f-30ac-526d-968c-fab23fa199f8 GECKOWEBDRIVER=/usr/local/share/gecko_driver BUILD_REASON=Manual SYSTEM_STAGEATTEMPT=1 CHROMEWEBDRIVER=/usr/local/share/chrome_driver SHLVL=0 SYSTEM=build ANDROID_SDK_ROOT=/usr/local/lib/android/sdk VCPKG_INSTALLATION_ROOT=/usr/local/share/vcpkg HOMEBREW_REPOSITORY=/home/linuxbrew/.linuxbrew/Homebrew ImageVersion=20220529.1 BUILD_SOURCEBRANCH=refs/heads/main AZURE_HTTP_USER_AGENT=VSTS_b3e27278-2d93-48a2-af86-fa3370179011_build_45_0 DOTNET_NOLOGO=1 BUILD_SOURCESDIRECTORY=/home/vsts/work/1/s MSDEPLOY_HTTP_USER_AGENT=VSTS_b3e27278-2d93-48a2-af86-fa3370179011_build_45_0 TASK_SKIPTRANSLATORFORCHECKOUT=False SYSTEM_CULTURE=en-US SYSTEM_WORKFOLDER=/home/vsts/work STATS_PFS=true GRAALVM_11_ROOT=/usr/local/graalvm/graalvm-ce-java11-22.1.0 AGENT_READONLYVARIABLES=true AGENT_ID=8 BUILD_QUEUEDBYID=08c91bb3-5fb2-6b27-a830-47c6829ed7f8 AZURE_EXTENSION_DIR=/opt/az/azcliextensions AGENT_BUILDDIRECTORY=/home/vsts/work/1 BUILD_REQUESTEDFOR=Rob Bowman ANDROID_NDK_ROOT=/usr/local/lib/android/sdk/ndk-bundle CHROME_BIN=/usr/bin/google-chrome AGENT_UPLOADBUILDARTIFACTSTOBLOB=true SYSTEM_DEFAULTWORKINGDIRECTORY=/home/vsts/work/1/s GOROOT_1_18_X64=/opt/hostedtoolcache/go/1.18.2/x64 JOURNAL_STREAM=8:23147 AGENT_OSARCHITECTURE=X64 LEIN_HOME=/usr/local/lib/lein LEIN_JAR=/usr/local/lib/lein/self-installs/leiningen-2.9.8-standalone.jar SYSTEM_ISSCHEDULED=False BUILD_REPOSITORY_GIT_SUBMODULECHECKOUT=False PATH=/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin:/home/vsts/.local/bin:/opt/pipx_bin:/home/vsts/.cargo/bin:/home/vsts/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/vsts/.dotnet/tools:/snap/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin SYSTEM_JOBID=d562b731-90ac-599c-aa5d-b4e5e0c32cf4 BUILD_QUEUEDBY=Rob Bowman SWIFT_PATH=/usr/share/swift/usr/bin PIPELINE_WORKSPACE=/home/vsts/work/1 ImageOS=ubuntu20 BUILD_SOURCEVERSIONMESSAGE=ubuntu-latest SYSTEM_TEAMFOUNDATIONCOLLECTIONURI=https://dev.azure.com/idsservicesbeta/ AGENT_LOGTOBLOBSTORAGESERVICE=true LOCATION=uksouth SYSTEM_TASKINSTANCEID=44b963b8-127f-5c06-baab-44a1330fee42 AGENT_JOBSTATUS=Succeeded DEBIAN_FRONTEND=noninteractive GIT_TERMINAL_PROMPT=0 AGENT_TOOLSDIRECTORY=/opt/hostedtoolcache SYSTEM_PHASENAME=deployAzure OLDPWD=/home/vsts/work/1/s SYSTEM_TOTALJOBSINPHASE=1 GOROOT_1_16_X64=/opt/hostedtoolcache/go/1.16.15/x64 _=/usr/bin/env-001
Anyone see where I've gone wrong?
Please check if your service connection's Azure AD service principal has access to the Azure subscription you are trying to deploy to. Your error message doesn't really look like that, but the error message is not fully shown in your question. (That "very long" error message can be very important. ;) )
Most likely you should debug the az deployment group create ... script locally with the variable values replaced for yourself manually, and see if you can reproduce the error. That would mean that Azure Pipelines has nothing to do with this, you should just make your deployment instruction work and all will be good.
If #1 is not applicable for you (e.g. your deployment instruction is working totally fine locally but it is still failing in the pipeline), my recommendation is to look into Azure CLI version on the pipelines agent vs. the one you need and maybe add az Azure CLI upgrade/downgrade task to suit your needs.
For example, we have used these 2 steps to update AzureCLI when the MS hosted agent version contained a bug.
- script: sudo apt-get update
- task: AzureCLI#2
inputs:
azureSubscription: ${{ parameters.armServiceConnection }}
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |-
az --version
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
az --version
Problem was using ubuntu image - should have been windows-2022
I'm trying to upload blobs using "az storage blob upload-batch". I got below two exceptions.
##[error]Azure CLI 2.x is not installed on this machine.
##[error]Script failed with error: Error: Unable to locate executable file: 'pwsh'. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also verify the file has a valid extension for an executable file.
task: AzureCLI#2
displayName: 'Upload Files'
inputs:
azureSubscription: 'xxxxxxx'
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
$subscriptionName = $(az account show --query 'name' --output tsv)
az storage blob upload-batch `
--subscription $subscriptionName `
--account-name xxxxxxxxx `
--source "C:\xxx\yyy" `
--destination MyContainerName `
--auth-mode login
Thank you #N MZ , Posting your suggestion as an answer to help other community members.
##[error]Azure CLI 2.x is not installed on this machine.
##[error]Script failed with error: Error: Unable to locate executable file: 'pwsh'. Please verify either the file path exists or the file
can be found within a directory specified by the PATH environment
variable. Also verify the file has a valid extension for an executable
file.
For this above errors we need to install Azure cli
and check running cmd az version if its installed or not .
For More information please refer the below links:
MS DOC:Azure File Copy task
SO THREAD: How to run Azure CLI tasks from an Azure DevOps Pipeline
Microsoft Documentation
Although not the best approach, for a quick test, you could install az cli while running the job steps:
jobs:
- job: my_job
steps:
- script: curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
displayName: 'install az cli'
My pipeline publishes two different build artifacts when all its tests have passed - stage: publish_pipeline_as_build.
One of my tests needs to use the build that was made in the current run, of the current version.
But additionally, I need to get the build artifact of the previous version, in order to run some compatibility tests.
How do I download the build artifact from that other pipeline run?
I know the build artifact name (from runtime script), but how would I find that?
I tried playing around with azure-cli az pipelines runs artifact list. It requies a --run-id and actually my script won’t have that.
So far I kind of managed, assuming the response of az pipelines runs list retuns the latest match to the query first:
az pipelines runs list --project PROJNAME --query "[?sourceBranch=='refs/heads/releases/R21.3.0.2']" | jq '.[0]'
I currently seem to run out of Ideas.
Perhaps just some confused/frustrated questions that pop up:
How can I find that specific build artifact name's latest version and download it?
How are pipeline tasks fed with runtime generated values?
Is this so ridiculously difficult when doing it in Azure DevOps, or am I just going the wrong way?
The job I'm trying to get there with:
jobs:
- job: test_session_integration
dependsOn: easysales_Build
steps:
- template: ./utils/cache_yarn_and_install.yml
- template: ./utils/update_webdriver.yml
- template: ./utils/download_artifact.yml
parameters:
artifact: easysales_$(Build.BuildId)_build
path: $(System.DefaultWorkingDirectory)/dist
# current release name as output
- template: ./utils/get_release_name.yml
# previous release name, branch and build name output
- template: ./utils/get_prev_release.yml
# clone prev version manually - can't use output variables as task input
# (BTW: why? that is super inconvenient, is there really no way?)
- bash: |
git clone --depth 1 -b $(get_prev_release.BRANCH_NAME) \
"https://${REPO_USERNAME}:${REPO_TOKEN}#dev.azure.com/organisation/PROJECTNAME/_git/frontend-app" \
./reference
workingDirectory: $(System.DefaultWorkingDirectory)
env:
REPO_TOKEN: $(GIT_AUTH_TOKEN)
REPO_USERNAME: $(GIT_AUTH_USERNAME)
name: clone_reference_branch
Any clues?
I'd be glad for any rubber ducking hints or clues on how I would be able to achieve what I need.
I'm new to Azure DevOps and currently struggle to find orientation in the vast but also quite in many places bits and pieces documentation Microsoft offers to me. It's fine, but frankly I struggle quite a bit with it. Is it just me having this problem?
All stages and full YAML on pastebin
The main template with the stages (Expanded templates made with "download full YAML"):
stages:
- stage: install_prepare
displayName: install & prepare
jobs:
- template: az_templates/install_hls_lib_build_job.yml
- stage: test_and_build
displayName: test and build projects
dependsOn: install_prepare
jobs:
- template: az_templates/build_projects_jobs.yml
- template: az_templates/test_session_integration_job.yml
- stage: publish_pipeline_as_build
displayName: Publish finished project artifacts as builds
dependsOn: test_and_build
jobs:
- template: az_templates/build_artifact_publish_jobs.yml
I by now found a solution. Perhaps not a definitive one, but should sort of work:
In my library variable group I added a azure devops personal access token with read access to the necessary groups as ADO_PAT_TOKEN
I Sign in with azure-cli with that token
I get the latest run id with azure cli
- bash: |
az devops configure --defaults organization=$(System.CollectionUri)
az devops configure --defaults project=$(System.TeamProject)
echo "$AZ_DO_TOKEN" | az devops login
AZ_QUERY="[?sourceBranch=='refs/heads/$(prev_release.BRANCH_NAME)'] | [0].id"
ID=$(az pipelines runs list --query-order FinishTimeDesc --query "$AZ_QUERY")
echo "##vso[task.setvariable variable=ID;isOutput=true]$ID"
env:
AZ_DO_TOKEN: $(ADO_PAT_TOKEN)
name: prev_build_run
I then download the artifact with azure-cli and the queried run id
- bash: |
az pipelines runs artifact download \
--artifact-name 'easysales_$(prev_release.PREV_RELEASE_VERSION)' \
--run-id $(prev_build_run.ID) \
--path '$(System.DefaultWorkingDirectory)/reference/dist/easySales'
workingDirectory: $(System.DefaultWorkingDirectory)
name: download_prev_release_build_artifact
This roughly seems to work for me now... finally 😉
Missing
The personal access token I added to the secrets may work, but AFAIS these tokens can not be created with no expiry date further than one year in the future.
That is not ideal, since I don't want my pipeline to stop working and perhaps no one around knows how to fix this.
Perhaps someone knows how I can use azure CLI within the current pipeline without authentication?
Given it accesses only the current organization and project?
Or does anyone see a more elegant solution to my admittedly clumsy solution.
I implemented a bunch of infrastructure checks (PowerShell scripts) that need to be ran on Window Servers (most of them use Get-WmiObject cmdlet). I put them along with their Pester tests on GitLab and trying to build a pipeline.
I have read creating-your-first-windows-container-with-docker-for-windows and building-a-simple-release-pipeline-in-powershell-using-psake-pester-and-psdeploy but I am very confused. My understanding is that to have the code run on GitLab CI, I will need to build a Windows Server docker image?
the following is my .gitlab-ci.yml file but it has authentication errors, the image can be found here:
image: ltsc2019
stages:
- build
- test
- deploy
build:
stage: build
script:
# run PowerShell script
- powershell -File "\Deploy\Build.ps1"
test:
stage: test
script:
- powershell -File "\Deploy\CodeCoverage.ps1"
deploy:
stage: deploy
script:
- powershell -File "\Deploy\Deploy_Local.ps1"
It wouldn't pass the initial build and here are the error I got:
# Error 1
ERROR: Job failed: Error response from daemon: pull access denied for ltsc2019, repository does not exist or may require 'docker login' (executor_docker.go:168:3s)
# Error 2 (this happened because I added 'shell: "powershell"'
# after executor in the gitlab-runner congif file)
ERROR: Preparation failed: Docker doesn't support shells that require script file
ltsc2019 is one tag of the mcr.microsoft.com/windows/servercore.
You need to refer this image at the beginning of your .gitlab-ci.yml :
image: mcr.microsoft.com/windows/servercore:ltsc2019
Anyone who struggles to get docker images working on your Docker for Windows, Please read Docker executor currently doesn't support Docker for Windows. Please check out executor if you are building a pipeline that needs a container to run it
I'm currently creating a pipeline for Azure DevOps to validate and apply a Terraform configuration to different subscription.
My terraform configuration uses modules, those are "hosted" in other repositories in the same Azure DevOps Project as the terraform configuration.
Sadly, when I try to perform terraform init to fetch those modules, the pipeline task "hang" there waiting for credentials input.
As recommanded in the Pipeline Documentation on Running Git Commands in a script I tried to add a checkout step with the persistCredentials:true attribute.
From what I can see in the log of the task (see bellow), the credentials information are added specifically to the current repo and are not usable for other repos.
The command performed when adding persistCredentials:true
2018-10-22T14:06:54.4347764Z ##[command]git config http.https://my-org#dev.azure.com/my-org/my-project/_git/my-repo.extraheader "AUTHORIZATION: bearer ***"
The output of terraform init task
2018-10-22T14:09:24.1711473Z terraform init -input=false
2018-10-22T14:09:24.2761016Z Initializing modules...
2018-10-22T14:09:24.2783199Z - module.my-module
2018-10-22T14:09:24.2786455Z Getting source "git::https://my-org#dev.azure.com/my-org/my-project/_git/my-module-repo?ref=1.0.2"
How can I setup the git credentials to work for other repositories ?
You have essentially two ways of doing this.
Pre-requisite
Make sure that you read and, depending on your needs, that you apply the Enable scripts to run Git commands section from the "Run Git commands in a script" doc.
Solution #1: dynamically insert the System.AccessToken (or a PAT, but I would not recommend it) at pipeline runtime
You could to this either by:
inserting a replacement token such as __SYSTEM_ACCESSTOKEN__ in your code (as Nilsas suggests) and use some token replacement code or the qetza.replacetokens.replacetokens-task.replacetokens task to insert the value. The disadvantage of this solution is that you would also have to replace the token when you run you terraform locally.
using some code to replace all git::https://dev.azure.com text with git::https://YOUR_ACCESS_TOKEN#dev.azure.com.
I used the second approach by using the following bash task script (it searches terragrunt files but you can adapt to terraform files without much change):
- bash: |
find $(Build.SourcesDirectory)/ -type f -name 'terragrunt.hcl' -exec sed -i 's~git::https://dev.azure.com~git::https://$(System.AccessToken)#dev.azure.com~g' {} \;
Abu Belai offers a PowerShell script to do something similar.
This type of solution does not however work if modules in your terraform modules git repo call themselves modules in another git repo, which was our case.
Solution #2: adding globally the access token in the extraheader of the url of your terraform modules git repos
This way, all the modules' repos, called directly by your code or called indirectly by the called modules' code, will be able to use your access token. I did so by adding the following step before your terraform/terragrunt calls:
- bash: |
git config --global http.https://dev.azure.com/<your-org>/<your-first-repo-project>/_git/<your-first-repo>.extraheader "AUTHORIZATION: bearer $(System.AccessToken)"
git config --global http.https://dev.azure.com/<your-org>/<your-second-repo-project>/_git/<your-second-repo>.extraheader "AUTHORIZATION: bearer $(System.AccessToken)"
You will need to set the extraheader for each of the called git repos.
Beware that you might need to unset the extraheader after your terraform calls if your pipeline sets the extraheader several times on the same worker. This is because git can get confused with multiple extraheader declaration. You do this by adding to following step:
- bash: |
git config --global --unset-all http.https://dev.azure.com/<your-org>/<your-first-repo-project>/_git/<your-first-repo>.extraheader
git config --global --unset-all http.https://dev.azure.com/<your-org>/<your-second-repo-project>/_git/<your-second-repo>.extraheader
I had the same issue, what I ended up doing is tokenizing SYSTEM_ACCESSTOKEN in terraform configuration. I used Tokenzization task in Azure DevOps where __ prefix and suffix is used to identify and replace tokens with actual variables (it is customizable but I find double underscores best for not interfering with any code that I have)
- task: qetza.replacetokens.replacetokens-task.replacetokens#3
displayName: 'Replace tokens'
inputs:
targetFiles: |
**/*.tfvars
**/*.tf
tokenPrefix: '__'
tokenSuffix: '__'
Something like find $(Build.SourcesDirectory)/ -type f -name 'main.tf' -exec sed -i 's~__SYSTEM_ACCESSTOKEN__~$(System.AccessToken)~g' {} \; would also work if you do not have ability to install custom extensions to your DevOps organization.
My terraform main.tf looks like this:
module "app" {
source = "git::https://token:__SYSTEM_ACCESSTOKEN__#dev.azure.com/actualOrgName/actualProjectName/_git/TerraformModules//azure/app-service?ref=__app-service-module-ver__"
....
}
It's not beautiful but it gets the job done. Module source (at the time of writing) does not support variable input from terraform. So what we can do is to use Terrafile it's an open source project helping with keeping up with the modules and different versions of the same module you might use by keeping a simple YAML file next to your code. It seems that it's no longer being actively maintained, however it just works: https://github.com/coretech/terrafile
my example of Terrafile:
app:
source: "https://token:__SYSTEM_ACCESSTOKEN__#dev.azure.com/actualOrgName/actualProjectName/_git/TerraformModules"
version: "feature/handle-twitter"
app-stable:
source: "https://token:__SYSTEM_ACCESSTOKEN__#dev.azure.com/actualOrgName/actualProjectName/_git/TerraformModules"
version: "1.0.5"
Terrafile by default download your modules to ./vendor directory so you can point your module source to something like:
module "app" {
source = "./vendor/modules/app-stable/azure/app_service"
....
}
Now you just have to figure out how to execute terrafile command in the directory where Terrafile is present.
My azure.pipelines.yml example:
- script: curl -L https://github.com/coretech/terrafile/releases/download/v0.6/terrafile_0.6_Linux_x86_64.tar.gz | tar xz -C $(Agent.ToolsDirectory)
displayName: Install Terrafile
- script: |
cd $(Build.Repository.LocalPath)
$(Agent.ToolsDirectory)/terrafile
displayName: Download required modules
I did this
_ado_token.ps1
# used in Azure DevOps to allow terrform to auth with Azure DevOps GIT repos
$tfmodules = Get-ChildItem $PSScriptRoot -Recurse -Filter "*.tf"
foreach ($tfmodule in $tfmodules) {
$content = [System.IO.File]::ReadAllText($tfmodule.FullName).Replace("git::https://myorg#","git::https://" + $env:SYSTEM_ACCESSTOKEN +"#")
[System.IO.File]::WriteAllText($tfmodule.FullName, $content)
}
azure-pipelines.yml
- task: PowerShell#2
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
filePath: '_ado_token.ps1'
pwsh: true
displayName: '_ado_token.ps1'
I Solved the issue by creating a Pipeline template that runs a inline powershell script. I then pull in the template as the Pipeline template a "resource" when using any terraform module form a different Repo.
The script will do a recursive search for all the .tf files. Then use regex to update all the module source urls.
I chose REGEX over tokenizing the module url, because this will make sure the modules can be pulled in on a development machine without any changes to the source.
parameters:
- name: terraform_directory
type: string
steps:
- task: PowerShell#2
displayName: Tokenize TF-Module Sources
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: 'inline'
script: |
$regex = "https://*(.+)dev.azure.com"
$tokenized_url = "https://token:$($env:SYSTEM_ACCESSTOKEN)#dev.azure.com"
Write-Host "Recursive Search in ${{ parameters.terraform_directory }}"
$tffiles = Get-ChildItem -Path "${{ parameters.terraform_directory }}" -Filter "*main.tf" -Recurse -Force
Write-Host "Found $($tffiles.Count) files ending with 'main.tf'"
if ($tffiles) { Write-Host $tffiles }
$tffiles | % {
Write-Host "Updating file $($_.FullName)"
$content = Get-Content $_.FullName
Write-Host "Replace Strings: $($content | Select-String -Pattern $regex)"
$content -replace $regex, $tokenized_url | Set-Content $_.FullName -Force
Write-Host "Updated content"
Write-Host (Get-Content $_.FullName)
}
As far as I can see, the best way to do this is exactly the same as with any other Git provider. It is only for Azure DevOps that I have ever come across the extraheader approach. I have always used this, and after not being able to get a satisfactory result with the other suggested approaches, I went back to it:
- script: |
MY_TOKEN=foobar
git config --global url."https://${MY_TOKEN}#dev.azure.com".insteadOf "https://dev.azure.com"
I don't think you can. Usually, you create another build and link to the artifacts from that build to use it in your current definition. That way you don't need to connect to a different Git repository