Accessing key vault - sas token - azure devops - azure-devops

Within Azure DevOps, my pipeline emits the error:
##[error]Get secrets failed. Error: Could not fetch access token for Azure. Verify if the Service Principal used is valid and not expired..
I am fetching other values, without fault from key vault and verified my service principal has the right permissions to read from key vault as that is where the sas token is deployed.
In fact the service principal is dynamically created via terraform and has no set expiration so that ruiles out that root cause.
I checked within Azure Key Vault and showed the sas-token secret to verify it is being pulled correctly.
The code block within my azurepipeline YAML file, reads the token as an environment variable:
# Init
- task: TerraformCLI#0
displayName: Initialize Terraform
env:
# using interpolation to reference sas secret environment variable being pulled from the key vault
ARM_SAS_TOKEN: $(sas_token)
inputs:
command: init
# workind directory points to the context within which the command above is ran
workingDirectory: $(System.DefaultWorkingDirectory)/ned-cloud-tutorial/vnet
# configuring the backend, sotrage account, container name and key.
commandOptions: -backend-config=storage_account_name=$(storageaccount) -backend-config=container_name=$(container_name) -backend-config=key=$(key)
backendType: selfConfigured
I am more than happy to share more code if additional informaiton is needed as this is a very particular question and I have had no luck on stackoverflow and general google queries. Any help would be appreciated.
Thanks in advance!

Related

I am getting an error called Either Value or Key vault must be provided and Secret Identifier is not in the correct format

I have azure key vault service in which we are maintaining secrets.
I have to deploy APIM service using ARM JOB in Azure devops release pipeline so I have added this job and added configured template.json and parameter.json and how to pass key vault as over ride parameter to ARM job in over ride parameters?. I tried with below option
I have added keyvault job/varaible group in azure pipelines then in over ride params i called $(keyvaultname/secretname) then saved it and ran the pipeline but i am geeting below issue
enter image description here
Please go to Pipelines -> Library -> create a variable group which contain the keyvalut.
Link the Variable group in your pipeline, make sure the variable of secret is listed.
In the ARM task, overwrite the parameters with "$(var)" name.
PFA .
I have created Variable group and then came back to release pipeline arm job then in the override parameter .
Arm job over ride parameter
Variable group

Azure DevOps pipeline for Pulumi when using Azure blob storage as backend

Instead of using Pulumi service (managed) backend, I am using Azure blob container as stack state backend. According to the documentation, Pulumi CLI would expect AZURE_STORAGE_KEY (or AZURE_STORAGE_SAS_TOKEN) environment variable in the Pipeline Agent.
When account key is provided as pipeline variable, it's working. But when account key is stored in KeyVault as secret, it's not working.
What I did:
Account key in KayVault as secret
→ (pipeline variable group) Link secrets from an Azure key vault as variables
→ → (Pipeline variables) Link variable group
→ → → Make account key available to Pulumi CLI (in pipeline agent) as environment variable
The problem:
KeyVault secret name can not contain '_'
Secret name for account key is AZURE-STORAGE-KEY (not AZURE_STORAGE_KEY since secret name can not contain '_')
Environment variable in pipeline agent becomes AZURE-STORAGE-KEY
So, the problem is obvious: environment variable name mismatch → Pulumi CLI is not getting what it expects (AZURE_STORAGE_KEY).
FYI,
I am using "classic editor" and "Pulumi Azure Pipelines Task"
I tried creating a pipeline variable with "Name: azure.storage.key, Value: $(AZURE-STORAGE-KEY)", hoping that this variable value will be set from KeyVault secret since secrets are linked to variable group → did not working
Tried to set environment variable in a PowerShell task ($env:AZURE_STORAGE_KEY = "$(AZURE-STORAGE-KEY)") mentioned that this PowerShell task is in front of "Pulumi Azure Pipelines Task" → did not work
Pulumi documentation Pulumi Task Extension for Azure Pipelines and
Other StackOverflow question Pulumi Azure Pipeline task did not help
How to solve this problem?
Is there any better approach? (providing account key to Pulumi CLI in pipeline agent securely).
Is there any way to achieve this if YAML (azure-pipelines.yml) is used? how? (Any work around or hint would also help)
You need to set the variable using a different powershell task:
echo ##vso[task.setvariable variable=AZURE_STORAGE_KEY;]$(AZURE-STORAGE-KEY)
I was able to solve the problem :
As mentioned in my question, KeyVault secrets are linked to variable group
Variable group is linked to pipeline variables (so now KeyVault secrets are available as pipeline variables)
According to Microsoft documentation, secret type variables are not injected into the task automatically
Used env for task to inject environment variables (that are expected by Pulumi CLI) by interpolating pipeline variables
azure-pipelines.yml
pool:
vmImage: 'ubuntu-latest'
variables:
- group: "pulumi-demo-vg"
job: PulumiUpJob
displayName: Pulumi Up Stack Deployment Job
steps:
- task: Pulumi#1
inputs:
azureSubscription: 'pulumi-demo-sc' # sc -> Service Connection
command: 'up'
loginArgs: 'azblob://pulumi-backend-container'
args: '--yes'
stack: 'dev'
env:
AZURE_STORAGE_ACCOUNT: "xxxsa"
AZURE_STORAGE_KEY: $(AZURE-STORAGE-KEY)
AZURE_CLIENT_ID: $(AZURE-CLIENT-ID)
AZURE_CLIENT_SECRET: $(AZURE-CLIENT-SECRET)
AZURE_TENANT_ID: $(AZURE-TENANT-ID)

AzureSubcription and azureContainerRegistry connection from Library

I have created a Docker Compose in my pipeline and Azure created the code. The azureSubscription and the azureContainerRegistry connection are very clear.
I tried to replace them with variable from the Library but when the pipeline starts I immediately get an error.
There was a resource authorization issue: "The pipeline is not valid. Job Build: Step DockerCompose1 input azureSubscriptionEndpoint references service connection $(AzureSubscription) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz. Job Build: Step DockerCompose2 input azureSubscriptionEndpoint references service connection $(AzureSubscription) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."
Basically, Azure DevOps can't replace the variable with the value for those particular parameters. I don't want to send around those configurations for obviuos reasons.
I saw some old posts where Microsoft said this was an issue in DevOps. Is this issue still there? Is there any way to move those values in the Libray or a variables?
This is still an issue. It have to be an literal or variables defined in YAML. It cannot be variable provied via variable group for instance. Please check these topics:
How to parametrize azureSubscription in azure devops template task
Azure subscription endpoint ID cannot be provided through a variable in build definition YAML file
Azure subscription endpoint ID cannot be provided through a variable in build definition YAML file

Can Azure DevOps Terraform Task can use a Storage Account SAS token to store remote state file?

I am trying to create an Azure DevOps pipeline to build out a terraform environment in Azure. I wish the tfstate file to be remote in an Azure Storage account. There are lots of simple examples to do this if you wish the storage account to remain publicly accessible.
However I do not. I would like to restrict access to the storage account using a SAS Token.
However I am having a hard time:
Finding reasonable references on this subject.
Finding anything that helps me define the sas token in the pipeline yaml.
My idea was that the SAS Token would be a security pipeline variable or part of a variable group which would then be inserted into the pipeline yaml, and then passed to the underlying terraform.
Attempts at trying script and TerraformTaskV1 contructs have failed. The latest error I received during the pipeline build for the init command is:
Error: Failed to get existing workspaces: storage: service returned error: StatusCode=403, ErrorCode=AuthorizationFailure, ErrorMessage=This request is not authorized to perform this operation.
I believe this is telling me that the sas token definition is failing because its not being applied. I have tested the token manually in a VM in our subscription.
Here is the current attempt:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1#0
displayName: 'Terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/modules/terraform/basic-sastoken'
backendServiceArm: $(service_connection)
backendAzureRmResourceGroupName: $(resource_group_name)
backendAzureRmStorageAccountName: $(storage_account_name)
backendAzureRmContainerName: $(container_name)
backendAzureRmKey: $(key)
commandOptions: -input=false -var "sastoken=$(sas_token)"
Ok, so what are my options?
Is this an impossible task? Is this not supported outside a narrow Microsoft Happy Path? Do I need to build my own agent and scale set? Would that even help. Are there any decent references?
Please do not use a SAS token but a service principal to access the storage account. Give the SP contributer rights on the storage account and you should be good to go!
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1#0
displayName: 'Terraform : init'
inputs:
workingDirectory: '$(System.DefaultWorkingDirectory)/<YOUR TERRAFORM FILES>'
backendServiceArm: '<SERVICE CONNECTION YOU CREATED>'
backendAzureRmResourceGroupName: '<RESOURCE GROUP YOUR STATE STORAGE ACCOUNT IS LOCATED'
backendAzureRmStorageAccountName: <NAME OF STORAGE ACCOUNT WITH STATE>
backendAzureRmContainerName: <CONTAINER NAME>
backendAzureRmKey: '<TERRAFORM STATE KEY>'
Remove the <> with your value.
Regarding the service connection, you can create it in DevOps :).

Azure DevOps Terraform Init - Remote State - Failed to get existing workspaces: containers.Client#ListBlobs: Failure sending request: StatusCode=0

Experiencing terraform for the very time, I'm following the document from this link to put in my terraform files in a release pipeline that I have with Azure DevOps. Everything runs perfectly fine until the step where it initializes the terraform. It fails with the following error message:
The storage account itself is provisioned and the key of that also is persisted successfully in the environment variables as per the document.
The YAML I have for terraform init in Azure DevOps Release pipeline is:
And the terraform script for the backend service is:
The variables are stored as environment variables inside the release pipeline and there is a replace token task that replaces __ with string empty:
Her is the step in the pipeline that create the resource group and storage account:
And finally, the PS scripts that store the storage key in the ENV vars:
Also, I can't understand why the get http from the error message has env appended to the terraform.tformstate.
I'm running out of ideas why it fails with that exception and what is expecting actually.
I've been Googling around but have been failing so far to resolve the issue. Appreciate your help/thoughts on this.
Looks like you misspelled storageaccount for your variable. So the value is not substituted. You have sotrageaccount. The t and o are swapped.
While Christian Pearce did answer the immediate question, there is underlying problem for this message.
There is something wrong with your Storage Account settings
The issue I had was that I placed path information into the Container name
- task: TerraformTaskV3#3
displayName: Terraform Init
inputs:
provider: azurerm
command: init
backendServiceArm: [service connection]
backendAzureRmResourceGroupName: [resource group]
backendAzureRmStorageAccountName: [storage account name]
backendAzureRmContainerName: [container]/subfolder <-- This is bad and belongs in the Key field
backendAzureRmKey: ${{ parameters.name }}/${{ parameters.environment }}.tfstate
workingDirectory: $(Pipeline.Workspace)/Infrastructure
FYI: I got the same error when specifying container name (which was not created yet in the storage account) with the upper case which turned out not to be allowed in Azure Storage.