As part of automating alert creation in Loganalytic workspace,We are using AzureDevops Server and looking for a way to run az deployment group command in Azuredevops pipeline. We have a working pipeline to execute the ADO tasks in Subscription-A like AzureCLI and AzureResourceGroupDeployment using the predefined serviceconnection configured with the service principal{SP-A} which is having required access on Subscripion A and it is working as expected.
Now we got the requirement to execute same pipeline to Subscription B, where the current used servicePrincipal SP-A does not have subscription level access. So we are blocked with the further steps as we are only allowed to get the specific access (like create or modify alert rule in LAW- Microsoft.Insights/ScheduledQueryRules/[Read, Write, Delete] ) for the SP-A to create the alert rules in Loganalyticworkspace of subscription B.
So we are looking for guidance on below things.
Is it possible to configure a service connection to Subscription B using the same SP-A that only has the Microsoft.Insights/ScheduledQueryRules/[Read, Write, Delete] permission on the LAW, and not having any access to subscription B or resource group of LAW-B
If the above method will not work, is there any way to run az group deploy commands with service Pricipal credentails by adding in to the command as paramaters.
The working ADO tasks for subs A is as follows and looking for a solution if we can use same for Sub B
- task: AzureCLI#2
displayName: "verify the deployment changes"
inputs:
azureSubscription: ${{ parameters.subscription }}
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az deployment group what-if --resource-group ${{ parameters.resourceGroup }} --template-file $(System.DefaultWorkingDirectory)/template.json --parameters $(System.DefaultWorkingDirectory)/param.json'
- task: AzureResourceGroupDeployment#2
inputs:
azureSubscription: ${{ parameters.subscription }}
action: 'Create Or Update Resource Group'
resourceGroupName: ${{ parameters.resourceGroup }}
location: ${{ parameters.region }}
templateLocation: 'Linked artifact'
csmFile: '$(System.DefaultWorkingDirectory)/template.json'
csmParametersFile: '$(System.DefaultWorkingDirectory)/param.json'
deploymentMode: 'Incremental'
Is it possible to configure a service connection to Subscription B
using the same SP-A that only has the
Microsoft.Insights/ScheduledQueryRules/[Read, Write, Delete]
permission on the LAW, and not having any access to subscription B or
resource group of LAW-B
It will be not possible as you will be creating custom role with only Microsoft.Insights/ScheduledQueryRules/* which is required for managing/creating Scheduled Query Alerts but for the service account to find the LAW and Deploy the Alerts using the resource group deployment will at least require read permission on subscription or resource group and Microsoft/Resources/deployments/* to create deployments.
You can check these built-in roles for Azure Monitor.
Related
I have a azure devops pipeline with three stages:
- Release_infrastructure_Core
- Release_infrastrcuture_App
- Release_webapps
In the first stage ("Release_infrastructure_Core") among other task a WebApp called "Core-wst-api" with a webjob is created.
In the "Release_webapps" stage I need stop the the webapp "Core-wst-api" and execute the webjob through the exposed webhook,
Trying to stop the webapp I'm getting this error.
##[error]Error: Resource 'Core-wst-api' doesn't exist. Resource should exist before deployment.
Checking in Azure portal the weapp had been properly created.
This is the code to stop the webapp
task: AzureAppServiceManage#0
displayName: 'Stop API'
inputs:
azureSubscription: ${{ parameters.serviceConnectionName }}
Action: 'Stop Azure App Service'
WebAppName: 'Core-wst-api'
I'm trying executing only the stop task and it doesn't work.
If I put in the same stage the webapp publish task and STOP Api task its works ok
I'm using AzureDevops to deploy Azure Function with time trigger.
This is deployment yaml section:
- task: AzureFunctionApp#1
displayName: 'Deploy Function App initImporter'
inputs:
azureSubscription: AzureAutomationSP # Azure service connection
appType: functionApp
appName: $(appName)
package: $(System.ArtifactsDirectory)/**/*.zip
deploymentMethod: runFromPackage
Everything is ok, but App deploys directly to /wwwroot of Function App.
I need to manually trigger it according this guide - https://learn.microsoft.com/en-us/azure/azure-functions/functions-manually-run-non-http
But, the question is that I don't have a subfolder in /wwwroot/{func name}, so I don't know how to build URL.
When i deploy func manually, everything works. But i'm stuck with invoking app from /wwwroot.
az functionapp function show --resource-group MyResourceGroup --name MyFunctionAppName --function-name MyFunctionName also doesn't work, because i don't have "function-name"
P.S. Found quite similar unanswered question - How to change azure web app default deployment directory from WWWROOT to WWWROOT/webapps?
There could be two options, deploy to subfolder or invoke directly from site/wwwroot
I have a database project that is being deployed to an Azure SQL Database instance. This CI pipe was working in another environment outside the organization. We lift/shifted it into this organization. The job that is failing is a deployment job. The task that is used is SqlAzureDacpacDeployment#1.
Error message:
##[error]*** An unexpected failure occurred: One or more errors occurred..
##[error]The Azure SQL DACPAC task failed. SqlPackage.exe exited with code 1.Check out how to troubleshoot failures at
https://aka.ms/sqlazuredeployreadme#troubleshooting-
Code:
- task: SqlAzureDacpacDeployment#1
displayName: 'info...'
inputs:
azureSubscription: $(ServiceConnection)
serverName: $(sqlServer)
databaseName: $(DbName)
SqlUsername: $(AdminAccount)
SqlPassword: $(AdminAccountPassword)
dacpacFile: '$(BuildName)\\db_name\\bin\\Output\\db_name.dacpac'
publishProfile: '$(BuildName)$(publishProfile)'
The deployment task is using a combination of DACPAC and a publish profile. This is necessary due to extensive usage of SQLCMD variables. The agent is a self-hosted Windows agent. It has been updated. Each time a user defined capability was added the agent service was restarted.
I have validated the account and password by connecting to the target instance with both accounts.
I have tried authenticating with Azure Active Directory principals which are admins on the Azure SQL Database.
I tried using SQL Server authentication.
I have added a user defined capability to the Windows Self-hosted agent for SqlPackage with compatibility level 150 which matched the database compatibility level.
I tried reducing the database compatibility level from 150 to 130 to match the system define capability on the agent.
I verified that the directories structure matches the YAML and that the DACPAC and the publish profile exist.
I verified the values stored in pipe variables outside of the YAML.
I verified that the machine that runs the agent has a firewall rule enabled on the Azure SQL Database instance.
I am looking for a likewise task now.
You can use Service Principal instead of SQL Authentication to deploy the Azure SQL Database.
Refer: https://datasharkx.wordpress.com/2021/03/11/automated-deployment-of-azure-sql-database-azure-sql-data-warehouse-through-azure-devops-via-service-principal-part-1/
https://datasharkx.wordpress.com/2021/03/12/automated-deployment-of-azure-sql-database-azure-sql-data-warehouse-through-azure-devops-via-service-principal-part-2/
Also, remove the publishProfile option and instead provided project variables in this format:
AdditionalArguments: /v:MyVariable=Y /v:Environment=TST,
and this should work.
Your final YAML file should look like this:
- task: SqlAzureDacpacDeployment#1
displayName: Deploy dacpac
inputs:
azureSubscription: $(ServiceConnection)
ServerName: <server_name>
DatabaseName: <database_name>
DacpacFile: $(Pipeline.Workspace)\drop\MyDacpac.dacpac
AdditionalArguments: /v:ResetStuff=Y /v:Environment=TST
DeploymentAction: Publish
AuthenticationType: servicePrincipal
I am trying to create an Azure DevOps pipeline to build out a terraform environment in Azure. I wish the tfstate file to be remote in an Azure Storage account. There are lots of simple examples to do this if you wish the storage account to remain publicly accessible.
However I do not. I would like to restrict access to the storage account using a SAS Token.
However I am having a hard time:
Finding reasonable references on this subject.
Finding anything that helps me define the sas token in the pipeline yaml.
My idea was that the SAS Token would be a security pipeline variable or part of a variable group which would then be inserted into the pipeline yaml, and then passed to the underlying terraform.
Attempts at trying script and TerraformTaskV1 contructs have failed. The latest error I received during the pipeline build for the init command is:
Error: Failed to get existing workspaces: storage: service returned error: StatusCode=403, ErrorCode=AuthorizationFailure, ErrorMessage=This request is not authorized to perform this operation.
I believe this is telling me that the sas token definition is failing because its not being applied. I have tested the token manually in a VM in our subscription.
Here is the current attempt:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1#0
displayName: 'Terraform init'
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/modules/terraform/basic-sastoken'
backendServiceArm: $(service_connection)
backendAzureRmResourceGroupName: $(resource_group_name)
backendAzureRmStorageAccountName: $(storage_account_name)
backendAzureRmContainerName: $(container_name)
backendAzureRmKey: $(key)
commandOptions: -input=false -var "sastoken=$(sas_token)"
Ok, so what are my options?
Is this an impossible task? Is this not supported outside a narrow Microsoft Happy Path? Do I need to build my own agent and scale set? Would that even help. Are there any decent references?
Please do not use a SAS token but a service principal to access the storage account. Give the SP contributer rights on the storage account and you should be good to go!
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-release-task.TerraformTaskV1#0
displayName: 'Terraform : init'
inputs:
workingDirectory: '$(System.DefaultWorkingDirectory)/<YOUR TERRAFORM FILES>'
backendServiceArm: '<SERVICE CONNECTION YOU CREATED>'
backendAzureRmResourceGroupName: '<RESOURCE GROUP YOUR STATE STORAGE ACCOUNT IS LOCATED'
backendAzureRmStorageAccountName: <NAME OF STORAGE ACCOUNT WITH STATE>
backendAzureRmContainerName: <CONTAINER NAME>
backendAzureRmKey: '<TERRAFORM STATE KEY>'
Remove the <> with your value.
Regarding the service connection, you can create it in DevOps :).
I am using Bot Framework Virtual Assistant template to Create and configure Bot in Azure,
For this process i have ARM template is in place for creating resources,
Deploy PS script is used to create knowledgebase and (Deploy.ps1) once qnamaker resources are created.
In current implementation, If i execute script from local Powershell tool everything works fine:
Creating Resources
Creating Knowledgebase
KnowledgeBase configuration
I am stuck at configuring this set up in Azure DevOps, How do i configure ARM deployment and PowerShell script execution in CI/CD pipeline.
So that once resources are created through ARM deployment, Knowledgebase creation should automatically trigger ?
Any help is appreciated
First you need to put the ARM template in a source repository(Github or Azure Repos). See document Create a new Git repo in your project
Then Create the pipeline(Yaml or Classic). See YAML example here. For Classic UI pipeline check out this example.
Before you can deploy to your Azure subscription. You need to create an azure Resource Manager service connection to connect your Azure subscription to Azure devops. See this thread for an example
In your pipeline use ARM template deployment task to deploy the ARM template. And use Azure powershell task to execute the Deploy PS script. See below example
trigger:
- master
pool:
vmImage: windows-latest
steps:
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'ARM Template deployment: Subscription scope'
inputs:
azureResourceManagerConnection: 'my-azure-sub'
resourceGroupName: 'azrue resource group'
location: 'West Europe'
csmFile: **/template.json
csmParametersFile: **/parameter.json
deploymentMode: Incremental
- task: AzurePowerShell#5
displayName: 'Azure PowerShell script: FilePath'
inputs:
azureSubscription: 'my-azure-sub'
ScriptPath: **/Deploy.ps1
azurePowerShellVersion: LatestVersion
See this tutorial for more information.