I'm running Azure devops where i have a pipeline and a release (running on a self hosted agent), and the release is set to send up to an azure app service. The deployment works fine, the only issue is that i'd also like to be able to (based on some of my release variables) edit the web.config of the site AFTER it has already been deployed to the azure website.
I'm using the Azure Powershell task ( https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-powershell?view=azure-devops ) ,and i can't find anywhere in the release variables ( https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops&tabs=batch ) that shows the directory of where the site exists. Looking through Kudu, it's pretty basic, like d:\home\site\wwwroot\ , but using that doesn't work at all.
Is this post config that i'm looking for not really possible, or should i be approaching it a different way?
I think you can use powershell task to call Kudu api to get the deployed web.config and edit it using Magic Chunks task or File Transform task in your release pipeline. Then using Kudu api to upload the changed web.config to azure website again.
1, Below script shows how to get web.config from azure website.
$srcResGroupName = "Test"
$srcWebAppName = "tstest12"
$outwebconfig="$(System.DefaultWorkingDirectory)\tempfolder\web.config"
# Get publishing profile for SOURCE application
$srcWebApp = Get-AzWebApp -Name $srcWebAppName -ResourceGroupName $srcResGroupName
[xml]$publishingProfile = Get-AzWebAppPublishingProfile -WebApp $srcWebApp
# Create Base64 authorization header
$username = $publishingProfile.publishData.publishProfile[0].userName
$password = $publishingProfile.publishData.publishProfile[0].userPWD
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$apiBaseUrl = "https://$($srcWebApp.Name).scm.azurewebsites.net/api/vfs/site/wwwroot/web.config"
# Download the web.config file to $outwebconfig
Invoke-RestMethod -Uri "$apiBaseUrl" `
-Headers #{UserAgent="powershell/1.0"; `
Authorization=("Basic {0}" -f $base64AuthInfo)} `
-Method GET `
-OutFile $outwebconfig
Above script will download the web.config from the azure website and save it to $(System.DefaultWorkingDirectory)\tempfolder\web.config, where you can edit it with transform task later.
Above scripts get the username and password with scripts, you can also get them in the publish profile by going to the Overview blade on your App Service, clicking ...More at the top of the blade, and then clicking Get publish profile
2, Then you can add a config transform task to change your web.config according.
3, Last add a powershell task to upload the changed web.config to azure website
$srcResGroupName = "Test"
$srcWebAppName = "tstest12"
$webconfig="$(System.DefaultWorkingDirectory)\tempfolder\web.config"
# Get publishing profile for SOURCE application
$srcWebApp = Get-AzWebApp -Name $srcWebAppName -ResourceGroupName $srcResGroupName
[xml]$publishingProfile = Get-AzWebAppPublishingProfile -WebApp $srcWebApp
# Create Base64 authorization header
$username = $publishingProfile.publishData.publishProfile[0].userName
$password = $publishingProfile.publishData.publishProfile[0].userPWD
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$apiUrl = "https://$($srcWebApp.Name).scm.azurewebsites.net/api/vfs/site/wwwroot/web.config";
Invoke-RestMethod -Uri $apiUrl -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method PUT -InFile $webconfig -ContentType "application/xml";
For more usage of Kudu api you can check here.
Related
I am Trying to Access Azure Storage Account Via Azure Windows VM.
I followed This Microsoft Document Link: https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-datalake
I followed almost All steps That Mentioned In the above Document Link& JWT Access Token also Generated Successfully But My Commands For Uploading/Downloading Files are Throwing Errors.
Error: InvalidAuthenticationInfoAuthentication information is not given in the correct format
Please Correct me if i Used any wrong Commands For Download/Upload Files Via Virtual Machines Through Managed Identity
Commands Used For Generating JSW Token:
$response = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https://adlsrg.blob.core.windows.net/' -Method GET -Headers #{Metadata="true"}
$content = $response.Content | ConvertFrom-Json
$AccessToken = $content.access_token
To access storage accounts, you need to generate access token for https://storage.azure.com resource.
I tried to reproduce the same in my environment and got below results:
I created one VM and enabled system-assigned managed identity like below:
Assign Storage Blob Data Contributor role to VM under your storage account as below:
Go to Azure Portal -> Storage accounts -> Your account -> Access Control (IAM) -> Add role assignment
Now connect to VM and run below PowerShell commands to get access token:
$response = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https://storage.azure.com' -Method GET -Headers #{Metadata="true"}
$content = $response.Content | ConvertFrom-Json
$AccessToken = $content.access_token
Response:
To upload file to storage account, you can use below script:
$file = "C:\Users\sri\Desktop\hello.txt" #File path
$name = (Get-Item $file).Name
$url="https://sristorageacc5.blob.core.windows.net/sri/$($name)"
$RequestHeader = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$RequestHeader.Add("Authorization", "Bearer $AccessToken")
$RequestHeader.Add("x-ms-version", "2019-02-02")
$RequestHeader.Add("x-ms-blob-type", "BlockBlob")
$result = Invoke-WebRequest -Uri $url -Method Put -Headers $RequestHeader -InFile $file
Response:
When I checked the same in Portal, file uploaded to container successfully like below:
I'm using azure AzureWebApp#1 to publish the contents of my front end app, however it has to be deployed to a folder inside the wwwroot of the app service. I tried using the customDeployFolder but it is not working. Is there a way I can achieve this using yaml? Thanks
Thanks # levi-lu-msft , Your answer helped lot.
You can use the KUDU API to deploy the azure app service outside the wwwroot with the artifacts. You need to add an azure PowerShell task in your release pipeline and run kudu api. Below scripts is for example.
1, scripts to create a directory CustomDomain
$WebApp = Get-AzWebApp -Name '<appname>' -ResourceGroupName '<resourcegroupname>'
[xml]$publishingProfile = Get-AzWebAppPublishingProfile -WebApp $WebApp
# Create Base64 authorization header
$username = $publishingProfile.publishData.publishProfile[0].userName
$password = $publishingProfile.publishData.publishProfile[0].userPWD
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$bodyToPOST = #{
command = "md CustomDomain"
dir = "D:\home\site"
}
# Splat all parameters together in $param
$param = #{
# command REST API url
Uri = "https://<appname>.scm.azurewebsites.net/api/command"
Headers = #{Authorization=("Basic {0}" -f $base64AuthInfo)}
UserAgent = "powershell/1.0"
Method = "POST"
Body = (ConvertTo-Json $bodyToPOST)
ContentType = "application/json"
}
# Invoke REST call
Invoke-RestMethod #param
Above scripts will first get the username and password from your app's publishprofile which will be used later as anthentication in calling kudu api. And the api will run your self-defined command to make directory CustomDomain in "d:\home\site"
2, Deploy your app using kudu api.
When the CustomDomain directory is created, you can invoke kudu api to deploy your app to CustomDomain directory. Please refer to below example.
$param = #{
# zipdeploy api url
Uri = "https://<appname>.scm.azurewebsites.net/api/zip/site/CustomDomain"
Headers = #{Authorization=("Basic {0}" -f $base64AuthInfo)}
UserAgent = "powershell/1.0"
Method = "PUT"
# Deployment Artifact Path
InFile = "$(System.DefaultWorkingDirectory)\<artifacts_alias>\drop\<artifacts_name>.zip"
ContentType = "multipart/form-data"
}
# Invoke REST call
Invoke-RestMethod #param
The value InFile should point to location of the artifact file which is downloaded by your release pipeline. Usually it is located in "$(System.DefaultWorkingDirectory)\<artifacts_alias>\drop\<artifacts_name>.zip"
Refer here for more info
Looking at az pipelines documentation it seems it's not possible to clone a pipeline using cli.
I've looked at getting the yaml (az pipelines show -name=x > x_orig.yaml) and then trying to change json and create pipeline from modified yaml, but that feels like a lot of work that could break after next update.
Is there a way to clone a pipline without going the the Web UI?
Currently, there indeed is not available Azure CLI that can clone or export/import a pipeline to create a new pipeline.
I also searched and tried the Azure DevOps REST API for Pipelines, but did not find the available API.
Ideally, the Azure CLI "az pipelines create" can provide an input parameter that allows users specify an existing pipeline as a starting point for the new pipeline.
If your projects really need this feature, I recommend that you can directly report a feature request on the "Azure/azure-cli" repository to ask adding the parameter like as above mentioned. That will allow you directly interact with the appropriate engineering team, and make it more convenient for the engineering team to collect and categorize your suggestions.
As a workaround, we could clone the build definition via power shell script to call REST API.
Note: We need to change the original build definition name.
REST API
Get build definition:
GET https://dev.azure.com/{organization}/{project}/_apis/build/definitions/{definitionId}?api-version=6.0
Create build definition
POST https://dev.azure.com/{organization}/{project}/_apis/build/definitions?api-version=6.0
Power shell script
$connectionToken="{pat}"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$BuildDefinitionInfoURL = "https://dev.azure.com/{org name}/{project name}/_apis/build/definitions/386"
$BuildDefinitionInfo = Invoke-RestMethod -Uri $BuildDefinitionInfoURL -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get
Write-Host $BuildDefinitionInfo.name
$BuildDefinitionInfo.name = $BuildDefinitionInfo.name +" clone"
Write-Host $BuildDefinitionInfo.name
$body = $BuildDefinitionInfo | ConvertTo-Json -Depth 99
$createBuildDefinitionURL = "https://dev.azure.com/{org name}/{project name}/_apis/build/definitions?api-version=6.0"
$response = Invoke-RestMethod -Uri $createBuildDefinitionURL -ContentType "application/json" -Body $body -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Method POST
Write-Host $response.id
Result:
I'm just looking for direction here as, possibly, the api already does this and I'm misunderstanding / can't find the right resource.
What I would like to do is to be able to call the azure-devops api to create a new build definition for me when I supply it with all the necessary yaml files for each stage.
I expected a create endpoint which would take in a few basic pieces of information to create the build / release definition then a collection of yaml files to create the tasks.
I've found Create your first pipeline and Api 5.0 BuildDefinition/Create however neither of these mention posting a yaml definition to the api. I was expecting far less items in the request body considering the yaml definitions contain most of the information required.
Does the api support this? Will it ever support this?
There is no docs for Rest Api with yaml, but if you try to get an existing yaml definition you`ll meet the next example:
So if you want to edit the process you have to edit existing yaml file. If you want create/clone an existing build definition you may try to create/clone yaml file and post a request (Definitions - Create) with the process member:
yamlFilename = path to yaml file in the repository
type = 2
This powershell example to clone a build definition with yaml:
$pat = '{personal access token}'
$base64AuthInfo = [System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes(":$pat"))
$uri = 'https://dev.azure.com/{organization}/{team_project}/_apis/build/definitions/{buil_id}?api-version=5.0'
$result = Invoke-RestMethod -Method Get -Uri $uri -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -ErrorAction Stop
$body = $result | ConvertTo-Json -Depth 7
$existingyaml = '"yamlFilename": "{path to yaml for existing buildef}"'
$newyaml = '"yamlFilename": "{path to new yaml}"'
$buildname = '"name": "{existing build name}"'
$newbuildname = '"name": "{new build name}"'
$body = $body.Replace($existingyaml, $newyaml)
$body = $body.Replace($buildname, $newbuildname)
$Uri = "https://dev.azure.com/{organization}/{team_project}/_apis/build/definitions?api-version=5.0"
$newBuildDef = Invoke-RestMethod -Uri $Uri -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Method Post -Body $body -ContentType "application/json" -ErrorAction Stop
Yes, you are right, you could do a get on a build using the api, and change the variables, it should work.
If you only need to modify variables, you could use variable group to store values, then you can get the variable group and modify the variable values using the Variablegroups api.
In short, the requirement is to verify that our latest released software can be built and then installed after the latest Windows updates and/or other patches were applied. So the build server VM(s) will be configured just for this purpose and the build only needs to run after an update.
Since such updates usually are followed with a restart, I am thinking of a server restart event triggering a build and deployment. Does such option exist in TFS 2017?
If there is no way to do it through TFS then, I guess, a PowerShell script that runs on startup should work?
No such a build-in function to achieve that. However create a PowerShell script that runs on startup should work. Just as Jessehouwing said, you can create the script with the REST API to trigger builds.
Create a script to trigger the specific build definition. (Reference below sample)
Run the script on startup:
How to run a batch file each time the computer boots
How to schedule a Batch File to run automatically in Windows
10/8/7
Param(
[string]$collectionurl = "http://server:8080/tfs/DefaultCollection",
[string]$projectName = "ProjectName",
[string]$keepForever = "true",
[string]$BuildDefinitionId = "34",
[string]$user = "username",
[string]$token = "password"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
function CreateJsonBody
{
$value = #"
{
"definition": {
"id": $BuildDefinitionId
},
"parameters": "{\"system.debug\":\"true\",\"BuildConfiguration\":\"debug\",\"BuildPlatform\":\"x64\"}"
}
"#
return $value
}
$json = CreateJsonBody
$uri = "$($collectionurl)/$($projectName)/_apis/build/builds?api-version=2.0"
$result = Invoke-RestMethod -Uri $uri -Method Post -Body $json -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
There is no existing trigger that handles this, but there is a simple REST API to query and trigger builds.
It would be easy to create an on startup job in the task scheduler, use the REST API to query a list of Build Definitions based on a certain name or tag and then queue it.
List build definitions
Queue a build