Pass a list of string to values in Azure App Service Settings - azure-devops

I have a list of string in my appsettings.json.
I an using Azure DevOps to deploy my code to Azure.
I am using the Release pipeline task "Azure App Sevice Settings".
I want to pass my list of strings to the above task parameter.
I have tried the below
[
{ "name": "test", "value": "["1", "2"]", "slotSetting": false }
]
The release pipeline is giving me the below error
Error: Application Settings object is not a valid JSON.
How do I pass list of string in the Azure App Service Settings task?
Thanks in advance

Your JSON string is not valid. Possibly you meant this, with the double-quotes escaped?
[
{ "name": "test", "value": "[\"1\", \"2\"]", "slotSetting": false }
]

Related

Read DevOps Variable as bool in ARM paramters

I am trying to read in a pipeline variable as a bool. I have the following parameters defined in json:
"parameters": {
"disableFunkyFunction": {
"value": "#{myVariables.disableFunkyFunction}#",
"type": "bool"
}...
}
In my Azure DevOps release pipeline, I have a pipeline variable called myVariables.disableFunkyFunction and the value for it is set to true. However, whenever I try and run the pipeline, it fails on the "Azure resource group deployment" step: Template parameter 'disableFunkyFunction' was provided an invalid value. Expected a value of type 'Boolean', but received a value of type 'String'. I have tried using a value of 1 instead, but to no avail.
The following works, but ideally I want to read the value from the DevOps pipeline variable, not hard code it in the parameters file:
"parameters": {
"disableFunkyFunction": {
"value": true,
"type": "bool"
}...
}
Any suggestions?
From the error message, it seems that the pipeline variable value hasn't been passed to ARM template file.
Since you have set the parameter value as #{myVariables.disableFunkyFunction}#, you can try to use the Replace Token task from Replace Tokens Extension.
Here are the steps:
Step1: Keep the parameter value as #{myVariables.disableFunkyFunction}#.
Step2: Use Replace token task to set the value.
Step3: Deploy the ARM template.
For example:
Note: You need to put the Replace token task before the ARM template deployment step.

Azure Batch and API Job ADD: upload file in wd directory with

I'm using API Job Add to create one Job with one Task in Azure Batch.
This is my test code:
{
"id": "20211029-1540",
"priority": 0,
"poolInfo": {
"poolId": "pool-test"
},
"jobManagerTask": {
"id": "task2",
"commandLine": "cmd /c dir",
"resourceFiles": [
{
"storageContainerUrl": "https://linkToMyStorage/MyProject/StartTask.txt"
}
]
}
}
To execute the API I'm using Postman and to monitor the result I'm using BatchExplorer.
The job and it's task are created correctly, but the 'wd' folder generate automatically is empty.
If I understood fine, I should see the linked file in the storage variable, right?
Maybe, some other parameter is needed in the Json of the body?
Thank you!
Task state of completed does not necessarily indicate success. From your json body, you most likely have an error:
"resourceFiles": [
{
"storageContainerUrl": "https://linkToMyStorage/MyProject/StartTask.txt"
}
You've specified a storageContainerUrl with a file. Also ensure you have provided proper permissions (either via SAS or a user managed identity).

From within a Build/Release pipeline, can we discover its path?

In Azure DevOps, we can organize our Build/Release definitions into high-level folders:
Example: for every pipeline that resides in the Framework folder, I want to conditionally execute a certain task. The pre-defined Build and Release variables provide a plethora of ways to discover information about the underlying file system, but seemingly nothing for this internal path information.
During a pipeline run, is it possible to determine the folder/path that it resides in?
You can check it with Rest API - Builds - Get:
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}?api-version=6.0
In the response you get the definition details including the path:
"definition": {
"drafts": [
],
"id": 13,
"name": "TestBuild",
"url": "https://dev.azure.com/xxxxx/7fcdafd5-b891-4fe5-b2fe-xxxxxxx/_apis/build/Definitions/13?revision=1075",
"uri": "vstfs:///Build/Definition/13",
"path": "\\Test Folder",
"type": "build",
"queueStatus": "enabled",
"revision": 1075,
"project": {
"id": "7fcdafd5-b891-4fe5-b2fe-9b9axxxxx",
"name": "Sample",
"url": "https://dev.azure.com/xxxx/_apis/projects/7fcdafd5-b891-4fe5-b2fe-9xxxxxx",
"state": "wellFormed",
"revision": 97,
"visibility": "private",
"lastUpdateTime": "2021-03-22T10:25:39.33Z"
}
},
So:
Add a simple PS script that invokes the rest API (with the $(Build. BuildId) pre-defined variable)
Check the value of the path property
If it contains the Framework folder set a new variable with this command:
Write-Host "##vso[task.setvariable variable=isFramework;]true"
Now, in the task add a custom condition:
and(succeeded(), eq(variables['isFramework'], 'true'))

Create Azure Devops/Pipeline release including all meta data like buildNumber via REST

I'am aware of https://learn.microsoft.com/en-us/rest/api/azure/devops/release/releases/create?view=azure-devops-rest-6.0#uri-parameters and i can create releases via REST, but ...
Problem:
The issue with those releases is, that a release triggered via the REST lacks some predefined variables like Build.BuildNumber - or at least, they are not available at all scopes.
It seems like Build.BuildNumber is available in the pipeline stage, but is missing in when the release name format is computed. This means, a release name format Release-$(Build.BuildNumber)($(Release.ReleaseId)) will end up with a blank Build.BuildNumber when created using the payload below.
Details:
My json payload
{
"definitionId": 9,
"description": "Test Release",
"artifacts": [
{
"alias": "My Build Artifact",
"instanceReference": {
"id": "6989",
"name": null
}
}
],
"isDraft": false,
"reason": "none",
"manualEnvironments": null
}
While artifacts.instanceReference.id references a valid build (which of course has a buildNumber) - so that during the release stage Build.BuildNumber is properly populated.
I send the payload via
curl -X POST -u username:redactedPAT -H "Content-Type: application/json" -d #payload.json https://vsrm.dev.azure.com/redactedCompany/redactedProjectId/_apis/release/releases\?api-version\=6.0
Question:
What is the right way to create a release as if this would be created via the GUI including all the META-Data?
Do i miss use the API somehow or do i need to manually set the environment variables for this to work (or even somehow via variables).
Since the buildNumber is available in the stage, could that be rather/even a bug?
The release name is evaluated at the compile time, which means it is populated before the pipeline tasks are executed. You can check out below workarounds to populated the release name.
1, The $(Build.BuildNumber) variable you defined in the release name format will retrieve the name property you pass to the artifacts instanceReference in the request body. So you can pass the BuildNumber value to the name property under instanceReference in the request body. See here.
{
...
"artifacts": [
{
"alias": "My Build Artifact",
"instanceReference": {
"id": "6989",
"name": BuildNumber #set the buildNumber here.
}
}
],
...
}
2, Another workaround is to use logging commands to update the release name in a script task.
You can add a script task in your release pipeline stage. And run below logging command to update the release name during the release pipeline execution.
echo "##vso[release.updatereleasename]Release-$(Build.BuildNumber)($(Release.ReleaseId))"
See below example:
When the task is executed. it will update the release name with your desired format.

Using Function app connector in ADF - How to override parameters in CI-CD?

I need a work around pretty quickly - this was a late surprise in the dev process when we added an Az function to our development ADF pipeline.
When you use a function app in ADF V2, when you generate the ARM template, it does not parameterize the key references unlike in other linked services. Ugh!
So for CI/CD scenarios, when we deploy we now have a fixed function app reference. What we'd like to do is the same as other linked services - override the key parameters to point to the correct Dev/UAT /Production environment versions of the functions.
I can think of dirty hacks using powershell to overwrite (does powershell support ADF functions yet? don't know - in January they didn't).
Any other ideas on how to override function app linked service settings?
the key parameters are under typeProperties (assuming the function key is in keyvault):
{"functionAppUrl:="https://xxx.azurewebsites.net"}
{"functionkey":{"store":{"referenceName"="xxxKeyVaultLS"}}}
{"functionkey":{"secretName"="xxxKeyName"}}
Right now these are hard coded from the UI settings - no parameter and no default.
ok, eventually got back to this.
The solution looks a lot but it is pretty simple.
In my devops release, I create a Powershell task after both the data factory ARM template has been deployed and the powershell task for deployment.ps1 with the "predeployment=$false" setting has run (see ADF CI/CD here.)
I have a json file for each environment (dev/uat/prod) in my git repo (I actually use a separate "common" repo to store scripts apart from the ADF git repo and its alias in DevOps is "_Common" - you'll see this below in the -File parameter of the script).
The json file to replace the deployed function linked service is a copy of the function linked service json in ADF and looks like this for DEV:
(scripts/Powershell/dev.json)
{
"name": "FuncLinkedServiceName",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureFunction",
"typeProperties": {
"functionAppUrl": "https://myDEVfunction.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "MyKeyvault_LS",
"type": "LinkedServiceReference"
},
"secretName": "MyFunctionKeyInKeyvault"
}
},
"connectVia": {
"referenceName": "MyintegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
...and the PROD file would be like this:
(scripts/Powershell/prod.json)
{
"name": "FuncLinkedServiceName",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureFunction",
"typeProperties": {
"functionAppUrl": "https://myPRODfunction.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "MyKeyvault_LS",
"type": "LinkedServiceReference"
},
"secretName": "MyFunctionKeyInKeyvault"
}
},
"connectVia": {
"referenceName": "MyintegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
then in the devops pipeline, I use a Powershell script block that looks like this:
Set-AzureRMDataFactoryV2LinkedService -ResourceGroup "$(varRGName)" -DataFactoryName "$(varAdfName)" -Name "$(varFuncLinkedServiceName)" -File "$(System.DefaultWorkingDirectory)/_Common/Scripts/Powershell/$(varEnvironment).json" -Force
or for Az
Set-AzDataFactoryV2LinkedService -ResourceGroupName "$(varRGName)" -DataFactoryName "$(varAdfName)" -Name "$(varFuncLinkedServiceName)" -DefinitionFile "$(System.DefaultWorkingDirectory)/_Common/Scripts/Powershell/Converter/$(varEnvironment).json" -Force
Note:
the $(varXxx) are defined in my pipeline variables e.g.
varFuncLinkServiceName = FuncLinkedServiceName.
varEnvironment = "DEV", "UAT", "PROD" depending on the target release
Force is used because the Linked service must already exist in the Data Factory ARM deployment and then we need to force the overwrite of just the function linked service.
Hopefully MSFT will release a function app linked service that uses parameters but until then, this has got us moving with the release pipeline.
HTH. Mark.
Update: Added the Az cmdlet version of the AzureRM command and changed to Set ("New-Az..." worked but in the new Az - there is only Set- for V2 linked services).