I need to temporarily disable the test run in the azure pipeline without deleting it from the pipeline complettely. How can i do it ?
Maybe something like this (right-click and Disable)?
Related
I'm trying to update the value of appsettings.Development.json file while releasing application using Azure DevOps pipeline.
appSettings.Development.json
{
"Networks": {
"EnableNetwork": {
"SomeNetwork": {
"SomeValue": "Old Value"
}
}
}
}
I have configured File Transform: task as follows
And set the variable with scope "Release"
File Transfer Task runs successfully
However, it doesn't change the value after deployment. Tried different changes as shown in SO answers like How to change Appsettings and Config info in Release Pipeline. But not sure where is the issue!
The steps you use File Transform task should be correct. The value of appsettings.Development.json can be updated successfully.
You can check the log of File Transform task to confirm this point.
For example:
it doesn't change the value after deployment.
From your screenshot, the cause of this issue could be that you are using Microsoft-hosted agent to deploy the IIS WebSite. It will deploy the website to hosted agent instead of your local machine.
According to your screenshot of the release pipeline definition, the Stage 3 should be run on Deployment Group. In this case, it will deploy the package to local machine.
You need to check the release pipeline definition to make sure the stage is running on Deployment Group. Then you will see the changes after deployment.
I have an Azure DevOps pipeline which is failing to run because it seems to be using an old connection string.
The pipeline is for a C# project, where a FileTransform task updates an appsettings.json file with variables set on the pipeline.
The variables were recently updated to use a new connection string, however, when running a Console.PrintLn before using it and viewing it on the pipeline, it shows an outdated value.
Many updates similar to this have been run in the past without issue.
I've also recently added a Powershell task to echo what the value is in the variables loaded while the pipeline is running, which does display the new value.
I've checked the order of precedence of variables and there shouldn't be any other variables being used.
There is no CacheTask being used in this pipeline.
Does anyone have any advice to remedy this? It seems that the pipeline itself is just ignoring the variables set on the pipeline.
There is a problem with the recent File transform task version v1.208.0.
It will shows the warning message and not update the variable value correctly.
Warning example:
Resource file haven't been set, can't find loc string for key: JSONvariableSubstitution
Refer to this ticket: File transform task failing to transform files, emitting "Resource file haven't been set" warnings
The issue is from Task itself instead of the Pipeline configuration. Many users have the same issue.
Workaround:
You can change to use the File Transform task Version 2 to update the appsettings.json file.
Here is an example: Please remove the content in XML Transformation rules field and set the JSON file path
I want to create a task group where Azure Resource Manager Connection is filled with a parameter:
However, this is not possible to do in portal as a validation force to fill it with working value. So I tried to export the task group as json and them modify it and import but then I got this message saving release pipeline:
Is there a way to overcome this? I understood that this is security check (which btw doesn't work in yaml pipelines becauce there you can use Azure Reource Manager connection even if you not allowed). However, in this way it limits usage of task group to a single connection.
EDIT:
Kevin thank you for your anser. I tired it but it didn't work for me.
So I have the connection rg-the-code-manual:
I created a variablewith it:
But when I tried to use it I have a validation error:
Based on my test, when I set the variable as the Azure Resource Manager Connection name, I could reproduce the same issue.
For example:
To solve this issue, you need to set the variable value in release pipeline.
Then you could save the release pipeline successfully.
On the other hand, you could also set the default value for the variable in Task Group.
In this case, the task group will use the default value in release pipeline. And the parameter will also exist in the task group task, you could directly select the value in the drop downlist.
Note: you need to make sure that the Service connection name is valid.
Below Release Variables are not getting resolved at run time when tried in a custom server task on a Release Pipeline.
System.JobName
System.JobDisplayName
System.StageDisplayName
System.DefinitionName
I am able to fetch JobId by using $(system.JobId), but not other job details (JobName) as mentioned above.
Am I missing anything here:
Those YAML pipeline variables. They will not work in a classic release pipeline.
Use the release-specific equivalents: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops&tabs=batch#default-variables
Let's say I have these 3 Stages: Dev, QC, Prod.
My requirements are:
Artifacts only from specific branches(release/*) can be deployed to QC/Prod
Artifacts from all branches can be deployed to Dev
I can achieve what I want using Artifact filters for "After stage" triggered Releases but I need this for "Manual only".
Is there a workaround that will let me control/filter which artifacts are available for deployment for specific stages/environments?
Basically, I need the Azure DevOps equivalent of Octopus Channels.
Update
I think I'm close to a solution.
In the "Pre-deployment conditions", I can add a new Deployment Gate which makes a Rest API call.
e.g URL suffix=/Release/releases/76
Now, I just need to correctly parse the ApiResponse because the below Success criteria doesn't work
eq(root['artifacts[0].definitionReference.branch.id'], 'refs/heads/master')
Evaluation of expression 'eq(root['artifacts[0].definitionReference.branch.id'], 'refs/heads/master')' failed.
As you said, you can do this using Deployment gates on your stages.
Create a new Generic service connection from Project Settings -> Pipelines -> Service Connections.
For service URL something like https://vsrm.dev.azure.com/{OrgName}/{ProjectName}/_apis
On your stage, open the Pre-Deployment Conditions
Enable the Gates option.
Add a new Invoke REST API gate and set the Delay before evaluation to 0 minutes.
4.1 Set the connection type to Generic.
4.2 Select the service connection you created in step 1.
4.3 Set the method to GET.
4.4 Set the URL suffix to /Release/releases/$(Release.ReleaseId)
4.5 On the Advanced area, set the Completion Event to ApiResponse.
4.6 On the Advanced area, set the success criteria to (or startsWith)
eq(root['artifacts'][0]['definitionReference']['branch']['id'],'refs/heads/master')
Now, if you try to deploy an artifact not from the master branch, the deployment will fail
There is a workaround:
In the QC/Prod stages add a custom condition that the job will be executed only where the artifacts source branch is release/*:
startsWith(variables['Release.Artifacts.{Artifacts-Alias}.SourceBranch'], 'refs/heads/release')
Now, when you manually run the QC/Prod stages and the artifacts not came from the release the job not will be executed:
This works
and(contains(variables['build.sourceBranch'], 'refs/heads/release'), succeeded())