I'm having issues when trying to display all runs for one of my Pipelines. I'm running on premise Azure Devops 2020 and I get a perpetual spinning Loading dialogue. I don't have this issue in any of my other pipelines. It seems that if I filter by run state in the problematic pipeline I am able to get past this and view some of the runs but for other states in my case "Succeeded" and "Succeeded with Issues" I continue to get the spinning loading symbol. Any advice?
Here are some suggestions:
Suggestion 1
Clean the cache or load the page from a different brower and restart the Azure DevOps Server and the SQL Server machine.
Suggestion 2
Create a new pipeline that has the same settings as the affected pipeline.
Suggestion 3
You can use the REST API Builds - List to get all runs of your pipeline.
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds?api-version=6.0
Because the cause of this question may be that the runs list is too long. You can use the REST API Builds-Delete to delete some of the runs you don't need to see whether the question can be solved.
DELETE https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}?api-version=6.0
Suggestion 4
You can refer to a similar question on the Developer Community. The Jan Selbach's comment offers a solution.
1.Run the following SQL script to find the LeaseId of duplicate rows. Please change collection Db name according to your’s.
SELECT LeaseId
FROM [AzureDevOps_DefaultCollection].[Build].[tbl_RetentionLease]
WHERE
partitionId > 0 AND
LeaseId NOT IN (
SELECT MIN(LeaseId) as LeaseId
FROM [AzureDevOps_DefaultCollection].[Build].[tbl_RetentionLease]
WHERE PartitionId > 0
GROUP BY OwnerId,RunId
)
2.Concatenate the LeaseIds into a comma separated list
3.Run the following PowerShell script by filling in the org, project, pat, and leaseId by passing in the comma separated list of LeaseIds
(remember if on-prem change the https://dev.azure.com in the $uri to
your on-prem server)
$org = "";
$project = "";
$pat = "";
$encoded = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes([string]::Format("{0}:{1}", "", $pat)));
$accessToken = "Basic $encoded";
$leaseId = "";
$uri = "https://dev.azure.com/$org/$project/_apis/build/retention/leases?ids=$leaseId&api-version=6.0-preview";
try {
$response = Invoke-RestMethod -uri $uri -method DELETE -Headers #{ Authorization = $accessToken } -ContentType "application/json"
$response
}
catch {
$errorDetails = ConvertFrom-Json $_.ErrorDetails
Write-Host "StatusCode: $($_.Exception.Response.StatusCode)`nExceptionType: $($errorDetails.typeKey)`nExceptionMessage: $($errorDetails.message)"
#StackTrace: $($errorDetails.stackTrace)"
}
Related
I am working on creating qnamaker based Chatbot and for this I have prepared ARM template for creating and deploying resources.
PowerShell script is being used to create knowledge base once all the resources are created.
This process works just fine when I am executing PowerShell script from local PowerShell tool.
Now I want to do CI/CD for this deployment process so that deployment process can be automated
Creating resources and deploying to azure through Pipeline is quite possible through Azure ARM Template deployment task but I am not sure how to execute PowerShell script that is responsible for creating knowledge base once resources are created based on recently created qnamaker service
Any help greatly appreciated
You can refer to Create a CI/CD pipeline for .NET with the Azure DevOps Project for step by step lab to create the pipelines.
If you need help on ARM templates (if you are stuck somewhere), you can check out A quick guide on writing your Azure Resource Manager (ARM) Templates. and get started quickly.
Please clarify that your issue is creating and training the QnA Maker KB after the actual resources have been created by ARM? There are probably multiple ways to do this, but here is what I have done. First of all, I am using the Azure CLI Task version 2.* which simplifies some of the initial steps. You could probably use regular PowerShell for some of the subsequent steps. I have selected all as PowerShell script types and inline scripts. Each of these segments I do as a separate task but you could probably do all as one. I like being able to see which step fails if something happens, though.
First you need to get the key for your cognitive service
$QNAKEY= & az cognitiveservices account keys list -g "YOUR_RESOURCE_GROUP" --name "YOUR_QNA_RESOURCE_NAME" --query key1 -o tsv
Write-Output("##vso[task.setvariable variable=QNA_KEY;]$QNAKEY")
Next you need to create the KB. I have mine seeded with information via a json file in my project repository. You can do this from wherever, or just create a blank KB (I think). Note I'm checking here to see if the KB exists; KB name does not have to be unique, so if you don't do this you will end up creating a lot of duplicate KBs.
$header = #{
"Content-Type"="application/json"
"Ocp-Apim-Subscription-Key"="$(QNA_KEY)"
}
$res = Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases" -Method 'Get' -Headers $header
$kb = $res.knowledgebases | Where-Object name -eq "YOUR_KB_NAME"
Write-Host $res.knowledgebases
Write-Host $kb
if (!$kb) {
Write-Host "KB does not exist, so creating new KB"
$body = Get-Content '$(System.DefaultWorkingDirectory)/PATH_TO_MY_MODEL.json' | Out-String
Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases/create" -Method 'Post' -Body $body -Headers $header
}
Finally, you will likely want to publish your KB. With LUIS I think you need to train it first (via separate CLI task), but QnA Maker you should be able to publish directly. I do this as a dependent stage to ensure the KB is created before I try to publish.
$QNAKEY= & az cognitiveservices account keys list -g "YOUR_RESOURCE_GROUP" --name "YOUR_QNA_RESOURCE_NAME" --query key1 -o tsv
$header = #{
"Content-Type"="application/json"
"Ocp-Apim-Subscription-Key"="$QNAKEY"
}
$res = Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases" -Method 'Get' -Headers $header
$kb = $res.knowledgebases | Where-Object name -eq "YOUR_KB_NAME"
$qnaId = $kb.id
Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases/$qnaId" -Method 'Post' -Headers $header
And that's it! At this point your QnA Maker knowledgebase should be created, published, and ready to go.
We have an Azure DevOps pipeline which uses self hosted Windows agents with Azure DevOps server 2019. The pipeline runs our front-end tests without any problems. However, occasionally our linting step will find problems that it throws as warnings (such as unused variables). This is what we want it to do but the issue is that these warnings were not being elevated. So the only way to see them was to look in the build execution.
This we were able to resolve by adding a vso formatter to the linting command: npm run nx run-many -- --target="lint" --all --skip-nx-cache=true --parallel --format=vso. So now the warnings are thrown like this:
As shown in the green box the warnings are displaying properly. However, in the red circles the status of the build, job, and linting task are success. Is there a way I can mark this build, job, and task as warning so we know to take a further look? Thank you for any help, please let me know if I can provide additional information.
You can add a powershell task at the end of the pipeline, and then run the Rest API(Timeline - Get) to traverse the warning messages in the previous task. Finally, you can use the logging command to set the pipeline status
Here is PowerShell Sample:
$token = "PAT"
$url=" https://{instance}/{collection}/{project}/_apis/build/builds/$(build.buildid)/timeline?api-version=5.0"
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($token)"))
$count = 0
$response = Invoke-RestMethod -Uri $url -Headers #{Authorization = "Basic $token"} -Method Get -ContentType application/json
ForEach( $issues in $response.records.issues )
{
if($issues.type -eq "warning")
{
echo $issues.Message
$count ++
}
}
echo $count
if($count -ne 0 )
{
Write-Host "##vso[task.complete result=SucceededWithIssues;]"
}
Result:
There is Azure DevOps Extension/Task created by Microsoft "Build Quality Checks"
So you can use it to set up an additional option to force specific build qulity.
I'm trying to get the status of agents in a deployment pool at release time.
The use case is I have 2 servers with shared disk, I want the release to run on one server only. I have two Deployment groups that run based on a custom condition:
eq(variables['DeployGroupSelector'], '1')
With a job that runs prior to those that will determine the value of the DeployGroupSelector var, essentially a case statement.
In the job that sets the var, I'm trying to reach out to the Azure DevOps REST API:
$headers = #{
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
}
$url = "https://dev.azure.com/$($organization)/_apis/distributedtask/pools/$($poolId)/agents?api-version=5.1"
$response = Invoke-RestMethod $url -Headers $headers -Verbose
write-host "Output: $response"
$status = ($response.value | where {$_.name -eq $($env:primaryPoolName)}).status
if($status -eq "online")
{
Write-Output("##vso[task.setvariable variable=DeployGroupSelector;]1")
}
else
{
Write-Output("##vso[task.setvariable variable=DeployGroupSelector;]2")
}
For the group containing the script above the "Allow scripts access to the OAuth token" box is checked.
When I run this powershell locally using a PAT it returns data. When I run the release in ADO, it hits the service, but returns an empty data set:
2019-10-07T14:16:18.8942915Z VERBOSE: GET https://dev.azure.com/xxxxxx/_apis/distributedtask/pools/13/agents?api-version=5.1 with 0-byte payload
2019-10-07T14:16:19.3235204Z VERBOSE: received 22-byte response of content type application/json
2019-10-07T14:16:19.9626359Z VERBOSE: Content encoding: utf-8
2019-10-07T14:16:19.9835101Z Output: #{count=0; value=System.Object[]}
I have tried giving the "Project Collection Build Service Accounts" group read access to pools and groups, I even tried bumping it up to user. I tried adding the build service accounts group to release administrators. I even tried using the old url format just in case.
Added picture of data returned from powershell:
UPDATE: Just to further rule out an issue with how I was using the token, I added a second powershell task to the task group in question. This script hits a different segment of the AzDO Rest API (below). This successfully gets a response. So the OAuth token is working, it just doesn't seem to have access to the entire API.
$headers = #{
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
}
$url = "https://dev.azure.com/$($organization)/$($project)/_apis/git/repositories?api-version=5.1"
$response = Invoke-RestMethod $url -Headers $headers -Verbose
write-host "Output: $($response)"
Response: Output: #{value=System.Object[]; count=10}
Since you're using the System_AccessToken variable, did you also enable the "Allow scripts to access OAuth token" checkbox in the agent job? https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=classic#systemaccesstoken This link shows where it is in build but you'll find it at the bottom of the Agent Job panel in release. If that's not checked, that may be why you're getting an empty response.
Had exactly the same issue. Were considered two options, the same as you have been trying.
System.AccessToken
PAT
Problem was solved by putting PAT into a KeyVault and using it as a basic auth token for REST API call in a pipeline.
My suggestion is that it seems expected and right behavior. Why do I think so? From Azure DevOps point of view there are two levels in our case Organization's level and Project's level. You can notice the difference by URI that were used:
$url = "https://dev.azure.com/$($organization)/_apis/distributedtask/pools/$($poolId)/agents?api-version=5.1"
$url = "https://dev.azure.com/$($organization)/$($project)/_apis/git/repositories?api-version=5.1
From security point of view it's a bad practice, let entities from lower layers, which is project in our case, access and manipulate on the higher layer, which is organization in our case.
As a conclusion I would say that SystemToken and PAT in essence have slightly different nature, one is devoted for agent, and another for one's personal profile.
I am working for a customer who is running a Dynamics AX project and is using TFS. I do have access to TFS and am able to check all the different work items - however, what I am missing is an overview to build metrics as the only way to get data into a table (Excel) does not allow me to get the history of a work item.
I am hence wondering how I could do this using PowerShell. I am completely new to this, accordingly step-by-step guidance would be highly appreciated.
You can use TFS Rest API.
For example:
$serverUrl = "http://tfsServer:8080/tfs/Collection"
$workItemId = "1"
#Get the Work Item
$workItem = Invoke-RestMethod -Uri "$($serverUrl)/_apis/wit/workitems/$($workItemId)?api-version=3.0" -UseDefaultCredentials -Method Get
#Print the revisions number
Write-Host $workItem.rev
#Get the specific revision details
$revision = Invoke-RestMethod -Uri "$($serverUrl)/_apis/wit/workitems/$($workItemId)/revisions/2?api-version=3.0" -UseDefaultCredentials -Method Get
#Print the Work Item details in the specific revision
Write-Host $revision.fields
TFS2012 with a one 2010 build controller with one 2010 build agent. Also have one 2012 build controller with multiple 2012 build agents.
We have multiple builds for multiple versions of our software. The builds are named according to a convention e.g. Foo_version_1_0 and Foo_version_2_0.
When I run this code on my local machine, all the builds are queued. When I run this code manually on a 2012 build agent, the builds are queued. When I run this code manually on a 2010 build agent, no builds are queued. When the code is executed as part of a triggered build in TFS (either on the 2010 or 2012 controller/agent), it doesn't queue any builds and errors out with my custom exception saying no definitions returned from TFS.
My questions:
Is the $buildServer.QueryBuildDefinitions() function an administrator function only? I.e. if a non-admin user account (like TFSService) runs it, it won't be able to get the data from the TFS api?
Is the $buildServer.QueryBuildDefinitions() a new function that is only available in 2012?
Is there another way of doing this that will work? Previously, we had all our build names hard coded - this is not a viable way forward for us.
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Build.Client")
$serverName="http://tfs:8080/tfs"
$tfs = [Microsoft.TeamFoundation.Client.TeamFoundationServerFactory]::GetServer($serverName)
$buildserver = $tfs.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
$buildServer.QueryBuildDefinitions("FooProject") | foreach {
if ($_.EndsWith("version_1_0"))
{
echo "Queueing build: $_.Name"
$buildServer.QueueBuild($buildServer.GetBuildDefinition("FooProject",$_.Name))
}
}
}
Edit: removed $buildDefinitions = $buildServer.QueryBuildDefinitions("FooProject").Name, replaced it with $buildServer.QueryBuildDefinitions("FooProject") | foreach...
Builds are now queued programmatically.
The API hasn't changed, and I suppose that both agents are using the same account.
The line
$buildDefinitions = $buildServer.QueryBuildDefinitions("FooProject").Name
seems wrong: the Name property get will raise an exception for an empty result.
You can also utilize one of the built in APIs, to prevent downloading dll binaries.
The following will work for
TFS 2017:
https://github.com/sameer-kumar/adhoc-posh/blob/master/QueueTfsBuild.ps1
$rootTfsUri = "http://myTFS:8080/tfs"
$collectionName = "Default"
$projectName = "Project1"
$tfsUri = $rootTfsUri + "/" + $collectionName + "/" + $projectName
$buildDefinition = "DevCI-vnext"
$buildDefinitionUri = "$tfsUri/_apis/build/definitions?api-version=3.1&name=$buildDefinition"
# first get build definition id
$buildResponse = Invoke-WebRequest -Uri $buildDefinitionUri -UseDefaultCredentials -Method Get -Verbose -UseBasicParsing -ContentType "application/json"
$buildResponseAsJson = $buildResponse.Content | convertfrom-json
$buildDefinitionId = $buildResponseAsJson.value.id
# Now queue this build definition
$requestContentString = #"
{
"definition": {
"id" : "$buildDefinitionId"
}
}
"#
$buildUri = "$tfsUri/_apis/build/builds?api-version=3.1"
$buildResponse = Invoke-WebRequest -Uri $buildUri -UseDefaultCredentials -Method Post -Verbose -UseBasicParsing -ContentType "application/json" -Body $requestContentString
$buildNumber = ($buildResponse.Content | ConvertFrom-Json).buildNumber
TFS 2015
Utilizes a slightly different structure where uri definition replace with this,
$buildDefinitionUri = "$tfsUri/_apis/Build/builds?api-version=2.0&name=$buildDefinition"