Need to report on data from Retrospectives - Azure Dev Ops - azure-devops

We need a way to access data through an automated way (either Rest API or some SDK) that is contained within the Retrospective Azure Dev Ops extension. Currently, there is an option to export CSV but the process is manual and limited to each Retrospective. Any ideas/thoughts?

You can try like as the following steps:
Run the API to get the information of project teams in a project.
Request URL
POST https://dev.azure.com/{organization_Name}/_apis/Contribution/HierarchyQuery?api-version=5.0-preview.1
Request Body
{
"contributionIds": ["ms.vss-admin-web.org-admin-groups-data-provider"],
"dataProviderContext": {
"properties": {
"teamsFlag": true,
"sourcePage": {
"url": "https://dev.azure.com/{organization_Name}/{project_Name}/_settings/teams",
"routeId": "ms.vss-admin-web.project-admin-hub-route",
"routeValues": {
"project": "{project_Name}",
"adminPivot": "teams",
"controller": "ContributedPage",
"action": "Execute",
"serviceHost": "{organization_Id} ({organization_Name})"
}
}
}
}
}
Run the API to list the retrospectives for a specified project team in the project.
GET https://extmgmt.dev.azure.com/{organization_Name}/_apis/ExtensionManagement/InstalledExtensions/ms-devlabs/team-retrospectives/Data/Scopes/Default/Current/Collections/{projectTeam_identityId}/Documents?api-version=3.1-preview.1
Run the API to get more details about a specified retrospective.
GET https://extmgmt.dev.azure.com/{organization_Name}/_apis/ExtensionManagement/InstalledExtensions/ms-devlabs/team-retrospectives/Data/Scopes/Default/Current/Collections/{retrospective_Id}?api-version=3.1-preview.1
However, we have not any available interface (API or CLI) to Export CSV content.

Related

How to add/update approvers for environments through REST API on Azure Devops

I am trying to make hundreds of environments on Azure Devops using a PowerShell script and REST API. I need to include approvers on some of it.
I can already create those environments, I just need to add/update the approvers for it.
How can I do that? I can't seem to find anything on the documentation
To add /update the approvers for environments, you can use the following Rest API:
Add approvers for environment:
Rest API:
Post https://dev.azure.com/Orgname/Project/_apis/pipelines/checks/configurations?api-version=api-version=7.1-preview.1
Request Body:
{
"type":{
"id":"8C6F20A7-A545-4486-9777-F762FAFE0D4D",
"name":"Approval"
},
"settings":{
"approvers":[{"displayName":"{UserName}","id":"{UserID}"}],
"executionOrder":1,
"blockedApprovers":[],
"minRequiredApprovers":0,
"requesterCannotBeApprover":false
},
"resource":
{
"type":"environment",
"id":"{EnvironmentID}",
"name":"{EnvironmentName}"
}
}
Update approvers for environment:
Rest API:
PATCH https://dev.azure.com/{organization}/{project}/_apis/pipelines/checks/configurations/{Configurationid}?api-version=7.1-preview.1
Request Body:
{
"type":{
"id":"8C6F20A7-A545-4486-9777-F762FAFE0D4D",
"name":"Approval"
},
"settings":{
"approvers":[
{
"displayName":"{Username}",
"id":"{UserID}"
},
{
"displayName":"{Username}",
"id":"{UserID}"
}
],
"executionOrder":1,
"blockedApprovers":[],
"minRequiredApprovers":0,
"requesterCannotBeApprover":false
},
"resource":
{
"type":"environment",
"id":"{EnvironmentID}",
"name":"{EnvironmentName}"
}
}
For the configuartionID, you can get the ID in the following Rest API: Check Configurations - List
Since the requirement has no official sample, you can check the NetWork trace for more detailed info.
For example: Manually add/update the approvers and Check network tab in Browser Developer tool.
Have a look at the Approvals and Checks API

Azure DevOps API to access Multi Repo Sources of a Build

I have a YAML Build Definition that uses multiple repos.
When looking at the Build Results, there is a "Sources" Card that lists the Repos and the commit ID that was used for this particular build.
I tried looking through all the api documentation for this info, but I can't seem to find it easily. I took a look at the Get Build call, but that just gives the Main Repo used.
Does anyone know the API well enough to point me to the api call to access that set of Data that contains the repos/resources and their commit ids ?
Thanks!
You can use the following REST API:
GET https://dev.azure.com/{Organization}/{Project}/_build/results?buildId={Build ID}&__rt=fps&__ver=2
Its response body is quite massive, the information about multiple repositories is in:
fps -> dataProviders -> data -> ms.vss-build-web.run-details-data-provider -> repositoryResources
It shows all source repositories including their names, ids, versions, and so on.
This REST API is not documented and I get it from Developer Tools. Most of the information contained in it is web page information.
It looks that this is not available over REST API (yet). However, this is really strange, because if you hit get build endpoint:
https://dev.azure.com/{{organization}}/{{project}}/_apis/build/builds/826?api-version=6.0-preview.6&expand=all
in link section you will find sourceVersionDisplayUri
"_links": {
"self": {
"href": "https://dev.azure.com/thecodemanual/4fa6b279-3db9-4cb0-aab8-e06c2ad550b2/_apis/build/Builds/826"
},
"web": {
"href": "https://dev.azure.com/thecodemanual/4fa6b279-3db9-4cb0-aab8-e06c2ad550b2/_build/results?buildId=826"
},
"sourceVersionDisplayUri": {
"href": "https://dev.azure.com/thecodemanual/4fa6b279-3db9-4cb0-aab8-e06c2ad550b2/_apis/build/builds/826/sources"
},
"timeline": {
"href": "https://dev.azure.com/thecodemanual/4fa6b279-3db9-4cb0-aab8-e06c2ad550b2/_apis/build/builds/826/Timeline"
},
"badge": {
"href": "https://dev.azure.com/thecodemanual/4fa6b279-3db9-4cb0-aab8-e06c2ad550b2/_apis/build/status/43"
}
},
but the URL redirects you to commit which actually trigerred a build.
It still possible to get commits and repos from the logs
https://dev.azure.com/{{organization}}/{{project}}/_apis/build/builds/826/logs/6?api-version=6.0-preview.2
But for that you need to get trough them and parse responses:
2020-05-08T11:13:04.3354990Z Syncing repository: kmadof/devops-templates (github)
.
.
.
.
2020-05-08T11:13:03.3810662Z ##[command]git checkout --progress --force 24602dc40710a53502d306d1d41d3bca3c9a9b80
2020-05-08T11:13:03.3816465Z Note: switching to '24602dc40710a53502d306d1d41d3bca3c9a9b80'.
2020-05-08T11:13:05.1455197Z ##[command]git checkout --progress --force 0d79e869239bbf0087492ac2cda3a59e9ef11a39
2020-05-08T11:13:05.1458382Z Note: switching to '0d79e869239bbf0087492ac2cda3a59e9ef11a39'.

Deployed Keycloak Script Mapper does not show up in the GUI

I'm using the docker image of Keycloak 10.0.2. I want Keycloak to supply access_tokens that can be used by Hasura. Hasura requires custom claims like this:
{
"sub": "1234567890",
"name": "John Doe",
"admin": true,
"iat": 1516239022,
"https://hasura.io/jwt/claims": {
"x-hasura-allowed-roles": ["editor","user", "mod"],
"x-hasura-default-role": "user",
"x-hasura-user-id": "1234567890",
"x-hasura-org-id": "123",
"x-hasura-custom": "custom-value"
}
}
Following the documentation, and using a script I found online, (See this gist) I created a Script Mapper jar with this script (copied verbatim from the gist), in hasura-mapper.js:
var roles = [];
for each (var role in user.getRoleMappings()) roles.push(role.getName());
token.setOtherClaims("https://hasura.io/jwt/claims", {
"x-hasura-user-id": user.getId(),
"x-hasura-allowed-roles": Java.to(roles, "java.lang.String[]"),
"x-hasura-default-role": "user",
});
and the following keycloak-scripts.json in META-INF/:
{
"mappers": [
{
"name": "Hasura",
"fileName": "hasura-mapper.js",
"description": "Create Hasura Namespaces and roles"
}
]
}
Keycloak debug log indicates it found the jar, and successfully deployed it.
But what's the next step? I can't find the deployed mapper anywhere in the GUI, so how do I activate it? I tried creating a protocol Mapper, but the option 'Script Mapper' is not available. And Scopes -> Evaluate generates a standard access token.
How do I activate my deployed protocol mapper?
Of course after you put up a question on SO you still keep searching, and I finally found the answer in this JIRA issue. The scripts feature has been a preview feature since (I think) version 8.
So when starting Keycloak you need to provide:
-Dkeycloak.profile.feature.scripts=enabled
and after that your Script Mapper will show up in the Mapper Type dropdown on the Create Mapper screen, and everything works.

how to create data factory's integrated runtime in arm template

I am trying to deploy data factory using ARM template. It is easy to use the exported template to create a deployment pipeline.
However, as the data factory needs to access an on-premise database server, I need to have an integrated runtime. The problem is how can I include the run time in the arm template?
The template looks like this and we can see that it is trying to include the runtime:
{
"name": "[concat(parameters('factoryName'), '/OnPremisesSqlServer')]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties":
{
"annotations": [],
"type": "SqlServer",
"typeProperties": {
"connectionString": "[parameters('OnPremisesSqlServer_connectionString')]"
},
"connectVia": {
"referenceName": "OnPremisesSqlServer",
"type": "IntegrationRuntimeReference"
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/integrationRuntimes/OnPremisesSqlServer')]"
]
},
{
"name": "[concat(parameters('factoryName'), '/OnPremisesSqlServer')]",
"type": "Microsoft.DataFactory/factories/integrationRuntimes",
"apiVersion": "2018-06-01",
"properties": {
"type": "SelfHosted",
"typeProperties": {}
},
"dependsOn": []
}
Running this template gives me this error:
\"connectVia\": {\r\n \"referenceName\": \"OnPremisesSqlServer\",\r\n \"type\": \"IntegrationRuntimeReference\"\r\n }\r\n }\r\n} and error is: Failed to encrypted linked service credentials on self-hosted IR 'OnPremisesSqlServer', reason is: NotFound, error message is: No online instance..
The problem is that I will need to type in some key in the integrated runtime's UI, so it can be registered in azure. But I can only get that key from my data factory instance's UI. So above arm template deployment will always fail at least once. I am wondering if there is a way to create the run time independently?
The problem is that I will need to type in some key in the integrated
runtime's UI, so it can be registered in azure. But I can only get
that key from my data factory instance's UI. So above arm template
deployment will always fail at least once. I am wondering if there is
a way to create the run time independently?
It seems that you already know how to create Self-Hosted IR in the ADF ARM.
{
"name": "[concat(parameters('dataFactoryName'), '/integrationRuntime1')]",
"type": "Microsoft.DataFactory/factories/integrationRuntimes",
"apiVersion": "2018-06-01",
"properties": {
"additionalProperties": {},
"description": "jaygongIR1",
"type": "SelfHosted"
}
}
Result:
Only you concern is that Windows IR Tool need to be configured with AUTHENTICATION KEY to access ADF Self-Hosted IR node.So,it should be Unavailable status once it is created.This flow is make sense i think,authenticate key should be created first,then you can use it to configure On-Premise Tool.You can't implement all of things in one step because these behaviors are operated on both of azure and on-premise sides.
Based on the Self-Hosted IR Tool document ,the Register steps can't be implemented with Powershell code. So,all steps can't be processed in the flow are creating IR and getting Auth key,not for Registering in the tool.

Import VSTS task group with dataSourceBindings

I am trying to define dataSourceBindings in the json of a task group I intend to reuse in release definitions. My task group is basically to deploy a web app with some custom actions. On the parameters I expose on the group, I want to populate the picklists with proper values from AzureRM endpoint like the subscriptions, resource groups, web apps and slots available.
To achieve it, I created a first template from the VSTS UI I then exported to edit in JSON. Based on multiple posts and examples availables in VSTS tasks repo, I defined my dataSourceBindings as:
"dataSourceBindings": [
{
"target": "ResourceGroupName",
"endpointId": "$(ConnectedServiceName)",
"dataSourceName": "AzureResourceGroups"
},
{
"target": "WebAppName",
"endpointId": "$(ConnectedServiceName)",
"dataSourceName": "AzureRMWebAppNamesByType",
"parameters": {
"WebAppKind": "app"
}
},
{
"target": "SlotName",
"endpointId": "$(ConnectedServiceName)",
"dataSourceName": "AzureRMWebAppSlotsId",
"parameters": {
"WebAppName": "$(WebAppName)",
"ResourceGroupName": "$(ResourceGroupName)"
},
"resultTemplate": "{\"Value\":\"{{{ #extractResource slots}}}\",\"DisplayValue\":\"{{{ #extractResource slots}}}\"}"
}
]
I was able to import my task group in VSTS. But once added to a release definition the picklists were still empty.
Then I tried to export the previously imported group and noticed the dataSourceBindings field was empty.
Is importing a group task with dataSourceBindings supported?
If yes, what could possibly go wrong with mine?