Azure Devops REST API documentation for policies is missing? - azure-devops

We're building a web based self service for our employees, to speed up/improve everyones DevOps experience. The self service utilizes the Azure Devops REST API and most of it works fine but as I'm about to implement the branch policies I get stuck for lack of documentation (or my inability to find it).
I think I have found what documentation is available for creating policy configurations, like this article. There's just a general mention of "settings" needs to be a JObject and then seven examples for various scenarios but if there are any reference articles for the 14 supported policy types then I have missed it.
Am I just blind or did Microsoft just not bother with documenting how to form the JObjects for the different kinds of configurations?

Azure Devops REST API documentation for policies is missing?
For me, I am more inclined that the document does not explicitly point out the content of JObject.
When we check that REST API document Configurations - Create, we could get following info:
Indeed, it only states that its type is a JSON object without specific content or examples.
To get the content of this JSON object, I use the REST API Configurations - Get to get the content of the Response body, I could get following response body:
"settings": {
"minimumApproverCount": 2,
"creatorVoteCounts": false,
"allowDownvotes": false,
"resetOnSourcePush": false,
"requireVoteOnLastIteration": false,
"resetRejectionsOnSourcePush": false,
"blockLastPusherVote": false,
"scope": [
{
"refName": "refs/heads/Dev",
"matchKind": "Exact",
"repositoryId": "dcb40ef6-dae0-4e3c-b581-2f71c76e09a6"
}
]
},
So, we could to know the content is indeed a JSON object and it will be different due to the different policies we set.
Now, we move back to the samples in that document, we could find that there are many such settings, like:
Approval count policy:
{
"isEnabled": true,
"isBlocking": false,
"type": {
"id": "fa4e907d-c16b-4a4c-9dfa-4906e5d171dd"
},
"settings": {
"minimumApproverCount": 1,
"creatorVoteCounts": false,
"scope": [
{
"repositoryId": null,
"refName": "refs/heads/master",
"matchKind": "exact"
}
]
}
}
If we want to set any other supported policy types, we can manually set it on the UI, and then get the corresponding response body about it.

Related

How to know the structure (body) of rest api azure POST request?

i am new at rest api azure and i dont know how to get correct body template of policy.
For example i used :
GET https://dev.azure.com/organization/project/_apis/policy/types?api-version=7.0
and the response are types of policies which i can use but how do i know the construction of the request body? Like this one:
{
"isEnabled": true,
"isBlocking": false,
"type": {
"id": "fa4e907d-c16b-4a4c-9dfa-4906e5d171dd"
},
"settings": {
"minimumApproverCount": 4,
"creatorVoteCounts": false,
"scope": [
{
"repositoryId": "a957e751-90e5-4857-949d-518cf5763394",
"refName": "refs/heads/master",
"matchKind": "exact"
}
]
}
}
Where should I find those request body templates? :(
Resources: https://learn.microsoft.com/en-us/rest/api/azure/devops/policy/configurations/create?view=azure-devops-rest-5.1&tabs=HTTP
Usually, when you could list or get the repo policy correctly, you could use the parameter configuration part of the returning result as the request body in creating the policy with post method.
rest api to list the branch policy.
GET https://dev.azure.com/{organization}/{project}/_apis/policy/configurations?api-version=5.1
with optional parameter
GET https://dev.azure.com/{organization}/{project}/_apis/policy/configurations?scope={scope}&policyType={policyType}&api-version=5.1
You could check the templates below for different configurations in Policy template examples.
Examples
Approval count policy
Build policy
Example policy
Git case enforcement policy
Git maximum blob size policy
Merge strategy policy
Work item policy
If you still don't know how to compose the request body, you could also share your scenario.
i finally made it, it was very hard and i dont understand why Microsoft has so bad documentation.... i had to made it by sending randoms request and look at the elements how the names are... so bad so much time spend...

Azure Devops Extension Actions

I am developing a service hook extension for Azure DevOps that is loosely based on the sample provided on https://learn.microsoft.com/en-us/azure/devops/extend/develop/add-service-hook?view=azure-devops
I cannot find any documentation on the available actions. For example, the sample provides a sample consumer with an action of publishEvent but there is no reference material on this.
Could someone please point me toward any reference documentation that might exist?
It's only supported to send standard event payload with custom service hooks for now. In the sample, you can see the action defined as below:
"actions": [
{
"id": "performAction",
"name": "Perform action",
"description": "Posts a standard event payload",
"supportedEventTypes": [
"git.push",
"git.pullrequest.created",
"git.pullrequest.updated"
],
"publishEvent": {
"url": "{{{url}}}",
"resourceDetailsToSend": "all",
"messagesToSend": "all",
"detailedMessagesToSend": "all"
}
}
]
With this setting, it will send all the payload about the trigger event to the URL you configured. And you can configure how much of the information to send:
resourceDetailsToSend - all, minimal, none
messagesToSend - all, text, html, markdown, none
detailedMessagesToSend - all, text, html, markdown, none

Published property in data factory dataset json

I have noticed that upon saving dataset definition in data factory via the azure portal that
"published": false
Appears in the definition, I have seen dataset's work fine with published false. But also seen some seemingly only start working with published: true, however that might of been a coincidence.
I've been unable to find any documentation for this property.
{
"name": "DataLakeDummyXmlInput",
"properties": {
"published": false,
"type": "AzureDataLakeStore",
This property is currently classified as "legacy" as described by a Microsoft employee here:
...this is a legacy element in our object model.
The link also mentions the possibility of "lighting it up as a future feature" which translates as it may come in to use in the future. For now, don't worry about it.

Security of cloudant query from OpenWhisk

I'm building an Angular SPA with a Cloudant data store on Bluemix.
Since the Bluemix implementation of OpenWhisk doesn't use VCAP services, I see 3 options to use OpenWhisk as my api provider for cloudant queries for my Angular app:
Follow the pattern of passing credentials as seen here: https://github.com/IBM-Bluemix/openwhisk-visionapp (very interesting approach btw)
Include the credentials as though I'm running locally as seen here: https://github.com/IBM-Bluemix/nodejs-cloudant/blob/master/app.js
Use the http API as seen here: https://docs.cloudant.com/api.html (which highlights the security problem passing credentials.
Since my service is not intended for publishing (it's intended for my own app) I'm thinking option 2 is my "least of all evils" choice. Am I missing something? My thinking is such that while fragile to changes it would be the most secure since credentials aren't passed in the open. The serverless infrastructure would have to be hacked...
Thanks in advance!
(lengthy) Update: (apologies in advance)
I've gotten a little farther along but still no answer - stuck in execution right now.
To clarify, my objective is for the app to flow from Angular Client -> OpenWhisk -> Cloudant.
In this simplest use case, I want to pass a startTime parameter and an endTime parameter, have OpenWhisk fetch all the records in that time range with all fields, and passing back selected fields. In my example, I have USGS earthquake data in a modified GeoJSON format.
Following information from the following articles below, I've concluded that I can invoke the wsk command line actions and use the bindings I've setup from within my Javascript function and therefore not pass my credentials to the database. This gives me a measure of security (still question the rest endpoint of my OpenWhisk action) but I figure once I get my sample running I think through that part of it.
My command line (that works):
wsk action invoke /my#orgname.com_mybluemixspace/mycfAppName/exec-query-find --blocking --result --param dbname perils --param query {\"selector\":{\"_id\":{\"$gt\":0},\"properties.time\":{\"$gt\":1484190609500,\"$lt\":1484190609700}}}
This successfully returns the following:
{
"docs": [
{
"_id": "eq1484190609589",
"_rev": "1-b4fe3de75d9c5efc0eb05df38f056a65",
"dbSaveTime": 1.484191201099e+12,
"fipsalpha": "AK",
"fipsnumer": "02",
"geometry": {
"coordinates": [
-149.3691,
62.5456,
0
],
"type": "Point"
},
"id": "ak15062242",
"properties": {
"alert": null,
"cdi": null,
"code": "15062242",
"detail": "http://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ak15062242.geojson",
"dmin": null,
"felt": null,
"gap": null,
"ids": ",ak15062242,",
"mag": 1.4,
"magType": "ml",
"mmi": null,
"net": "ak",
"nst": null,
"place": "45km ENE of Talkeetna, Alaska",
"rms": 0.5,
"sig": 30,
"sources": ",ak,",
"status": "automatic",
"time": 1.484190609589e+12,
"title": "M 1.4 - 45km ENE of Talkeetna, Alaska",
"tsunami": 0,
"type": "earthquake",
"types": ",geoserve,origin,",
"tz": -540,
"updated": 1.484191127265e+12,
"url": "http://earthquake.usgs.gov/earthquakes/eventpage/ak15062242"
},
"type": "Feature"
}
]
}
The action I created in OpenWhisk (below) returns an Internal Server Error. I'm passing the input value as
{
"startTime": "1484161200000",
"endTime": "1484190000000"
}
Here's the code for my action:
`var openWhisk = require('openwhisk');
var ow = openWhisk({
api_key:'im really a host'
});
function main(params) {
return new Promise(function(resolve, reject) {
ow.actions.invoke({
actionName:'/my#orgname.com_mybluemixspace/mycfAppName/exec-query-find',
blocking:true,
parameters:{
dbname: 'perils',
query: {
"selector": {
"_id": {
"$gt": 0
},
"properties.time": {
"$gt": params.startTime,
"$lt": params.endTime
}
}
}
}
}).then(function(res) {
//get the raw result
var raw = res.response.result.rows;
//lets make a new one
var result = [];
raw.forEach(function(c) {
result.push({id:c.docs._id, time:c.docs.properties.time, title:c.docs.properties.title});
});
resolve({result:result});
});
});
}`
Here are the links to my research:
http://infrastructuredevops.com/08-17-2016/news-openwhisk-uniq.html
Useful because of the use of the exec-query-find and selector syntax usage but also cool for the update function I need to build for populating my data!
https://www.raymondcamden.com/2016/12/23/going-serverless-with-openwhisk
The article referenced by #csantanapr
Am I overlooking something?
Thanks!
I'm assuming what you are trying to do is to access your Cloudant DB directly from your angular client side code from the Browser.
If you don't need any business logic, or you can get away by using Cloudant features (design docs, views, map, reduce, etc..) and you are generating Cloudant API keys with certain access (i.e. write vs. read), then you don't need a server or serveless middlewear/tier.
But now let's get real, most people need that tier, and if you are looking a OpenWhisk, then you are in good luck this is very easy to do.
OpenWhisk on Bluemix support VCAP service credentials, but in a different way.
Let's name you have a Bluemix Org carlos#example.com and space dev that would translate to OpenWhisk namespace carlos#example.com_dev
If you add a Cloudant service under the space dev in Bluemix, this will generate service key credentials for this Cloudant Account. This credentials give you super power access meaning you are admin.
If you want to use this Cloudant credentials in OpenWhisk, you can use the automatic binding generated with the cloudant package.
To do this using the OpenWhisk CLI run wsk package refresh this will pull the Cloudant credentials and create you a new package with the credentials binded as default parameter for all the cloudant actions under that package. This is modified version of #1 above
Another alternative is to bind the credentials manually to a package or an action as default parameters, this makes sense when you don't want to use the super power admin credentials, and you generated a Cloudant API key for a specific database. This is option #1 above.
I would not recommend to put the credentials in source code #2
For option #3, what's insecure is to pass your credentials as part of the URL like https://username:password#user.cloudant.com, but passing the username and password in the Authorization header over https is secured.
This is because even if you are using secure transport https everything in the URI/URL is not encrypted anyone can see that value, but passing secrets in body or header is standard practice as this is transfer after secure connection is established.
Then you create actions that use the credentials as parameters in your OpenWhisk actions to build your business logic for your backend.
Then how to do you access this backend from the Browser, well OpenWhisk has a API Gateway feature in experimental that allows your to expose your actions as public APIs with CORS enable.
Only a url is expose, your credentials as default parameters are never expose.
If you want to see an example on check out Raymond Camden Blog posts where he show Ionic/Angular App accessing his Cloudant Database of Cats
https://www.raymondcamden.com/2016/12/23/going-serverless-with-openwhisk

Strongloop REST Connector - connecting to non-REST remote resources

We have an existing web application which has an API not based on REST. We'd like to put a REST API in front of it, using Strongloop, however, getting lost in the documentation and not sure if this can be achieved.
Example:
Want to configure an endpoint in Strongloop which looks like;
localhost:3000/api/DataObject/Orders?StartDate=01/01/2016&EndDate=31/01/2016
A GET on this end point should service the request from our existing web application, where the URL would like;
localhost:4000/wh?Page=ObjectBuilder&Name=Orders&StartDate=01/01/2016&EndDate=31/01/2016
i.e. take Orders from the API request and insert into the remote URL, along with the remaining parameters.
I could code this using express.js, but was wondering if this is possible using configuration in Strongloop?
Thanks!
I think you might be able to use the built-in REST connector even though your legacy API is not REST per se (although you don't get all the benefits of the built-in mapping to find, create, destroy, etc). The connector simply translates URLs into model methods. That said, I think you do need to have the old API spit out JSON... does it do that? If not, then you basically just have to write a full translator.
This is not working code, but might help you get part of the way there.
In your server/datasources.json file:
"old-service": {
"name": "old-service",
"connector": "rest",
"operations": [{
"template": {
"method": "GET",
"url": "http://localhost:4000/wh",
"headers": {
// whatever you might need to send...
},
"query": {
"Page": "ObjectBuilder",
"Name": "{name}",
"StartDate": "{start}",
"EndDate": "{end}"
},
"responsePath": "$.results.theObject" // be sure to custom ize this
},
"functions": {
"buildObject": ["name", "start", "end"]
}
}]
}
In your server/model-config.json be sure too map your DataObject model to this datasource:
{
// ...
"DataObject": {
"public": true,
"dataSource": "old-service"
},
}
And in your model itself (common/models/DataObject.js) you can now call the buildObject() method:
DataObject.buildObject('Order', '01/01/2016', '31/01/2016', function(err, result, response) {
if (err) { ... }
// otherwise look at the result or response...
});
Now that you can call this method, you could put it into a remoteMethod or even override the default find method for this model.
Good luck, but in many of these cases you simply have to write the "conversion" code yourself. Might be easier to rewrite the API from scratch. ;)