Azure Pipelines Tasks - How to query available versions? - azure-devops

We have a situation where for stability reasons, we are locking some Tasks in our azure pipelines to specific versions.
For example, we might lock S3Upload#1.10.0 (from the AWS Task Toolkit) or PublishTestResults#2.180.0 (built-in task). We don't want these specific versions changing on us without doing a manual test pass on this pipeline.
However, there's no easy way to tell that a new version should be tested. Is there a comprehensive list of the task versions available in your account, or a way to ask via an API what the available versions of a task is? Ideally I could make an API call like _apis/tasks/S3Upload/versions and see what versions of that task are available for me to use in a pipeline.
(An API, query, or feed of some kind would allow us to gather up the changed tasks and queue them for a periodic manual test pass.)

Currently you can use Azure DevOps rest API to get the list of installed extensions (could be pipeline tasks).
You can find version in the response as shown below
"extensionId": "pr-multi-cherry-pick",
"extensionName": "PR Multi-Cherry-Pick",
"publisherId": "1ESLighthouseEng",
"publisherName": "Microsoft DevLabs",
"version": "1.0.0.64",
"registrationId": "204772b3-5f43-40f4-9c1b-c15c798f1544",

Related

Can I force ADO to disregard a build agent demand?

We are running Azure DevOps Server and we have our own, locally hosted build agents. I'm trying to get a WhiteSource scan to run on one of our build agents. The WhiteSource task "demands" node.js. But none of our projects use node.js at all, so whether the agents have node.js installed is totally irrelevant. I can't identify an option that could be used to stop the WhiteSource task making this demand.
Is there a way to cancel the demand? A way to tell ADO "this task/pipeline is going to demand node.js, but in fact I know better; you may disregard that demand and run the pipeline even on a build agent that doesn't have node.js installed"?
The build pipeline is a YAML pipeline. I would like a YAML-only solution if possible.
I would like to avoid actually installing node.js on our build agents, given that the dependency on it is entirely spurious; it would never actually be used.
Is there a way to cancel the demand?
The demand of some agents is automatically added according to the needs of the task.
In this case, we cannot cancel these demands.
For a workaround to solve this issue, you can manually set the node.js in Agent Pools -> Target self agent -> Capabilities -> User-defined capabilities.
It can also meet the demand of the pipeline without actually installing node.js.

Executing onetime scripts from Azure DevOps Pipelines

I'm looking for some advice on how others might have managed the handling and execution of one-time scripts which need to be executed either pre-deployment or post-deployment.
We are looking at building a solution but I was wondering if there are any tools out there already?
I want the pipeline to find the scripts which are relevant to the sprint release being deployed and run the script if it has not been run before. These scripts are often data changes following schema updates. We use CosmosDB. Scripts are written in .Net

MSOLEDBSQL on Azure Pipelines

To promote CI/CD for Analysis Services tabular models (SSAS, Azure AS or Power BI datasets), I always recommend people to use Tabular Editor with Azure DevOps. One popular feature of Tabular Editor's CLI, is that it can perform a schema check, in which Tabular Editor connects to the data source defined within the tabular model, in order to validate the partition queries against the actual columns specified in the model metadata.
Microsoft recommends the use of the MSOLEDBSQL provider (1,2) for SQL Server-based data sources. Unfortunately, this provider is not available on Microsoft-hosted build agents in Azure Pipelines (neither vs2017-win2016 nor windows-2019).
Unfortunately, the installer for MSOLEDBSQL requires admin permission, so I don't think that we can install the driver as part of our pipeline.
One workaround is to use Tabular Editor's scripting functionality to temporarily change the data source to use for example the SQLNCLI provider when performing the schema check. However, it feels like the missing MSOLEDBSQL driver on the build agents is an oversight on Microsoft's part, especially considering that they're recommending the use of this driver for production purposes.
Is there any way we can have the MSOLEDBSQL driver available on a Microsoft-hosted Windows-based build agent?
Is there any way we can have the MSOLEDBSQL driver available on a
Microsoft-hosted Windows-based build agent?
Virtual Environments repo contains the source used to create the virtual environments for GitHub Actions hosted runners, as well as the VM images of Microsoft-hosted agents used for Azure Pipelines.To file bug reports, or request that tools be added/updated, please open an issue using the appropriate template.
So I think you can open a Tool Request here with the given template, then the team there would consider and check your feedback.
In addition: As temporary workaround you can consider installing one self-hosted agent in your local machine, so that you can run the pipeline with local environment. (With more control to install dependent software needed for your build and deployment)

SonarQube + Azure DevOps + Pipeline as Code - Is it possible?

The company I work on recently purchased SonarQube Enterprise to improve code quality throughout all repositories. I found out that there is a feature that enables SonarQube to comment automatically on PRs targeting a specific branch, and I successfully managed to try that out.
Thing is:
That configuration is not scalable: I would need to manually configure every repo to follow that rule
That configuration needs a build pipeline to be defined "old school" on Azure DevOps to work, and we are moving into Pipeline as Code, starting of course with CI (where this takes place)
Anyone managed to get the PR commenting working in that scenario? Or, at least, solving the #1 problem?
Cheers
You can use REST APIs to do whatever configuration you need to do across your repositories. Refer to the REST API documentation.
Shouldn't matter, although I haven't tested it. The SonarQube tasks aren't aware of whether the build source is YAML or visual designer/classic/JSON builds. The underlying tasks and job running architecture is the same. As long as the build is hooked up to a branch policy, it should still work.

How to integrate powerapps with azure devops

I am doing some research for Powerapps integration with Azure DevOps.
However there is limitated information for it.
It is possible to integrate powerapps inside a Task for AzureDevops?
Based on, that we have a .zip file with the Powerapp, and we want to create a Build and Release/Deploy for several environments.
Thank You.
It is possible to integrate powerapps inside a Task for AzureDevops?
Yes it is.
You can leverage the Solution concept of the Microsoft Power Platform and the Power Apps BuildTools (preview) extension for Azure DevOps.
Update 11/2020: This is now GA and called Power Platform Build Tools
I've written a complete step-by-step guide on this topic:
A Continuous Delivery Approach for No-Code Solutions in Microsoft’s Power Platform
Bottom line:
With this build tool, you can automatically check-in a Solution into source control and deploy it using a continuous delivery approach with the help of Azure DevOps. See the screenshot for a sample configuration of the Export and Import Solution Task.
It works for everything you can organize inside a Solution, e.g.:
Power Apps
Power Automate Flows
AI Builder Models
Common Data Service Entities
It is possible to integrate powerapps inside a Task for AzureDevops?
I am afraid there is no such Task integrate powerapps for AzureDevops at this moment.
If you want to integrate powerapps with azure devops, you can follow the guide step by step:
Microsoft Teams – Integration with Visual Studio Team Services using PowerApps.
Besides, AFAIK, PowerApps should not be "Build/Deployed" through Azure Devops.
When you are developing with PowerApps, there is no way to do Source
Control. There are no source files. The only artifact you can version
control is the .zip file that you can export.
And
In PowerApps, you don’t have to build your code. Any change you make
to the application is live for you to test it. In that way it is very
productive. To publish the application you just click on the publish
button and it is live.
Check this great blog: PowerApps From A DevOps Perspective for some more details.
Hope this helps.
Solutions are a way to package your components in a single zip file and use Powerapps build tools to import your solution on to a different environment or tenant.
It is still a an improvement from manually importing each app or environment variable and then import it on to target system, but it lacks what we call as automation of deployment.
To provide an eg, I will explain what I have done, and what still constitutes of a manual task:
I created an enterprise level app using powerapp canvas model. My app consumes data from around 20 APIs. These API calls are implemented in power automate.
We have 4 environments, dev, sit, uat and prod. Now I cant keep on importing flows in each environment and change their api URLs to point to the deployed environment. So I used environment variables for each environment which stores api URLs for each environment. This can be done under solution.
Under the same solution, I added my app. So now my solution has 2 things, my app and the environment variable which consists of api URLs.
I then use powerapps build tools to move this solution from dev to sit.
Steps: use build tools tasks to perform the following
Export solution
Unpack it in git
Pack it
Import the solution.
This successfully moves my solution to sit.
But the solution environment variable still points to the dev url.
So I have to override environment variables to store sit URLs.
This manual intervention to edit environment variable is as good as doing all the tasks manually.
This was the case when PowerApps was first announced; however, this is no longer the case.
While it is technically true that there is no actual code that would be managed and deployed with a PowerApp or Flow but that doesn't mean that you can not use the power of Azure DevOps. Additionally, when creating a PowerApp / flow you would also be creating entities and even Model Driven apps - and these uses solutions - which naturally work well to deplooy within Azure DevOps.
Microsoft is building out this whole construct to enable all these to deploy...
While the whole incorporation of PowerApps and flows into Solutions is not fully baked yet - they are targeting to have this ready around the October time frame this year.
We have been talking to Microsoft about also enabling PowerApps and flows to follow the same expansion that solutions do so that they can take advantage of the full branching strategy.
So even though you would be simply exporting out zip files into your repo - you can still take advantage of the full devops pipeline which is highly recommended.
Use this component, it still on preview mode but is working fine on my side
https://marketplace.visualstudio.com/items?itemName=microsoft-IsvExpTools.PowerApps-BuildTools