MSOLEDBSQL on Azure Pipelines - azure-devops

To promote CI/CD for Analysis Services tabular models (SSAS, Azure AS or Power BI datasets), I always recommend people to use Tabular Editor with Azure DevOps. One popular feature of Tabular Editor's CLI, is that it can perform a schema check, in which Tabular Editor connects to the data source defined within the tabular model, in order to validate the partition queries against the actual columns specified in the model metadata.
Microsoft recommends the use of the MSOLEDBSQL provider (1,2) for SQL Server-based data sources. Unfortunately, this provider is not available on Microsoft-hosted build agents in Azure Pipelines (neither vs2017-win2016 nor windows-2019).
Unfortunately, the installer for MSOLEDBSQL requires admin permission, so I don't think that we can install the driver as part of our pipeline.
One workaround is to use Tabular Editor's scripting functionality to temporarily change the data source to use for example the SQLNCLI provider when performing the schema check. However, it feels like the missing MSOLEDBSQL driver on the build agents is an oversight on Microsoft's part, especially considering that they're recommending the use of this driver for production purposes.
Is there any way we can have the MSOLEDBSQL driver available on a Microsoft-hosted Windows-based build agent?

Is there any way we can have the MSOLEDBSQL driver available on a
Microsoft-hosted Windows-based build agent?
Virtual Environments repo contains the source used to create the virtual environments for GitHub Actions hosted runners, as well as the VM images of Microsoft-hosted agents used for Azure Pipelines.To file bug reports, or request that tools be added/updated, please open an issue using the appropriate template.
So I think you can open a Tool Request here with the given template, then the team there would consider and check your feedback.
In addition: As temporary workaround you can consider installing one self-hosted agent in your local machine, so that you can run the pipeline with local environment. (With more control to install dependent software needed for your build and deployment)

Related

How do I SSH/RDP into the ADO agents in order to investigate build issues?

I am unable to find any documentation on wiki which details approach for SSH/RDP into the ADO agents. Will be good to know the approach for both BTL/ATL agents.
The Microsoft hosted agents? You don't. You can't. They are assigned when the pipeline is queued and immediately deprovisioned after the pipeline finishes running.
Your own on-premise/self-hosted agents? The same way you'd SSH/RDP into any other machine. If you're having trouble with that, that's something for you to discuss with your networking / infrastructure team.
If you are using a Microsoft hosted then you can not per the documentation. The agent is created on a one-time instance and then torn down as soon as the build is completed.
Microsoft-hosted agents do not offer:
The ability to remotely connect.
The ability to drop artifacts to a UNC file share.
The ability to join machines directly to your corporate network.
The ability to get bigger or more powerful build machines.
The ability to pre-load custom software. You can install software during a pipeline run, such as through tool installer tasks or in a
script.
Potential performance advantages that you might get by using self-hosted agents that might start and run builds faster.
The ability to run XAML builds.
If you are using a self-hosted machine running the agent, then you'd just SSH/RDP into the server like any other and check the work folder.

Azure Pipelines - Clone or Copy Hosted Agent

Is it possible to clone or copy an Azure Pipeline Hosted Agent for use as a Self-Hosted Agent?
I'd like to reuse one of the Hosted Agents simply to enable me recreate and reuse all of its
capabilities, saving me the headache. Is this possible and if so I'd really appreciate some help.
Thanks
This is not possible, but you can try to reuse what is already available here https://github.com/actions/virtual-environments
This repository contains the source used to create the virtual environments for GitHub Actions hosted runners, as well as the VM images of Microsoft-hosted agents used for Azure Pipelines. To file bug reports, or request that tools be added/updated, please open an issue using the appropriate template. To build a VM machine from this repo's source, see the instructions.
So you can use the same scripts which are used to create Microsoft Hosted agents.
There is no such build-in feature.
Azure DevOps provides free hosted agents that have a predefined set
of tools installed and configured for building and releasing your
apps.
There is another option where you set up and manage your own agents.
This can be done by simply downloading the agent package, and run it
either on your local machine or any other computing platform even on
Docker container. It also gives you more freedom to install specific
dependencies for your build and release.
If you want to build your own agent during the pipeline. I would suggest you use Docker container to handle the process.
Microsoft has already created pre-configured container images on Docker Hub for everybody to use. But they’re Linux based and don’t contain any additional applications and/or packages so you’d probably still need to add those every time you run your build.
This repository contains images for the Visual Studio Team Services
(VSTS) agent that runs tasks as part of a build or release.
VSTS agent images are tagged according to the base OS, an optional
Team Foundation Server (TFS) version, and tools that are installed.
When used with VSTS, the agent version is automatically determined and
downloaded at container startup based on the account to which the
agent is connecting.
More detail step, you could refer this article: Build your own Azure DevOps agents with pipelines

Test server for release pipeline in Azure DevOps

Forgive me for asking a stupid question. I am from IT Infrastructure background & have been asked to create CI/CD pipelines based on my recent learnings on DevOps.
We have couple of applications whose source code is currently in TFS 2013 & those apps are written in ASP.NET C# language. Now, requirement is to migrate the source code from TFS to Azure Repos (Azure DevOps services) & further create a CI/CD pipeline.
Now for demo purposes, customer is asking us to do the deployment (i.e. Release pipeline) on a test server which is a plain windows 2012 OS without any SQL & IIS for both of these applications. Is that possible & how could we achieve the results to confirm release pipeline is funcioning properly?
In my opinion, it wont work as there is no application infra/configuration done for those applications on that plain test server. I guess we actually need a ready dev/stage environment which is replica of production to do the testing of release pipeline for those applications. Am I correct?
Just need expert advise for confirmation so I communicate the same to customer.
Azure DevOps Pipelines use an agent to perform the deployments. You can run the agent entirely in the cloud when deploying to Azure resources. You can also install an agent locally. Follow this link and scroll down to read about self-hosted agents. This is how you can deploy to your test instance from the pipeline.
Now, what you deploy there may require additional software be installed. You say it's an application in C#. Cool. Now, what's it do? Is it a windows program? Then just having the server there, with an agent installed, is all you need. Is it a web program? Then, yeah, it's going to need an IIS (or whatever) instance available somewhere to deploy to. Is it a database program? Then, yeah, it's going to need a database instance to deploy to. There's nothing magical about having a VM or a machine somewhere. All the same rules have to apply. There has to be an OS, drives, memory, and yes, supporting services depending on the needs of the application.
However, using a local machine instead of a hosted one, that works fine. Just follow the instructions in the link above.

How to deploy SQL Database in Bitbucket pipeline to Azure

Asking for an opinion or direction on the current problem.
We are using bitbucket pipeline to deploy ci/cd web applications to Azure. Now what is remaining - the database, also being hosted on Azure.
From my research - everything on SQL Database Projects deployments usually utilizes Azure DevOps pipelines (connects to github repo, allows plural environments, has a built-in SqlAgent allows deploy SQL db to the target server via dacpac file. It allows CI with every check-in, every time you push changes. Nice!
But what if can not (for some reason) use Azure DevOps and have to utilize Bitbucket pipelines instead. is that possible? how? via scripting? a tool? to call in the command line? Any help - highly appreciated.
It's true that in Azure DevOps it is easier to deploy (Azure) SQL Database, as Azure DevOps offers many tasks (including 3rd party custom tasks you can find in Microsoft MarketPlace).
However, no matter what tool will you use, you should be able to do the same, knowing the concept of deployment of a specific service.
I don't know BitBucket very well, but I bet the product has the capability to execute some commands, including PowerShell commands as well. If so, you must do 2 steps in your pipeline to publish Azure SQL database:
1) Create server and (empty) database - perhaps BitBucket offers some task for creating services in Azure (from ARM template or other way). If not - you can always use CLI or PowerShell to do so. More info: az cli server
2) Deploy the database or changes to it. This step is always to compare DACPAC file (which is compiled version of SQL Server database project) to target database on the server. The result is T-SQL (differential) script which must be executed against the target database. There is only one way to do so - sqlpackage.exe - tool provided by Microsoft. You can find the whole documentation here and plenty of examples on how to use it on the Internet.
Let me know if that helps.

How to integrate powerapps with azure devops

I am doing some research for Powerapps integration with Azure DevOps.
However there is limitated information for it.
It is possible to integrate powerapps inside a Task for AzureDevops?
Based on, that we have a .zip file with the Powerapp, and we want to create a Build and Release/Deploy for several environments.
Thank You.
It is possible to integrate powerapps inside a Task for AzureDevops?
Yes it is.
You can leverage the Solution concept of the Microsoft Power Platform and the Power Apps BuildTools (preview) extension for Azure DevOps.
Update 11/2020: This is now GA and called Power Platform Build Tools
I've written a complete step-by-step guide on this topic:
A Continuous Delivery Approach for No-Code Solutions in Microsoft’s Power Platform
Bottom line:
With this build tool, you can automatically check-in a Solution into source control and deploy it using a continuous delivery approach with the help of Azure DevOps. See the screenshot for a sample configuration of the Export and Import Solution Task.
It works for everything you can organize inside a Solution, e.g.:
Power Apps
Power Automate Flows
AI Builder Models
Common Data Service Entities
It is possible to integrate powerapps inside a Task for AzureDevops?
I am afraid there is no such Task integrate powerapps for AzureDevops at this moment.
If you want to integrate powerapps with azure devops, you can follow the guide step by step:
Microsoft Teams – Integration with Visual Studio Team Services using PowerApps.
Besides, AFAIK, PowerApps should not be "Build/Deployed" through Azure Devops.
When you are developing with PowerApps, there is no way to do Source
Control. There are no source files. The only artifact you can version
control is the .zip file that you can export.
And
In PowerApps, you don’t have to build your code. Any change you make
to the application is live for you to test it. In that way it is very
productive. To publish the application you just click on the publish
button and it is live.
Check this great blog: PowerApps From A DevOps Perspective for some more details.
Hope this helps.
Solutions are a way to package your components in a single zip file and use Powerapps build tools to import your solution on to a different environment or tenant.
It is still a an improvement from manually importing each app or environment variable and then import it on to target system, but it lacks what we call as automation of deployment.
To provide an eg, I will explain what I have done, and what still constitutes of a manual task:
I created an enterprise level app using powerapp canvas model. My app consumes data from around 20 APIs. These API calls are implemented in power automate.
We have 4 environments, dev, sit, uat and prod. Now I cant keep on importing flows in each environment and change their api URLs to point to the deployed environment. So I used environment variables for each environment which stores api URLs for each environment. This can be done under solution.
Under the same solution, I added my app. So now my solution has 2 things, my app and the environment variable which consists of api URLs.
I then use powerapps build tools to move this solution from dev to sit.
Steps: use build tools tasks to perform the following
Export solution
Unpack it in git
Pack it
Import the solution.
This successfully moves my solution to sit.
But the solution environment variable still points to the dev url.
So I have to override environment variables to store sit URLs.
This manual intervention to edit environment variable is as good as doing all the tasks manually.
This was the case when PowerApps was first announced; however, this is no longer the case.
While it is technically true that there is no actual code that would be managed and deployed with a PowerApp or Flow but that doesn't mean that you can not use the power of Azure DevOps. Additionally, when creating a PowerApp / flow you would also be creating entities and even Model Driven apps - and these uses solutions - which naturally work well to deplooy within Azure DevOps.
Microsoft is building out this whole construct to enable all these to deploy...
While the whole incorporation of PowerApps and flows into Solutions is not fully baked yet - they are targeting to have this ready around the October time frame this year.
We have been talking to Microsoft about also enabling PowerApps and flows to follow the same expansion that solutions do so that they can take advantage of the full branching strategy.
So even though you would be simply exporting out zip files into your repo - you can still take advantage of the full devops pipeline which is highly recommended.
Use this component, it still on preview mode but is working fine on my side
https://marketplace.visualstudio.com/items?itemName=microsoft-IsvExpTools.PowerApps-BuildTools