Why does Release Management Visual Studio Online only show 4 actions? - azure-devops

After installing Release Management client and connecting to my Visual Studio Online account I only see 4 actions.
Online I see that other people have a lot more actions in their toolbox:

What you are seeing is correct since in VSO you do pretty much everything in PowerShell using the Deploy Using PS/DSC action. I have a blog post series on Continuous Delivery with VSO here that might be of use in understanding how to get going.

As Graham pointed out what you are seeing is correct. On VSO RM only supports the DSC/PowerShell (vNext) pipelines. But the on premises product also supports the Agent based solution which is where you see all the additional actions and can add our own.
If you look at how the Agent based actions are created most are PowerShell and you can modify the PowerShell to work in a vNext pipeline. Here is an example of making a script work in both. That would require that the PowerShell be a part of your project so RM would be able to execute it during your release. Not as nice a solution as the Agent based but the next version of RM will address both of your concerns. There will be a way to establish a secure connection with Azure and a way for you to add custom actions or “Tasks”.
Keep an eye on DonovanBrown.com or follow me #DonovanBrown, I have blog post ready that I can’t post until the new RM is released. Before anyone asks I don’t have a date.

Related

What are the differences in using GitHub or GitLab for CI/CD?

I have a software project which is currently hosted on BitBucket. I would like to implement a CI/CD pipeline which would have to run on local agents for build/test/deploy. The runners would also have to be compatibile with Windows 7/10 (x86/x64) and Linux (x86/x64/arm64/armv7). I am pretty new to DevOps, but after a thorough search, I came up with 2 options: GitHub and GitLab. Can you present to me which one would be better, exposing some advantages/disadvantages for each one? Thanks a lot
My recommendation would be you go with GitLab because of some of the following reasons.
GitLab CI has been in the market for a much longer time than GitHub actions that was announced in Nov of 2019 you can see some of the feature comparisons on GitLab blog here
When you are getting started It is much easier to navigate the GitLab GUI to configure all the tools that you need for DevOps in comparison to GitHub's somewhat difficult to navigate GUI due to the number of other tools that are available on GitHub
In addition GitLab is primarily focused on improving DevOps and as a result, they have integrated a couple of features over time in line with making the whole entire DevOps process much smoother than GitHub which just jumped started out in 2019.
Also there are a bunch of templates available for you to get started on GitLab which is not the case in GitHub.Plus these templates are in a wide range of languages which I am sure to cover your project requirements
Ease of access of CI within GitLab well in addition to having an easy to navigate GUI GitLab has all the tools necessary for your DevOps bundled in one location so every single DevOps feature that you will need will be accessible in this one place and in addition to that they do have a YAML template available for you that can help you get started quickly.
Finally there are way more features within GitLab majorly because it has been in the market since 2012 or 2011 compared to GitHub actions of 2019
There are however some major similarities that I would also like to point out which I believe could make your transition easier or just in case you want to try out both tools to judge for yourself.
Both GitHub Actions and GitLab Ci are build-in tools.
Both GitHub and GitLab use the same commands so there will not be a learning curve for you in terms of managing and collaborating changes on your project.

SonarQube + Azure DevOps + Pipeline as Code - Is it possible?

The company I work on recently purchased SonarQube Enterprise to improve code quality throughout all repositories. I found out that there is a feature that enables SonarQube to comment automatically on PRs targeting a specific branch, and I successfully managed to try that out.
Thing is:
That configuration is not scalable: I would need to manually configure every repo to follow that rule
That configuration needs a build pipeline to be defined "old school" on Azure DevOps to work, and we are moving into Pipeline as Code, starting of course with CI (where this takes place)
Anyone managed to get the PR commenting working in that scenario? Or, at least, solving the #1 problem?
Cheers
You can use REST APIs to do whatever configuration you need to do across your repositories. Refer to the REST API documentation.
Shouldn't matter, although I haven't tested it. The SonarQube tasks aren't aware of whether the build source is YAML or visual designer/classic/JSON builds. The underlying tasks and job running architecture is the same. As long as the build is hooked up to a branch policy, it should still work.

How to integrate powerapps with azure devops

I am doing some research for Powerapps integration with Azure DevOps.
However there is limitated information for it.
It is possible to integrate powerapps inside a Task for AzureDevops?
Based on, that we have a .zip file with the Powerapp, and we want to create a Build and Release/Deploy for several environments.
Thank You.
It is possible to integrate powerapps inside a Task for AzureDevops?
Yes it is.
You can leverage the Solution concept of the Microsoft Power Platform and the Power Apps BuildTools (preview) extension for Azure DevOps.
Update 11/2020: This is now GA and called Power Platform Build Tools
I've written a complete step-by-step guide on this topic:
A Continuous Delivery Approach for No-Code Solutions in Microsoft’s Power Platform
Bottom line:
With this build tool, you can automatically check-in a Solution into source control and deploy it using a continuous delivery approach with the help of Azure DevOps. See the screenshot for a sample configuration of the Export and Import Solution Task.
It works for everything you can organize inside a Solution, e.g.:
Power Apps
Power Automate Flows
AI Builder Models
Common Data Service Entities
It is possible to integrate powerapps inside a Task for AzureDevops?
I am afraid there is no such Task integrate powerapps for AzureDevops at this moment.
If you want to integrate powerapps with azure devops, you can follow the guide step by step:
Microsoft Teams – Integration with Visual Studio Team Services using PowerApps.
Besides, AFAIK, PowerApps should not be "Build/Deployed" through Azure Devops.
When you are developing with PowerApps, there is no way to do Source
Control. There are no source files. The only artifact you can version
control is the .zip file that you can export.
And
In PowerApps, you don’t have to build your code. Any change you make
to the application is live for you to test it. In that way it is very
productive. To publish the application you just click on the publish
button and it is live.
Check this great blog: PowerApps From A DevOps Perspective for some more details.
Hope this helps.
Solutions are a way to package your components in a single zip file and use Powerapps build tools to import your solution on to a different environment or tenant.
It is still a an improvement from manually importing each app or environment variable and then import it on to target system, but it lacks what we call as automation of deployment.
To provide an eg, I will explain what I have done, and what still constitutes of a manual task:
I created an enterprise level app using powerapp canvas model. My app consumes data from around 20 APIs. These API calls are implemented in power automate.
We have 4 environments, dev, sit, uat and prod. Now I cant keep on importing flows in each environment and change their api URLs to point to the deployed environment. So I used environment variables for each environment which stores api URLs for each environment. This can be done under solution.
Under the same solution, I added my app. So now my solution has 2 things, my app and the environment variable which consists of api URLs.
I then use powerapps build tools to move this solution from dev to sit.
Steps: use build tools tasks to perform the following
Export solution
Unpack it in git
Pack it
Import the solution.
This successfully moves my solution to sit.
But the solution environment variable still points to the dev url.
So I have to override environment variables to store sit URLs.
This manual intervention to edit environment variable is as good as doing all the tasks manually.
This was the case when PowerApps was first announced; however, this is no longer the case.
While it is technically true that there is no actual code that would be managed and deployed with a PowerApp or Flow but that doesn't mean that you can not use the power of Azure DevOps. Additionally, when creating a PowerApp / flow you would also be creating entities and even Model Driven apps - and these uses solutions - which naturally work well to deplooy within Azure DevOps.
Microsoft is building out this whole construct to enable all these to deploy...
While the whole incorporation of PowerApps and flows into Solutions is not fully baked yet - they are targeting to have this ready around the October time frame this year.
We have been talking to Microsoft about also enabling PowerApps and flows to follow the same expansion that solutions do so that they can take advantage of the full branching strategy.
So even though you would be simply exporting out zip files into your repo - you can still take advantage of the full devops pipeline which is highly recommended.
Use this component, it still on preview mode but is working fine on my side
https://marketplace.visualstudio.com/items?itemName=microsoft-IsvExpTools.PowerApps-BuildTools

Backup or copy Azure Mobile Service scripts?

I am running a mobile service with an increasing amount of scripted functionality. I want to have these scripts somehow stored in a smart format for version control. I'm having hard time finding any information on such scenarios. Is it even possible Azure -> VS2012 (and TFS) or VS2012 (and TFS) -> Azure?
Currently that is not supported on the portal itself. You can do that by using the Command Line Interface, however, as was shown in this blog post. Basically, you can store the scripts in whichever source control system you want (the post uses Git, but it would work with TFS as well) and use the CLI to update your service whenever a new version of the script is checked in.
You can also vote up the source control feature suggestion on the UserVoice for the system, to have that functionality implemented in the service itself.
You can now link your Azure Mobile scripts to a git repo directly from the Azure Mobile Portal Dashboard, allowing you to edit scripts from VS2012 or another editor.

Source control for server side scripts in Azure Mobile Service

I am using Azure Mobile Services as the backend for my mobile app. Despite my best efforts, my server side scripts are getting complex now. Is there a way I can keep the insert, update, read, delete scripts for the tables in my service, in source control and maybe have a way to deploy them from within Visual Studio?
Have you checked out the node Azure Command Line Tools? This will likely hold the solution to your problems. These tools allow you to neatly manage your mobile service from your dev machine. The newly added cli tools for Mobile Services also support downloading your scripts. Just run the following command in your Azure Powershell:
azure mobile script download <service_name> <script_name>
The script name syntax is as follows:
For tables: table/.{insert|read|update|delete}
Apple Push Notification: shared/apnsFeedback
Scheduler: scheduler/
Once you have your scripts downloaded and placed on your local filesystem, you could put them in source control with your client that consumes your mobile service, or just throw them in their own git repository. You can't, however, sync your source control repository with your mobile service. In order to upload any changes you've made to your scripts, you'd need to execute the following command in the Azure-CLI again:
azure mobile script upload <service_name> <script_name>
I'm not sure if you can upload multiple scripts at once though. You could probably use some of the Azure-CLI Automation scripts I saw Glenn Block post on github. This could allow you to somehow automate uploading the scripts as a part of your build workflow.
Edit:
I found a few more resources that might help you with this:
Getting started with the CLI and backing up your scripts
More CLI – changing your Mobile Services workflow
These are some great resources from Josh Twist. I'm sure they will push you in the right direction.
Since this question has been answered, a new feature has been added to Azure Mobile Services - integration with Git source control. Basically you can enable this feature in the dashboard of your mobile service, and it converts the storage in a Git repository which you can clone / pull and push updates to.
You can find more information in the tutorial at http://www.windowsazure.com/en-us/develop/mobile/tutorials/store-scripts-in-source-control/.