Our Azure DevOps implementation deploys both an ARM template and runs several powershell scripts to fully deploy our solution. Currently it is modifying the Azure storage logging and metric properties using the Azure Powershell commands Set-AzureStorageServiceMetricsProperty and Set-AzureStorageServiceLoggingProperty.
While it is perfectly acceptable to continue using these commands, we're considering adding the equivalent JSON to our ARM template. Is this possible? If so, is this documented anywhere? I've looked through the Azure resource manager template reference, but that doesn't seem to have what I need. Any pointers or even example JSON is appreciated.
Yes, that is possible. Here's the article talking about that:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/monitoring-and-diagnostics/monitoring-enable-diagnostic-logs-using-template.md#non-compute-resource-template
ARM Json couldnt change without notice (well, no one would warn YOU personally, but they have breaking changes EXTREMELY RARELY, my 3 year old arm templates still work). So there nohting to fear, really.
Related
We have created a process template on the Enterprise level access on Microsoft AzureDevOps platform. We were looking to export the process template so that it can imported for some other organization. However we do not find an option to do so. Can anyone help?
The only way I've found so far to export inherited processes to other organizations is to use the process-migrator tool that's on GitHub made by Microsoft. There are some wonky things about it that don't totally work but hopefully should be a good start:
https://github.com/Microsoft/process-migrator
You download and install dependencies on the tool then you can run migrate or export/import (I think I usually do export/import).
I think that it works okay as-is except for if you have work item rules that are type CurrentUserIsMemberOfGroup, and picklists don't export correctly, but you'll want to do some testing of the tooling first. I also found out recently that this tooling uses an old SDK/API version (API v4.1) so hopefully it will be updated soon.
I am not sure through Azure DevOps UI but there are methods in Azure DevOps Services REST API
Export Process Template REST API Documentation
Import Process Template REST API Documentation
Parameters are pretty straighforward and well explained in MS Documentation.
I have always done manual report publishing to PBI work-space and it has worked well. Let's me have better control over;
Publishing
Dashboarding
Collaborating
Securing
I have started a new piece of work and requirements are to have all the above or if not at least publish the reports using automated scripts.
I have googled and I haven't found anything that purely talks about automating using, say Powershell or any other method barring C#.
We have plans to use Powershell scripts to deploy Azure modules and was looking for something similar for PBI as well.
Would appreciate pointers to any script that I can customize and use or a tutorial that explains this process.
Cheers...
You can either use the PowerBI service to automate all of this, which is the referred way, recommended by Microsoft, as seen on https://community.powerbi.com/t5/Desktop/Automation-for-Power-BI-Desktop/td-p/332560, or you can use this python script by dubravcik: https://github.com/dubravcik/pbixrefresher-python
We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)
Newbie to automated azure deployment here! I have the happy task of automating our deployment to the cloud. I have also done some reading and discovered that the 2 main tools are MSbuild and Powershell. Please could anyone tell me why i would use one over the other or indeed if there are any better ways to automate the deployment. Keeping in mind that my main concern is performance and i need this deplymrnt to be as fast as possible.
Any insight would be most welcome.
I'm a fan of using PowerShell for deployments. It's pretty quick to set up and the script can be pretty straight forward.
MSBuild can be great too. I use MSBuild from TFS Team Build to kick off a PowerShell script to do the deployment. Works like a champ.
A good starting point would be http://blogs.msdn.com/b/tomholl/archive/2011/12/06/automated-build-and-deployment-with-windows-azure-sdk-1-6.aspx. This blog does a great job of showing you how to build and deploy with Team Build.
If you don't want/need the Team Build and MSBuild part, then just look at his PowerShell script. That covers the basics of getting a deployment from your dev environment to Windows Azure.
You should use Web Deploy, it only takes about a minute to deploy a fix. See these links
http://blogs.msdn.com/b/cloud/archive/2011/04/19/enabling-web-deploy-for-windows-azure-web-roles-with-visual-studio.aspx
http://channel9.msdn.com/Blogs/funkyonex/Speed-Up-Azure-Deployments-with-the-New-Web-Deployment-Feature
At SplendidCRM, we had a similar need to automate deployments to Azure, but as our need was to service our live customers, we had to develop using C#. We have been watching Azure for many years, but it was not until they provided a DNS service did it make sense to make the move. Using the Azure Resource Manager (ARM) libraries, we were able to automate VM creation, SQL database creation and DNS name creation. In addition to the Microsoft documentation for ARM, we found it particularly useful to be able to get the Microsoft source code for the PowerShell scripts that wrap ARM. This is because the documentation does not always provide a complete set of settings.
In the end, we decided to release the Azure deployment code as part of a new Ultimate edition that combines order and customer management with software deployment.
We've run into an issue with the New-Deployment Azure Powershell commandlet timing out; we've put in a bug report with MS. While they gave us an explanation for it (the path and timeout threshold used to upload through commandlets is different then what's used by the web portal); they don't have a fix for us.
We need to get this running so we can automate our build deployments, so we're looking into developing a custom commandlet to replace New-Deployment using the Azure SDK; hoping this path will not have the timeout issues the commandlet did. But before we go down that route; are there any other scriptable tools I can use to replace the New-Deployment functionality? I looked at Cloudberry for Windows Azure; but that doesn't have a scriptable interface yet.
Any constructive input is greatly appreciated.
If you are developing worker roles, then you can reuse the dynamic assembly loading system that comes with the execution framework of Lokad.Cloud.
Basically, you just have to upload a ZIP archive containing all your DLLs toward the Blob Storage and the framework takes care of the rest. Extra Bonus: worker restart takes about 10s (vs +10min for a classical deployment)
A similar behavior could be obtained with web roles too, but this part hasn't been implemented in Lokad.Cloud yet.
FYI, we solved this issue by writing a custom tool to upload the package with a higher timeout tolerance; as well as using CSManage (http://code.msdn.microsoft.com/Release/ProjectReleases.aspx?ProjectName=windowsazuresamples&ReleaseId=3233) to create the deployment after uploading.