In the FAQ of the UsePython task doc it says that in order to use the UsePython task, you have to manually install the required version of Python under a specific folder structure in the Agent.ToolDirectory folder.
This is pretty daunting. Isn't there a way to configure Azure to a specific source (such as Artifactory instance or something like this) and tell it to install it from there?
If you use self hosted agents on VM's you enter the area where tools for configuration managment could be handy. So you can use:
powershell desired state configuration install python quick tutorial
ansible
chef
However, there is no simple way to acheive this. You can do it manually, or build a solution to handle software on your self hosted machines. You can also reuse repository for MS Hosted vm's and leverage this.
Related
I have some .exe files which i need to install on VMs in azure using azure automation powershell DSC. Is this possible and how can it be done? The exe files are currently stored in an azure file share.
A similar question was asked here The answer provided was:
If you want to do the install the EXE remotely then the VM needs to be configured to allow for remote management - you can do this via WinRM. See: https://www.penflip.com/powershellorg/secrets-of-powershell-remoting/blob/master/accessing-remote-computers.txt and http://tarkus.me/post/64761019099/windows-azure-vms-remote-management
Setting this up is non-trivial (you need to ensure it's secure). You could also try adding an VM extension to the VM that will do the copy. Since the VM is already provision I think the only option here would be to use DSC though I've never added a DSC extension to a VM post-provisioning, it should work. That would require authoring and staging a DSC script in addition to adding the extension but definitely less complex than enabling remote management.
You should also consider Azure automation, depending on how large scale is your automation projects are. You can find more info and tutorials here
We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)
I have several ASP.NET application running under a common IIS Site and would like to use a NuGet package to transform their config while copying versioned JavaScript file to the IIS Site's Root application Content folder.
Can NuGet be used like this? I.e. can it be determine during package installation/updating which IIS Sites root application an ASP.NET project is being run under (using the preinit powershell script for example).
The tool you are looking for is Octopus Deploy It install nuget packages on IIS, as windows service, etc. Moreover it gives raeally nice GUI for management.
NuGet is a dependency manager, and as such is appropriate for use at development/build, not deployment time.
Take a look at Inedo's BuildMaster. It can take care of the process from source control through production deployment. There's also a free version that will most certainly handle your requirements, and it also has a module to manage your configuration files so you don't need to worry about doing transforms.
(disclaimer: I work for Inedo)
We are currently using web deploy for creating packages for our .net web applications. It got some pros and cons. Now we are going to use nuget for dependency management but given the ease of packaging in nuget .. i am debating on should i still use web deploy + remote service or try to use nuget to bundle my web application and use powershell or something like octupus to deploy ??
For my web deploy becomes little complex when even try to do simple things like include , exlcude , gac or registry or iis config ( which again not very flexible ).
But on the other hand it comes with remote service and all i need to do is through the package to the service and i am done..
Please give your inputs on this comparison.
-raj
NuGet is a dependency manager, and as such is appropriate for use at development/build, not deployment time. From a deployment perspective, it doesn't offer you any more than what a zip file does... except all the overhead of trying to fit NuGet in the process.
Take a look at Inedo's BuildMaster. It can take care of the process from source control through production deployment. There's also a free version that will most certainly handle your requirements, and it also has a module to manage your configuration files so you don't need to worry about doing transforms.
(disclaimer: I work for Inedo)
I am setting up a system that uses NuGet and Webdeploy.
NuGet is used as the repository format, so the build system publishes NuGet packaged artifacts.
The Deployment systemn uses NuGet to get the RIGHT packaged artifact from the NuGet repository.
The build artififact happens to be WebDeploy for easy installation.
I am still working out if I am going to use puppet, chef or octupus to orchastrate the deployment.
Newbie to automated azure deployment here! I have the happy task of automating our deployment to the cloud. I have also done some reading and discovered that the 2 main tools are MSbuild and Powershell. Please could anyone tell me why i would use one over the other or indeed if there are any better ways to automate the deployment. Keeping in mind that my main concern is performance and i need this deplymrnt to be as fast as possible.
Any insight would be most welcome.
I'm a fan of using PowerShell for deployments. It's pretty quick to set up and the script can be pretty straight forward.
MSBuild can be great too. I use MSBuild from TFS Team Build to kick off a PowerShell script to do the deployment. Works like a champ.
A good starting point would be http://blogs.msdn.com/b/tomholl/archive/2011/12/06/automated-build-and-deployment-with-windows-azure-sdk-1-6.aspx. This blog does a great job of showing you how to build and deploy with Team Build.
If you don't want/need the Team Build and MSBuild part, then just look at his PowerShell script. That covers the basics of getting a deployment from your dev environment to Windows Azure.
You should use Web Deploy, it only takes about a minute to deploy a fix. See these links
http://blogs.msdn.com/b/cloud/archive/2011/04/19/enabling-web-deploy-for-windows-azure-web-roles-with-visual-studio.aspx
http://channel9.msdn.com/Blogs/funkyonex/Speed-Up-Azure-Deployments-with-the-New-Web-Deployment-Feature
At SplendidCRM, we had a similar need to automate deployments to Azure, but as our need was to service our live customers, we had to develop using C#. We have been watching Azure for many years, but it was not until they provided a DNS service did it make sense to make the move. Using the Azure Resource Manager (ARM) libraries, we were able to automate VM creation, SQL database creation and DNS name creation. In addition to the Microsoft documentation for ARM, we found it particularly useful to be able to get the Microsoft source code for the PowerShell scripts that wrap ARM. This is because the documentation does not always provide a complete set of settings.
In the end, we decided to release the Azure deployment code as part of a new Ultimate edition that combines order and customer management with software deployment.