What guarantees are there about the Windows Azure runtime components? - deployment

What's the minimal guaranteed list of functionality a Windows Azure instance will provide?
For example, I want to use Powershell from inside my Azure role. Currently all Windows Azure instances do have Powershell. What if Microsoft suddenly decides to stop deploying Powershell by default?
Is there a list of guaranteed components and functionality that I may assume unconditionally available inside Windows Azure instances?

There is no guarantee with respect to features currently available in Windows Azure. Obviously, Microsoft being the late-comer to the cloud game and an underdog, will not likely be cutting features out but only adding features in, but there is certainly no guarantee.
We do know that Azure instances are running Win2008 and you can lock them down to a certain version of a patch so that nothing is removed.

Interesting post, not sure if MSFT have a complete list of what's on offer (I do remember this being talked about in the early days when it went to general release), be nice to see a list. (*cough mark russinovich!)
Re: Powershell, its not going anywhere fast, its a core tenant of the Windows o/s and I could bet my house on it.
Startup tasks are your ideal preference to ensure you have what you need in Azure, for instance I install the Java JVM on the box (VM role) during a startup task, it's not there naively but it is when my code runs!
http://www.davidaiken.com/2011/01/19/running-azure-startup-tasks-as-a-real-user/

Related

What is the benefit and real purpose of program installation?

Of all the programs I wrote so far, If I want it to work on another work station, I just have to copy and paste the executable and necessary files needed to make it run (e.g: .o files, binary files..).
But all the program built for commercial use always comes with an installer. For example PC games. So my question is: What is the main benefits/reasons of doing installation when we could just simply copy the files over to the targetted work station?
-One of the reason is probably to prevent piracy. But other than that, I'm sure there are other stronger reasons?
The Complexity of Deployment
Only the simplest applications can work with a simple file copy, and even then you need to have a convenient way to actually download and do the copying of the files to the right location - and this is what a setup is for. The setup is also a marketing tool that can be used for branding and consistency across products as well as allowing installation of a trial version of the product - a very important part of selling software.
Finally a setup provides upgrade and patching features for new versions as well as uninstall and cleanup of the system when the user wants to remove your software. A good setup may also be signed with digital certificates to ensure the file can not be hampered with in transit, and that the vendor is certifiable and hence serious. All of these things are crucial for a serious product.
It is important to remember that the setup experience is the users first encounter with the quality of your product. If the setup fails the product can't be evaluated at all. This would seem to be the most expensive error to make in software development.
Errors in deployment are cumulative in the sense that once you have a deployed error, you generally have no access to the machine in question for debugging - and the fix could easily do more damage. You are managing a delivery process, not just debugging code and binaries. Each delivery adds risk and complexity and pretty soon you can have an impossibility to maintain on your hands if you are not careful. Furthermore all machines your setup is run on will almost certainly be in a totally different state than another computer.
Deployment (setups) is therefore the complex process of migrating any computer from one stable state to another. This requires a disciplined approach. The setup should install all required files and settings and ensure the product is configured for first launch or ready to be configured upon launch without failure. This can be a very complex task. The list of things a setup may need to do is growing all the time, and for every new versions of Windows it seems new obstacles are put in place to make deployment harder. Such obstacles include the UAC prompts, self-repair lockdown on terminal servers, changed core MSI caching behavior, new folder redirects, virtualization features, new and changed signing features with encryption and digital certificates, Active X killbits security lockdown, 64 bit complexities, etc... The list goes on.
Application virtualization is a big issue these days. It essentially encapsulates computer programs from the underlying operating system on which it is executed. This essentially still involves a deployment package for your application, but a fully virtualized application is not installed in the traditional sense. The application behaves at runtime like it is directly interfacing with the original operating system and all the resources managed by it, but can be isolated or sandboxed to varying degrees.
An Overview of Deployment Tasks
The tasks and features needed in a setup range from the very fundamental and basic with built-in Windows Installer or third party tool support, to the highly customized ad hoc solutions where you have to actually code something yourself to deal with unique deployment requirements.
Deployment tools really contain most you would ever need for any deployment, but certain things are still coded on a case by case basis. These ad hoc solutions are implemented as "custom actions" in Windows Installer, and they are without a shadow of a doubt the leading cause of deployment failures. See the "Very Advanced" section for more on custom actions.
Overuse of custom actions and a lot of ad hoc coding tends to indicate flawed application design, but in certain cases you are just dealing with new technology and you have to roll your own solution to get your solution deployed. This is exactly what custom actions are for. Over time standardized solutions should be created and preferred. And small changes in application design can often eliminate complicated custom actions. This is a very important fact about software deployment - there are so many variables that one should opt for simplicity whenever possible.
At a basic overview level, deployment must account for:
Setup Fundamentals
All third party tools provide good support for these setup fundamentals, but there are some differences. The installation of prerequisites may be the area where third party tools and free frameworks like WiX differ the most in terms of ease of use - at the time of writing. The support is there, but it can be a little bit challenging to set up.
Check if the system is suitable for installation for the package in question.
Disk space.
OS type & version.
Language version.
Computer architecture x86/x64.
Unsuitable platforms: Thin Client / Citrix / Terminal Services
Customized setup required due to custom lockdown.
Maybe even malware situation (I wish - can cause mysterious deployment problems).
etc...
Scan for presence and if necessary install prerequisites and runtimes.
Allowing easy deployment of prerequisites and runtimes is a task with extensive support in third party deployment tools. There is limited support for this in Windows Installer itself. The basic feature for runtime distribution in Windows Installer is the merge module - essentially the "include file equivalent" for MSI files. The standard way to deploy shared files. A merge module is compiled into your MSI at build time - sort of early binding in developer terms.
Some prerequisites are installed via Windows Installer merge modules. Others are generally installed using their own setup file (various formats).
Examples: Active X for games, Crystal Reports, Microsoft Report Viewer Runtime, MySQL, SQL Server Runtime, VB6 Runtime, ASP.NET MVC Runtime, Java Runtime, Silverlight, Microsoft XNA, VC++ Runtime, .NET runtime versions, Visual Studio Tools For Office Runtime, Visual F# Runtime, MSXML Runtime, MS Access Runtime, Apache Tomcat, Various Primary Interop Assemblies, PowerShell versions, etc...
Finally, several core Microsoft components such as Windows Installer versions and PowerShell versions generally come down via Windows Update and might be better to exclude from your setup (just check for existence, and tell user to run Windows Update if component is missing). Actual practice here varies.
Provide a GUI suitable for input of required settings from the user.
It is common practice to enter and validate license keys in a setup.
Personally I think this is better done from the application itself for both practical and security reasons - making piracy more difficult, allowing trial installs, reducing excessive setup support calls (you wouldn't believe it...), etc...
For complex setups a lot of GUI could be required to gather deployment settings - particularly for server setups with IIS, MS SQL, COM+ and other advanced components.
Allow installation in silent mode for corporate use.
Extremely important - all corporate deployment is automatic and silent (no GUI shown during installation), except certain server installs.
Smaller companies may run your setup in GUI mode. In my experience they generally do.
Home users generally always run your setup in GUI mode.
Know your target group, and definitely make sure you support silent running if you target corporate customers. However all setups should work in silent mode, and if you follow MSI design rules and best practice it "comes for free".
Adding Basic Stuff
These basic tasks have full support in the Windows Installer engine itself, and all third party tools provide fairly equivalent support for all of them despite variations in GUI features and ease of use.
Install files and registry settings.
Install odbc, file associations, shortcuts and icons.
Update application and system-wide path settings.
Update and merge text based files such as INI files.
Register COM files and enable .NET COM Interop if need be.
Install .NET assemblies to the GAC, and run custom .NET installer classes.
Install side-by-side windows assemblies to WinSxS.
Deliver signed and certified files (also applies to the setup file itself).
Install and control Windows services.
Install Control Panel Applets.
Update environment variables.
I won't dwell on these issues or flesh them out with too many details. All of these deployment tasks should be reasonably well supported in all deployment tools and frameworks available. However, many people mess up their deployment by not using the built-in deployment features and instead relying on custom actions for such trivial tasks. Entirely added risk for no gain whatsoever.
In particular we often see custom actions used to install Windows services - and this is usually a sign of a very badly designed service, or at other times just ignorance of how to do deployment. Both issues together is also common. Deploying such a service often involves applying custom ACL permissioning and modified NT privileges to make a service run with user rights instead of as LocalSystem - which is generally the only correct way to run Windows services. Running a service with user credentials is a "deployment anti-pattern" worth mentioning in passing (more on this later).
Another common custom action use that is always wrong is to install files to the GAC via a custom action. There is good built-in support for this in Windows Installer and any excuses to install via a custom action is almost certainly hiding a bad design or some generalized madness :-). It is also a fact that many deploy far too many things to the GAC overall, but that is a development issue: When should I deploy my assemblies into the GAC?
Finally, .NET installer classes are intended for developers to test their components during development - it should not be used for deployment. It is essentially just the .NET equivalent of self-registration (which is also not acceptable for MSI - you need to extract the COM information and add to the MSI tables - see link for details). An MSI is declarative - it should contain all changes to be applied to the system so that proper rollback and management can be ensured. So the message is that .NET installer classes should only be used for development and testing. Once you build an MSI to deploy your application you should use MSI constructs to achieve proper deployment with rollback support and intelligent management. We see these .NET installer classes used mostly for service and GAC install. In an MSI this translates to using the ServiceInstall and ServiceControl tables for services, and just marking a component for GAC install to install to the GAC (must be a signed assembly). Once you know how, it is easy and you won't miss the .NET installer classes because MSI works like "automagic" when you do this right. You get reliable rollback for free, with ease.
Adding Advanced Stuff (often server stuff)
Despite support in all deployment tools for most of these issues, I have often found that I needed to implement custom actions and ad hoc solutions to achieve proper deployment in certain cases. This is particularly the case for COM+ and IIS deployment. WiX provides highly customized support for both types of deployment, but I have limited experience using it.
The update and installation of XML files is a task supported by each deployment tool since there is no built-in support for this in the Windows Installer engine - which is quite amazing at this point.
With regards to database installation and particularly updates, I am on the fence thinking it should be done from applications with proper user authentication and interactive use, instead of a "one shot" and impersonated deployment operation (that might fail seemingly without good exception management or recovery options). Or in other cases it seems updates should be a managed process involving users raising corporate tickets handled by professional DBOs. Some more details below.
Configuration of IIS, Apache, or other web servers.
This is a whole world of its own, particularly with regards to IIS. I have found deployment tools lacking in features to deploy sites as requested by developers and corporate teams.
Though largely untested by me, the WiX framework provides a very flexible implementation of IIS configuration and deployment.
I expect a lot of custom actions are in use to achieve special deployment configurations.
Run SQL server scripts against databases.
Create db, connect to db, update db, run stored procedures, maybe even trigger backups or schedule new tasks, etc... I don't know all that people do here.
Should this be done in the application instead, or by a DBO? That seems much more reliable. A setup is "one shot", an application can be restarted and you try again - a better exception handling.
Plus an MSI setup has a highly limited GUI severely limited in events due to the overall MSI design (proper Win32 dialogs can be spawned from the limited MSI GUI, but it takes a lot of effort - I have only done it once).
Crucially a setup can run with elevated rights, but that is just on the local machine. Authentication is still needed against the database (unless Windows Authentication is used).
A database update is a transaction on its own that would run as a part of the overall Windows Installer transaction. It is not obvious how to handle errors or what to do in terms of rollback if the installation fails.
Needless to say this can all get very complex to handle in your setup. It is an (enterprise) configuration task in my view, not just a deployment task. Insightful comments very welcome on this issue - I am on the fence with regards to best practice.
If you are delivering a client / server solution to your customers and need a way to set up the (server side?) databases "fresh" with defaults to help your customers "get going" with your solution, then database deployment definitely makes sense to me. But update scripts run as part of installation targeting existing databases would worry me in terms of reliability and management - not to mention safety.
For corporate database updates it would seem a proper process involving a DBO would be more secure. They can run a proper backup before updates are applied and then true rollback is in place if problems are found in UAT.
Installing ActiveX browser components (certificate based through browser).
Install of signed CAB file downloaded from a Web page (admin only, can be captured as an MSI for mass deployment with elevated rights).
Defaults to install in "C:\Windows\Downloaded Installations".
Complications can arise if the version in the CAB file differs from the version requested by the Web page (triggers CONFLICT folders to be generated as installs keep re-running).
Update and merge XML files.
Advanced because it is (amazingly) not natively supported by Windows Installer.
Supported with extensions by both WiX and third party deployment tools.
Configure and control COM+ components.
Tech note: I have failed several times to achieve this properly with several third party tools. There seems to be an overall lack of required features.
I normally end up manually configuring the COM+ application and then exporting an MSI from the Component Services administrative tool that is then used for deployment.
This exported MSI is not good at all - fragile if you try to make any modifications. It contains an undocumented .apl file with the application's attributes and any dependent DLL or data files are not auto-included.
WiX provides support for COM+ (not tested at all by me). I hope it is good :-).
Just for reference: Understanding COM+ Application Installation.
Add custom event logs, set up performance monitors, add firewall rules, and other windows extensions. Supported by most deployment tools these days - including WiX. These features are not natively supported by the Windows Installer engine.
Set up connections to mobile devices and deploy.
Can involve "some strangeness" and weird proprietary solutions.
A custom, native dll might be required to achieve smooth deployment (Pocket PC back in the day - not sure how things work these days).
Install drivers of various kinds.
Much easier and more reliable now for signed drivers than before.
Supported by all third party tools and WiX (using dpinst.exe in the background).
Hooking up the application to advanced server features (deployed separately).
Automatic update systems.
License servers. Floating licenses, or regular licenses.
Online resources of various types. Help, templates, discussions, SDKs, developer tools, etc...
Online stores.
Most of the time this just involves setting a link or registry key to point to the server resources, but sometimes it is more complex.
Adding Very Advanced Stuff (custom actions)
When there is no built-in support for a certain operation or task in Windows Installer itself, or in any of the various third party tools available, you are left having to implement the feature yourself.
When you use Windows Installer, this involves running custom actions of various types (Windows Installer's mechanism for running executable, custom installation logic during installation).
Custom actions are purpose built executables (binaries: dll, exe) and scripts capable of making advanced modifications to the system during installation that are not supported by Windows installer natively or by the deployment tool in use (WiX, Installshield, Advanced Installer, etc...).
Custom actions that make modifications to the system run with elevated rights so that changes can be made to the system even if the logged on user does not have admin rights. There is essentially no limit to what these custom actions can do. They are armed and dangerous.
Custom actions are the leading causes of deployment errors and failure.
Hands down. If an MSI install fails it is most often related to a failing custom action.
Custom actions are difficult to write and debug due to the complexity of Windows Installer. They must be used only when necessary and they must be written with full rollback support so that they are capable of undoing all changes that were applied to the system in case the installer fails and must roll back changes.
This is hard and difficult work and custom actions are a big, complex and error prone issue - a can of worms.
Often minor application design changes can allow custom actions to be replaced by standard MSI features, or various MSI extensions available in third party tools and in WiX.
Executables and scripts that run correctly on their own may fail when run as part of an MSI due to the complex impersonation, elevation and runtime design of Windows Installer. These are not trivial things to get right. An MSI install is an intricate transaction with elevated and impersonated sequences that is very hard to deal with.
Custom action types
Windows Installer supports custom actions implemented as purpose built, native (win32) executables and dlls as well as scripts such as JavaScript or VBScript.
Some even use .NET binaries (C#, VB.NET, DTF, etc...) to run custom actions - this is not recommended due to their prerequisite need for the .NET Framework. These binaries are referred to as "managed code" and can't run without the correct .NET framework installed.
Finally there are PowerShell custom actions that are both scrips and managed code combined - and they should not be used since they require the .NET framework.
In the future, when the .NET framework might be guaranteed to exist on all Windows computers this managed code might be a viable options for general use, but as of now the consensus seems to be that these actions are too risky and unreliable.
Common, sample custom actions (some common, custom tasks are frequently implemented as custom actions because they are not natively supported by Windows Installer but frequently needed).
Manage Windows Shares (usually create).
Apply custom ACL permissioning (there is some built-in MSI support for this).
Modify NT privileges.
Configure DCOM.
Manage groups and users.
Configure per-user Office Addins.
Persist installer properties (for repair and reinstall).
Custom and company specific launch conditions.
IP-Configuration redirects for IIS
Encrypt or obfuscate content for data security
Etc...
Most of the custom functionalities mentioned above are now available in the WiX framework as a custom C++ dll - and other tools have some similar, custom features. You should always prefer these ready-made solutions to your own custom actions since rollback is properly implemented in WiX and the implementation is well tested.
Applying custom ACL permissions and modifying NT privileges are considered "deployment anti-patterns" by most deployment specialists. The requirement to do so indicates poor (lazy) application design.
Custom action summary.
Writing a custom action on your own should be a rare event that is unique and that has not been done (better) before.
Minor application re-design can often eliminate unwise and complex deployment constructs. In fact, almost always.
For example: application configuration should happen on first application launch, and not during the setup.
The setup should prepare the application for first launch, and perform tasks that require elevated rights (only).
User data initialization is a particularly bad thing to use setup scripts to perform. All of this should be done in the application launch sequence.
You should enforce proper rollback support.
This is complex and hard work.
Almost all script custom actions I have seen do not implement rollback at all.
You should write with minimal dependencies.
Preferably use C++ or Installscript or maybe JavaScript (only for internal, corporate deployment in my view). Avoid VB Script, and definitely avoid .NET code in C#/DTF or PowerShell scripts. There is some discussion on the issue of managed code. MSI experts like Chris Painter believes C#/DTF custom actions are ready for prime time, whereas the general consensus seems to be to err on the side of caution and rely on C++ dlls until a proper .NET runtime environment can be guaranteed. Here is a long-winded "discussion" of this issue: Windows Installer fails on Win 10 but not Win 7 using WIX
Robust code is difficult write in script. Scripts are fragile, hard to debug, lack advanced language features (particularly error handling) and are vulnerable to anti-virus blocking.
The only real advantages of scripts are that they are transparent and inspectable and the whole source is embedded in the MSI file (no version control issues). Corporate teams that hand off work to each other frequently might use JavaScript (there is a lot of legacy VB Script use as well - but that language is very poor for error handling).
Managed code has runtime requirements that can't be guaranteed at the time of writing - and this has been the situation for a very long time now.
PowerShell is both managed code and a script. Avoid it. Installshield supports it as a type of custom action. It remains to be seen how successful it will be. I would never use it unless forced to.
And much more...
Additional complications For Deployment
There are many additional complications when delivering a professional setup such as delivering setups in different languages (localization), branding setups for different resellers (OEM), ensuring the setup works on all required operating systems in different language versions, delivering separate setups for x86 and x64 machines, delivering a scaled down "viewer version" of the application, making combined setups for client and server installations (can be run on both the server and the client installing different components - not recommended if you ask me - details), and not to mention deploying to different embedded devices such as phones, pocket pcs, smart phones etc...
Certain "Deployment Anti-Patters" are also problematic to deal with (the linked answer is an "experiment" and I am not too happy with it - a work in progress, but it is intended as a check list for developers for their deployment efforts to avoid really common problems). These are bad constructs required in setups to make poorly designed applications run properly. They include things such as applying custom permissioning (write access in otherwise locked down paths, etc...), customizing NT privileges (typically "run as service" for a user account, or much worse), or applying excessive use of complex custom actions that make unpredictable changes to the system (these can really be anything and be very dangerous). Messing up the silent install is also a huge, common problem - it is terrible for corporate use of your setup. Deploying excessive amounts of user-specific data with your setup can also be problematic (hard to control complications). And there are many other, more specific problems to relate to.
Here is a post with the overall issue of setup and deployment seen in the larger context of application marketing and sales.
Doing Your Own Deployment
You will need a tool or a framework to deliver your own setups. Here is an answer describing different tools used to create installers: What installation product to use? InstallShield, WiX, Wise, Advanced Installer, etc. All attempts have been made to make the descriptions as objective as possible - describing real world experience with positives and negatives.
The commercial tools described in the link above are most excellent tools - and they tend to speed things up with good GUIs and ready-made solutions for common requirements, but developers should consider trying WiX - the new way to create MSI files. Please read this post for background information:
Windows Installer and the creation of WiX (read this if you are trying to "find your feet with WiX" and want to understand what the technology is all about and where it is coming from).
WiX has a learning curve but is "developer friendly" in many ways. For one it is a project type in Visual Studio (once you install it), and it allows a setup to be defined in XML and compiled to MSI as you would a normal binary. This allows proper source control, branching and collaboration. Plus it is free and open source. I feel it is OK to recommend a free framework, especially since it is well maintained. Expect a learning experience though. Here are some suggestions for a "flying start" with WiX.
Many programs make use of graphics, sound, and other drivers which are supplied and maintained by third parties. In many cases, these drivers may use underlying hardware or other system features in ways that Windows itself knows nothing about. If two programs, each with its own driver and unaware of the other's existence, tried to use the same hardware, they would likely interfere with each other in unpredictable undesirable ways (e.g. one might overwrite graphical textures loaded by the other). To avoid such problems, Microsoft recommends that has applications install drivers in such a way that the two programs that need the same driver can share the same driver instance.
The approach Microsoft takes is not the only means of ensuring that multiple programs using the same hardware go through the same driver. A system could also have programs temporarily load drivers when they start, and have drivers automatically unload when they're done. The difficulty with that approach is that if a program which uses an old driver is launched, and while it is running a program which needs a newer version of that driver is launched, the new program would not be able to run unless or until the old program shuts down its driver and switches to using a new one. Such a hassle is probably unavoidable, but having to deal with such things every time a program is launched is probably less bothersome than dealing with it only once when a program is installed.
All that having been said, while it may be helpful to be able to install a program once and have any "driver" issues taken care of once and for all, there's also something to be said for being able to simply run a program without having to make "permanent" modifications to the system. There shouldn't be any particular obstacles to programs being able to use either "temporary" or permanent drivers, but I know of no particular efforts to facilitate such designs.
Beside copying the files for You, the installer may also add registry entries needed by the program (if any), add values to environment variables (PATH), create icons on desktop, so You don't have to do this manually etc.
To quote Wikipedia, "Installation typically involves code being copied/generated from the installation files to new files on the local computer for easier access by the operating system." For simple programs, there is no need to install anything, but more complex ones can update, add links, etc. automatically if installed.

Azure - SSMS - PowerShell

I am working through my first Azure HDInsight tutorial. Can I do this without installing Azure Remote PowerShell on my local computer?
Can I use SSMS (2008R2) to run the PowerShell? My first attempt at that led me down the path of using a Database in Azure, but I do not think that is what I want to do (the tutorial describes setting Storage (not a Database) and then an HDInsight instance to interact with that Storage).
I am doing this tutorial: http://www.windowsazure.com/en-us/manage/services/hdinsight/get-started-hdinsight/
Thank you.
While you can use SQL Server and HDInsight together as part of a full pipeline, for the purposes of the getting started tutorial you want to think of them as two very different things.
The Storage referred to, is a standard Windows Azure Storage account, based on blobs. These then form a backing file system for the HDInsight cluster.
As far as using PowerShell goes, it is definitely the best, and easiest way to submit jobs to an HDInsight cluster. I would also recommend using a regular PowerShell console, or the PowerShell ISE to work with HDInsight as well, rather than the one available through SSMS, since the SSMS version won't load all the Azure modules by default.
There are other ways to submit jobs if PowerShell is not your thing (if you are on OS X or Linux for instance). You can use the REST API provided by WebHCAT (documentation). If you're on Windows, and prefer C# to PowerShell, you can also use the Windows Azure HDInsight Management Client from the Microsoft Hadoop SDK to submit jobs (available on codeplex and nuget). These will need you to break out Visual Studio and write a short console program to submit your job, so may be a bit heavy unless you're doing full on C# streaming Map Reduce, and so are already there.
If you're after a GUI based approach to job submission to HDInsight, you're out of luck at the moment, but your might like to check out what my team is working on at Red Gate, which will help you with submitting Hive and Pig jobs.

Automatically uninstall unused applications in SCCM 2012

Is there a way to automatically uninstall unused applications in SCCM 2012?
I thought of a PowerShell script. Which lists the local installed Applications and compares to the group assigned by the user in the AD. If there is an installed application without group assignment, the application should be uninstalled.
I have had very little to do with PowerShell and want to ask if this is possible?
Thanks
First of all you need to have sofware usage metering data - corresponding feature should be enabled & configured in SCCM, probably some usage statistics can be gathered on a client side by enabling some logging/metering built into client OS (not sure about latter at the moment).
Then having software usage metering data you may establish threshold suitable for you (e.g. software is unused if it was never run at all, newer run for certain period of time etc.) and select and run uninstall of this sofware by running script against all your machines/users.
This is just an outline of how it possibly could be done.
Further reading (describe exactly asking about with use of Orchestrator Runbook Automation and ability to opt out from assigned uninstall for user):
1) Software Metering Deep Dive and Automation Part 1: Use It Or Lose It - The Basics
2) Software Metering Deep Dive and Automation Part 2: Use It Or Lose It - The Collections
3) Software Metering Deep Dive and Automation Part 3: Use It Or Lose It - The Orchestrator Runbook Automation
By the way there is a little pitfall here: software metering just track runs of app & probably time it has been running for, but this in not always equals to real application usage (it may be simple configured to autostart but ignored by user)

Best practice deploying windows service

I'm looking for best practice in continuous delivery of windows services.
Currently we hava a set of powershell scripts that unintall, reboot, install updates but error handling is tricky. We are reviewing System center but are there any other options available for deploying a windows service?
We've been using Presto since Dec 2011, and have done over 1,000 deployments. Most of what we deploy are Windows services.
What's nice is that we set up our apps and servers in Presto, then we can repeatedly deploy, to any server (or multiple servers at once), by just hitting a button. Presto will copy our official release binaries, update all of the items in our app config files, create and start the service, etc...
So, if you have an application that has 30 manual steps to deploying it, you can enter these steps in Presto, then it's done automatically for you after that.
It's worth a look: http://presto.codeplex.com/
Your most basic and generally accepted best option comes from this thread, which basically links to a Microsoft support article on creating an installer for the windows service.

How to use PowerShell and PowerShell modules in the enterprise

Recently, I've joined the Windows team in my enterprise and with my developer background (Java, .NET and Web in general), I was pretty quickly interested in PowerShell. I can see its value over plain old batch files, VB, ... which is why I'd like to promote its usage and, little by little, push people to favor it over the rest unless there's a reason not to do so.
Deploying PowerShell seems pretty straightforward since we can easily approve the relevant patches in WSUS and configure the execution policy via GPO for the AD integrated servers.
My questions are in fact more about the distribution and usage of PowerShell and PowerShell modules (e.g., PCSX, PowerShellPack, home made, ...).
For those of you who have already deployed PowerShell in your enterprise:
Do you have some sort of standard package for PowerShell with a set of modules that you deploy on each server? If you do, then how do you deploy new versions of the installed modules?
Did you put a central PowerShell repository in place where you store all your PowerShell modules? If so, is that repository accessible globally or do you also secondary repositories that you synchronize?
I'm pretty used to tools such as Maven, Ivy and other dependency management software, which is why I'm a bit disappointed by what PowerShell has to offer in this regard.
I've found a very nice article about this subject and will probably go down the same path, as it corresponds to my requirements.
Do you use WinRM? Do you connect directly from workstations or do you have central management servers? Did you limit access to WinRM to those management servers?
Do you use WinRM in a non-managed environment (servers not in an AD domain)? How do you configure WinRM?
We have a network zone in which the servers aren't part of an AD domain, thus I can't rely on the Kerberos authentication for WinRM.
Globally, what is your experience, are you satisfied with the results?
Edit:
Regarding question 2, we've decided to put a central repository in place.
The idea will be to have a main repository which will be under version control (GIT) and to which we'll be the only ones to have write access.
From that repository, we will copy the modules using an rsync like tool (in our case that'll be robocopy) to other secondary repositories (which will be read-only copies). Only those repositories will be accessible by the clients (we'll just have to update the PSModulePath on those clients to make sure they can access the repository).
We'll also stage our releases, thus in the repository, there'll be multiple versions available: Development, Integration and Production.
Let's cover each issue by category.
Evangelism
To start off interest in PowerShell to your coworkers, I would suggest starting off with the bread and butter of automation. Find a common pain point that is relatively easy to implement (to get something out there in front of your coworkers quickly) and automate it with PowerShell. Then expand from there.
Another good idea is to start a "Script Club" at your office where you do some training and share ideas or problems about scripting in PowerShell. You can start out with once every few weeks and see how it goes. At my work, we have a book club where we go through various technical books on testing, design, and programming, it works well.
Packaging
Modules - PowerShell modules are the best form of packaging. They are quite easy to use and offer some nice features such as easy deployment and private variables/functions.
Scripts - Scripts are a good idea for work-in-progress or to start off since your coworkers will certainly be comfortable with scripts.
Deployment
There are a few options currently for deployment.
Deploy to every machine - This could avoid some network issues and gives you more flexibility with each machine, but the downside is that updating modules may be more of a pain.
Central repository (a.k.a. a network share) - You could also store all your modules on a central network share. This would avoid issues with deploying to every machine and you can make the modules read-only if you want to control modifications. But you would still have to deploy a profile to each machine to add the network share to the $env:PSModulePath variable. It would be best to set this in the all-users-all-hosts profile. From then on you shouldn't need to update it unless the path changes.
NuGet - NuGet is an open source project that is bringing package management to .NET development. What is nice about it is the ability to have public repositories and local/private repositories. There is already an initial version of a PowerShell module that will leverage NuGet for PowerShell module deployment.
Remote Access
Being a build engineer, I have relatively few machines and full control over them. So I have remoting enabled. You would have to ask some IT guys for better advice.