Deployments: MSI packages vs Scripts - deployment

I have been tasked with looking into our deployments, and seeing where they can be streamlined. Right now we have 4 different configurations (Debug/Dev, Test, Staging, Release) and 4 *.config files. We have a task that will overwrite app/web.config with the appropriate *.config pre-build time based on the active configuration. An MSI is created, and we do a full deployment of the component on release night.
This is not entirely ideal because if we change something in a config file, or fix the spelling in a specific view we have to re-deploy the entire thing. Not to metion that the MSI will occasionally require a reboot. One other option that has been brought up is instead of creating MSIs we could create custom deployment/rollback scripts and have the ability to do incremental release.
Has anyone here tried deployments both ways? What are some of the pros/cons you have found? Is there a third way we haven't thought of?
edit: Just to clarify a few things...We don't deploy to customers. All software is deployed to our servers. (a few sites, and a lot of windows services). We never change things in production. We actually use the built in system within VS to create the MSI, so that part isn't the terrible part. To me it just doesn't make sense to redeploy an entire website if you had to change 1 view. We also have to deploy to multiple servers. Right now that is done by running the MSI on each one.

MSI pros:
Application/service/site gets installed and registered like most other Windows apps, and shows up in Add/Remove programs
Some built-in support for re-installing, upgrading
Has some built-in support for installing Windows services/IIS sites/lower-level Windows features
MSI cons:
Seems really cryptic once you get "under the hood"
Seems more difficult to customize than using a custom script
Script pros:
Easier to customize, although certain steps might require lots of/cryptic scripting (working with IIS, lower-level computer administration)
Don't have to deal with low-level weirdness of MSI
Script cons:
.bat scripting is not the most readable or writable language. (Powershell is better, but then you have to worry about whether Powershell is installed on the target machine).
Low-level operations require a lot of administrative scripting for commit/rollback behavior
No built in support for installing or rolling-back (MSI has some support built-in)
One thing I've come across that helps with MSIs is WiX (http://wix.sourceforge.net/), but even WiX seems pretty cryptic in a lot of ways. We use a combination of MSBuild and WiX to do automated builds and deployment/installs, and it works okay for us.
Overall, I'd probably lean more towards doing MSI/WiX (or other installer toolkit) deployments over scripts. MSIs are the standard way of doing installs on Windows, and once you get it working, you usually don't have to change too much. MSBuild or some other build framework (NAnt, etc.), can be useful for setting up the deployment (renaming files, doing string replacements, etc.), before putting together the final MSI package.

Running a dev company that build web apps for five years we struggled with this and tried a bunch of solutions. Here are a couple tips:
Always replace the entire web directory with your code (except if you have content generated by the web site, like a CMS). It's pretty fast to do this and incremental deployments can introduce phantom bugs if files are left around.
Have your build process (Nant, MSBuild, whatever) mod the .config files for each environment and build for what you push for. Alternately you can use registry settings so that the .config files are the same but that means a dedicated machine for each environment. May or may not be an issue.
Don't make changes in production. If you need to make changes (spelling errors on site) make those top priority to get changed in dev so that you don't overwrite them with the next push.
If you aren't using MSI's then make sure you have a rollback process. Keeping a copy of the site just before you changed it really helps when something unexplained goes sideways during a roll-out.
I don't know that these tips point to MSI or script. I think it's a matter of which you are most comfortable with. MSI's can be hard to customize, but easy to run and manage. Microsoft has lots of tools for managing roll-outs of MSI's across an organization or farm. Scripts may require custom tools and custom tooling or lots of manual work on the production end.
We ran scripts with Nant and a custom deployment harness. These days (VS2008) building deployment packages is much easier.

Your best option is to get a decent MSI builder to do the job with - i'm talking about InstallShield etc (there are a couple, so do look around). While these invariably cost, they can save you a huge amount of time/money/pain further down the track. Having said that, the pain is not totally eliminated, just reduced :)
Anything tricky you need to do can be done as a custom task within the msi - and you can even do this with the setup builder that comes with Visual Studio (if you are using VS).
I have a suggestion for your config files - include all four in the msi, and then have a public property which can be set from the command line. You can then modify that public property to install the appropriate config file (and have the default value of that property set so that the release config gets installed). That way, your customers just use the msi and get the correct config file, but your test team can get their config file by changing the value of the public property; the command line they would use to do the install is this:
msiexec /i "MyInstaller.msi" CONFIG=test
You can do install scripts quite easily, but as already mentioned you also need to script the uninstall. Using install scripts precludes you from getting Windows certification for your product should you look at getting that done. But that doesn't mean you shouldn't use install scripts, they may be the perfect fit for your needs. Alternatively you may look at using a combined script/msi approach by having your scripts run as custom actions from within the msi.

Related

Can i use a wix installer to just run a couple of custom commands

I am working on a project where we need to repeat certain steps with powershell to deploy stuff. i would like to create a process/install guidance (steps supported with UI) with WIX but after the msi has finished i have an entry in programs and features. I just need it to execute the powershell and the end without registering in windows. i might be using the wrong tooling or whatever, any suggestions are welcome.
Definitely not recommended unless you want to track the deployment of these scripts on different systems by checking the entries in ARP (Add/Remove Programs), and even then it clogs up the Add/Remove view of your computers. Most system administrators hate this approach, it is better to just write to your own registry key and read it back from every machine.
What are the scripts doing? Are you actually installing files.

Should we deploy scrrun.dll (Windows Scripting Runtime)?

Our VB6 application relies heavily on the use of scrrun.dll (Windows Scripting Host). Until a year ago we used to deploy this dll with our installer. Since the Windows Scripitng Host is supposed to be part of Windows we removed the dll from the installation package. However, now and then surface customers who have a non functional scrrun.dll on their system and we have to help them reinstall or reregister it.
So, should we put the scrrun.dll back in the installation package? Should we perform some check on installation? Or should we just live with the fact that we have to provide hands on support to some of our customers to set their systems right?
Don't try to deploy these libraries as part of a normal setup.
Microsoft Scripting Runtime must be installed through the use of a
self-extracting .exe file. For versions of Scripting Runtime mentioned
at the beginning of this article, the only way to distribute it is to
use the complete self extracting .exe file located at the following
locations...
It is possible that some users employ an older anti-malware suite, many of which tried to disable scripting. It is more likely though that some users have managed to break their Windows installation, either themselves or by using applications improperly packaged to try to include these libraries - and blindly remove them from the system on uninstall (cough, cough - Inno).
The libraries involved have been tailored code for some time. This is why the ancient .CAB file was "recalled" long ago. There is no single copy of them intended to run on any random version of Windows, and there are no redist packs for any modern version of Windows. The correct fix is a system restore or repair install.
While this can't be blamed directly on InnoSetup because it is the result of poorly authored scripts it is frustrating enough and common enough that I won't cry when its signature is added to anti-malware suites. There are just too many poorly written examples loose in the wild copy/pasted by too many people.
I spend plenty of time undoing the damage caused by uninstalls of these applications and have grown quite weary of it. Where possible I use isolated assemblies now in self-defense, which helps a lot. Windows File Protection is getting better about preventing abusive action for system files too.
But in general you are much better off avoiding any dependency on scripting tools in an application. There isn't very much that they can do as well as straight code anyway, though it may take some time to write alternative logic.

Automating Solaris custom software deployment and configuration for multiple nodes

Essentially, the question I'd like to ask is related to the automation of software package deployments on Solaris 10.
Specifically, I have a set of software components in tar files that run as daemon processes after being extracted and configured in the host environment. Pretty much like any server side software package out there, I need to ensure that a list of prerequisites are met before extracting and running the software. For example:
Checking that certain users exists, and they are associated with one or many user groups. If not, then create them and their group associations.
Checking that target application folders exist and if not, then create them with pre-configured path values defined when the package was assembled.
Checking that such folders have the appropriate access control level and ownership for a certain user. If not, then set them.
Checking that a set of environment variables are defined in /etc/profile, pointed to predefined path locations, added to the general $PATH environment variable, and finally exported into the user's environment. Other files include /etc/services and /etc/system.
Obviously, doing this for many boxes (the goal in question) by hand will certainly be slow and error prone.
I believe a better alternative is to somehow automate this process. So far I have thought about the following options, and discarded them for one reason or another.
Traditional shell scripts. I've only troubleshooted these before, and I don't really have much experience with them. These would be my last resort.
Python scripts using the pexpect library for analyzing system command output. This was my initial choice since the target Solaris environments have it installed. However, I want to make sure that I'm not reinveting the wheel again :P.
Ant or Gradle scripts. They may be an option since the boxes also have Java 1.5 enabled, and the fileset abstractions can be very useful. However, they may fall short when dealing with user and folder permissions checking/setting.
It seems obvious to me that I'm not the first person in this situation, but I don't seem to find a utility framework geared towards this purpose. Please let me know if there's a better way to accomplish this.
I thank you for your time and help.
Most of those steps sound like things handled by use of a packaging system to install your package. On Solaris 10, that would be the SVR4 packaging system included with the OS.

Is MSDeploy "friendly" enough, or can it be wrapped up in an MSI file

In your opinion, are MSDeploy packages a good option for giving to an end user to install a webapplication on their system. How does it compare with, say, the experience of using an MSI file to install a web app?
Has anybody tried wrapping up an MSDeploy package inside an MSI package? Would it work?
MSDeploy was described to me as a tool that helps synchronize web sites between machines, much in the way that AppCenter used to replicate a well configured master to many machines. The Windows Installer (MSI + WiX CustomActions for IIS and SQL config) is about applying packages to a machine that modify state in a transaction. It follows the more traditional packaged software model.
Those are two different approaches to the problem of configuring machines. Each is optimized around a different set of requirements. MSDeploy = replicating machine state. MSI = apply changes in transaction.
Could you throw MSDeploy into an MSI? Probably. Would it work well? Maybe, if you ignore the part about transaction. That in my mind is the key difference. In environments where you want to declare the configuration you want distributed and have it either apply fully or not apply at all (i.e. don't end up in an intermediate/busted state) then package based installation seems appropriate.
If you have a machine that you configured just right and want to make a bunch of machines look like it (and are willing to take a failed machine out of rotation and repeat the process until it is beaten into submission) then MSDeploy seems appropriate.
There isn't enough information in your question to suggest which works better... but I don't think they go together. <smile/>

Best practices for deploying tools & scripts to production?

I've got a number of batch processes that run behind the scenes for a Linux/PHP website. They are starting to grow in number and complexity, so I want to bring a small amount of process to bear on them.
My source tree has a bunch of cpp files and scripts, organized with development but not deployment in mind. After compiling all the executables, I need to put various scripts and binaries on a cluster of machines. Different machines need different executables, scripts, and config files for their batch processes. I also have a few of tools that I've written that belong on every machine. At the moment, this deployment process is manual and error prone.
I'm guessing I'm just going to end up with a script that runs at the root of the source tree and builds a smaller tree of everything necessary for any of the machines. Then, I'll just rsync that to the appropriate machines. But I'm curious how other people are managing this type of problem. Any ideas?
There are a several categories of tool here. Some people use a combination of tools from these categories. I sometimes use, for example, both Puppet and Capistrano. See Puppet or Capistrano - Use the Right Tool for the Job for a discussion.
Scripting Tools aimed at Deploying an Application:
The general pattern with tools in this category is that you create a script and/or config file, often with sets of commands similar to a Makefile, and the tool will ssh over to your production box, do a checkout of your source, and run whatever other steps are necessary.
Tools in this area usually have facilities for rollback to a previous version. So they'll check out your source to releases/ directory, and create a symbolic link from "current" to "releases/" if all goes well. If there's a problem, you can revert to the previous version by running a command that will remove "current" and link it to the previous releases/ directory.
Capistrano comes from the Rails community but is general-purpose. Users of Capistrano may be interested in deprec, a set of deployment recipes for Capistrano.
Vlad the Deployer is an alternative to Capistrano, again from the Rails community.
Write your own shell script or Makefile.
Options for getting the files to the production box:
Direct checkout from source. Not always possible if your production boxes lack development tools, specifically source code management tools.
Checkout source locally, then tar/zip it up. Use scp or rsync to copy the tarball over. This is sometimes preferred for something like an Amazon EC2 deployment, where a compressed tarball can save time/bandwidth.
Checkout source locally, then rsync it over to the production box.
Packaging Tools
Use your OS's packaging system to generate packages containing the files for your app. Create a master package that has as dependencies the other packages you need. The RubyWorks system is an example of this, used to deploy a Rails stack and sample application. Then it's a matter of using apt, yum/rpm, Windows msi, or whatever to deploy a given version. Rollback involves uninstalling and reinstalling an old version.
General Tools Aimed at Installing Apps/Configs and Maintaining a Set of Systems
These tools do not specifically target the problem of deploying a web app, but rather the more general problem of deploying/maintaining Apps/Configs for a set of servers, or an entire company's workstations. They are aimed more at the system administrator than the web developer, though either can find them useful.
Cfengine is a tool in this category.
Puppet aims to improve on Cfengine. It's got a learning curve but many find it worth the time to figure out how to do the configs. Once you've got it going, each box checks the central server periodically and makes sure everything is up to date. If someone edits a file or changes a permission, this is detected and corrected. So, unlike the deployment tools above, Puppet not only puts files in the right place for you, it ensures they stay that way.
Chef is a little younger than Puppet with a similar approach.
Smartfrog is another tool in this category.
Ansible works with plain YAML files and does not require agents running on the servers it manages
For a comparison of these and many more tools in this category, see the Wikipedia article, Comparison of open source configuration management software.
Take a look at the cfengine tutorial to see if cfengine looks like the right tool for your situation. It may be a little too complicated for a small website, but if it is going to involve more computers and more configuration in the future, at some point you will end up using cfengine or something like that.
Create your own packages in the format your distribution uses, e.g. Debian packages (.deb). These can either be copied to each machine and installed manually, or you can set up your own repository, and add it to your list of sources.
Your packages should be set up so that the scripts they contain consult a configuration file, which is different on each host, depending on what scripts need to be run on each.
To tie it all together, you can create a meta package that just depends on each of the other packages you create. That way, when you set up a new server, you install that one meta package, and the other packages are brought in as dependencies.
Although this process sounds a bit complicated, if you have many scripts and many hosts to deploy them to, it can really pay off in the long run.
I have to roll out PHP scripts and Apache configurations to several customers on a frequent basis. Since they all run Debian Linux, I've set up a Debian package repository on my server and the all the customer has to do is type apt-get upgrade and they get the latest version.
The first thing to do is get all these scripts into a source control repository (svn or git are good) so that you can track changes to these scripts over time.
If you are interested in ruby, check out Capistrano, it is well suited deploying things to multiple machines in a cluster, and is fairly easy to set up. It can read files directly from your version control system.
Puppet is another tool that can be used in this situation. It is similar to cfengine - you create a model of the desired deployment and Puppet figures how to get the environment to this state.