Conditional installation with Wix - deployment

Is it possible to have a conditional installation configuration, slaved wth the Visual Studio configuration environment?
For example, selecting DEBUG or RELEASE configuration, Wix selects different executables in the built installation.
Basically I shall build different installations from the same projects, but they differs by the components. Some components are build from the same project, but built with different preprocessor options.
Of course it is possible to include every required component, and then define features in order to select a specific component for the installation, but I don't want really to redistribute some executables.
Build different Wix projects is the only solution?

Putting the other two answers and Luca's research together I came up with this solution, which seems to work (note that the string comparison appears to be case sensitive, and the lack of quotes appears to be correct, I've tested this with WiX 3.7):
<?if $(var.Configuration) = Debug ?>
<!-- DEBUG ONLY -->
[ ... insert debug only XML here ... ]
<!-- END DEBUG ONLY -->
<?else?>
<!-- RELEASE ONLY -->
[ ... insert release only XML here ... ]
<!-- END RELEASE ONLY -->
<?endif?>

Your wix scripts have access to build parameters, like the Configuration ('debug' or 'release'). You can therefore conditionally include the correct binaries for the current configuration by referencing $(var.Configuartion) in your component declarations:
<Component Id="myProject.dll"
DiskId="1"
Guid="*">
<File Id="myProject.dll"
Name="myProject.dll"
Source="..\myProject\bin\$(var.Configuration)\myProject.dll" />
</Component>
When you run the build in release mode, this script will pick up the release version of the binary. Likewise, in debug mode, the debug binary will be picked up. This approach does not require preprocessing - the script makes Configuration-related decisions at build time.

Use the preprocessor, e.g.: <?if?> to conditionally include/exclude components based on configuration.

Related

Continue the Wix setup after having a service that could not start

We have a setup in which we have a service that we try to install and run.
For some reason, the service cannot start(due to a port already in use). This isn't critical for us and should not stop the setup.
The service is declared like this:
<DirectoryRef Id="BIN">
<Component Id="MyService" Guid="*" SharedDllRefCount="yes">
<File Id="MyService.exe" Name="MyService.exe" KeyPath="yes" Vital="no" Compressed="default" DiskId="1" Source="$(var.DirDotfuscated)\MyService.exe" />
<ServiceControl Id="Install" Name="MyService" Start="install" Stop="install" />
<ServiceControl Id="Uninstall" Name="MyService" Stop="uninstall" Remove="uninstall" />
<ServiceInstall Id="NewServiceInstall2" Name="MyService" DisplayName="My Service" Type="ownProcess" Interactive="no" Start="auto" ErrorControl="normal" Description="My service" Vital="no" />
</Component>
<Component Id="Xms_HostService_Files" Guid="*" SharedDllRefCount="yes">
<File Id="MyService.exe.config" Name="MyService.exe.config" Vital="no" Compressed="default" DiskId="1" Source="$(var.DirDotfuscated)\MyService.exe.config" />
<File Id="MyServiceCommon.dll" Name="MyServiceCommon.dll" Vital="no" Compressed="default" DiskId="1" Source="$(var.DirDotfuscated)\MyServiceCommon.dll" />
<File KeyPath="yes" Id="MyServiceCore.dll" Name="MyServiceCore.dll" Vital="no" Compressed="default" DiskId="1" Source="$(var.DirDotfuscated)\MyServiceCore.dll" />
</Component>
</DirectoryRef>
When we execute the setup, we get this error:
And then, we only have the option to Retry(which will also fail) or cancel(that stops the setup).
We tried so many things(only put serviceInstall, not serviceControl, ...) but at some point we always have an error.
How should we manage this?
Attempted Answer (without ability to test):
What happens if you set the ServiceControl element's Wait attribute to "no"? I don't have a service exe to test with at the moment, but I believe that could work as you intend it.
Custom actions should generally be avoided for reliability reasons, but on the other hand - if you do need something special - that's what they are there for. Be prepared for most deployment problems to originate from your custom actions though: Why is it a good idea to limit the use of custom actions in my WiX / MSI setups?
Some further advice (which was not asked for :-) ):
You should not install multiple binaries with one component. You should use one file per component for many reasons. Windows Installer best practice specifically dictates to have only one binary per component, but in my opinion you should use one component per file in general to make minor upgrades and patching possible, and self-repair more reliable.
To better understand component reference counting: Change my component GUID in wix?
By eliminating hard-coded GUIDs you can take advantage of WiX's advanced auto-GUID creation concept. This will change the component GUID if the absolute installation path changes. This is correct behavior for component reference counting. Auto-magic. You either set Guid="*" or just leave out the Guid attribute entirely. A few installation locations need a hard coded GUID - the WiX compiler will warn you and explain why.
If you do change the component structure (to use one file per component) you should change the installation path to "break the link to past sins" with regards to component reference counting. This is a very complex topic to explain, but changing the installation path will sort all problems for you - if you also enable the auto component GUIDs I mentioned in the previous point. Keep the path stable from then on (until you have a major new version).
You can do it as simple as adding a sub folder with the the application's major (and minor?) version to the main installation folder hierarchy: "Program Files\MyCompany\MySoftware\5" instead of "Program Files\MyCompany\MySoftware".
I would only add the major version to the path and keep the installation path stable throughout your application's lifetime and then increment when you want to break the link to previous installers for a major new software version (for example if you want to install two versions side-by-side - your application must be built to handle this properly, i.e not overwriting shared settings in the registry from both versions etc...).
You might want to consider simplifying your WiX source file by only specifying values for attributes that are non-standard (otherwise rely on defaults). This can substantially simplify your WiX source files. Here is an example: Syntax for guids in WIX?
Just a quick sample inline (same as in link above - check it out), this is all that is required to install a normal file with default attributes / parameters - all other attributes default well - unless you want to override something:
<Component>
<File Source="..\File.dll" />
</Component>
Some links:
Windows Installer Best Practices (full list).
Windows Installer Best Practices - Organizing Applications into Components (specifically for component creation).
When component reference counting has gone haywire (missing files after upgrades, unexpected removal of shared files on uninstall, etc...): WiX 3.8: Two MSI using the same registry values. How to delete registry values only if both MSI are uninstalled?
Drop the ServiceControl element in lieu of a CustomAction element with #DllEntry="WixQuietExec" set, then use a standard means of starting the service like net start foo and ignore the result. See Quiet Execution Custom Action for details.

web.config changes via TFS 2015 Release Management

In the past I've using web.config transforms when manually deploying code to set environment specific setting values and attributes. I am transitioning from environment specific manual builds to a single TFS 2015 Build deployed to multiple environments via Release Management. Environment specfic application settings values configured in the web.config are tokenized. This method essentially inserts tokens into setting values during the build process. When deployed the tokens are replaced with matching Release definition configuration values.
This method is insufficient setting attributes of non-settings however. Examples of these transforms include:
<httpCookies requireSSL="true" xdt:Transform="Insert" />
<compilation xdt:Transform="RemoveAttributes(debug)" />
<httpRuntime xdt:Transform="RemoveAttributes(executionTimeout,maxRequestLength,useFullyQualifiedRedirectUrl,minFreeThreads,minLocalRequestFreeThreads,appRequestQueueLimit,enableVersionHeader)"/>
<httpRuntime enableVersionHeader="false" maxRequestLength="12288" xdt:Transform="SetAttributes"/>
<customErrors mode="On" xdt:Transform="SetAttributes"/>
What is the best way to update these attributes during release?
Both Web Deploy's parameters.xml method and transforms can be used with Release Management. Transforms would be triggered from Build and the process of replacing tokens created by a publish would be triggered by Release Management.
To trigger transforms during the build, you can do this one of two ways:
Add the following MSBuild parameters to force the transformation to happen during the build
/p:UseWPP_CopyWebApplication=true /p:PipelineDependsOnBuild=false
Create a publish profile using the MSDeploy Package option and then trigger the packaging in Build using the following MSBuild parameters:
/p:DeployOnBuild=true /p:PublishProfile=[nameOfProfile]
Either of the above methods will cause normal Web.config XDT's to run. If you need other XML files to be transformed, you'll need to first install SlowCheetah.
Token Replace and Parameters
Now that you have a build artifact with XDT's run, you can use token replacement and the WinRM tasks from Release Management. These will take the Web Deploy package from the Build and execute the SetParameters command before deploying it. The trick is to take the SetParameters.xml file and run a token replace on it first, swapping out Release environment variables first.
User Sumo gave a proper answer, but I want to record some comments related to what instead of how.
IMHO there are different categories of settings to consider, let's exemplify. The database connection string changes at each environment, while requiring SSL should be turned on for all testing and production environments.
In this perspective, you should have settings applied as early as possible, traditionally at build time and called Debug/Release builds; and last-minute settings, environment dependent, up to runtime settings, like Feature toggles.
So in my view you can use a single tool or multiple tools, but it is important that you properly categorize your settings accordingly.

SpecFlow wrongly using NUnit

I've just (today) tried SpecFlow for the first time. I'm playing about by creating a new class library in VS2010 Pro and adding a SpecFlow Feature Definition file.
Thing is, the integration doesn't appear to be working properly, with a variety of different errors. I've selected MsTest as the test runner, because I can't be bothered with invoking NUnit (I'd like to use NUnit in the long term but at the moment I just want to get some BDD code working). The generated code files however continue to reference NUnit - which is obviously wrong, since I've just told SpecFlow to run using MsTest. I've done everything I can think of to invoke the code generation again, including creating a brand new class library project with the MsTest option selected in Tools > Options > SpecFlow.
If I leave the test runner field set to 'Auto' and right-click a feature file, then select 'Run SpecFlow Scenarios' I get an error message "Could not find matching test runner".
If I instead change the test runner field to MsTest, I get a different error message on doing the same thing - "Object Reference not set to an instance of an object". I'm not surprised at this one since it's still trying to run NUnit tests even though I've explicitly asked for MsTest, though obviously it shouldn't nullref and present that to the user.
What am I doing wrong? The documentation is not helpful, and as far as I can see, there's no FAQ.
edit #1: I've established that the actual setting I'm looking for is provided using App.Config using the field <unitTestProvider name="MsTest" />. I can see what's happened - the field in the Visual Studio options menu doesn't seem to modify the project you're currently working on. Thing is, this makes it look like that field doesn't do anything at all. I've now persuaded SpecFlow to generate MsTest classes and run using the MSTest runner.
So now the question morphs into a slightly different one: What (if anything) does the Tools > Options > SpecFlow > Test Runner Tool field do?
With VS2010 the correct value is MsTest.2010 not MsTest as documented. Change your app.config (for the test assembly) and it will work fine (at least with SpecFlow 1.8)
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<configSections>
<section name="specFlow" type="TechTalk.SpecFlow.Configuration.ConfigurationSectionHandler, TechTalk.SpecFlow" />
</configSections>
<specFlow>
<unitTestProvider name="MsTest.2010" />
<!-- For additional details on SpecFlow configuration options see https://github.com/techtalk/SpecFlow/wiki/Configuration -->
</specFlow>
</configuration>
In answer to your latest Question. What is the setting "Tools > Options > SpecFlow > Test Runner Tool" this setting controls what will actually run the tests, not what will generate the test code. If it is set to auto i believe it will look at the App.config file where you have set the unitTestProvider to determine what the best tool is to run the tests. An alternaive Test runner made by the same guys as SpecFlow is SpecRun http://www.specrun.com/
So when you go to run the tests it will use this option. As you have discovered though the code generator uses the config file to determine what type of test it should generate (mstest/nunit..)
If you ran the specfow installer ( https://github.com/downloads/techtalk/SpecFlow/SpecFlowSetup_v1.8.1.msi ) to install all the Visual Studio Intergration components when you change the App.config file it normally promps to regenerate the features using the new provider. The manual way to do this though is to right click the Feature and select "Run Custom Tool"
In regards to documentation have you found the git hub wiki?
https://github.com/techtalk/SpecFlow/wiki/Documentation
The way I've read this is that the test runner is entirely different to that of the code generator although that doesn't always make sense when the MsTest runner doesn't know about NUnit (I think). Out of the box, the latest version (v2.3.2) even when installed with SpecFlow.MsTest nuget package (of the same version) does not configure your machine to generate MsTest based classes in the background. I am running VS2017 and have Resharper installed as my 'test runner' but the main requirement for generating MsTest based code is a change to the app.config. As per the wiki documentation you also need the following in your app.config. When you save the config you should be prompted for the files to be regenerated.
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<configSections>
<section name="specFlow"
type="TechTalk.SpecFlow.Configuration.ConfigurationSectionHandler, TechTalk.SpecFlow"/>
</configSections>
<specFlow>
<unitTestProvider name="MsTest" />
</specFlow>
</configuration>
We are using ReSharper as a runner for SpecFlow acceptance tests; it worked well right out of the box. Although ReSharper is not free, but it worth every penny...
I was never able to get SpecFlow working right from Visual Studio, I spent some time working on it but never go anywhere. Though I found these instructions on setting up NUnit in Visual Studio 2010 and I use this shortcut to run my SpecFlow tests with good effect.
Overall we use PowerShell to run a lot of tests and I was able to incorporate the NUnit command line runner and SpecFlow report generator into a single script I can run easily.

TFS2010 Build Definition to Deploy to multiple servers?

I've been looking into TFS2010 new build and deployment features with MSDeploy. So far everything is going well (although its been hard to find information about specific scenarios).
Can I modify my Build Definition to specify 2 or more servers to deploy to? What I need to do is deploy to multiple servers (as I have two in my testing environment which uses a NLB).
What I have now is a Build definition which Builds, runs my tests, and then Deploys to ONE of my testing servers (which has the MsDeployAgentService running on it). It works fine, and each web project is deployed as configured in its project file. The MSBuild Arguments I use are:
* /p:DeployOnBuild=True
* /p:DeployTarget=MsDeployPublish
* /p:MSDeployServiceURL=http://oawww.testserver1.com.au/MsDeployAgentService
* /p:CreatePackageOnPublish=True
* /p:MsDeployPublishMethod=RemoteAgent
* /p:AllowUntrustedCertificated=True
* /p:UserName=myusername
* /p:Password=mypassword
NB: I dont use /p:DeployIISAppPath="xyz" as it doesnt deploy all my projects and overrides my project config.
Can I add another build argument to get it to call more than one MSDeployServiceURL? Like something like a second /p:MSDeployServiceURL argument that specifies another server?
Or do I have to look for another solution, such as editing the WF?
I saw an almost exact same question here posted 2 months ago: TFS 2010 - Deploy to Multiple Servers After Build , so it doesn't look like I'm the only one trying to solve this.
I also posted on the IIS.NET forums where MSDeploy is discussed: http://forums.iis.net/t/1170741.aspx . It's had quite a lot of views, but again, no answers.
You don't have to build the project twice to deploy to two servers. The build process will build a set of deployment files. You can then use the InvokeProcess to deploy to multiple servers.
First create a variable named ProjectName. Then add an Assign activity to the "Compile the Project" sequence. This is located in the "Try to Compile the Project" sequence. Here are the properties of the Assign:
To: ProjectName
Value: System.IO.Path.GetFileNameWithoutExtension(localProject)
Here are the properties of our InvokeProcess activity that deploys to the test server:
Arguments: "/y /M:<server> /u:<domain>\<user> /p:<password>"
FileName: String.Format("{0}\{1}.deploy.cmd", BuildDetail.DropLocation, ProjectName)
You will need to change <server>, <domain>, <user>, and <password> to the values that reflect your environment.
If you need to manually deploy to a server you can run the command below from your build folder:
deploy.cmd /y /M:<server> /u:<domain>\<user> /p:<password>
I couldn't find the solution I was looking for, but here's what I came up with in the end.
I wanted to keep the solution simple and configurable within the TFS arguments while at the same time staying in line with the already provided MSBuildArguments method which has been promoted a lot. So I created a new Build Template, and added a new TFS WorkFlow Argument called MSBuildArguments2 in the Arguments tab of the WorkFlow.
I searched through the BuildTemplate WorkFlow for all occurances of the MSBuildArguments (there were two occurances).
The two tasks that use MSBuildArguments are called Run MSBuild for Project. Directly below this task, I added a new "If" block with the condition:
Not String.IsNullOrEmpty(MSBuildArguments2)
I then copied the "Run MSBuild for Project" task and pasted it into the new If's "Then" block, updating its title accordingly. You'll also need to update the new Task's ConmmandLineArguments property to use your new Argument.
CommandLineArguments = String.Format("/p:SkipInvalidConfigurations=true {0}", MSBuildArguments2)
After these modifications, the WorkFlow looks like this:
Save and Check In the new WorkFlow. Update your Build Definition to use this new WorkFlow, then in the build definition's Process tab you will find a new section called Misc with the new argument ready to be used. Because I'm simply using this new argument for deployment, I copied the exact same arguments I used for MSBuild Arguments and updated the MSDeployServiceURL to my second deployment server.
And that's that. I suppose a more elegant method would be to convert MSBuildArguments into an array of strings and then loop through them during the WorkFlow process. But this suits our 2 server requirements.
Hope this helps!
My solution to this is a new Target that runs after Package. Each project that needs to produce a package includes this targets file, and I chose to make the Include conditional on an externally-set "DoDeployment" property. Additionally each project defines the DeploymentServerGroup property so that the destination server(s) are properly filtered depending on what kind of project it is.
As you can see towards the bottom I'm simply executing the command file with the server list, pretty simple.
<!--
This targets file allows a project to deploy its package
As it is used by all project typesconditionally included from the project file
-->
<UsingTask TaskName="Microsoft.TeamFoundation.Build.Tasks.BuildStep" AssemblyFile="$(TeamBuildRefPath)\Microsoft.TeamFoundation.Build.ProcessComponents.dll" />
<!-- Each Server needs the Group metadatum, either Webservers, Appservers, or Batch. -->
<Choose>
<When Condition="'$(Configuration)' == 'DEV'">
<ItemGroup>
<Servers Include="DevWebServer">
<Group>Webservers</Group>
</Servers>
<Servers Include="DevAppServer">
<Group>Appservers</Group>
</Servers>
</ItemGroup>
</When>
<When Condition="'$(Configuration)' == 'QA'">
<ItemGroup>
<Servers Include="QAWebServer1">
<Group>Webservers</Group>
</Servers>
<Servers Include="QAWebServer2">
<Group>Webservers</Group>
</Servers>
<Servers Include="QAAppServer1">
<Group>Appservers</Group>
</Servers>
<Servers Include="QAAppServer2">
<Group>Appservers</Group>
</Servers>
</ItemGroup>
</When>
</Choose>
<!-- DoDeploy can be set in the build defintion -->
<Target Name="StartDeployment" AfterTargets="Package">
<PropertyGroup>
<!-- The _PublishedWebsites area -->
<PackageLocation>$(WebProjectOutputDir)_Package</PackageLocation>
<!-- Override for local testing -->
<PackageLocation Condition="$(WebProjectOutputDirInsideProject)">$(IntermediateOutputPath)Package\</PackageLocation>
</PropertyGroup>
<Message Text="Tier servers are #(Servers)" />
<!-- A filtered list of the servers. DeploymentServerGroup is defined in each project that does deployment -->
<ItemGroup>
<DestinationServers Include="#(Servers)" Condition="'%(Servers.Group)' == '$(DeploymentServerGroup)'" />
</ItemGroup>
<Message Text="Dest servers are #(DestinationServers)" />
</Target>
<!-- Only perform the deployment if any servers fit the filters -->
<Target Name="PerformDeployment" AfterTargets="StartDeployment" Condition="'#(DestinationServers)' != ''">
<Message Text="Deploying $(AssemblyName) to #(DestinationServers)" />
<!-- Fancy build steps so that they better appear in the build explorer -->
<BuildStep
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Message="Deploying $(AssemblyName) to #(DestinationServers)...">
<Output TaskParameter="Id" PropertyName="StepId" />
</BuildStep>
<!-- The deployment command will be run for each item in the DestinationServers collection. -->
<Exec Command="$(AssemblyName).deploy.cmd /Y /M:%(DestinationServers.Identity)" WorkingDirectory="$(PackageLocation)" />
<BuildStep
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(StepId)"
Status="Succeeded"
Message="Deployed $(AssemblyName) to #(DestinationServers)"/>
<OnError ExecuteTargets="MarkDeployStepAsFailed" />
</Target>
<Target Name="MarkDeployStepAsFailed">
<BuildStep
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(StepId)"
Status="Failed" />
</Target>
I am the author of the other similar post. I have yet to find a solution. I believe it is going to be modifying the workflow to add a postprocessing MSBUILD -sync task. That seems to be the most elegant, but was still hoping to find something a bit less intrusive.
I'm not sure if that could help you with TFS 2010, but I have a blog post for TFS 2012: Multiple web projects deployment from TFS 2012 to NLB enabled environment.

how to do builds with nant

In my org, we are planning to go for nant for .net web applications. Source control is TFS, visual studio 2008. I would like to know how to do Builds with Nant? How to create msi and deploy the application using Nant? Is separate Build machine is required to do builds with nant? Somebody please help me out. I need step wise process. Thanks in advance.
Thanks
Shanthi
For a step-by step guide to using NAnt I suggest referring to the NAnt project documentation for the fundamental concepts. Once you are familiar with it's basic usage I suggest investigating the nant-contrib project to obtain more build tasks.
One part of your question that I would like to address directly here is the question of whether a separate machine is required to use NAnt. NAnt does not strictly require a separate machine, however a separate machine might be beneficial if your build process is automated or particularly intensive
[Update]
In response to comment from OP:
NAnt views the build process as a series of individual tasks to be performed as part of a target. The normal process for building an application would be to invoke a compiler on the source files in order to produce a binary, NAnt has a number of tasks that invoke language compilers
In this example I will invoke the C# language compiler (csc.exe) using the task in an NAnt build file for a Hello World application that consists of a single source file named hello.cs.
<?xml version="1.0"?>
<project name="Hello World" default="build" basedir=".">
<property name="debug" value="true" overwrite="false" />
</target>
<target name="build" description="compiles the source code">
<csc target="exe" output="HelloWorld.exe" debug="${debug}">
<sources>
<includes name="HelloWorld.cs" />
</sources>
</csc>
</target>
</project>
Let's examine this XML:
<project name="Hello World" default="build" basedir=".">
Things to Note:
The value of the default property is "build". This means that the target named "build" will be invoked if no other target is specified.
This is the build target, as the description states it will compile the source code. To do this the csc task is used. The csc task has a number of options including
target: This specifies the type of binary the target will produce. In this case an executable will be produced
output: specifies the name of the executable file that will be created
debug: The value of this property used a conditional property debug (defined above as false) which will determine whether the compiler produces an executable that contains debugging information
sources & include:
specifies the source files that the compiler will parse in order to produce the executable
As you can see the actions necessary to build the source code are defined in a target. A build file can define many targets which each call many tasks. To produce an MSI file you would inoke a task that produces an MSI file, unfortunately as I don't actually use NAnt regularly, you will have to do some research to find one although I have a feeling the nant-contrib project includes one given how common it is to produce an MSI.
I hope this explanation as clarified things for you
The information in this update has been distilled from this document in the NAnt documentation
Seperate build machine is not necessarily required, but it's definetely recommended.
You'll want to look into using the following tools:
CruiseControl .NET
TFS Plugin for CC.NET