I have defined a CCNet queue which is included in several CI projects (about 10).
The queue fills up and I can see pending jobs in CCTray and in the dashboard, which are correctly ordered based on the priorities I have defined. However, the queue is emptied as soon as the first job completes, and the dashboard activity of all jobs that were "Pending" returns to "Sleeping".
I do not have triggers between projects - the only trigger I am using in any project is for starting a build if the source repository is updated.
<queue name="myQ" duplicates="ApplyForceBuildsReplace" />
<cb:scope ProjectName="My project">
<project category="MyProjects" name="$(ProjectName)" queue="myQ" queuePriority="1" webURL="$(WebUrl)" workingDirectory="c:\my_project\work" artifactDirectory="c:\my_project\log" >
<triggers>
<intervalTrigger name="continuous integration" seconds="120" buildCondition="IfModificationExists"/>
</triggers>
<sourcecontrol type="filtered">
<sourceControlProvider type="vsts">
<server>$(TeamProjectCollectionUrl)</server>
<project>$/MyProject</project>
<workingDirectory>c:\my_project_work_tfs</workingDirectory>
<workspace>my_project_work_tfs</workspace>
<deleteWorkspace>false</deleteWorkspace>
</sourceControlProvider>
</sourcecontrol>
<tasks>
<exec>
[invokes NAnt...]
</exec>
</tasks>
</project>
</cb:scope>
Three of my projects have a priority of 1 and the others have a priority 2.
I am using CCNet 1.8.3.0.
Can anyone help? Thanks.
From your question I read that you expect the projects to build after the first one is triggered, yes? In that case you need to include a project trigger for every project that gets triggered as soon as its dependency gets triggered.
Related
I am trying to deploy approval process to cutomer Production orgs. It is difficult to do the manual creation of approval in each customer org. Anybody know this can be done by ANT/Eclipse ? Thanks in advance
if you want to use Ant for deployment SF-apps you also need Ant-plugin - ant-salesforce.jar. So you can use the following script in build.xml
<project name="ExampleProject" default="ExampleProject" basedir="." xmlns:sf="antlib:com.salesforce">
<taskdef resource="com/salesforce/antlib.xml" classPath="ant-salesforce.jar" uri="antlib:com.salesforce"/>
<target name="ExampleProject">
<sf:deploy deployRoot="Application/src" username="your_sf_username" password="your_sf_password_and_token" serverurl="https://login.salesforce.com" runAllTests="true" rollbackOnError="true" />
</target>
</project>
(It's just an example, not working script)
Also you can use Eclipse with SF-plugin, which allow to deploy SF-apps and parts of them (also approval processes) to sandboxes. This plugin also has a lot of other useful functions.
I everyone, I am using Eclipse, Subclipse, and ANT. I would like to generate a build manifest with the files that have changed, added, updated, deleted, from the repo (with the individual version numbers on my current system).
<propertyfile file="${dist.dir}\deploymentManifest.txt"
comment="This file is automatically generated - DO NOT EDIT">
<entry key="buildtime" value="${builtat}"/>
<entry key="build" value="${svnversion}"/>
<entry key="version" value="${version}"/>
<entry key="systemLocation" value="${directory/filename.ext}"/>
</propertyfile>
How do I peel that information from the files in Eclipse? or how do I use ANY to retrieve this info?
Thanks,
Frank
Well, ${buildtat} could be taken from the <tstamp> task in Ant. The others could be parsed by doing a svn log --xml and then using the resulting XML from a <xmlproperties> task. Right off the top of my head (i.e. no error checking):
<!-- Gets the Time Stamp -->
<tstamp>
<format property="buildtat" pattern="MM/dd/yyyy HH:MM"/>
</tstamp>
<!-- Generates the revision information you need-->
<exec
executable="svn"
output="${svn.log.file}">
<arg line="log --xml -rHEAD/>
</exec>
<!-- Reads that information into a Property -->
<xmlproperty file="${svn.log.file}"/>
<echo message="Subversion Rev: ${log.logentry{revision}}"/>
However, I'd recommend you look at a continuous build system like Jenkins. Whenever you make a change in your Subversion repository, Jenkins picks up the change and automatically does a new build. Not only does this allow you to verify that your changes don't break your build, but Jenkins can do other things too like run JUnit tests. Jenkins then stores your build and the results of your tests and the whole build log in an easy to get to HTML page.
Where Jenkins will work for you is that Jenkins automatically exposes such things as the Subversion Revision as part of the build process. You can fetch the Subversion Revision, the Jenkins build number, the name of the Jenkins project and many other things as environment variables. Then, you could do this:
<property env="env."/>
<propertyfile file="${dist.dir}\deploymentManifest.txt"
comment="This file is automatically generated - DO NOT EDIT">
<entry key="buildtime" value="${env.BUILD_ID}"/>
<entry key="build" value="${env.SVN_REVISION}"/>
<entry key="version" value="${BUILD_NUMBER}"/>
<entry key="systemLocation" value="${directory/filename.ext}"/>
</propertyfile>
Take a look at Jenkins. It's fairly easy to understand and use.
It should take you about 5 minutes to download and maybe 10 minutes on a Linux system to get up and running. Windows is more complex and might take as long as 15 to 20 minutes to get up and running. You can run it on your desktop system for now, and play around with it.
It should take you maybe another half hour to figure out how to setup a project that can automatically do builds whenever someone does a commit.
Jenkins is web based, but comes with its own light weight web based application engine. All you need is Java 1.6 to run it. (And, if you're using Eclipse, you should already have that).
I've built a set of generic deployment scripts which work great for the majority of our stuff. We've just however introduced our largest project to the setup and we're now finding times are far too varied and long for our liking.
The project size as it stands is 33,226files at a size of 400Mb plus. Times are currently taking between 13mins & 55mins (the last deployments time) depending on certain decisions made by ANT ( more below ).
In terms of the steps we currently do the following on x2 servers:-
1) ANT exports the project from SVN to both servers (made up of 3 parts).
2) It begins to shutdown the Web Services on Server #1.
This was the workaround we put in to stop Windows (2003) file locking failing the deployment.
3) ANT runs a "move" task on the current version (all parts) into temporary folders & moves the exported new version into its place.
4) Customised deployment code is run - one part being to move permanant features from the temporary folder into the new (i.e system files / Web Server Admin tools).
5) Delete the temporary folder.
6) Bring the Web Services back online
... rinse and repeat for the 2nd server steps 2 -> 6.
7) Save the ANT logs.
The main issue I'm having is that the ANT move task seems to make one of two decisions. It either:
a) Very simply swaps the versions over and moves on - taking a minute or two to handling it or
b) Goes through some kind of integrity check that it moves every file and folder from one place to the other. This floods the logs and takes a fair length of time to complete. Hence the 40+ minutes extra added on.
I can't find anything online that explains what causes ANT/OS to make that decision. Option A would be the ideal full-time situation.
I've tried copy, delete separately. I've tried the sync task. All seem to have this slow performance.
So really I'm asking what others with more experienced than me do with deployments of this scale. Do you have any hints / tips on how I could improve / speed this process up? Any ideas what the move is doing and if there is maybe a better way of doing all this?
Thanks a lot,
James
Thanks for the input all.
Just to add an answer to this one I've made the following changes which seem to have knocked a few minutes of it.
The first one was I've changed how the swapping happens off the back of the comment I mentioned before. It seems that ANT will try and do the following :-
"If the target directory does not already exist, Ant will do a rename of the directory. But if the target directory exists, it instead does a copy into the directory and delete from the source directory instead."
I think what's happened is ANT is trying to put the new version in before the old version has been completely removed. So instead of trying to rename the old I've now moved it into a temporary folder and deleted this at the end of the build. That seems to have stabled things on that front.
A few other things I've added to make ANT a bit smart :-)
1) I've set it up so ANT will not deploy any part of the build that is the same as what currently exists. So if part 1 is selected and that's on the Test environment already then it's removed from the build and SVN exports.
2) With the service shutdown / startup I've got ANT reading the response that comes back. If a service tells it that it's already started when calls, as sometimes happens if a service relies on others Windows automatically boots them up, then I've told ANT to hang around and move onto the next one.
Little steps like that seem to have improved things by a fair bit. I'd like to still try and get more out of them but these certaintly have given them a big step.
Thanks again,
James
even we faced this problem of auto deployment script taking longer time in our organization. So initially we had our script running in sequential like cleaning, stopping tomcats, updating, starting tomcats and making sure that all webapps are properly deployed.
So we have done following things like:
1. parallely cleaning and stopping all tomcats
2. do svn swith
3. parallely start all tomcats
4. make sure all webapps are deployed properly using jmx
here is the piece of code :
<target name="all_clean_parallel">
<parallel>
<antcall target="x1_clean"/>
<antcall target="x2_clean"/>
<antcall target="x3_clean"/>
</parallel>
</target>
<target name="all_start_parallel">
<parallel>
<antcall target="x1_start"/>
<antcall target="x2_start"/>
<antcall target="x3_start"/>
</parallel>
</target>
And the piece of code to check whether webapp is deployed propely or not with jmx help :
<macrodef name="mStatus">
<attribute name="aModule" />
<attribute name="aHost" default="localhost"/>
<attribute name="aPort" default="9012"/>
<attribute name="aMaxWait" default="240"/>
<attribute name="aTomcat" default=""/>
<attribute name="aState" default="1"/>
<sequential>
<waitfor maxwait="#{aMaxWait}" maxwaitunit="second" timeoutproperty="#{aHost}.#{aTomcat}.#{aModule}.#{aPort}.server.timeout" >
<and>
<jmx:equals
host="#{aHost}"
port="#{aPort}"
ref="#{aHost}.#{aTomcat}.#{aModule}.#{aPort}"
name="Catalina:j2eeType=WebModule,name=//localhost/#{aModule},J2EEApplication=none,J2EEServer=none"
attribute="state"
value="#{aState}"
/>
</and>
</waitfor>
<if>
<equals arg1="${#{aHost}.#{aTomcat}.#{aModule}.#{aPort}.server.timeout}" arg2="true" />
<then>
<var name="failBuild" value="true"/>
<echo message="*************************Host.Tomcat.Module = #{aHost}.#{aTomcat}.#{aModule} is not deployed into the tomcat" />
</then>
<else>
<echo message="#{aHost}.#{aModule} is deployed into the tomcat" />
</else>
</if>
</sequential>
I've been looking into TFS2010 new build and deployment features with MSDeploy. So far everything is going well (although its been hard to find information about specific scenarios).
Can I modify my Build Definition to specify 2 or more servers to deploy to? What I need to do is deploy to multiple servers (as I have two in my testing environment which uses a NLB).
What I have now is a Build definition which Builds, runs my tests, and then Deploys to ONE of my testing servers (which has the MsDeployAgentService running on it). It works fine, and each web project is deployed as configured in its project file. The MSBuild Arguments I use are:
* /p:DeployOnBuild=True
* /p:DeployTarget=MsDeployPublish
* /p:MSDeployServiceURL=http://oawww.testserver1.com.au/MsDeployAgentService
* /p:CreatePackageOnPublish=True
* /p:MsDeployPublishMethod=RemoteAgent
* /p:AllowUntrustedCertificated=True
* /p:UserName=myusername
* /p:Password=mypassword
NB: I dont use /p:DeployIISAppPath="xyz" as it doesnt deploy all my projects and overrides my project config.
Can I add another build argument to get it to call more than one MSDeployServiceURL? Like something like a second /p:MSDeployServiceURL argument that specifies another server?
Or do I have to look for another solution, such as editing the WF?
I saw an almost exact same question here posted 2 months ago: TFS 2010 - Deploy to Multiple Servers After Build , so it doesn't look like I'm the only one trying to solve this.
I also posted on the IIS.NET forums where MSDeploy is discussed: http://forums.iis.net/t/1170741.aspx . It's had quite a lot of views, but again, no answers.
You don't have to build the project twice to deploy to two servers. The build process will build a set of deployment files. You can then use the InvokeProcess to deploy to multiple servers.
First create a variable named ProjectName. Then add an Assign activity to the "Compile the Project" sequence. This is located in the "Try to Compile the Project" sequence. Here are the properties of the Assign:
To: ProjectName
Value: System.IO.Path.GetFileNameWithoutExtension(localProject)
Here are the properties of our InvokeProcess activity that deploys to the test server:
Arguments: "/y /M:<server> /u:<domain>\<user> /p:<password>"
FileName: String.Format("{0}\{1}.deploy.cmd", BuildDetail.DropLocation, ProjectName)
You will need to change <server>, <domain>, <user>, and <password> to the values that reflect your environment.
If you need to manually deploy to a server you can run the command below from your build folder:
deploy.cmd /y /M:<server> /u:<domain>\<user> /p:<password>
I couldn't find the solution I was looking for, but here's what I came up with in the end.
I wanted to keep the solution simple and configurable within the TFS arguments while at the same time staying in line with the already provided MSBuildArguments method which has been promoted a lot. So I created a new Build Template, and added a new TFS WorkFlow Argument called MSBuildArguments2 in the Arguments tab of the WorkFlow.
I searched through the BuildTemplate WorkFlow for all occurances of the MSBuildArguments (there were two occurances).
The two tasks that use MSBuildArguments are called Run MSBuild for Project. Directly below this task, I added a new "If" block with the condition:
Not String.IsNullOrEmpty(MSBuildArguments2)
I then copied the "Run MSBuild for Project" task and pasted it into the new If's "Then" block, updating its title accordingly. You'll also need to update the new Task's ConmmandLineArguments property to use your new Argument.
CommandLineArguments = String.Format("/p:SkipInvalidConfigurations=true {0}", MSBuildArguments2)
After these modifications, the WorkFlow looks like this:
Save and Check In the new WorkFlow. Update your Build Definition to use this new WorkFlow, then in the build definition's Process tab you will find a new section called Misc with the new argument ready to be used. Because I'm simply using this new argument for deployment, I copied the exact same arguments I used for MSBuild Arguments and updated the MSDeployServiceURL to my second deployment server.
And that's that. I suppose a more elegant method would be to convert MSBuildArguments into an array of strings and then loop through them during the WorkFlow process. But this suits our 2 server requirements.
Hope this helps!
My solution to this is a new Target that runs after Package. Each project that needs to produce a package includes this targets file, and I chose to make the Include conditional on an externally-set "DoDeployment" property. Additionally each project defines the DeploymentServerGroup property so that the destination server(s) are properly filtered depending on what kind of project it is.
As you can see towards the bottom I'm simply executing the command file with the server list, pretty simple.
<!--
This targets file allows a project to deploy its package
As it is used by all project typesconditionally included from the project file
-->
<UsingTask TaskName="Microsoft.TeamFoundation.Build.Tasks.BuildStep" AssemblyFile="$(TeamBuildRefPath)\Microsoft.TeamFoundation.Build.ProcessComponents.dll" />
<!-- Each Server needs the Group metadatum, either Webservers, Appservers, or Batch. -->
<Choose>
<When Condition="'$(Configuration)' == 'DEV'">
<ItemGroup>
<Servers Include="DevWebServer">
<Group>Webservers</Group>
</Servers>
<Servers Include="DevAppServer">
<Group>Appservers</Group>
</Servers>
</ItemGroup>
</When>
<When Condition="'$(Configuration)' == 'QA'">
<ItemGroup>
<Servers Include="QAWebServer1">
<Group>Webservers</Group>
</Servers>
<Servers Include="QAWebServer2">
<Group>Webservers</Group>
</Servers>
<Servers Include="QAAppServer1">
<Group>Appservers</Group>
</Servers>
<Servers Include="QAAppServer2">
<Group>Appservers</Group>
</Servers>
</ItemGroup>
</When>
</Choose>
<!-- DoDeploy can be set in the build defintion -->
<Target Name="StartDeployment" AfterTargets="Package">
<PropertyGroup>
<!-- The _PublishedWebsites area -->
<PackageLocation>$(WebProjectOutputDir)_Package</PackageLocation>
<!-- Override for local testing -->
<PackageLocation Condition="$(WebProjectOutputDirInsideProject)">$(IntermediateOutputPath)Package\</PackageLocation>
</PropertyGroup>
<Message Text="Tier servers are #(Servers)" />
<!-- A filtered list of the servers. DeploymentServerGroup is defined in each project that does deployment -->
<ItemGroup>
<DestinationServers Include="#(Servers)" Condition="'%(Servers.Group)' == '$(DeploymentServerGroup)'" />
</ItemGroup>
<Message Text="Dest servers are #(DestinationServers)" />
</Target>
<!-- Only perform the deployment if any servers fit the filters -->
<Target Name="PerformDeployment" AfterTargets="StartDeployment" Condition="'#(DestinationServers)' != ''">
<Message Text="Deploying $(AssemblyName) to #(DestinationServers)" />
<!-- Fancy build steps so that they better appear in the build explorer -->
<BuildStep
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Message="Deploying $(AssemblyName) to #(DestinationServers)...">
<Output TaskParameter="Id" PropertyName="StepId" />
</BuildStep>
<!-- The deployment command will be run for each item in the DestinationServers collection. -->
<Exec Command="$(AssemblyName).deploy.cmd /Y /M:%(DestinationServers.Identity)" WorkingDirectory="$(PackageLocation)" />
<BuildStep
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(StepId)"
Status="Succeeded"
Message="Deployed $(AssemblyName) to #(DestinationServers)"/>
<OnError ExecuteTargets="MarkDeployStepAsFailed" />
</Target>
<Target Name="MarkDeployStepAsFailed">
<BuildStep
TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(StepId)"
Status="Failed" />
</Target>
I am the author of the other similar post. I have yet to find a solution. I believe it is going to be modifying the workflow to add a postprocessing MSBUILD -sync task. That seems to be the most elegant, but was still hoping to find something a bit less intrusive.
I'm not sure if that could help you with TFS 2010, but I have a blog post for TFS 2012: Multiple web projects deployment from TFS 2012 to NLB enabled environment.
In my org, we are planning to go for nant for .net web applications. Source control is TFS, visual studio 2008. I would like to know how to do Builds with Nant? How to create msi and deploy the application using Nant? Is separate Build machine is required to do builds with nant? Somebody please help me out. I need step wise process. Thanks in advance.
Thanks
Shanthi
For a step-by step guide to using NAnt I suggest referring to the NAnt project documentation for the fundamental concepts. Once you are familiar with it's basic usage I suggest investigating the nant-contrib project to obtain more build tasks.
One part of your question that I would like to address directly here is the question of whether a separate machine is required to use NAnt. NAnt does not strictly require a separate machine, however a separate machine might be beneficial if your build process is automated or particularly intensive
[Update]
In response to comment from OP:
NAnt views the build process as a series of individual tasks to be performed as part of a target. The normal process for building an application would be to invoke a compiler on the source files in order to produce a binary, NAnt has a number of tasks that invoke language compilers
In this example I will invoke the C# language compiler (csc.exe) using the task in an NAnt build file for a Hello World application that consists of a single source file named hello.cs.
<?xml version="1.0"?>
<project name="Hello World" default="build" basedir=".">
<property name="debug" value="true" overwrite="false" />
</target>
<target name="build" description="compiles the source code">
<csc target="exe" output="HelloWorld.exe" debug="${debug}">
<sources>
<includes name="HelloWorld.cs" />
</sources>
</csc>
</target>
</project>
Let's examine this XML:
<project name="Hello World" default="build" basedir=".">
Things to Note:
The value of the default property is "build". This means that the target named "build" will be invoked if no other target is specified.
This is the build target, as the description states it will compile the source code. To do this the csc task is used. The csc task has a number of options including
target: This specifies the type of binary the target will produce. In this case an executable will be produced
output: specifies the name of the executable file that will be created
debug: The value of this property used a conditional property debug (defined above as false) which will determine whether the compiler produces an executable that contains debugging information
sources & include:
specifies the source files that the compiler will parse in order to produce the executable
As you can see the actions necessary to build the source code are defined in a target. A build file can define many targets which each call many tasks. To produce an MSI file you would inoke a task that produces an MSI file, unfortunately as I don't actually use NAnt regularly, you will have to do some research to find one although I have a feeling the nant-contrib project includes one given how common it is to produce an MSI.
I hope this explanation as clarified things for you
The information in this update has been distilled from this document in the NAnt documentation
Seperate build machine is not necessarily required, but it's definetely recommended.
You'll want to look into using the following tools:
CruiseControl .NET
TFS Plugin for CC.NET