msbuild not able to write to files - sql-server-2008-r2

I'm trying to create a build process using cruise control 1.8.5.0 with TFS 2010 running on a windows server 2008 R2 machine. The problem I'm running into is when MS Build is trying to write to files that ere just copied into the projects area it gets access denied. When I look at the files they are set to read only, the account that ccnet is running as is an admin on the box and everything is reading and writing locally. We have a similar environment setup on Windows Server 2003 and everything works just fine. We've verified that the account that is running the process is correct, I've set the owner of the projects folder to the same account that is running the process, we've disabled UAC. At this point I'm at a loss. Any additional information needed let me know.
Thanks
Robert

//When I look at the files they are set to read only, //
You can run the attrib to remove the read-only flag on the files.
quick example:
<Exec Command=“attrib -R $(SolutionRoot)\MyCoolFile.txt“ />
There is also a custom task:
<UsingTask AssemblyFile="$(MSBuildCommunityTasksLib)" TaskName="MSBuild.Community.Tasks.Attrib" />
I've not used it, but that would be enough to hunt it down.
I'd guess it would look like this:
<ItemGroup>
<Files Include="$(SolutionRoot)\MySubFolder\**\*.*/>
</ItemGroup>
<Attrib Files="%(Files.Identity)" ReadOnly="true"/>

Related

Any way to run commands/script after VS2015 Publish wizard has copied files to output?

I have a very simple Windows service project I want to deploy to a server using Visual Studio 2015. I can successfully deploy using the Publish wizard (right-click on project -> Publish and deploy to \\myserver\c$\somepath\), but I need to 1) stop the service before publishing (so that the executable can be replaced), and 2) restart the service after the files have been copied.
I know how to start/stop services from the command line, and this answer provides a way to do it directly in a build action. However, I can't seem to find a way to execute any action after VS has copied the files to the output directory on the server.
For example, I have tried adding the following to my .csproj file without luck:
<Target Name="Mytarget" AfterTargets="AfterPublish">
<Warning Text="After AfterPublish" />
</Target>
Mytarget executes before VS actually copies the files to the server, so evidently, I can't hook onto AfterPublish. I've also tried PipelinePreDeployCopyAllFilesToOneFolder, CopyAllFilesToSingleFolderForPackage, and MSDeployPublish without luck (these don't seem to execute at all).
My end goal is to allow more-or-less one-click updating of the service, without having to log on to remote desktop and run a script manually after each update.
Is there any way I can have VS automatically execute an action after publishing a Windows service project to a server?

VS2015: recursively adding external content directories to AppX

I try to add a folder and its subfolders (~4000 files) as content to a C++ windows store app (in VS2015).
Heres the scenario:
G:\Game -> is the build directory
D:\data -> holds the original content
I've read there are some methods to declare external content in the .vxcproj file like that:
<ItemGroup>
<Content Include="D:\**">
<Link>%(RecursiveDir)%(FileName)%(Extension)</Link>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<DeploymentContent>true</DeploymentContent>
</Content>
</ItemGroup>
This actually copies the contents of D:\data into the build-directory (G:\Game). This is great since the program can now be run & debugged. BUT: as soon as i deploy the project to the AppX Folder (G:\Game\AppX) the data-folder doesnt get deployed there.
G:\Game\game.exe
G:\Game\data\...
G:\Game\AppX
G:\Game\AppX\game.exe
(G:\Game\AppX\data\... - missing)
Any clues ?
After fiddling around for days, as of now i can state there is no way to do this properly in the Visual C++ - IDE (2012 / 2015) (it seemed to work with C# projects though).
The only way to achieve what i wanted to do is
a post-build-event using robocopy to copy/synch the data over to the AppX folder
Writing a script for the packaging / signing using MakeAppX.exe, SignTool.exe and 7-zip.

Nant script - How to check if it runs on a server or on a dev machine?

I have a Nant build script.
It may be executed on either dev machine or build server.
I want to skip a build step on a dev machine but still run it on a server.
How can I check if the script runs on a server or on a dev machine?
It's all about adding something special to the build server. I can think of two options from the top of my head:
Environment variable: just make sure build server has a specific environment variable and check its existence in your NAnt script
Special environment.include file: make sure build server has a special environment.include file in the root of the C drive, for example, and define build server specific NAnt properties there, like <property name="is.build.server" value="true" />.
You can include this kind of file like this:
<include buildfile="\Environment.include" if="${file::exists('\Environment.include')}"/>

How to modify the csdef defined in a cspkg

To deploy to different azure environments I modify the csdef as part of the compilation step to change the host headers. Doing so requires building the cspkg once for each environment instead of being able to reuse the cspkg and specify different configs for deployment.
I would like to instead modify the csdef file of a cspkg after it has been created, without recompiling. Is that possible, and if so how?
I've done something similar to what you're after to differentiate between test and live environments. First of all you need to create a new .csdef file that you want to use for your alternate settings. This needs to be the complete file as we're just going to swap it out with the original one. Now we need to add this to the cloud project. Right click on the cloud project and select unload project. Right click on it again and select Edit [Name of project]. There's a section that looks a bit like this:
<ItemGroup>
<ServiceConfiguration Include="ServiceConfiguration.Test.cscfg" />
<ServiceDefinition Include="ServiceDefinition.csdef" />
<ServiceConfiguration Include="ServiceConfiguration.cscfg" />
</ItemGroup>
Add a new ServiceDefinition item that points to your newly created file. Now find the following line:
<Import Project="$(CloudExtensionsDir)Microsoft.WindowsAzure.targets" />
Then add this code block, editing the TargeProfile check to be the build configuration you're wanting to use for your alternate and ensuring that it points to your new .csdef file
<Target Name="AfterResolveServiceModel">
<!-- This should be run after it has figured out which definition file to use
but before it's done anything with it. This is all a bit hard coded, but
basically it should remove everything from the SourceServiceDefinition
item and replace it with the one we want if this is a build for test-->
<ItemGroup>
<!-- This is an interesting way of saying remove everything that is in me from me-->
<SourceServiceDefinition Remove="#(SourceServiceDefinition)" />
<TargetServiceDefinition Remove="#(TargetServiceDefinition)" />
</ItemGroup>
<ItemGroup Condition="'$(TargetProfile)' == 'Test'">
<SourceServiceDefinition Include="ServiceDefinition.Test.csdef" />
</ItemGroup>
<ItemGroup Condition="'$(TargetProfile)' != 'Test'">
<SourceServiceDefinition Include="ServiceDefinition.csdef" />
</ItemGroup>
<ItemGroup>
<TargetServiceDefinition Include="#(SourceServiceDefinition->'%(RecursiveDirectory)%(Filename).build%(Extension)')" />
</ItemGroup>
<Message Text="Source Service Definition Changed To Be: #(SourceServiceDefinition)" />
</Target>
To go back to normal, right click on the project and select Reload Project. Now when you build your project, depending on which configuration you use, it will use different .csdef files. It's worth noting that the settings editor in is not aware of your second .csdef file so if you add any new settings through the GUI you will need to add them manually to this alternate version.
If you would want to just have a different CSDEF then you can do it easily by using CSPACK command prompt directly as below:
Open command windows and locate the folder where you have your CSDEF/CSCFG and CSX folder related to your Windows Azure Project
Create multiple CSDEF depend on your minor changes
Be sure to have Windows Azure SDK in path to launch CS* commands
USE CSPACK command and pass parameters to use different CSDEF and Output CSPKG file something similar to as below:
cspack <ProjectName>\ServiceDefinitionOne.csdef /out:ProjectNameSame.csx /out:ProjectOne.cspkg /_AddMoreParams
cspack <ProjectName>\ServiceDefinitionTwo.csdef /out:ProjectNameSame.csx /out:ProjectTwo.cspkg /_AddMoreParams
More about CSPACK: http://msdn.microsoft.com/en-us/library/windowsazure/gg432988.aspx
As far as I know, you can't easily modify the .cspkg after it is created. I guess you probably technically could as the .cspkg is a zip file that follows a certain structure.
The question I'd ask is why? If it is to modify settings like VM role size (since that's defined in the .csdef file), then I think you have a couple of alternative approaches:
Create a seperate Windows Azure deployment project (.csproj) for each variation. Yes, I realize this can be a pain, but it does allow the Visual Studio tooling to work well. The minor pain may be worth it to have the easier to use tool support.
Run a configuration file transformation as part of the build process. Similiar to a web.config transform.
Personally, I go with the different .csproj approach. Mostly because I'm not a config file transformation ninja . . . yet. ;) This was the path of least resistance and it worked pretty well so far.

FXCop report hosting on Hudson Dashboard Issue

I generated Fxcop analysis report using ant script. But I am unable to host it on Hudson Dashboard.
Using Nant script, I am able to generate an .xml output. Here is the ant:
<target name="Fxcop">
<echo message="Running Fxcop..." />
<exec command="${fxcop.basedir}\FxCopCmd.exe">
<arg value="/f:Path of my source file/>
<arg value="/out:some path/>
</exec>
</target>
In hudson Configuration, To display Vioaltion Reports, i configured the path of output(only pattern) file of the ant in xml file pattern of fxcop.
But Hudson is unable to find it.
I done the configurations and setting correctly.
Can anyone walk me through where I am going wrong.
Thanks in Advance
Most likely XML is created in different subfolder which relative path is originated from current directory. E.g. if your current working directory is %WORKSPACE%\trunk and relative path for report is /out:result\fxcop-result.xml then it will be created in %WORKSPACE%\trunk\result\fxcop-result.xml.
To fix this I suggest to check current directory from which you are executing FxCop analysis (also try searching this xml on build machine).
Easiest way to implement FxCop analysis in Hudson using Windows batch command will be:
Add "Execute Windows batch command" (this command will be executed from base workspace folder, e.g. C:\hudson\workspace\FxCopJob)
Specify command that will execute analysis, e.g.: ""{FxCopDirectory}\fxcopcmd.exe" /file:"%WORKSPACE%\{path to your file}" /directory="{Assemblies_path}" /rulesetdirectory:"{RuleSetDir}" /out:fxcop-result.xml"
Set fxcop-result.xml in fxcop section of Report Violation (e.g. report will be created in C:\hudson\workspace\FxCopJob\fxcop-result.xml)
Run updated Hudson job and verify that FxCop violations are shown
WBR,
Andrey