As part of a content migration project, I am building content into a CMS on a weekly basis, and I use an Ant script to copy the content files to the build directory. Up until now, we've been wiping the CMS and reloading the whole 17,000-file set every time, which takes about 1.5 hours. But now that the content and the CMS customisations are more stable, we'd like to only upload the content files that have been modified since the previous week.
I can copy files modified since the last time I ran the Ant script using the <modified> selector:
<copy todir="changed" failonerror="no">
<fileset dir="output" includes="*.*">
<modified/>
</fileset>
</copy>
Which works very nicely. However, I would like to be able to load the files that have been modified since the last CMS build that took place on the server. So I was wondering if there was some way of using <modified>'s cache-based approach to only copy the files that have been modified since a given date/time like "17.00 last Thursday" instead of "last time this script was run".
I got the answer I was looking for on the Ant mailing list where Stefan Bodewig suggested using the update parameter on the modified selector. As I'm using Ant 1.7.1., I had to work around a bug that prevented its direct use as an attribute, but essentially, by setting it using a property set on the command line, I can update the cache whenever I do a production build, and leave the cache as it is when I do an intermediate test build.
Here is the code I ended up with (including workaround for 1.7.1.):
<copy todir="\content\Test\" includeEmptyDirs="false" failonerror="no">
<fileset dir="../Output">
<modified>
<param name="update" value="false"/>
</modified>
</fileset>
</copy>
Would the date selector do the job?
Related
From what I've gathered, the only change made since the last build in Azure-DevOps is the version of this nuget-package.
So either there is a mistake made in there (which I am not privy to investigate) or the problem lies elsewhere in the build task.
[error]f:\WorkB_tool\dotnet\sdk\5.0.102\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.Publish.targets(237,5):
Error MSB3025: The source file
"C:\windows\ServiceProfiles\NetworkService.nuget\packages\package\version\staticwebassets\css\open-iconic\FONT-LICENSE"
is actually a directory. The "Copy" task does not support copying
directories.
The error is clear enough I suppose, but I havenĀ“t found a resource on what is causing it or how to fix it.
By adding a file ending (css\open-iconic\FONT-LICENSE.txt) the build could proceed.
However, why this was suddenly an issue still perplexes me.
It seems that there is something wrong with Copy Task from your <packages_id>.props file,
Copy task should work with files rather than a folder, so you should use this:
<ItemGroup>
<File Include="$(MSBuildThisFileDirectory)xxx\staticwebassets\assets\libs\flot-charts\Makefile\*.*"></File>
</ItemGroup>
<Target Name="xxx" AfterTargets="xxx">
<Copy SourceFiles="#(File)" DestinationFolder="xxx"></Copy>
</Target>
We could also copy the file via task copy file.
To deploy to different azure environments I modify the csdef as part of the compilation step to change the host headers. Doing so requires building the cspkg once for each environment instead of being able to reuse the cspkg and specify different configs for deployment.
I would like to instead modify the csdef file of a cspkg after it has been created, without recompiling. Is that possible, and if so how?
I've done something similar to what you're after to differentiate between test and live environments. First of all you need to create a new .csdef file that you want to use for your alternate settings. This needs to be the complete file as we're just going to swap it out with the original one. Now we need to add this to the cloud project. Right click on the cloud project and select unload project. Right click on it again and select Edit [Name of project]. There's a section that looks a bit like this:
<ItemGroup>
<ServiceConfiguration Include="ServiceConfiguration.Test.cscfg" />
<ServiceDefinition Include="ServiceDefinition.csdef" />
<ServiceConfiguration Include="ServiceConfiguration.cscfg" />
</ItemGroup>
Add a new ServiceDefinition item that points to your newly created file. Now find the following line:
<Import Project="$(CloudExtensionsDir)Microsoft.WindowsAzure.targets" />
Then add this code block, editing the TargeProfile check to be the build configuration you're wanting to use for your alternate and ensuring that it points to your new .csdef file
<Target Name="AfterResolveServiceModel">
<!-- This should be run after it has figured out which definition file to use
but before it's done anything with it. This is all a bit hard coded, but
basically it should remove everything from the SourceServiceDefinition
item and replace it with the one we want if this is a build for test-->
<ItemGroup>
<!-- This is an interesting way of saying remove everything that is in me from me-->
<SourceServiceDefinition Remove="#(SourceServiceDefinition)" />
<TargetServiceDefinition Remove="#(TargetServiceDefinition)" />
</ItemGroup>
<ItemGroup Condition="'$(TargetProfile)' == 'Test'">
<SourceServiceDefinition Include="ServiceDefinition.Test.csdef" />
</ItemGroup>
<ItemGroup Condition="'$(TargetProfile)' != 'Test'">
<SourceServiceDefinition Include="ServiceDefinition.csdef" />
</ItemGroup>
<ItemGroup>
<TargetServiceDefinition Include="#(SourceServiceDefinition->'%(RecursiveDirectory)%(Filename).build%(Extension)')" />
</ItemGroup>
<Message Text="Source Service Definition Changed To Be: #(SourceServiceDefinition)" />
</Target>
To go back to normal, right click on the project and select Reload Project. Now when you build your project, depending on which configuration you use, it will use different .csdef files. It's worth noting that the settings editor in is not aware of your second .csdef file so if you add any new settings through the GUI you will need to add them manually to this alternate version.
If you would want to just have a different CSDEF then you can do it easily by using CSPACK command prompt directly as below:
Open command windows and locate the folder where you have your CSDEF/CSCFG and CSX folder related to your Windows Azure Project
Create multiple CSDEF depend on your minor changes
Be sure to have Windows Azure SDK in path to launch CS* commands
USE CSPACK command and pass parameters to use different CSDEF and Output CSPKG file something similar to as below:
cspack <ProjectName>\ServiceDefinitionOne.csdef /out:ProjectNameSame.csx /out:ProjectOne.cspkg /_AddMoreParams
cspack <ProjectName>\ServiceDefinitionTwo.csdef /out:ProjectNameSame.csx /out:ProjectTwo.cspkg /_AddMoreParams
More about CSPACK: http://msdn.microsoft.com/en-us/library/windowsazure/gg432988.aspx
As far as I know, you can't easily modify the .cspkg after it is created. I guess you probably technically could as the .cspkg is a zip file that follows a certain structure.
The question I'd ask is why? If it is to modify settings like VM role size (since that's defined in the .csdef file), then I think you have a couple of alternative approaches:
Create a seperate Windows Azure deployment project (.csproj) for each variation. Yes, I realize this can be a pain, but it does allow the Visual Studio tooling to work well. The minor pain may be worth it to have the easier to use tool support.
Run a configuration file transformation as part of the build process. Similiar to a web.config transform.
Personally, I go with the different .csproj approach. Mostly because I'm not a config file transformation ninja . . . yet. ;) This was the path of least resistance and it worked pretty well so far.
At the end of my Clean/Build, I wanted to always automatically copy the project folder into a zip for easy transfer. So I added this to my post build <target> in build.xml:
<zip zipfile="../project-xyz.zip" basedir=".." includes="project-xyz/**" excludes="*/dir/lib/**"/>
This works great on Windows, but on Linux, it removes any .hidden folders and all their children. I even tried
<zip zipfile="../project-xyz.zip" basedir=".." includes="project-xyz/**,project-xyz/.hidden/**" excludes="*/dir/lib/**"/>
and it still doesn't work.
What can I do to bring those files into the zip?
I am not opposed to detecting non-Windows environments and using <exec> on the zip command, though I am not sure how I would do that, and I am not sure I really want to, especially if there is a better way!
You can see what gets excluded by default from the zip by adding the following line in ant
<defaultexcludes echo="true"/>
And then use
<defaultexcludes add=.../>
and
<defaultexcludes remove=.../>
to customize what gets excluded by default.
Reference: Ant docs for DefaultExcludes
EDIT
You can also do
<zip defaultexcludes="no" .../>
Reference: Ant docs for Zip
I generated Fxcop analysis report using ant script. But I am unable to host it on Hudson Dashboard.
Using Nant script, I am able to generate an .xml output. Here is the ant:
<target name="Fxcop">
<echo message="Running Fxcop..." />
<exec command="${fxcop.basedir}\FxCopCmd.exe">
<arg value="/f:Path of my source file/>
<arg value="/out:some path/>
</exec>
</target>
In hudson Configuration, To display Vioaltion Reports, i configured the path of output(only pattern) file of the ant in xml file pattern of fxcop.
But Hudson is unable to find it.
I done the configurations and setting correctly.
Can anyone walk me through where I am going wrong.
Thanks in Advance
Most likely XML is created in different subfolder which relative path is originated from current directory. E.g. if your current working directory is %WORKSPACE%\trunk and relative path for report is /out:result\fxcop-result.xml then it will be created in %WORKSPACE%\trunk\result\fxcop-result.xml.
To fix this I suggest to check current directory from which you are executing FxCop analysis (also try searching this xml on build machine).
Easiest way to implement FxCop analysis in Hudson using Windows batch command will be:
Add "Execute Windows batch command" (this command will be executed from base workspace folder, e.g. C:\hudson\workspace\FxCopJob)
Specify command that will execute analysis, e.g.: ""{FxCopDirectory}\fxcopcmd.exe" /file:"%WORKSPACE%\{path to your file}" /directory="{Assemblies_path}" /rulesetdirectory:"{RuleSetDir}" /out:fxcop-result.xml"
Set fxcop-result.xml in fxcop section of Report Violation (e.g. report will be created in C:\hudson\workspace\FxCopJob\fxcop-result.xml)
Run updated Hudson job and verify that FxCop violations are shown
WBR,
Andrey
I have the following target in my nant script:
<target name="update" verbose="true">
<copy todir="${dirs.deploy}">
<fileset basedir="${dirs.drop}\_PublishedWebSites\RomanceReminder.Web">
<include name="**/*.*" />
</fileset>
</copy>
</target>
when I run this script manually the following output is visible in the log:
[nant]
C:\Projects\RomanceReminder\BuildScripts.Custom_test_deploy.build
Buildfile:
file:///C:/Projects/RomanceReminder/BuildScripts.Custom/_test_deploy.build
Target framework: Microsoft .NET
Framework 3.5
Target(s) specified: go
error_check:
stop_w3svc:
cleanup:
[echo] Deleting C:\Webs\Nightly.
update:
[copy] Copying 93 files to
'C:\Webs\Nightly'.
start_w3svc:
go:
BUILD SUCCEEDED
Total time: 2.6 seconds.
As you can see it move 93 files into the web\nightly folder.
When this script is run via TeamCity the copy doesn't happen for some reason. Team city is running under an admin account so it should have all the permissions it needs. The log file for TC show the exact text above except the update task shows nothing.
Anyone have ideas on how I can even troubleshoot this?
UPDATE: I flipped the bit on the copy task to give verbose logging. and now I see the following in my TeamCity log:
[copy] Copying 0 files to 'C:\Webs\Nightly'.
I still am flummoxed by I can run it from the command line and everything works, but TC doesn't copy files... 8(
User Error User Error User Error
Of course, I was not trusting the tool assuming it was doing something wrong. The drop directory is only populated in the package step. This particular script executes before that. Team City destroys the build directory every time it runs including the drop directory. So nant was correct, there were no files to copy. I modified my script to use the build output and all is good with the world.