When doing a get-latest from TFS, all timestamps are set to the time at which the get operation was executed. When doing running msdeploy to perform a sync, the timestamps in the source are compared with the timestamps on the target server. Of course, this means that with TFS + msdeploy, every file will be pushed to the target servers after every build, unless
You use incremental builds
You have only a single build agent in the build controller's pool.
If the build definition is set to do Clean builds, or if you want to utilize multiple build agents, then this no longer works.
This topic comes up all the time, and once every couple of years I cast out new lines in case something has changed. This could be fixed in a couple of different ways:
TFS sets timestamps on workspace files to the last checkin time.
TFS sets timestamps on workspace files to the last modified time from the files themselves when they were last checked in.
msdeploy uses some content-based comparison method (e.g. MD5) to compare files, rather than timestamp comparisons.
Something else?
I never know where to go to search for this stuff since both of these teams are pretty opaque--the webdeploy team in particular. Is this a problem that has been solved yet?
The TFS and visual Studio teams are entirely transparent and you can submit feature requests through http://visualstudio.uservoice.com and bugs through http://connect.microsoft.com.
However all files within a server workspace are set to the date the file was last modified on the server. Local workspaces physically compare the file contents to determin changes. You can change from local workspace to server workspace in the workspace properties.
In the end, we got around this by writing a powershell script to wrap the .cmd file produced by the Web Publishing Pipeline, and passing the -useChecksum flag in the command that invokes the .cmd script. Since the boilerplate .cmd created by WPP allows for passing additional arguments to msdeploy, we were able to accomplish this with a line like the following.
& "MyProject.cmd" /u:agent /p:P#ssw0rd /m:$ComputerName /y -useChecksum
In this way, even though TFS is creating workspaces with timestamps set to the get-latest time, msdeploy is now instructed to use checksums instead.
Related
As a DevOps engineer, I want our developers to be able to place SQL scripts in a SQL Server Database Project folder and have the VSTS build agent prepare all scripts committed since the last successful build.
The reason I'm looking for only files since the last successful build is because I only want these scripts to run once. If they are built into a post-deployment script, they will be run every time the database is deployed. Most of these scripts are data changes and not schema changes.
I found this Build last commited [sic] SQL Script VSTS, but the solution applied to a Git repository for latest committed changes instead of a generic solution or TFVC equivalent.
Do I need to look into Visual Studio pre-build events? SQL Server Database Project post-deployment scripts? VSTS build agent task to search for and copy latest files to another location?
If they are built into a post-deployment script, they will be run every time the database is deployed.
The solution to this problem is to make the scripts idempotent. If data needs to be inserted, check if it's there before inserting it.
Sounds like what you really want is DBUP, it tracks which scripts have been ran and runs them via a console application.
https://dbup.readthedocs.io/en/latest/
https://marketplace.visualstudio.com/items?itemName=johanclasson.UpdateDatabaseWithDbUp
We are using VSTS for CI and CD in my team, we got over 40 repositories which are separated projects. but all of them have to run the same PowerShell script in one of their Build steps.
the PowerShell file is bigger too big to be kept as the inline script, so we need to save it inside a file. obviously, I got a copy of the PowerShell file in each repository.
Problem:
Now whenever I need to update the script, then I end up to update it in every repository, which is over 40 at the moment.
I think there should be a better approach. Is there any way that I can put my script in one single repo (a repo dedicated to holding the script) then I use it within each build, therefore we I need to update it I only need to update it once.
There are a few options.
My general recommendation is to publish the script as a package (NuGet or otherwise) and restore it during your application builds. This allows consumers to stay "pinned" to a known-good, known-working version, and update on a schedule that works for them.
Another option is to add a submodule to each repository that requires the script dependency, then initialize the submodule during the build process.
A third option is to turn the shared script into a VSTS build task or extension. This is extensively documented and easily located so I won't belabor the point by including instructions for doing that here.
You can add a git repository to store your powershell file.
Then add a build step to get you file from that repository during build and use it.
I'm using TFS 2017, and regarding to title I found a term: Incremental build - but, I can not find out where to set it. I tried to add Incremental parameter in ' build parameters' (/p:IncrementalBuild=true) but always got error which says that this is wrong parameter.
Is possible to deliver (or build and deliver) only changed files?
Lets assume how I solved this problem (if anyone has similar problem):
Since TFS 2017 always delivered all files - changed and unchanged, but I need only changed, I solved this as follows:
Since files that are transferred to Artifact keep timestamp (unchanged files have last_edited timestamp, and edited files have new, current timestamp) I decide to create FIXED Artifact directory (not depending on build version, but always the same). Then I wrote PowerShell script (as a first Release step ) which deletes all files (recursive) which have timestamp that is < (Now - x min), and all empty directories after that.
On this way Artifact directory contains of ONLY CHANGED files (entire file structure (of changed files) is kept). Now Release will deliver only these files to destination.
Cheers! :)
If you want to have an incremental build. When you add a Visual Studio Build / MSBuild task to build the project, just uncheck the Clean option. Thus it will sync the source and only get the changed files from the second time to build. See
Build task Arguments for details.
Clean Option : Set to False if you want to make this an incremental build. This setting might reduce your build time,
especially if your codebase is large. This option has no practical
effect unless you also set Clean repository to False.
Set to True if you want to rebuild all the code in the code projects.
This is equivalent to the MSBuild /target:clean argument.
Assuming you want to deliver the only changed files to a specific location, you can add a Copy Files step to copy the changed files to the location.
In TFS (2013 Update 4) I am trying to write a PowerShell script to copy modified SQL files that are tied to a build. I can get and copy the appropriate files if I know the changeset number, which will often be enough (I can use the TF_BUILD_SOURCEGETVERSION environment variable when the build is triggered by a merge). However, occasionally there will be a handful of changesets that are associated with the build in TFS.
Using the Build Number, how do I get a list of Changesets?
You need to use your build number to find the previous build number. You will then have both a start changeset (from previous build) to current changeset (current build).
You can then walk the gap with the API and find all the intervening changesets.
So I've done this in my last engagement, in essence we solved it by doing a get of all SQL related files EVERY build and produce a csv file that contained information about each file, name, version, and most importantly and MD5 hash of the file. Then with each deployment we create/update/insert into a special deployment table in our DB all SQL "run" against that DB. Then our build script is really just producing the csv file but our deployment script has the intelligence and checks to see if anything as changed in the csv file vs. the target DB and only applies changes (new SQL, changed SQL with new MD5). So we essentially use two scripts. I can't share the scripts but you have the idea. Also I would look at this article by Alexander
Automating SQL Server Database Deployments: Scripting Details where he explains a lot about db migration.
We have our build and deployment scripts set up in TFS 2010.
But we are also evaluating indeo Build Master. Has any one used this before?
Also, in general, for a full .NET house does it makes senses to have another SCM management tool?
Here is the link for inedo
I found this while researching Inedo's BuildMaster as well. We're a .NET/TFS shop, and BuildMaster solves all sorts of different problems.
Here's a blog post I found that discusses the differences:
http://blog.inedo.com/2011/06/06/how-does-buildmaster-compare-to-team-foundation-server/
We're using the free version of BuildMaster and may upgrade to enterprise once we use it for other projects.
Buildmaster has a TFS plugin that helps grab builds from TFS Builds. We use Gated check-in to ensure the code builds and Buildmaster to package the build for 1 click to deploy through the environments. Buildmaster has a fix forward approach (as in, no roll backs), where you create many builds for a release and each propagates through each environment and when 1 or more exist in say QA and have not moved to Staging, they will both be moved to staging at the same time, but in order, thereby ensuring all artifacts move through every environment.
Prior to Buildmaster, we used an xml driven PowerShell script that worked well, but Buildmaster agents saved us from remote desktop script execution. Our Powershell script has 1 advantage that Buildmaster does not yet have. We used the xml configuration file to hold application configuration file information, including file names, relative paths and xpath settings to inject values, xml fragments and remove xml nodes from configuration files coming from source control. Buildmaster uses template configuration files stored in Buildmaster, with tag replacement for each environment. This results in high maintenance should anything change in a configuration file, such as additional environment non specific sections being added, which would require creating the template again.
Buildmaster does have a custom action that allow you to run executables, so theoretically, you can run your own commands to perform functionality that Buildmaster does not have built in, but this is not ideal.