As a DevOps engineer, I want our developers to be able to place SQL scripts in a SQL Server Database Project folder and have the VSTS build agent prepare all scripts committed since the last successful build.
The reason I'm looking for only files since the last successful build is because I only want these scripts to run once. If they are built into a post-deployment script, they will be run every time the database is deployed. Most of these scripts are data changes and not schema changes.
I found this Build last commited [sic] SQL Script VSTS, but the solution applied to a Git repository for latest committed changes instead of a generic solution or TFVC equivalent.
Do I need to look into Visual Studio pre-build events? SQL Server Database Project post-deployment scripts? VSTS build agent task to search for and copy latest files to another location?
If they are built into a post-deployment script, they will be run every time the database is deployed.
The solution to this problem is to make the scripts idempotent. If data needs to be inserted, check if it's there before inserting it.
Sounds like what you really want is DBUP, it tracks which scripts have been ran and runs them via a console application.
https://dbup.readthedocs.io/en/latest/
https://marketplace.visualstudio.com/items?itemName=johanclasson.UpdateDatabaseWithDbUp
Related
we are using Redshift as our EDW and we have quite a bit of tables and view there. at the moment we keeping all DDL's in our organisation's knowledge centre, but this is basically copy and paste and not very smart. is there any other option that is quicker better to do so?
thanks
Not very sure what you meant be "copy and paste" but you can try to put all the scripts in a github/ SVN repository and make sure that all the DDLs actually get fired using the scripts from the repo.
We did this using git and Jenkins (and little bit of Shell programs to do the code checkins and checkouts). We blocked all the users from running DDL statements and the Jenkins job would just pull the latest scripts from the repo and deploy it automatically from the RC (release candidate) branch of the repo.
If you need to export the DDL scripts out from the system you can use the script provided by the AWS folks,
https://github.com/awslabs/amazon-redshift-utils/blob/master/src/AdminViews/v_generate_tbl_ddl.sql
If you want to automate the checkin process to some code repository, you can build a wrapper python code using this code.
In TFS (2013 Update 4) I am trying to write a PowerShell script to copy modified SQL files that are tied to a build. I can get and copy the appropriate files if I know the changeset number, which will often be enough (I can use the TF_BUILD_SOURCEGETVERSION environment variable when the build is triggered by a merge). However, occasionally there will be a handful of changesets that are associated with the build in TFS.
Using the Build Number, how do I get a list of Changesets?
You need to use your build number to find the previous build number. You will then have both a start changeset (from previous build) to current changeset (current build).
You can then walk the gap with the API and find all the intervening changesets.
So I've done this in my last engagement, in essence we solved it by doing a get of all SQL related files EVERY build and produce a csv file that contained information about each file, name, version, and most importantly and MD5 hash of the file. Then with each deployment we create/update/insert into a special deployment table in our DB all SQL "run" against that DB. Then our build script is really just producing the csv file but our deployment script has the intelligence and checks to see if anything as changed in the csv file vs. the target DB and only applies changes (new SQL, changed SQL with new MD5). So we essentially use two scripts. I can't share the scripts but you have the idea. Also I would look at this article by Alexander
Automating SQL Server Database Deployments: Scripting Details where he explains a lot about db migration.
When doing a get-latest from TFS, all timestamps are set to the time at which the get operation was executed. When doing running msdeploy to perform a sync, the timestamps in the source are compared with the timestamps on the target server. Of course, this means that with TFS + msdeploy, every file will be pushed to the target servers after every build, unless
You use incremental builds
You have only a single build agent in the build controller's pool.
If the build definition is set to do Clean builds, or if you want to utilize multiple build agents, then this no longer works.
This topic comes up all the time, and once every couple of years I cast out new lines in case something has changed. This could be fixed in a couple of different ways:
TFS sets timestamps on workspace files to the last checkin time.
TFS sets timestamps on workspace files to the last modified time from the files themselves when they were last checked in.
msdeploy uses some content-based comparison method (e.g. MD5) to compare files, rather than timestamp comparisons.
Something else?
I never know where to go to search for this stuff since both of these teams are pretty opaque--the webdeploy team in particular. Is this a problem that has been solved yet?
The TFS and visual Studio teams are entirely transparent and you can submit feature requests through http://visualstudio.uservoice.com and bugs through http://connect.microsoft.com.
However all files within a server workspace are set to the date the file was last modified on the server. Local workspaces physically compare the file contents to determin changes. You can change from local workspace to server workspace in the workspace properties.
In the end, we got around this by writing a powershell script to wrap the .cmd file produced by the Web Publishing Pipeline, and passing the -useChecksum flag in the command that invokes the .cmd script. Since the boilerplate .cmd created by WPP allows for passing additional arguments to msdeploy, we were able to accomplish this with a line like the following.
& "MyProject.cmd" /u:agent /p:P#ssw0rd /m:$ComputerName /y -useChecksum
In this way, even though TFS is creating workspaces with timestamps set to the get-latest time, msdeploy is now instructed to use checksums instead.
Trying to make my life easier, Currently we have 4 developers working in Visual Studio 2012 and we are using TFS 2012 for source control. The project we work on is a multi-tenant web application (single source directory with multiple dbs) that is a mixture of legacy, asp and vb6 com components, coupled with new C# code. We use TFS for source control and for managing User Stories and Bugs. Because of the way our site works it can not be ran or debugged locally only on the server.
Source Control is currently setup with a separate branch for each developer that's working directory is mapped to a shared network path on the dev server that has a web site pointed to it in IIS. Dev01-Dev05 etc. The developers work on projects in their branch test it using their dev website, then check in changes to their own branch and merge those into the trunk. The trunk's work space is mapped to the main dev website so that the developers can test their changes against the other customer's dev domains to test against customizations and variances in functionality based on the specific dbs the are connected to.
Very long explanation but basically each dev has a branch and a site, that are then merged into the trunk with its own site.
In order to deploy our staging server:
I compile the trunk's website via a bat file on the server
Run a windows app I built to query TFS for changesets associated with
specific WorkItems in a certain status, and copy all the files for
those changesets from the publish folder to a deployment folder.
Run another bat file on the server to use RedGate's Deployment Manager
to create a package from those new files
Go to the DM site on our network to create and deploy that release (haven't been able to get the command line tools to work for this, so I have to do it manually)
Run any SQL scripts that have been saved off in Folders that match ticket numbers on each database (10 or so customer dbs) to support the release
I have tried using TFS automated build stuff and never really got it to build the website correctly. Played around with Cruise Control also with little success. Using a mishmash of skunk works projects to do this is very time consuming and unreliable at best.
My perfect scenario would be:
Gated Checkin, Attempt build/publish every time a developer merges into the trunk, rejects and notifies developer if the build fails.
End of the day collect the TFS Items of a certain status and deploys files associated with them to the staging site
Deploy SQL scripts for those TFS items across all the customer dbs in staging
Eventually* run automated regression UI tests, create new WorkItems or emails to devs if failed
Update TFS WorkItems to new state so QA/Customers know their items are ready to test in our staging environment
Send report of what items were deployed successfully
How can I get here so that I am not spending hours preparing and deploying releases to staging and eventually production? Pretty open to potential solutions, things that would be hard to change would be the source control we are using, can't really switch to subversion or something else so we are pretty stuck with TFS.
Thanks
Went back in and started trying to get TFS to build/publish my web solution. I was able to get a build to complete successfully. adding msbuild argument /p:DeployOnBuild=True and setting the msbuild platform to x86 seemed to do the trick on that.
Then I found https://github.com/red-gate/deployment-manager-tfs which gives you a build process template to do the package and deployment using the redgate tools. After playing with that for a bit I finally got it to create, package and deploy my build to our staging environment.
Next up will be to modify the template to run some custom scripts to collect only the correct items to deploy, deploy all the sql files and then to set the workitems to the appropriate statuses after completion.
Really detailed description of your process. Thanks for sharing!
I believe you can set up TFS to have gated check-in on a single branch, which if you can setup on trunk would make sure that the merges built successfully. That could trigger msbuild, if you can get that working or a custom build job.
If you can get that working then you'd be able to use that trunk code as the artifact to send to Deployment Manager. That avoids having to assemble the files for deployment through the TFS change sets, as you'd be confident that the trunk could always build.
Are you using Deployment Manager to deploy the database from source control as well as the application?
That could be a way to further automate the process. SQL Source Control and SQL CI allow you to source control the structure of a database, keep a database up to date on each check-in, and run database unit tests. They also produce database packages for Deployment Manager, so you can deploy a release that contains both the application and the database.
If you want to send me the command you're using in step 4 to deploy the release using Deployment Manager I can help out with that. The commands I use are:
DeploymentManager.exe --create-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1
DeploymentManager.exe --deploy-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1 --deployto=CI-Environment-Name
That will create a release version 1.1 using the latest available packages for that project. You can optionally specify the package to be used when creating the release with
--packageversion=<package name>=<version>
--packageversion="application=1.5
I have several branches in TFS (dev, test, stage) and when I merge changes into the test branch I want an automated script to find all the updated SQL files based on the labels and deploy them to the SQL database.
Currently I manually use compare in TFS source control explorer to get the files that are changed and use a custom powershell script to deploy to the database.
I am looking for a script that would copy the changed sql files to a repository so that my powershell will do the rest.
Any help would be appreciated.
Thanks
Nit