Script to Compare TFS labels in a folder - powershell

I have several branches in TFS (dev, test, stage) and when I merge changes into the test branch I want an automated script to find all the updated SQL files based on the labels and deploy them to the SQL database.
Currently I manually use compare in TFS source control explorer to get the files that are changed and use a custom powershell script to deploy to the database.
I am looking for a script that would copy the changed sql files to a repository so that my powershell will do the rest.
Any help would be appreciated.
Thanks
Nit

Related

Create SQL script from commits since last build

As a DevOps engineer, I want our developers to be able to place SQL scripts in a SQL Server Database Project folder and have the VSTS build agent prepare all scripts committed since the last successful build.
The reason I'm looking for only files since the last successful build is because I only want these scripts to run once. If they are built into a post-deployment script, they will be run every time the database is deployed. Most of these scripts are data changes and not schema changes.
I found this Build last commited [sic] SQL Script VSTS, but the solution applied to a Git repository for latest committed changes instead of a generic solution or TFVC equivalent.
Do I need to look into Visual Studio pre-build events? SQL Server Database Project post-deployment scripts? VSTS build agent task to search for and copy latest files to another location?
If they are built into a post-deployment script, they will be run every time the database is deployed.
The solution to this problem is to make the scripts idempotent. If data needs to be inserted, check if it's there before inserting it.
Sounds like what you really want is DBUP, it tracks which scripts have been ran and runs them via a console application.
https://dbup.readthedocs.io/en/latest/
https://marketplace.visualstudio.com/items?itemName=johanclasson.UpdateDatabaseWithDbUp

Share the same Powershell script file between multiple repo/Build

We are using VSTS for CI and CD in my team, we got over 40 repositories which are separated projects. but all of them have to run the same PowerShell script in one of their Build steps.
the PowerShell file is bigger too big to be kept as the inline script, so we need to save it inside a file. obviously, I got a copy of the PowerShell file in each repository.
Problem:
Now whenever I need to update the script, then I end up to update it in every repository, which is over 40 at the moment.
I think there should be a better approach. Is there any way that I can put my script in one single repo (a repo dedicated to holding the script) then I use it within each build, therefore we I need to update it I only need to update it once.
There are a few options.
My general recommendation is to publish the script as a package (NuGet or otherwise) and restore it during your application builds. This allows consumers to stay "pinned" to a known-good, known-working version, and update on a schedule that works for them.
Another option is to add a submodule to each repository that requires the script dependency, then initialize the submodule during the build process.
A third option is to turn the shared script into a VSTS build task or extension. This is extensively documented and easily located so I won't belabor the point by including instructions for doing that here.
You can add a git repository to store your powershell file.
Then add a build step to get you file from that repository during build and use it.

Saving Redshift Matadata

we are using Redshift as our EDW and we have quite a bit of tables and view there. at the moment we keeping all DDL's in our organisation's knowledge centre, but this is basically copy and paste and not very smart. is there any other option that is quicker better to do so?
thanks
Not very sure what you meant be "copy and paste" but you can try to put all the scripts in a github/ SVN repository and make sure that all the DDLs actually get fired using the scripts from the repo.
We did this using git and Jenkins (and little bit of Shell programs to do the code checkins and checkouts). We blocked all the users from running DDL statements and the Jenkins job would just pull the latest scripts from the repo and deploy it automatically from the RC (release candidate) branch of the repo.
If you need to export the DDL scripts out from the system you can use the script provided by the AWS folks,
https://github.com/awslabs/amazon-redshift-utils/blob/master/src/AdminViews/v_generate_tbl_ddl.sql
If you want to automate the checkin process to some code repository, you can build a wrapper python code using this code.

Get Changesets Associated With Build

In TFS (2013 Update 4) I am trying to write a PowerShell script to copy modified SQL files that are tied to a build. I can get and copy the appropriate files if I know the changeset number, which will often be enough (I can use the TF_BUILD_SOURCEGETVERSION environment variable when the build is triggered by a merge). However, occasionally there will be a handful of changesets that are associated with the build in TFS.
Using the Build Number, how do I get a list of Changesets?
You need to use your build number to find the previous build number. You will then have both a start changeset (from previous build) to current changeset (current build).
You can then walk the gap with the API and find all the intervening changesets.
So I've done this in my last engagement, in essence we solved it by doing a get of all SQL related files EVERY build and produce a csv file that contained information about each file, name, version, and most importantly and MD5 hash of the file. Then with each deployment we create/update/insert into a special deployment table in our DB all SQL "run" against that DB. Then our build script is really just producing the csv file but our deployment script has the intelligence and checks to see if anything as changed in the csv file vs. the target DB and only applies changes (new SQL, changed SQL with new MD5). So we essentially use two scripts. I can't share the scripts but you have the idea. Also I would look at this article by Alexander
Automating SQL Server Database Deployments: Scripting Details where he explains a lot about db migration.

Automated Building and Release Management VS2012

Trying to make my life easier, Currently we have 4 developers working in Visual Studio 2012 and we are using TFS 2012 for source control. The project we work on is a multi-tenant web application (single source directory with multiple dbs) that is a mixture of legacy, asp and vb6 com components, coupled with new C# code. We use TFS for source control and for managing User Stories and Bugs. Because of the way our site works it can not be ran or debugged locally only on the server.
Source Control is currently setup with a separate branch for each developer that's working directory is mapped to a shared network path on the dev server that has a web site pointed to it in IIS. Dev01-Dev05 etc. The developers work on projects in their branch test it using their dev website, then check in changes to their own branch and merge those into the trunk. The trunk's work space is mapped to the main dev website so that the developers can test their changes against the other customer's dev domains to test against customizations and variances in functionality based on the specific dbs the are connected to.
Very long explanation but basically each dev has a branch and a site, that are then merged into the trunk with its own site.
In order to deploy our staging server:
I compile the trunk's website via a bat file on the server
Run a windows app I built to query TFS for changesets associated with
specific WorkItems in a certain status, and copy all the files for
those changesets from the publish folder to a deployment folder.
Run another bat file on the server to use RedGate's Deployment Manager
to create a package from those new files
Go to the DM site on our network to create and deploy that release (haven't been able to get the command line tools to work for this, so I have to do it manually)
Run any SQL scripts that have been saved off in Folders that match ticket numbers on each database (10 or so customer dbs) to support the release
I have tried using TFS automated build stuff and never really got it to build the website correctly. Played around with Cruise Control also with little success. Using a mishmash of skunk works projects to do this is very time consuming and unreliable at best.
My perfect scenario would be:
Gated Checkin, Attempt build/publish every time a developer merges into the trunk, rejects and notifies developer if the build fails.
End of the day collect the TFS Items of a certain status and deploys files associated with them to the staging site
Deploy SQL scripts for those TFS items across all the customer dbs in staging
Eventually* run automated regression UI tests, create new WorkItems or emails to devs if failed
Update TFS WorkItems to new state so QA/Customers know their items are ready to test in our staging environment
Send report of what items were deployed successfully
How can I get here so that I am not spending hours preparing and deploying releases to staging and eventually production? Pretty open to potential solutions, things that would be hard to change would be the source control we are using, can't really switch to subversion or something else so we are pretty stuck with TFS.
Thanks
Went back in and started trying to get TFS to build/publish my web solution. I was able to get a build to complete successfully. adding msbuild argument /p:DeployOnBuild=True and setting the msbuild platform to x86 seemed to do the trick on that.
Then I found https://github.com/red-gate/deployment-manager-tfs which gives you a build process template to do the package and deployment using the redgate tools. After playing with that for a bit I finally got it to create, package and deploy my build to our staging environment.
Next up will be to modify the template to run some custom scripts to collect only the correct items to deploy, deploy all the sql files and then to set the workitems to the appropriate statuses after completion.
Really detailed description of your process. Thanks for sharing!
I believe you can set up TFS to have gated check-in on a single branch, which if you can setup on trunk would make sure that the merges built successfully. That could trigger msbuild, if you can get that working or a custom build job.
If you can get that working then you'd be able to use that trunk code as the artifact to send to Deployment Manager. That avoids having to assemble the files for deployment through the TFS change sets, as you'd be confident that the trunk could always build.
Are you using Deployment Manager to deploy the database from source control as well as the application?
That could be a way to further automate the process. SQL Source Control and SQL CI allow you to source control the structure of a database, keep a database up to date on each check-in, and run database unit tests. They also produce database packages for Deployment Manager, so you can deploy a release that contains both the application and the database.
If you want to send me the command you're using in step 4 to deploy the release using Deployment Manager I can help out with that. The commands I use are:
DeploymentManager.exe --create-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1
DeploymentManager.exe --deploy-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1 --deployto=CI-Environment-Name
That will create a release version 1.1 using the latest available packages for that project. You can optionally specify the package to be used when creating the release with
--packageversion=<package name>=<version>
--packageversion="application=1.5