Sending a file to multiple servers - server

I'm working on a web project(built with the .Net framework) on a remote windows server, and this project is connected to a database my SQL server management studio, now on multiple other remote windows servers exist the same web project linked to the same database, now I change a page's code in my project or add/remove a table or stored procedure in my database, is there a way(or an already existing software) which will my to deploy the changes that I made to all the others(or to choose multiple servers if I don't want to deploy the changes to all of them)?

If it were me, I would stand up a git server somewhere (cloud or local vm), make a branch called something like Prod or Stable, and create a script (powershell if the servers are windows, bash on anything else) on a nightly or hourly job to pull from that branch. Only push to that branch after testing thoroughly. If your code requires compilation, you have the choice to compile once before committing (in which case you're probably going to commit to releases), or on each endpoint after the pull. I would have the script that does the pull also compile and restart the service (only if there was something new in the pull).

You can probably achieve this by following two things :
Create a separate publishing profile for each server.
Use git/vsts branches to keep the code separate. (as suggested by #memtha).
Let's say you have total 6 servers and two branches A and B. So, you'll have to create 6 publishing profiles. Then, you can choose which branch to deploy where. e.g. you can deploy branch B on server 1,3 and 4.

For the codebase you could use Git Hooks.
https://gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa
And for the database, maybe you could use migrations or something similar. You will need to provide more info about your database, do you store your database across multiple servers etc.

If the same web project is connecting to the same database and the database changes, I suspect you would need to update all the web apps to ensure the database changes don't break any of the apps and to keep all the apps updated to prevent any being left behind.
You should look at using Azure Devops to build and deploy your apps and update the database.
If you use Entity Framework, you can run the migrations on startup and have the application update the database when deployed manually or automatically using devops.

To maintain the software updated in multiple server you could use Git with hooks, post-receive hook is what you need.
The idea is to use one server as your Remote Repository and here configure the post-receive hook to update the codebase in the same server and the others.

Related

Automated Building and Release Management VS2012

Trying to make my life easier, Currently we have 4 developers working in Visual Studio 2012 and we are using TFS 2012 for source control. The project we work on is a multi-tenant web application (single source directory with multiple dbs) that is a mixture of legacy, asp and vb6 com components, coupled with new C# code. We use TFS for source control and for managing User Stories and Bugs. Because of the way our site works it can not be ran or debugged locally only on the server.
Source Control is currently setup with a separate branch for each developer that's working directory is mapped to a shared network path on the dev server that has a web site pointed to it in IIS. Dev01-Dev05 etc. The developers work on projects in their branch test it using their dev website, then check in changes to their own branch and merge those into the trunk. The trunk's work space is mapped to the main dev website so that the developers can test their changes against the other customer's dev domains to test against customizations and variances in functionality based on the specific dbs the are connected to.
Very long explanation but basically each dev has a branch and a site, that are then merged into the trunk with its own site.
In order to deploy our staging server:
I compile the trunk's website via a bat file on the server
Run a windows app I built to query TFS for changesets associated with
specific WorkItems in a certain status, and copy all the files for
those changesets from the publish folder to a deployment folder.
Run another bat file on the server to use RedGate's Deployment Manager
to create a package from those new files
Go to the DM site on our network to create and deploy that release (haven't been able to get the command line tools to work for this, so I have to do it manually)
Run any SQL scripts that have been saved off in Folders that match ticket numbers on each database (10 or so customer dbs) to support the release
I have tried using TFS automated build stuff and never really got it to build the website correctly. Played around with Cruise Control also with little success. Using a mishmash of skunk works projects to do this is very time consuming and unreliable at best.
My perfect scenario would be:
Gated Checkin, Attempt build/publish every time a developer merges into the trunk, rejects and notifies developer if the build fails.
End of the day collect the TFS Items of a certain status and deploys files associated with them to the staging site
Deploy SQL scripts for those TFS items across all the customer dbs in staging
Eventually* run automated regression UI tests, create new WorkItems or emails to devs if failed
Update TFS WorkItems to new state so QA/Customers know their items are ready to test in our staging environment
Send report of what items were deployed successfully
How can I get here so that I am not spending hours preparing and deploying releases to staging and eventually production? Pretty open to potential solutions, things that would be hard to change would be the source control we are using, can't really switch to subversion or something else so we are pretty stuck with TFS.
Thanks
Went back in and started trying to get TFS to build/publish my web solution. I was able to get a build to complete successfully. adding msbuild argument /p:DeployOnBuild=True and setting the msbuild platform to x86 seemed to do the trick on that.
Then I found https://github.com/red-gate/deployment-manager-tfs which gives you a build process template to do the package and deployment using the redgate tools. After playing with that for a bit I finally got it to create, package and deploy my build to our staging environment.
Next up will be to modify the template to run some custom scripts to collect only the correct items to deploy, deploy all the sql files and then to set the workitems to the appropriate statuses after completion.
Really detailed description of your process. Thanks for sharing!
I believe you can set up TFS to have gated check-in on a single branch, which if you can setup on trunk would make sure that the merges built successfully. That could trigger msbuild, if you can get that working or a custom build job.
If you can get that working then you'd be able to use that trunk code as the artifact to send to Deployment Manager. That avoids having to assemble the files for deployment through the TFS change sets, as you'd be confident that the trunk could always build.
Are you using Deployment Manager to deploy the database from source control as well as the application?
That could be a way to further automate the process. SQL Source Control and SQL CI allow you to source control the structure of a database, keep a database up to date on each check-in, and run database unit tests. They also produce database packages for Deployment Manager, so you can deploy a release that contains both the application and the database.
If you want to send me the command you're using in step 4 to deploy the release using Deployment Manager I can help out with that. The commands I use are:
DeploymentManager.exe --create-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1
DeploymentManager.exe --deploy-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1 --deployto=CI-Environment-Name
That will create a release version 1.1 using the latest available packages for that project. You can optionally specify the package to be used when creating the release with
--packageversion=<package name>=<version>
--packageversion="application=1.5

How to manage database context changes in production / CI

I've spent the past few months developing a webApi solution that I'm ready to push up to Azure and hook into an Azure SQL Database. It was built with EF Code First.
I'm wondering what standard approaches there are to making changes to the database while in production. I've been using database initializers up to this point but they all blow away data and re-seed.
I have a feeling this question is too broad for a concise answer, so I'd like to ask: what terminology / processes / resources should a developer look into when designing a continuous integration workflow for a solution built with EF Code First and ASP.NET WebAPI, hosted as an Azure Service and hooked up to Azure SQL?
On the subject of database migration, there was an interesting article on ASP.NET about this subject: Strategies for Database Development and Deployment.
Also since you are using EF Code First you will be able to use Code First Migrations here for database changes. This will allow you to better manage the changes you make to the database.
I'm not sure how far you want to go with continuous integration but since you are using Azure it might be worth it to have a look at Continuous delivery to Windows Azure by using Team Foundation Service. Although it relies on TFS in the cloud it's of course also possible to configure it with for example Jenkins. However this does require a bit more work.
I use this technic:
1- Create a clone database for your development environment if it doesn't exist.
2- Make the necessary changes in your dev environment and dev
database.
3- Deploy to your staging environment.
4- If you added some static datas
that should also exist in your prod database, use a tool like
SQLDataExaminer to find the data differences and execute the
insert, update, deletes for according rows. Use Schema Compare in VS2012 to find differences between your dev
and prod environment by selecting source as dev and target as prod.
And execute the script in your prod.
5- Swap the environments

is there a deploy tool (or set of tools) that supports rollback of a deployment?

I'm learning FluentMigrator. The thing that I like about FM is that it supports the idea of Forward and Back for migrations (aka Up/Down). I'm finding that it's not ideal about this; there are some holes. Still, it's good.
This leads me to wonder if there are any deployment tools (nant, msbuild or other) that support this idea of rolling forward and back. The scenario that I'm using it in is the deployment of a web app with a related database.
Ideally I'd like to set up my deployment so that, should any part of it fail, it will revert to the previous known working configuration. With FM, this is pretty easy to do (but there are rough spots), so that covers the db. How about the files that make up the web app? Do any deploy tools have support for this?
Deploying to a Windows Server. Assume that I can't make any changes to the server.
I don't know of any Microsoft-centric, automated provisioning/deployment tools like Capistrano. Here are some tools I've heard of, but never used:
MSDeploy, for deploying web application.
Microsoft Deployment Services, for managing operating system configuration
Microsoft's System Center Configuration Manager
BladeLogic
HP's Operations Center
Up until about three months ago, we did our deployment/provisioning using custom MSBuild scripts. After a server is provisioned, deploys happen automatically using Robocopy to copy files to a share on the application server, updating changed application binaries and markup files. We've never had a need to rollback any of our deployments, but since our scripts are custom, we could write the logic if we needed to.
MSBuild is a terrible deployment/provisioning language. For the past three months, we've been writing all new scripts in, and porting existing ones to, PowerShell. It is wonderful. With version 2, there is support for running commands on remote servers, like SSH. We haven't used that functionality yet, but I'm looking forward to pushing setup scripts to remote server to provision and deploy at the same time.
We have been using Git to do our deploys for the last 6 months.
Here is the whole process:
CI server build the project
CI server checks it in to a local git repository
CI server pushes the changes to the centralised git repository
User creates an empty repository on the live server
User adds the central git repository to the remotes
User pulls the latest version over https (no need to open any ports)
It is a lot to setup in the beginning but once setup it works great. Deploys take seconds as only changed files get copied.
Another great thing about this method is that git keeps history of changes so rolling back is pretty simple. You can also roll back a few revisions and it's done straight on the live server. If something goes wrong reverting is super fast.
Also you can save some time if you use a hosted git service (github) for your central repository.
This is a very brief description but I can give you more info if you want.
Of course! My favorite is Capistrano. This was originally built for Ruby but I've found that it works just as well for other languages.
https://github.com/capistrano/capistrano

What is a typical workflow to put my local MVC3 project on to a "live server"?

I develop on my local machine with VS2010 and SQL Server. Naturally, my web.config points to my local SQL Server and I can debug/development and all is well. Unfortunately, I am not entirely sure on how to go about deploying my code to a live server.
Currently, my live server consists of a virtual machine (my site is accessible from the internet). When I'm ready to put my changes on the live server I publish my app (right click on solution explorer -> publish). Then I go to the directory it publishes to and dump all the files into a network share that goes to my site on the live server. On the initial copy over, I have to manually edit the web.config so that the connection string points to the SQL Server on the live server instead of my local machine. So this is my first stumbling block. How can I easily manage development settings and "live" settings in the web.config?
Now, I also use version control (Kiln). Can I possibly tag a changeset and have it automatically deployed to my live server somehow? Let's say someone submits a bug and I fix it. I push my changeset and now Kiln has the latest version of my code with the bug fix. What's the best way to get these changes on to a live server?
I'm unable to find any documentation that covers the entire workflow but I feel like there has go to be a better way. Surely, something like this can be accomplished without having to manually edit the web.config everytime I publish and pray to the computer Gods that I didn't miss something in the connection string.
It's just me so I have complete control over all of my environments, including the server and what's accessible via the internet, and anything is possible if only I knew what to do.
How can I easily manage development settings and "live" settings in the web.config?
Re: With VS 2010 web.config transformations, it is quite easy. Please take a look at this blog:
http://blogs.msdn.com/b/webdevtools/archive/2009/05/04/web-deployment-web-config-transformation.aspx
For VS 2008 or older, we used to have multiple config file based on environment and we used to create Debug/Release/DevTest/UAT/PROD release configuration and then in the post build event we used to replace the web.config with the release configuration based config. For example - if you build the project using "Prod" release configuration then we copy the PROD web.config to the publishing folder.
Now, I also use version control (Kiln).
Can I possibly tag a changeset and have it automatically deployed to my live server somehow? Let's say someone submits a bug and I fix it. I push my changeset and now Kiln has the latest version of my code with the bug fix. What's the best way to get these changes on to a live server?
Re: Source control and publishing to live server are two different things. The first question you are asking here related to how you manage multiple releases and have control over bug fixes for each release. The way I would do it is I will have PROD branch in my source control which will be the first release and for every major release I will sub branch it to have more control over e-fixes.
For the other question about how to get it to live server, it depends on your environment. We do it differently based on how customer environment is setup. If they have given us the FTP, we use that or otherwise we package the application into an MSI and then deploy it to UAT.. Until UAT signoff is done, we keep on updating the MSI. Once signoff received, the MSI goes to PROD.
Hope this helps.

straightforward single developer deployment with mercurial and netbeans?

I am coding a website using the Codeigniter PHP framework.
I am using mercurial for version control.
I have 3 systems I work with. I do my coding on a Windows 7 machine using Netbeans 6.9.1. I am occasionally making commits, and pushing to a repository at Bitbucket.org, purely for the purposes of backup and version control.
I have a "beta" website (on a shared Linux box with it's own dedicated IP address) that I upload to using FTP, where I can test that everything is working as intended on an actual site running Linux.
Once I'm happy with that, I upload to my "live" site, which is on it's own dedicated server. Again I'm just using FTP to upload the files from my development server.
I realize that this is all kinds of wrong. For one thing I have to go in and change some things on the beta and live machines so that they're referring to the correct domain name, instead of localhost. For another, I'm not making use of mercurial at all to help with this. I assume instead of uploading from FTP, I could be using mercurial to "grab" a particular revision that I've marked as ready to deploy. I also think I could possibly be doing something in Netbeans differently to make the process easier.
What I want to do is have some very smoothe way to control all this, and hopefully one that knows how to deal with the issue of a slightly different configuration setup for the beta and live sites from the localhost.
Is there a standard way to do what I'm looking for? I've seen references to some third party apps for "continuous integration" but I'm not sure I need anything like that.
I'm a little lost as to what would be the SIMPLEST thing for me to do that would make my life easier....any help greatly appreciated :) Thanks!
It depends on how different the setup for each site is, and if there are secrets involved, which should not be visible on a public place (I assume you use a public bitbucket repository).
If the changes are not sensitive, then you can add two additional branches for your test and production servers, where only the configuration changes are applied. Every time you change something in default and deploy it to test, you would simply merge default on top of test, and mercurial fill in the different configuration settings in the process. Then the server deployment wold be a call to hg archive within the correct branch.
A typical change history would look like this:
O----o-o-o-o-o-o-o-o---o default
\ \ \
T1--------T2-----------T3 test
\ \
P1---------------------P2 production
where in T1 and P1 the parameters for test and production are filled in. You also can use this branch setup to mature the development of your site, where you hack in default, and only propagate stable changes into test and production.
If the changes are sensitive, you can create a non-versionized deploy script (or better a versionized deployment script and a not versionized configuration file), which patches the output of hg archive.
You should use deployment scripts anyway, which handles the packaging of the product and deploy an the target in an automated and standardized way. Within this script you can also embed information about the source revision into the final archive.
Note that this model works fine for an environment, where no changes are made on the server. If you do changes to the product on the server, you need to copy the files from the server back into your development environment(at the correct revision), to check what was changed on the server. When you want to make changes also on the server, you might want to install mercurial also there.