Does anyone have any experience using version control with a production website? Would it be a terrible idea to run a website from a repository? I just found a related article but I would like to hear your thoughts/comments.
Makes imho no sense - cheap person's approach.
In larger scenarios you have develop / test / production, so you version control on the develop side, then publish forward to test and production. There is no need to acutally version control once things hit production. You do keep one or two backup versions, for a fast rollback, but otherwise - no need.
Every production manager will tell you the same thing: a (D)VCS has no place in a production environment.
You can maybe have a one "release deployment" server in the production pit, where you do have a VCS allowing you to view the correct delivery, and from that server to copy/rsync it to the right production server.
But on the servers themselves, you only have:
the application itself
monitoring process to follow and report
some diagnostic tools
The reason is that the more elements you have in your release environment, the more possibility you get for one of those elements to go wrong.
Adding a VCS in the mix is not worth it.
The way I've always done it is to have a live & test version be checkouts of the repository. Then my workflow is like this:
make changes on my dev checkout
commit changes.
update test.
make sure everything works
update production.
Related
I am coding a website using the Codeigniter PHP framework.
I am using mercurial for version control.
I have 3 systems I work with. I do my coding on a Windows 7 machine using Netbeans 6.9.1. I am occasionally making commits, and pushing to a repository at Bitbucket.org, purely for the purposes of backup and version control.
I have a "beta" website (on a shared Linux box with it's own dedicated IP address) that I upload to using FTP, where I can test that everything is working as intended on an actual site running Linux.
Once I'm happy with that, I upload to my "live" site, which is on it's own dedicated server. Again I'm just using FTP to upload the files from my development server.
I realize that this is all kinds of wrong. For one thing I have to go in and change some things on the beta and live machines so that they're referring to the correct domain name, instead of localhost. For another, I'm not making use of mercurial at all to help with this. I assume instead of uploading from FTP, I could be using mercurial to "grab" a particular revision that I've marked as ready to deploy. I also think I could possibly be doing something in Netbeans differently to make the process easier.
What I want to do is have some very smoothe way to control all this, and hopefully one that knows how to deal with the issue of a slightly different configuration setup for the beta and live sites from the localhost.
Is there a standard way to do what I'm looking for? I've seen references to some third party apps for "continuous integration" but I'm not sure I need anything like that.
I'm a little lost as to what would be the SIMPLEST thing for me to do that would make my life easier....any help greatly appreciated :) Thanks!
It depends on how different the setup for each site is, and if there are secrets involved, which should not be visible on a public place (I assume you use a public bitbucket repository).
If the changes are not sensitive, then you can add two additional branches for your test and production servers, where only the configuration changes are applied. Every time you change something in default and deploy it to test, you would simply merge default on top of test, and mercurial fill in the different configuration settings in the process. Then the server deployment wold be a call to hg archive within the correct branch.
A typical change history would look like this:
O----o-o-o-o-o-o-o-o---o default
\ \ \
T1--------T2-----------T3 test
\ \
P1---------------------P2 production
where in T1 and P1 the parameters for test and production are filled in. You also can use this branch setup to mature the development of your site, where you hack in default, and only propagate stable changes into test and production.
If the changes are sensitive, you can create a non-versionized deploy script (or better a versionized deployment script and a not versionized configuration file), which patches the output of hg archive.
You should use deployment scripts anyway, which handles the packaging of the product and deploy an the target in an automated and standardized way. Within this script you can also embed information about the source revision into the final archive.
Note that this model works fine for an environment, where no changes are made on the server. If you do changes to the product on the server, you need to copy the files from the server back into your development environment(at the correct revision), to check what was changed on the server. When you want to make changes also on the server, you might want to install mercurial also there.
I have a CakePhp Website that is currently live. I would like to keep working on the site, without impacting the deployed site.
What is the best way to keep a production version separate from a deployed version, and then merging the two when appropriate?
Currently, I am using Git for version control.
Thanks!
First thing, get to know a version control system Subversion, Git, Bazaar, Mercurial are some examples. They are a safety net that can save your bacon because they save EVERY change to EVERY file in your fileset.
Then, typically I have a local development server and also a subdomain (staging.example.com) on the production server. I then do my heavy development on the local development server. Then I use SVN to archive all my site changes. Then, using a shell account on the production server I check out the new version of the software to the staging subdomain. If it works ok there, I can then update the live site using just a single SVN check out.
I've also heard of people placing a symbolic link in the location where the site root should be (/var/www/public_html) that points to the live directory (/var/www/site_ver_01234) , then set up the new version in a parallel directory (/var/www/site_ver_23456). Finally, just recreate the symbolic link pointing to the new version's directory. The switch is instantaneous and transparent. I'm sorry I'm not more clear on this method though, I read about it a while back but never tried it myself though.
I've also looked at Bazaar (another version control system) that has a plugin that automatically ftps any changed files to a given server every time a version is checked in.
The general idea, first of all, is to use a version control system. Using this, you're developing your site on your local machine or with several people, having a central repository somewhere.
When you're happy with a certain revision and would like to deploy it, you "tag" it. That means you freeze the state of that revision and separate it from the continually evolving "trunk". What that means specifically depends on your version control system.
You then take that tagged revision and copy it to the live server. Possibly you may copy it to a "staging server" before to test it in another environment. This copying can be as simple as overwriting all existing files using FTP, or it can involve automated deployment systems which will take care of the details for you and allow you to roll back an unsuccessful deployment. If a database is involved as well, you're probably also looking at database schema migration scripts that need to be run.
Each of these steps can be done in many different ways, and you'll have to figure out what's the best approach for you. If you're not doing so already, start using a version control system such as SVN or git. Do it now! Then you might want to google or search on SO about different techniques to tag and branch using that system. For serious deployment, start with a keyword like Capistrano or one of its PHP clones.
I have finished developing the core of a web application I have been working on. Since I was the only developer I just developed locally (lamp stack) without using version control (probably stupid but anyway..). Now that it is getting close to production ready, I have a couple other developers working with me so I set up a repository for my code.
This is my question: I still want to be able to test any changes locally first before posting to production. How do I manage this with a repository without having to maintain 2 versions of my code (that I have to synch up manually)? For one, the production code has a few differences here in there (such as database constants etc.). I'd like to be able to change my code in my local repository, test it on my local apache server, then check the code directly into production (is this even possible using eclipse)?
I am using eclipse and subversion (php code). I know I asked many questions but hopefully you get the idea of what I am trying to do...and I assume its rather common. Thanks.
In addition to the excellent answers you've gotten already, I'd like to emphasize that if there are differences between your dev and production code, you're adding risk. You should be using the same, well-tested code in both locations; any difference between the environments should be expressed in configuration files. Any configuration files in source control should be samples only; your deployment script should not push new configuration files to production.
This, in combination with tagged releases and a staging environment that mimics production, should help you promote your code smoothly to the production environment.
I would suggest a few things
Use tags/branches in SVN. When the code is production ready, tag it with a unique name.
Set up a staging area for integration testing. After a release is tagged for staging, yank it from your vcs and copy it into the staging area. This can be as simple as a different directory tree or a second install of your server.
Put constants into separate files that can copied/merged over into the staging and deployment directories
Test the staged version against dev to insure everything works as it did in your dev environment. I would point staging to production databases when I am sure it is working and ready to be promoted. Test that it also works against prod.
Once everything works in staging, update the production copy. I would suggest you create a clean deployment directory then copy that entire deployment over to the production server after copying/merging config settings.
This was my approach is dealing with perl/cgi many years ago and it worked pretty well. SVN handles tags/branching much better so it should be easier to deal with. We had very few production problems once we started staging the files before pushing to prod.
It sounds like you haven't created any branches or tags, and probably have a "trunk" that isn't labeled as such. Best practices would dictate that you have a trunk for the current stable code, branches that you develop against, and tags that are actually used on the production site. There is a short description and diagram on Wikipedia.
Of course, that's just best practice. Your project sounds small enough that you could get away with splitting your code into a development/ directory and a production/ directory in your code repository. Checkin code to the development directory, and once a change is fully tested, merge it into the production directory.
Whether you do it the right way or the easy way, it's important to do something to separate your development code from your production code. As you add more developers, it will be increasingly unlikely that the development code base is stable because people are checking in code that hasn't been fully tested, isn't complete, whatever. Spending a little extra time on managing two branches of code will save you a lot of headaches later on.
I have a NSIS installer that we previously built using nAnt scripts that copy some files around and run makensis.exe via a exec task to build the installer exe. After the nant script completes, I have the compelte structure for our CD and also our download.
I was just doing a get from sourcesafe onto an unused desktop and using it as a build box, compiling there. Sometimes we would have a couple of files checked in that fix something critical. In those cases I would go to the build box, and very selectively get only those files, to avoid getting other changed files that we aren't ready to release yet. Basically I am able to allow development to continue and selectively include certain changed files into the installer for release.
Now we no longer have a free box, and need to build from our server. So I am setting up CI Factory so that the developer can kick the build off without remoting into the server. The one issue I am struggling with, is the best way to continue to allow this selective change control to occur. The default concept of CI that CI Factory implements is fine for internal development "head". However, I also want to setup a CCNet project that is run only on demand via a Force Build for this "public release" type of build.
This is what I've brain stormed so far, without being sure how well this will work, if at all(still figuring out what CCNet and CI Factory are all about). The "public release" CCNet project config/build would be setup such that it would not get latest. Modifications would not trigger a build. Since the other CCNet project that is using the default CI methodology(we'll call it the "CI project") of getting latest when changes are detected, then these two projects can't share the same working directory. So the "public release" would need a different working copy, so that its files won't get updated when the CI project's build is triggered. The developer would need to remote into the server, one VSS, selectively do a get into the "public release"'s working copy, and then force a build through CI Factory.
The disadvantage's I see with this is
1) Having to remote in to selectively do gets.
2) I have no idea how to allow a single CI Factory project to have two different working copies of the Product folder, so that each project configuration block has it's own.
3) I'm afraid of what kind of strangeness this might cause. I'm not quite sure yet how to specify a source control block in CCNet project config block, but prevent it from doing a get latest when it builds. I'm still gradually figuring out what things are in scripts and can be easily taken out without breaking other things, versus what is not meant to be mucked around with and/or is not configurable.
I would really like to hear about how others deal with this issue of selectively releasing changes, if you have a similar situation. I am constrained to VSS, so my immediate need is to solve this with that in mind, but at the same time I'd be interested in hearing how you manage this with other source control systems. I guess you would probably have a branch that is your latest developments branch, and then merge changes into the trunk whenever you want to release them? I really don't trust VSS for branching/merging, and I think the branching concepts might be a little too much overhead and learning curve for this shop. Like I said though, stories with other source control systems would be useful future knowledge for me.
Thanks in advance.
You need a branching structure in your repository to facilitate this. Something like the release branch method. Only select individuals can commit to this branch (or have a release/stable for that). Set up your manual CI launches to pull from the release branch as release nightly promote to milestone or final from there. I don't like the idea of manually modifying things on your build machine. Set up the changes in version control, in a safe place to prepare your release and let CI build from there, but manually triggered.
Check out these branching patterns. I suggested C3, codeline-per-release, often called release branching.
Heres an article on VSS branching that includes a link to merging.
This question looks similar.
Maybe you could move to another source control system with better support for this kind of thing. Any suggestions from MS people out there?
currently my work-flow is as follows:
Locally on a machine I maintain a git repo on each website I am working on, when the time comes to publish something I compress the folder and upload this single file to the production server via ssh then I decompress, test the changes a move the changes to the live folder and I get rid of the .git folder.
I was wondering if the use of a git repo on the live server was a good idea, seems to be at first but it can be problematic if a change doesn't look the same on on the production server in comparison to the local development machine... this could start a fire...
What about creating a bare repo on some folder on production server then clone from there to the public folder thus pushing updates from local machine to the bare repo and pulling from the bare on the public folder of the production server... may anyone plese provide some feedback.
Later I read about capistrano http://capify.org but I have no experience w/ this software...
In your experience what is the best practice/methodology to accomplish a website deployment/updates?
Thanks in advance and for your feedback.
I don't think that our method can be called best practice, but it has served us well.
We have several large databases for our application (20gb+), so maintaining local copies on each developers computer has never really been an option, and even though we don't develop against the live database, we do need to do the development against a database that is as close to the real thing as possible.
As a consequence we use a central web server as well, and keep a development branch of our subversion trunk on it. Generally we don't work on the same part of the system at once, but when we do need to do that, or someone is making a lot of substantial changes, we branch the trunk and create a new vhost on the dev server.
We also have a checkout of the code on the production servers, so after we're finished testing we simply do a svn update on the production servers. We've implemented a script that executes the update command on all servers using ssh. This is extremely convinient, since our code base is large and takes a lot of time to upload. Subversion will only copy the files that actually have been changed, so it's a lot faster.
This has worked really well for us, and the only thing to watch out for is making changes on the production servers directly (which of course is a no-no from the beginning) since it might cause conflicts when updating.
I never thought about having a repository copy on the server. After reading it, I thought it might be cool... However, updating the files directly in the live environment without testing is not a great idea.
You should always update a secondary environment matching exactly the live one (webserver + DB version, if any) and test there. If everything goes well, then put the live site under maintenance, update files, and go live again.
So I wouldn't make the live site a copy of the repository, but you could do so with the test env. You'll save SSH + compressing time, plus you can check out any specific revision you'd like to test.
Capistrano is great. The default recipes The documentation is spotty, but the mailing list is active, and getting it set up is pretty easy. Are you running Rails? It has some neat built-in stuff for Rails apps, but is also used fairly frequently with other types of webapps.
There's also Webistrano, which is based on Capistrano but has a web front-end. Haven't used it myself. Another deployment system that seems to be gaining some traction, at least among Rails users, is Vlad the Deployer.