Deployment with CakePhp - deployment

I have a CakePhp Website that is currently live. I would like to keep working on the site, without impacting the deployed site.
What is the best way to keep a production version separate from a deployed version, and then merging the two when appropriate?
Currently, I am using Git for version control.
Thanks!

First thing, get to know a version control system Subversion, Git, Bazaar, Mercurial are some examples. They are a safety net that can save your bacon because they save EVERY change to EVERY file in your fileset.
Then, typically I have a local development server and also a subdomain (staging.example.com) on the production server. I then do my heavy development on the local development server. Then I use SVN to archive all my site changes. Then, using a shell account on the production server I check out the new version of the software to the staging subdomain. If it works ok there, I can then update the live site using just a single SVN check out.
I've also heard of people placing a symbolic link in the location where the site root should be (/var/www/public_html) that points to the live directory (/var/www/site_ver_01234) , then set up the new version in a parallel directory (/var/www/site_ver_23456). Finally, just recreate the symbolic link pointing to the new version's directory. The switch is instantaneous and transparent. I'm sorry I'm not more clear on this method though, I read about it a while back but never tried it myself though.
I've also looked at Bazaar (another version control system) that has a plugin that automatically ftps any changed files to a given server every time a version is checked in.

The general idea, first of all, is to use a version control system. Using this, you're developing your site on your local machine or with several people, having a central repository somewhere.
When you're happy with a certain revision and would like to deploy it, you "tag" it. That means you freeze the state of that revision and separate it from the continually evolving "trunk". What that means specifically depends on your version control system.
You then take that tagged revision and copy it to the live server. Possibly you may copy it to a "staging server" before to test it in another environment. This copying can be as simple as overwriting all existing files using FTP, or it can involve automated deployment systems which will take care of the details for you and allow you to roll back an unsuccessful deployment. If a database is involved as well, you're probably also looking at database schema migration scripts that need to be run.
Each of these steps can be done in many different ways, and you'll have to figure out what's the best approach for you. If you're not doing so already, start using a version control system such as SVN or git. Do it now! Then you might want to google or search on SO about different techniques to tag and branch using that system. For serious deployment, start with a keyword like Capistrano or one of its PHP clones.

Related

is there a deploy tool (or set of tools) that supports rollback of a deployment?

I'm learning FluentMigrator. The thing that I like about FM is that it supports the idea of Forward and Back for migrations (aka Up/Down). I'm finding that it's not ideal about this; there are some holes. Still, it's good.
This leads me to wonder if there are any deployment tools (nant, msbuild or other) that support this idea of rolling forward and back. The scenario that I'm using it in is the deployment of a web app with a related database.
Ideally I'd like to set up my deployment so that, should any part of it fail, it will revert to the previous known working configuration. With FM, this is pretty easy to do (but there are rough spots), so that covers the db. How about the files that make up the web app? Do any deploy tools have support for this?
Deploying to a Windows Server. Assume that I can't make any changes to the server.
I don't know of any Microsoft-centric, automated provisioning/deployment tools like Capistrano. Here are some tools I've heard of, but never used:
MSDeploy, for deploying web application.
Microsoft Deployment Services, for managing operating system configuration
Microsoft's System Center Configuration Manager
BladeLogic
HP's Operations Center
Up until about three months ago, we did our deployment/provisioning using custom MSBuild scripts. After a server is provisioned, deploys happen automatically using Robocopy to copy files to a share on the application server, updating changed application binaries and markup files. We've never had a need to rollback any of our deployments, but since our scripts are custom, we could write the logic if we needed to.
MSBuild is a terrible deployment/provisioning language. For the past three months, we've been writing all new scripts in, and porting existing ones to, PowerShell. It is wonderful. With version 2, there is support for running commands on remote servers, like SSH. We haven't used that functionality yet, but I'm looking forward to pushing setup scripts to remote server to provision and deploy at the same time.
We have been using Git to do our deploys for the last 6 months.
Here is the whole process:
CI server build the project
CI server checks it in to a local git repository
CI server pushes the changes to the centralised git repository
User creates an empty repository on the live server
User adds the central git repository to the remotes
User pulls the latest version over https (no need to open any ports)
It is a lot to setup in the beginning but once setup it works great. Deploys take seconds as only changed files get copied.
Another great thing about this method is that git keeps history of changes so rolling back is pretty simple. You can also roll back a few revisions and it's done straight on the live server. If something goes wrong reverting is super fast.
Also you can save some time if you use a hosted git service (github) for your central repository.
This is a very brief description but I can give you more info if you want.
Of course! My favorite is Capistrano. This was originally built for Ruby but I've found that it works just as well for other languages.
https://github.com/capistrano/capistrano

Can anyone explain me the steps to setup version control system in Linux

I am trying a lot and i am not bale to get how this version control work in my scenario
I have the VPS server where i host php sites. Users have home directories in /home/users.
Currently users edit files via FTP and i have no control what they do. I want to setup version control system on VPS i don't know hoe to start . I mean
I will explain what i want , i may be wrong but please correct me.
How can i install VCS on my VPS server so that all directories in /home/users are version controlled. I don't know if its possible or not. I want that final saving place or repo should be /home/user/public_html so that when user commit then my live site should change. Now i don't know if VCS works that way or not.
Now how will my client computers connect that VCS server
Is it possible to have version control for one user i mean /home/user1/public_html and not for others
Now users will still have FTP details , can't they change files via FTP even if i use VCS
Please clear my doubts , i really want to learn VCS systems
Yes, it should be feasible. Expect to be storing some extra data as the whole history will be stored plus separate copy of the current version for the stored.
You have to decide which version control system you want to use. The most common options are:
Subversion
Git
Mercurial
Bazaar
If you or your users already have experience with one, than it's probably best choice.
You want to:
Install the version control system of choice and create a post-commit hook to check out each version into the target directories.
Clients will commit into the respository. All the systems support access through restricted ssh (users log in using public key and the key is set in .ssh/authorized_keys to only allow one particular command). Some also have HTTP(s)-based method (special Apache module for Subversion, CGI script for Mercurial, Bazaar and Git).
Yes; the hook script will check out what you tell it to. You can implement it to checkout for all users, listed users, users in a group, whatever you need.
Turn the FTP server off.
Usually the workflow is that you have a repository with all the revisions and changes. This uses a special format, there is no point in directly accessing these files. The repo is typically accessed thru WebDAV interface (running as an apache module), or running a standalone server (with it's own protocol).
Users commit their changes to the repo, then can export the latest revision (or one of their choice) to their publicly accessible *public_html* directory. This involves them interacting with the VCS and knowing (and caring) about it.
A simpler setup can be that the *public_html* contains a working copy and they interact with it thru conventional FTP. (You have to make sure that the VCS's files for example the .svn folders can not be accessed by the general public). This way you can expose the VCS functions (basically commit and rollback) to your users thru a web interface (you write a small PHP script that does the commit and update for your them).
Incremental backups: a completely different story
As I understood you probably need something more like incremental backups, for example rsync. Each time a user closes an FTP connection you can initialize an rsync backup. It has flexible options, you can have all the changes for the last X days, or last X FTP sessions, so the user could roll back after an accidental upload. (It can be used with a remote or local storage for backups).
VCS (Version Control System) is just a class of software: You need to select one before you can implement it. In your case you probably want subversion, or one of the DVCS (Distributed Version control system) (git or mercurial).
It sounds like what you want is some kind of automated deployment system for your websites, which is certainly possible.
Disabling ftp is easy: simply stop the ftp server from running: ftp is insecure and the servers are often dangerous themselves.
Have a look at how Branchable works. They have specific web framework (ikiwiki), but the underlying principle of keeping the web sites in version control (git) is the same and all the software they use is open-source including the scripts that bind it all together, so you can look how it works.

straightforward single developer deployment with mercurial and netbeans?

I am coding a website using the Codeigniter PHP framework.
I am using mercurial for version control.
I have 3 systems I work with. I do my coding on a Windows 7 machine using Netbeans 6.9.1. I am occasionally making commits, and pushing to a repository at Bitbucket.org, purely for the purposes of backup and version control.
I have a "beta" website (on a shared Linux box with it's own dedicated IP address) that I upload to using FTP, where I can test that everything is working as intended on an actual site running Linux.
Once I'm happy with that, I upload to my "live" site, which is on it's own dedicated server. Again I'm just using FTP to upload the files from my development server.
I realize that this is all kinds of wrong. For one thing I have to go in and change some things on the beta and live machines so that they're referring to the correct domain name, instead of localhost. For another, I'm not making use of mercurial at all to help with this. I assume instead of uploading from FTP, I could be using mercurial to "grab" a particular revision that I've marked as ready to deploy. I also think I could possibly be doing something in Netbeans differently to make the process easier.
What I want to do is have some very smoothe way to control all this, and hopefully one that knows how to deal with the issue of a slightly different configuration setup for the beta and live sites from the localhost.
Is there a standard way to do what I'm looking for? I've seen references to some third party apps for "continuous integration" but I'm not sure I need anything like that.
I'm a little lost as to what would be the SIMPLEST thing for me to do that would make my life easier....any help greatly appreciated :) Thanks!
It depends on how different the setup for each site is, and if there are secrets involved, which should not be visible on a public place (I assume you use a public bitbucket repository).
If the changes are not sensitive, then you can add two additional branches for your test and production servers, where only the configuration changes are applied. Every time you change something in default and deploy it to test, you would simply merge default on top of test, and mercurial fill in the different configuration settings in the process. Then the server deployment wold be a call to hg archive within the correct branch.
A typical change history would look like this:
O----o-o-o-o-o-o-o-o---o default
\ \ \
T1--------T2-----------T3 test
\ \
P1---------------------P2 production
where in T1 and P1 the parameters for test and production are filled in. You also can use this branch setup to mature the development of your site, where you hack in default, and only propagate stable changes into test and production.
If the changes are sensitive, you can create a non-versionized deploy script (or better a versionized deployment script and a not versionized configuration file), which patches the output of hg archive.
You should use deployment scripts anyway, which handles the packaging of the product and deploy an the target in an automated and standardized way. Within this script you can also embed information about the source revision into the final archive.
Note that this model works fine for an environment, where no changes are made on the server. If you do changes to the product on the server, you need to copy the files from the server back into your development environment(at the correct revision), to check what was changed on the server. When you want to make changes also on the server, you might want to install mercurial also there.

Eclipse / Aptana File Sync Solutions

Our development team uses Eclipse + Aptana to do their web development work. Currently, most of them are mapping their Eclipse projects directly to the web server. I'd rather them create a local project and use that to sync to the web server project directory they are working on.
The issue is that there aren't any good solutions which is just appalling given the popularity of the two.
The FileSync plugin for Eclipse is only one-way. Meaning if another developer makes a change to the file on the server, another dev isn't even notified and could overwrite the change.
The File Transfer option in Aptana 2.0 doesn't support any sort of Sync, just manually uploading/downloading files.
The Sync option in Aptana 1.5.1 doesn't allow you to merge files when they are different. You can only update one or the other. It does however allow you to view a diff (but only if you right click and select) and in that diff you can't make any changes.
I did find a way to allow files to be uploaded to their Sync repositories in Aptana using Eclipse Monkey. However it doesn't work if a user saves multiple files at once, 'Save All', again it doesn't work. And additionally, there is no notification if a user opens a local file that has an updated copy on the server. I tried to add one using Eclipse Monkey but I couldn't find any sort of listener in the Eclipse API to do it and any Eclipse Monkey documentation is far and few between.
My only solution at this point is just to let them continue to map directly to the server or ask them to do a manual download before they do any work (but again what if someone uploads a change right after they do that).
Anyone have any ideas?
April 2010
Add EGit to your Eclipse+Aptana setup, and:
let developers push to a local bare repo their developments (see also this post)
let your local project be updated by a git pull from that same local bare repo (creating/updating) a local working directory with sources merged/updated (or by using a post-update hook as described in my previous SO link)
let your local Aptana+Eclipse(+EGit) reference that local working directory, also used by your web server.
In short, when you are speaking of file synchronization + merges, this is a job for a (D)VCS (Version Control System: Centralized or Distributed VCS)
Oct 2011: as xmedeko mentions in the comments, Aptana3 has its own Git plugin.
And it isn't very compatible with EGit: See bug 1988.
Adding to VonC answer (which is correct IMHO), what probably lies beneath this scenario is that the process you adopted is not correct in itself, apart from the tools used.
If I understood well, you should not allow nor perform a direct upload from a development version of the project to the web server. Merging is not a job for remote synchronization tools, and it should happen well before the deployment phase (upload to web server is practically a deploy).
You should have a dedicated repository taken from some point in development history (according to you release timeline), a point where merge has already happened. Then deploy it (by means of file synchronization if you want, but that is not mandatory) on a local/staging web server.
Perform there any test you run on the web site actively running (i.e. integration and/or functional tests). If there's any bug & fixing, well there are different ways to actually apply the fixes on development & staging code repository. Only after that, you deploy the staging repository on to production web server (again, synchronization tools are a way to do that).

Best practice updating a website

currently my work-flow is as follows:
Locally on a machine I maintain a git repo on each website I am working on, when the time comes to publish something I compress the folder and upload this single file to the production server via ssh then I decompress, test the changes a move the changes to the live folder and I get rid of the .git folder.
I was wondering if the use of a git repo on the live server was a good idea, seems to be at first but it can be problematic if a change doesn't look the same on on the production server in comparison to the local development machine... this could start a fire...
What about creating a bare repo on some folder on production server then clone from there to the public folder thus pushing updates from local machine to the bare repo and pulling from the bare on the public folder of the production server... may anyone plese provide some feedback.
Later I read about capistrano http://capify.org but I have no experience w/ this software...
In your experience what is the best practice/methodology to accomplish a website deployment/updates?
Thanks in advance and for your feedback.
I don't think that our method can be called best practice, but it has served us well.
We have several large databases for our application (20gb+), so maintaining local copies on each developers computer has never really been an option, and even though we don't develop against the live database, we do need to do the development against a database that is as close to the real thing as possible.
As a consequence we use a central web server as well, and keep a development branch of our subversion trunk on it. Generally we don't work on the same part of the system at once, but when we do need to do that, or someone is making a lot of substantial changes, we branch the trunk and create a new vhost on the dev server.
We also have a checkout of the code on the production servers, so after we're finished testing we simply do a svn update on the production servers. We've implemented a script that executes the update command on all servers using ssh. This is extremely convinient, since our code base is large and takes a lot of time to upload. Subversion will only copy the files that actually have been changed, so it's a lot faster.
This has worked really well for us, and the only thing to watch out for is making changes on the production servers directly (which of course is a no-no from the beginning) since it might cause conflicts when updating.
I never thought about having a repository copy on the server. After reading it, I thought it might be cool... However, updating the files directly in the live environment without testing is not a great idea.
You should always update a secondary environment matching exactly the live one (webserver + DB version, if any) and test there. If everything goes well, then put the live site under maintenance, update files, and go live again.
So I wouldn't make the live site a copy of the repository, but you could do so with the test env. You'll save SSH + compressing time, plus you can check out any specific revision you'd like to test.
Capistrano is great. The default recipes The documentation is spotty, but the mailing list is active, and getting it set up is pretty easy. Are you running Rails? It has some neat built-in stuff for Rails apps, but is also used fairly frequently with other types of webapps.
There's also Webistrano, which is based on Capistrano but has a web front-end. Haven't used it myself. Another deployment system that seems to be gaining some traction, at least among Rails users, is Vlad the Deployer.