A deployment tool for PHP - deployment

Hi guys I have a PHP project and I need a tool for make deployment and I foud Deployer, a deployment tool for PHP, which is your opinion and experience about it?
Regards

Deployer is good choose: it's has simple api, a lot of recipes for popular frameworks and apps, and can run parallel deploy. Also it requires only for PHP.
deployer.org
github.com/deployphp/deployer
Here is an example of little task:
task('my_task', function () {
// Your tasks code...
});
Deployer has a good quality code:

I use deployer. It works as advertised, or better. That is my experience and opinion, to answer your question. Let me give some more information:
You get parallel deployment on remote servers and going forward/back is very simple. Your release and rollback procedures become documented in code and repeatable by anyone on your team with permissions.
Your directory structure looks like this:
./releases/yyyymmddhhmmss/
./current -> ./releases/currentrelease/
./shared/
The releases folder contains each release that has been made. You should tag these on git yourself too because deployer will clean them up for you when they get old (you configure that.) current is a symlink that points at a current, ready to go, release. You can move it forward and back to release and roll-back. Deployer manages it all for you.
You setup and configure deployer by creating your Deployerfile (deploy.php) in your project root. Use composer to install deployer.
Then to deploy on all staging servers simply run:
vendor/bin/dep deploy stage
And when happy you can send to all production servers in parallel with:
vendor/bin/dep deploy production
The only really "difficult" task was figuring out how to integrate with our database migrations, deployer gives you many tools but migrations are of course left up to your database layer. I would also love to see some more resources from deployer about suggestions for permissions and thoughts on using a deploy user on the remote hosts.
It will be worth your time to automate your release and rollback procedures so that anyone on your team can handle them. Deployer is a great tool set for doing that.

Related

Automatically Combine/Bundle EF Migration(s)

I'm wondering if it is possible to run some automated tasks either on a Release (web deploy) action, or Branch Merge (TFS) action?
Ideally I would like to set up a process that will automatically combine EF migrations since the last release. I'm still looking into how I would automate this, but I think the first step is hooking into a suitable event.
I haven't setup a build server yet, but I'm guessing if the above isn't possible then this would be an option for attaching a custom procedure to the MSBuild task?
Alternatively, if anyone has experience in automating things like this I would be happy to hear it. I am the head of development at a web development company and I would like to facilitate our current processes by automating some of our standard procedures, and this is something we do over any over again for each development!
I appreciate your time looking at my question, thanks.
VSTS and TFS2015 both support a CI/CD process via their new build and release system. Very flexible and powerful. Check it out!
https://msdn.microsoft.com/Library/vs/alm/Release/getting-started/understand-rm
VS/WebDeploy does support deploying EF migrations with a web application:
https://msdn.microsoft.com/en-us/library/dd394698?f=255&MSPPError=-2147217396#efcfmigrations
This works fine for deploying a small application/system but when you want to deploy a larger system with many components it doesn't work as well. We create MSDeploy packages for each component of the system. For example, this is how we deploy SQL databases:
http://dotnetcatch.chief7.space/2016/02/10/deploying-a-database-project-with-msdeploy/

How can I share deployment code between Lab Management and Release Management

After having just started using Microsoft Release Management, I am more and more convinced that it is not well suited to run integration tests. This might be a false feeling I'm having, and I'd love to get more input on this. When we first considered it, I had the intention to run the tests defined in our test plan through it's pipeline, but now I'm seeing that we should be running those as frequently as possible. We would like to run integration testing every night, but our release candidates are only defined at the end of sprints, so using Release Management for that seems conflicting.
With the tool out of the equation, we are considering exploring the Lab Template again. We did some very minor tests with it a few months ago in a legacy project but never went too far. My main concern now is that both stages need deployment:
the Release Management pipeline needs to deploy our projects to the QA and production environment
the Lab Template also needs to deploy the project on a few virtual machines to run integration tests on
The Release Management uses some very nice abstractions to achieve that. You can code machine scopes and define components based on the drop folder structure to define each part of the whole application to be deployed. On the other hand, the lab management workflow does not support this (or perhaps I'm just missing it). The standard way to make deployment work for lab testing, is to write a custom power shell script that moves the files from the build drop folder to the correct places, creates the application pools for web projects, and stuff like that, all by hand.
Ideally, I'd like to just share the entire deployment workflow between both tools and, since the Release Management way of doing it seems much simpler, I'd use that. This would make it easier to maintain both pipelines at the same time, which I assume is actually commonplace.
What is the correct approach to share the deployment code as much as possible between the two tools?
I would expect that better integration between RM and MTM/LM will be a future feature. In the interim, you could investigate using Desired State Configuration to handle having a single script that configures environments for you.
DSC support isn't really out-of-the-box in RM Update 2, but RM Update 3 will have built-in support for DSC to both Azure and on-prem VMs. Update 3 CTP 1 is out right now, but it's not production-ready.
You can still use DSC from RM in Update 2, it just requires a bit more work.

how can I set up a continuous deployment with TFSBuild for MVC app?

I have some questions around the best mechanism to deploy MVC web applications to different environments. Previously I used setup projects (.msi's) but as these have been discontinued in VS2012 I am looking to move to an alternative.
Let me explain my current setup. I currently have a CI setup using TFSBuild 2010 with Team Foundation Server for source control.
A number of developers work on their local machines and check in to the TFS Server. We regularly deploy to a single server dev environment and a load balanced qa environment with 2 servers. Our current process includes installing an msi which carries out some of the following custom actions:
brings current app offline with the app_offline.htm file
run in database scripts (from database project in the solution)
modifies web.config (different for each web server of qa)
labels the code
warmup each deployed file via http request
etc
This is the current process. Now I would like to make some changes. Firstly, I need alternative to msi's. From som research I believe that web deploy via IIS and using MsDeploy is the best alternative. I can use web config transforms for web config modifications. Is this correct and if so, could I get an outline of what I need to do?
Secondly I want to set up continuous delivery via TFSBuild, I have no idea how this may be achieved, would it be possible to get an outline of how it can be integrated in to my current setup? Rather than check in driven, I would like it to be user driven following check in. Also, would it be possible for this to also run in database scripts from a database project in the solution.
Finally, there is also a production environment, but I would like to manually deploy this - can my process also produce an artifact that I can manually install?
Vishal Joshi has some information on his blog that is reasonably good, http://vishaljoshi.blogspot.com/2010/11/team-build-web-deployment-web-deploy-vs.html. It does have the downside that your deployment password is include in the properties you pass to msbuild.
Syed Hashimi has also posted some information on this in another questions Team Build: Publish locally using MSDeploy.

Best way for deploying websites?

How do you deploy your websites?
For example: I am developing a site with a php framework and have it under version controle with git with all my local configs. When I want to put it on a web server for testing or updating the live application i have to copy it onto the server, change the config files, delete my test stuff etc.
So how do you handle these tasks?
I thought about using ant and write a deployment script for this.
Does there already exists a common solution for this "problem"? Because I don't think im the only one who need something like this.
There are quite a bunch of stuff available, but you might like Phing (like ANT for java).
Questions related to PHP+Phing:
Do you use Phing?
How do you manage your build [using Phing] process?
Setting up a deployment / build / CI cycle for PHP projects
what can Phing do that Ant can't?
Also read this questions sounds very interesting How To Deploy Your PHP Applications Correctly?
There is a specific question (a possible duplicate of your questions) has been answered a while ago, take a look at it Deploy a project using Git push
It seems you are using php, you should be good to go with capistrano. It is very easy to use capistrano for deployment with rails but it can also be tricked a bit to use for php.
Basically what you do with capistrano is -
Tell it which is you application server
Tell you database server
Tell web server (in most cases web server, app server and db server are same)
Specify you git repository with branch you want to deploy from
Once configured, you can deploy with capistrano with single command. You can even rollback your deployments from some of backup releases created by capistrano.
Now form some the repetitive tasks like, copying configs files like database configs (which generally are ignored in git), you create some tasks, which just creates symlinks or copies the files at appropriate location. These tasks will be called with deploy_hookes e.g. after_symlink hook.
You can find more about capistrano here - https://github.com/capistrano/capistrano/wiki
It comes with very good documentation, after getting overview, you may search for your framework specific approach to do this.

Solutions for automated deployment in developer environments?

I am setting up an automated deployment environment for a number of decoupled services that are in active development. While I am comfortable with the automated deployment/configuration management aspect, I am looking for strategies on how best to structure the deployment environment to make things a bit easier for developers. Some things to take into consideration:
Developers are generally building web applications, web services, and daemons -- all of which talk to one another over HTTP, sockets, etc.
The developers may not have all running on their locally machine, but still need to be able to quickly do end to end testing by pointing their machine at the environment
My biggest concern with continuous deployment is that we have a large team and I do not want to constantly be restarting services while developers working locally against those remote servers. On the flip side, delaying deployments to this development environment makes integration testing much more difficult.
Can you recommend a strategy that you have used in this situation in the past that was worked well?
Continuous integration doesn't have to mean continuous deployment. You can compile/unit test/etc the code "continuously" thoughout the day without deploying it and only deploy at night. This is often a good idea anyway - to deploy at night or on demand - since people may be integration testing during the day and wouldn't want the codebase to change out from under them.
Consider, how much of the software can developers test locally? If a lot, they shouldn't need the environment constantly. If not a lot, it would be good to set up mocks/stubs so much more can be tested on a local server. Then the deployed environment is only needed for true integration testing and doesn't need to be update constantly throughout the day.
I'd suggest setting up a CI server (Hudson?) and use this to control all deployments to both your QA and production servers. This forces you to automate all aspects of deployment and ensures that the are no ad-hoc restarts of the system by developers.
I'd further suggest that you consider publishing your build output to a repository manager like Nexus , Artifactory or Archiva. In that way deployment scripts could retrieve any version of a previous build. The use of a repository manager would enable your QA team to certify a release prior to it's deployment onto production.
Finally, consider one of the emerging deployment automation tools. Tools like chef, puppet, ControlTier can be used to further version control the configuration of your infrastructure.
I agree with Mark's suggestion in using Hudson for build automation. We have seem successful continuous deployment projects that use Nolio ASAP (http://www.noliosoft.com) to automatically deploy the application once the build is ready. As stated, chef, puppet and the like are good for middle-ware installations and configurations, but when you need to continuously release the new application versions, a platform such as Nolio ASAP, that is application centric, is better suited.
You should have the best IT operation folks create and approve the application release processes, and then provide an interface for the developers to run these processes on approved environments.