How do you apply patches on a web project at production server? - version-control

We recently had a project where we released beta of a big web app on our client's server. Our client requested us to do bug fixes as they come, and we tried to do it same way. Normally while building an app on our prototype server is way easier, as I just have to issue simple 'svn up' command which takes a second.
But on production environment, we do not have any version control tool available. Is it possible to automate the patching work, so that we need not to login to ftp and upload each a every file one by one?
Its very difficult to work this way. As I'm having this problem, its for sure that some of you have already solved the problem. Please share your solutions.
Looking forward to your replies... Thanks a lot for reading guys.

Depending on the tools available on the server, you could either do a svn diff -r x:y where x is the revision you last updated too and y the last revision you want to update to (probably the last revision on your repository) to generate a patch and then apply the patch with the patch command.
If rsync is available on the production platform, and you can use it (though ssh for instance) you could set up a production ready tree, rsync it on the production server, and when an update comes in, svn update your production tree, and rsync it again.

What is stopping you from installing a Subversion client on the production server?
[EDIT] So someone doesn't allow you to install the software you need on the server. The question is: What is more important? A stable production server or an arbitrary policy? If the someone doesn't listen to arguments, go to your computer, start MS Word and write this letter:
"I hereby refuse to accept any responsibility for the stability of our production system based on the fact that [insert name here] refuses to equip me with the tools to make sure that the production system contains all the necessary files and data after an installation."
Sign this, have your boss sign it and then send a copy to [insert name here]. All of a sudden, any problem that might arise after an installation will be on his turf. Or to put it more clearly: He will be responsible for any mistake you might make.
Now, all you have to do is wait. :)

Depends on the programming environment you use. In Smalltalk and the web application server like Aida/Web we can upgrade the live web applications on the fly, without stopping it.
The server is connected to the SCM of choice like Monticello for Squeak Smalltalk or Store for VisualWorks. New versions are then manually or automatically loaded to the server's Smalltalk image.

Related

Domino 8.5.3 - Create an organization extension library / codestore

This is a project I've been working on off and on for months and I feel like I'm pretty close, but I just can't seem to get past the final hurdle.
The goal is to develop an organization extension library that contains both internal and 3rd party code that we frequently rely on.
History
As a test project, I started with Apache Poi because that is already in wide use in our environment. I have a plug-in and feature built just from the Poi .jars that allows me to build our current Poi applications as long as I add the plug-in (from my workspace) to my build path. The apps work on the servers because we have already distributed the Poi .jars by manually copying them.
The next step is taking that plug-in and getting it into an updatesite so that all of the servers and developers can synchronize on one version. I found and followed these two excellent blog articles (that I wish existed when I started this project):
http://www.dalsgaard-data.eu/blog/wrap-an-existing-jar-file-into-a-plug-in/
http://www.dalsgaard-data.eu/blog/deploy-an-eclipse-update-site-to-ibm-domino-and-ibm-domino-designer/
With the caveat that the articles are written for Domino 9 and we are running 8.5.3 here, but that only matters in the last (installation) step.
Current
This brings us to the problem. All of the above seems to have worked great up to a point. I can install my feature to my designer client from the eclipse update site and it works great. However, the install is failing when I import that into our updatesite.nsf database. This means that while the developers can all install from the updatesite if I put it on a network drive, that doesn't deploy updates to our servers.
The problem is that when I try to install from the .nsf update site, the Eclipse Updater just hangs. I've let it go for well over an hour and eventually Notes becomes completely unresponsive.
So the question is, is there anything I might have done wrong, either in the development of the plug-in or server configuration that might be causing this issue?
Additional Info
I'm looking at the osgi console and that is largely unhelpful. I am getting the following errors as I'm trying to install: SEVERE Could not access digest on the site: no protocol: 0/5B004DDD5E38F3FF85257CAF004C72C7/$file/digest.zip ::class.method=unknown ::thread=Worker-7 ::loggername=org.eclipse.update.core
I could generate dumps if that would be useful.
Security is also locked down fairly tight here. It could be a security issue - is there a way to troubleshoot that? Once I get to the hang I'm just stuck guessing.
This has been edited for clarity and to update information
I know that this is post is over 5 years ago but...
for those that find this and are trying to resolve the error
SEVERE Could not access digest on the site: no protocol: "
is due to the update site project not having the URL of the Domino updatesite.nsf not being added to the Archives tab of the site.xml.
I found the updatesite.nsf also needs to be anonymously accessible as no credentials are prompted/passed through to the Domino server hosting the updatesite.nsf database (at least from DDE), YMMV from eclipse. So if Anonymous connections are blocked on the Domino server you will be out of luck.
To develop a plug-in you really want to have 3 projects:
the plug-in
the feature
the update site
Of course a feature can contain more than one plug-in (and probably should) and a update site can contain more than one feature (and probably should). Once you have an update site project it features a handy button "build all" that makes sure plug-in, feature and update-site get compiled in one go. And that button is what you really want.
You can point using a setting in your Domino Designer (or local Domino server) to the feature directory. Add a plain text .link file to framework/rcp/eclipse/links, that contains the path to your install site - it then picks up the features and plug-ins from there. After a build you would need to restart designer/server to activate the updated feature.
For the Domino server the approach using an updatesite.nsf and the respective notes.ini setting makes the most sense (to me). http restart required. Lazy people script the whole thing.
I still don't have a great answer for this, but I believe the issue is related to the environment here. I don't have the authority to change the environment, even if I were able to conclusively demonstrate it is the cause of this problem, so it is a moot point. All I can say is that at least one administrator computer had no issue installing from the update site.
For me, the solution for distributing the update site is to put it on a network drive and have everyone install it from there. The server has no problem using it from the updatesite.nsf.

What happens to existing workspaces after upgrading to TFS 2010

I was looking for some insight about what happens to existing workspaces and files that are already checked-out on people, after an upgrade to TFS2010. Surprisingly enough I can not find any satisfactory information on this. (I am talking about upgrading on new hardware by the way. Fresh TFS instance, upgraded databases)
I've checked TFS Installation guide, I searched through the web, all I could find is upgrade scenarios for the server side. Nobody even mentions what happens to source control clients.
I've created a virtual machine to test the upgrade process, The upgrade was successful and all my files and workspaces exist in the new server too. The problem is: The new TFS installation has a new instanceID. When I redirected on the clients to the new server, the client seemed unable to match files and file states in the workspace with the ones on the new server. This makes me wonder if it will be possible to keep working after the production upgrade.
As I mentioned above I can not find anything on this, it would be great if anyone could point me to some paper or blog post about this.
Thanks in advance...
When you do an upgrade your server ID should stay the same. You may need to chnage it is you want to clone your enviroment.
In your test senario you are creating a clone of the TFS server rather than a strate upgrade.
ChangeServerID
You are probably running into problems as this has been run on your test envionment to facilitate it runing on the same network as your production TFS server.
All workspaces and shelvesets remain unchanged, and people will be able to continue working immediately. Even checked-out files are OK and will be picked up correctly.
I would recommend upgrading the server first, and keep the clients as 2008 (using the Forward Compatibility Pack), and then upgrading the clients to 2010 as and when the projects are upgraded.

Deployment strategy for distributing application upgrades across large organization

We will be embarking on an Application developement project (.NET 3.5) for a large organization. As we started thinking about the upgrades we would be giving across the machines, we are looking at options like ClickOnce.
What we need is a push model, as long as the client machine is connected to the network, the server can send updates. I believe ClickOnce is a pull model(although by specifying minimum version we can kind of push). Also ClickOnce downloads complete files only, it cannot download the change (byte difference) among the files.
Can anyone point me to a better tool that can be used here. Also better strategies, if any, are welcome, we are in a very early stage of the project.
I don't have a definitive answer on better options, but I've used ClickOnce and can offer some advice.
There are several update options with ClickOnce (before starting, after starting, check every time, check every X Hours/Days/Weeks, etc). You can also throw those out and write code to check for updates. It's not a "push" from the server, but your client could poll for updates which would be the next best thing. Just remember, the application is going to have to restart after the update to see changes.
ClickOnce only downloads changed files. However, the progress dialog always shows the entire size of the application even if it's only downloading a single file. Everyone worries about that, but it's just a bug with the progress dialog.
Finally, I'm a big fan of keeping it simple. It's really easy to over-think these things and create a monstrosity that was never needed. We went through something similar at my company. We were so worried about users downloading unnecessary bytes, we broke our apps up into more, smaller assemblies. It turned into a nightmare; apps were harder to maintain and performed worse on the client. We finally undid it all and wasted weeks just to end up where we started.
I'm not saying you don't need the features you're asking for, I don't know your scenario. Just educate yourself first and know what you're getting yourself into.
We use clickonce at my company (about few hundred users for the app geographically dispersed). By specifying the minimum version we can make sure that every app installation gets updated after deployment automatically. You are right that clickonce downloads full files only but only files that have changed since previous version. If that is still a concern you can break your application into more smaller assemblies. I think you can also use netmodules but then Visual Studio has not built in support for that.
In general clickonce has worked good for us.
I am just in the process of implementing such a service on top of my distributed application platform. In essence I have developed a "push" model for corporates that follows these basic principles:
Software upgrades are "managed" from the server, NOT from the client, which is in line with the deployment of corporate software as opposed to user software (this is a very important point)
Software upgrades can be customised per client application on the server, i.e. the server can deploy unique configurations to every client if required
Software upgrades can be deployed to clients at different times, or all at the same time, or any combination of the two
The software upgrade version can be specified per client, i.e. different versions can be deployed to different clients as required
All software upgrades for all clients can be "managed" from a single server, i.e. the software upgrading "service" is consistent across any application, and all applications can utilise the software upgrading "service"
Clients can implement a software upgrade policy of automatic (application restarts as soon as the upgrade has been downloaded and available at the client), manual (application needs to be "sent" a custom "force upgrade"
message"), or on restart (application upgrades on shutdown if an upgrade has been downloaded and is available)
All auto-upgrading functionality is transparent to any running applications as this is all performed in autonomous background threads and all inter-process communication and file transfer is handled by my framework
In essence this now allows me (or will allow me when I have tidied a few things up and thoroughly tested the implementation) to manage the version of any application developed by me from a central server after it has been initially installed, without any client intervention.

What's best Drupal deployment strategy? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am working on my first Drupal project on XAMPP in my MacBook. It's a prototype and receives positive feedback from my client.
I am going to deploy the project on a Linux VPS two weeks later. Is there a better way than 're-do'ing everything on the server from scratch?
install Drupal
download modules (CCK, Views, Date, Calendar)
create the Contents
...
Thanks
A couple of tips:
Use source control, NOT FTP/etc., for the files. It doesn't matter what you use; we tend to spin up an Unfuddle.com subversion account for each client so they have a place to log bugs as well, but the critical first step is getting the full source tree of your site into version control. When changes are made on the testing server or staging server, you see if they work, you commit, then you update on the live server. Rollbacks and deployment gets a lot, lot simpler. For clusters of multiple webheads you can repeat the process, or rsync from a single 'canonical' server.
If you use SVN, though, you can also use CVS checkouts of Drupal and other modules/themes and the SVN/CVS metadata will be able to live beside each other happily.
For bulky folders like the files directory, use a symlink in the 'proper' location to point to a server-side directory outside of the webroot. That lets your source control repo include all the code and a symlink, instead of all the code and all the files users have uploaded.
Databases are trickier; cleaning up the dev/staging DB and pushing it to live is easiest for the initial rollout but there are a few wrinkles when doing incremental DB updates if users on the live site are also generating content.
I did a presentation on Drupal deployment best practices last year. Feel free to check the slides out.
Features.module is an extremely powerful tool for managing Drupal configuration changes.
Content Types, CCK settings, Views, Drupal Variables, Contexts, Imagecache presets, Menus, Taxonomies, and Permissions can all be rolled into a feature, which can be checked into version control. From there, deploying a new site, or pushing changes to an existing one, is easily managed with the Features UI or Drush.
Make sure you install Strongarm.module for exporting drupal config that gets stored in your Variables table. You can also static content/nodes (ie: about us, faqs, etc) into Features by installing uuid_features.module.
Hands down, this is the best way to work with other developers on the same site, and to move your site from Development to Testing to Staging and Production.
We've had an extensive discussion on this at my workplace, and the way we finally settled on was pushing code updates (including modules and themes) from development to staging to production. We're using Subversion for this, and it's working well so far.
What's particularly important is that you automate a process for pushing the database back from production, so that your developers can keep their copies of the database as close to production as possible. In a mission-critical environment, you want to be absolutely certain a module update isn't going to hose your database. The process we use is as follows:
Install a module on the development server.
Take note of whatever changes and updates were necessary. If there are any hitches, revert and do again until you have a solid, error-free process.
Test your changes! Repeat your testing process as a normal, logged-in user, and again as an anonymous user.
If the update process involved anything other than running update.php, then write a script to do it.
Copy the production database to your staging server, and perform the same steps immediately. If it fails, diagnose the failure and return to step 1. Otherwise, continue.
Test your changes!
BACK UP YOUR PRODUCTION DATABASE and TAKE NOTE OF THE REVISION YOU HAVE CHECKED OUT FROM SVN.
Put your production Drupal in maintenance mode, run "svn update" on your production tree, and go through your update process.
Take Drupal out of maintenance mode and test everything (as admin, regular user, and anonymous)
And that's it. One thing you can never really expect for a community framework such as Drupal is to be able to move your database from testing to production after you go live. From then on, all database moves are from production to testing, which complicates the deployment process somewhat. Be careful! :)
We use the Features module extensively to capture features and then install them easily at the production site.
I'm surprised that no one mentioned the Deployment module. Here is an excerpt from its project page:
... designed to allow users to easily stage content from one Drupal site to another. Deploy automatically manages dependencies between entities (like node references). It is designed to have a rich API which can be easily extended to be used in a variety of content staging situations.
I don't work with Drupal, but I do work with Joomla a lot. I deploy by archiving all the files in the web root (tar and gzip in my case, but you could use zip) and then uploading and expanding that archive on the production server. I then take a SQL dump (mysqldump -u user -h host -p databasename > dump.sql), upload that, and use the reverse command to insert the data (mysql -u produser -h prodDBserver -p prodDatabase < dump.sql). If you don't have shell access you can upload the files one at a time and write a PHP script to import dump.sql.
Any version control system (GIT, SVN) + Features module to deploy Drupal code + custom settings (content types, custom fields, module dependencies, views etc.).
As Deploy module is still in development mode, so you may like to use Node export module in Drupal 7 to deploy your content / nodes.
If you're new to deployment (and or Drupal) then be sure to do everything in one lump.
You have to be quite careful once there are users effecting content while you are working on another copy.
It is possible to leave the tables that relate to actual content, taxonomy, users, etc. rather than their structure. Then push the ones relating to configuration. However, this add an order of magnitude of complexity.
Apologies if deployment is something old hat to you, thus this is vaguely insulting.
A good strategy that I have found and am currently implementing is to use a combination of the deploy module to migrate my content, and then drush along with dbscripts to merge and update the core and modules. It takes care of database merging even if you have live content, security and module updates, and I currently have mine set up to work with svn.

Best version control for a one man web app?

I'm just learning how to do things, and want to start using some sort of version control for a web app.
What's most appropriate for deploying a python or php web app on my own? I'm using linux and have a linux server.
Thanks!
SVN, but you need to be able to easily deploy your webapp with SVN.
Since it is not always a simple task, so I just point out this article which may be of interest for your project.
General principle:
Configure Apache on your development server so that it picks up your checked out working copies as separate subdomains. Using this, you can simply make a checkout of your project and it will automagically be up and running. No need to touch the Apache configuration. You need a DNS wildcard entry so that all subdomains of dev.example.org go to your development server.
The only problem with using the above Apache configuration locally is the DNS wildcard. Unless your desktop is assigned a hostname by your network's DNS server and you can set the wildcard there, you will have to make do with your localhost address. You can install dnsmasq to act as a local caching DNS server and put the wildcard on your own machine
Use dnsmasq so you can achieve the same effect on your own development machine. That way you can develop your web applications locally and you won't need a central development server. In my examples I will be assuming you use subversion for your version control, but it works virtually the same with other version control packages, such as git or bazaar.
Note: (Humor)
This other question on Subversion allowed me to point out to this article about publishing its (source-controlled) data into production, with in it probably the ugliest diagram I ever saw on the topic ;-)
If I had not bumped into git, I would've doubtless gone with SVN. Having said that, I would recommend git.
Nowadays, I would certainly go with a distributed version control system. Setup is faster since you don't need to set up a version control server and everything, all you usually need to do is initialize a certain directory within your development box for version control and you're good to go. They also seem like the way to go these days. If it were 2001, I would recommend a centralized system like Subversion. But it's 2008, everyone is moving to distributed systems and user interfaces and supporting tools tend to get better.
Here are some suggestions for you:
Darcs: Easy to learn and has all the features you will usually need
Mercurial
Git: Powerful. May take some time to understand but evolves rapidly
All three of them should be readily available in your Linux-based OS through the usual package management solutions.
SVN is great.
Nowadays the hype around DVCS.
I prefer Bazaar.
Because of it's name, the support, the feature set, and it works well on my window$ machine too.
I'm using unfuddle.com and I love it. It's free for a one person web app
The answer really depends on your way of thinking. I personally had problems switching to subversion from SourceSafe. If you come from microsoft shop, I'd suggest using SourceGear Vault, it is free for <=2 users. If you come from non microsoft area, then using subversion would be preferrable. Also please consider git if working on linux.
HTH, Valve.
Personally I use monotone, learning a DVCS is definitely the way forward.
For a one-man job, pretty much any revision control system will do the job. It's when you get into multiple people, and past that into multiple repositories, where there start to be differences.
Given that, I'd go with whatever Free Software system your development environment supports best. I see Subversion and Git mentioned and both are fine choices.
SVN would been my first choice. If I have to take a second choice I would go to CVS.
One of the most popular models out there today is Subversion. It's generally easy to setup & configure and is able to handle multiple platforms.
SVN. If one does not need concurrent access (which is your case), it is VERY easy to setup as no server is required at all. Definitely your weapon of choice.
I wholeheartedly agree with SVN. Command-line SVN is quite easy too.
While I like svn a lot, I've found mercurial handy for having the whole repository locally. (the same goes for git, but its interface is a little less polished in my opinion.)
I'm not able to answer the question as asked, because I don't develop on a Linux server.
But maybe this experience has a counterpart in Linux world.
I use a local-on-my-LAN-only IIS server (actually on an old laptop that no longer travels but works as a little server). I have VSS installed on that server too. There is an integration between the IIS Server, the FrontPage extensions on that server, and the VSS.
The upshot is that I can use FrontPage to build and edit my site and build a development image that is always backed up in VSS, and I can check out, check in, and do all of that from within FrontPage.
Now, the way I publish is I take advantage of the sharing capability of VSS so I have a deployment image that shares with the project that is actually an IIS web site. I have a deployment-image directory that I can transfer the latest checked-in material to (material that has not changed is not updated). I then deploy the deployment image to the hosted, public web site using FTP (again, only transfering new and updated files).
I present all of these details to suggest what might be the use-case of interest, even though a different solution approach is needed with Linux.
If I wasn't using a tool that integrated with the web server and also the source control at the server, I could do something similar by checking the VSS material in and out of a local directory and then pushing the updated VSS project to the IIS server web-pages directory hierarchy. The workflow is a little more clumsy. In this case, I would not edit pages directly on the development web server unless I could lock check-in pages as read-only or something.
Does this suggest anything that might be appealing in the Linux server case?
Definitively Mercurial is a good choice, quick, easy to use, perfect for working alone, or with multiple other developer, perfectly multiplateform, handles merges, branches, etc. very simply, plugin based, there are great tools out there such as nice IDE plugins (notably Netbeans and Eclipse).
Robust, it works just as you a expect such a tool to work, not like SVN (and I have years of day to day)...
Both Sun, Xen and Mozilla host all their repos on Mercurial. We're currently moving from SVN to Mercurial after a 6 month daily test, without any regret.
I once used Perforce and was impressed with it. There's GUI and command line versions and it supports Windows, Linux, Mac and Unix for both the server and client. It integrates with Eclipse and has APIs for writing your own client applications (C/C++, Ruby, Perl, Python) It only supports two users and five workspaces before you need to buy licenses though (but that is within the scope of this question).
Subversion is a good choice. For the client, there's TortoiseSVN (http://tortoisesvn.tigris.org/) that integrates with the shell and lets you do things with a right click on a folder. For integration with Visual Studio (I'll assume that's your environment) there's VisualSVN (http://www.visualsvn.com/) and AnhkSVN (http://ankhsvn.open.collab.net/). For the server there's a one-click installer you can find here (http://svn1clicksetup.tigris.org/) that does the setup in a snap. VisualSVN also has a (free) server that you can use which provides it's own web access and security (rather than using apache) and has a mmc-snapin for managing/creating repositories and users.
CVS - No, I'm not joking. Not that it is better (it is not) or the simplest (it isn't), but it really doesn't matter at the end of the day. The important thing is to get started with ANY version control system even if it is a one-developer shop, even if it is CVS.