Performing a partial export from Nexus - deployment

I haven't worked much with Nexus before, so I'm still trying to work out a product lifecycle that works for us.
I want to be able to export certain sets of repository artifacts from nexus into another nexus container. It looks like the only way to do this, as of now, is to pull artifacts as a set of dependency builds and then deploy them to the new repository. This may be what we have to go with, I was just looking for a better approach.
It looks like mirroring or proxying won't give us the fine grain control of export that I need.
I see that I can just copy artifacts out of nexus, but I'm not sure how to tell the new nexus container that it is supposed to manage those files.
What I want to do is be able to put a set of artifacts onto a DVD that can be run as a localized nexus instance for the purpose of installing software at a customer site. It appears that for any customer that will allow a connection back to us for software installs, they can be treated with the same install setup we use for QA. The reason to use a nexus deploy instead of an installer is because we need to be able to roll back a "patch install", as each path/install set would be maintained as a release version. Right now this is all done in custom code, since there doesn't seem to be an installer that handles rollbacks (with backups) after the install has completed.

This would be a non-standard usecase for Nexus and it strikes me that Nexus would be overkill for your requirements. Any web server can act as a Maven repository, once the files are available in the correct format.
For example, why not burn the desired subset of repository files onto DVD and include a copy of Jetty? Jetty which can be launched from the DVD and serve up the local content over HTTP.

Related

Continuous delivery with capistrano/chef/puppet: where do you store your artifacts?

I've been reading up on how people do continuous delivery with some of the popular toolsets.
Lots of posts (like this one) seem to indicate that a common way of doing things is to use something like capistrano to push software from your builds to your machines, and then chef or puppet to configure anything related to it.
My question is, do people generally push there software directly into a special git repo for binary assets, or can capistrano fetch it out of a maven repo? The maven approach seems most natural to me, but I don't seem to be able to find much information on it - which is what makes me think it's not the approach that people are generally taking.
Basically, I'm slightly confused as there seems to be a gap between the build output (where one would normally publish to a maven repo) - and where the delivery tools expect to find the software you have asked them to deploy (which seems to be a file system, or a git repo)
When it comes to artifacts; I attempt to leverage the jenkins plugin to upload to S3. Here's a link to it.
Basiclly right now, all my ci goes through Jenkins and when I get a complete build I upload it to a bucket and have chef pull the tarball/war/gem from it and install it from there.

Output binary files linked my version-control server without a build system?

I am trying to setup a internal Mercurial HgWeb server on a Windows 2003 server. The Hgweb part is working. I could just share a folder to put released binary files for each projects. But I am wandering could I still somehow link the version control system with binary build output. So when there is a commit, the build output will automated get update as well for a release?
I know I could have a build system on the server end. But for Delphi, C#, ASP.NET projects and with a few third-party libraries, it seems much more work.
Right now, I am thinking about for each project I will have two repository, one for development (not output binary), the other for release which will including everything including the build result binaries (or only build result including dependency will be a better idea?). But I don't know yet how to make those two synchronize automatically without manually commit twice.
Maybe simply a hook on Dev repository fires every time commit to Master branch which will make another commit to the Release branch?
You really need a build system like CruiseControl.NET to build your binaries after pushes happen to a remote repository that CC.NET is watching. The binaries built can then just be copied to a standard Web server to be served up for download. CC.NET is not complicated to configure and supports Mercurial out-of-the-box. Using a system like this, you can get the extras like build stats, run unit tests before pushing a build to be downloaded, and lots more.

is there a deploy tool (or set of tools) that supports rollback of a deployment?

I'm learning FluentMigrator. The thing that I like about FM is that it supports the idea of Forward and Back for migrations (aka Up/Down). I'm finding that it's not ideal about this; there are some holes. Still, it's good.
This leads me to wonder if there are any deployment tools (nant, msbuild or other) that support this idea of rolling forward and back. The scenario that I'm using it in is the deployment of a web app with a related database.
Ideally I'd like to set up my deployment so that, should any part of it fail, it will revert to the previous known working configuration. With FM, this is pretty easy to do (but there are rough spots), so that covers the db. How about the files that make up the web app? Do any deploy tools have support for this?
Deploying to a Windows Server. Assume that I can't make any changes to the server.
I don't know of any Microsoft-centric, automated provisioning/deployment tools like Capistrano. Here are some tools I've heard of, but never used:
MSDeploy, for deploying web application.
Microsoft Deployment Services, for managing operating system configuration
Microsoft's System Center Configuration Manager
BladeLogic
HP's Operations Center
Up until about three months ago, we did our deployment/provisioning using custom MSBuild scripts. After a server is provisioned, deploys happen automatically using Robocopy to copy files to a share on the application server, updating changed application binaries and markup files. We've never had a need to rollback any of our deployments, but since our scripts are custom, we could write the logic if we needed to.
MSBuild is a terrible deployment/provisioning language. For the past three months, we've been writing all new scripts in, and porting existing ones to, PowerShell. It is wonderful. With version 2, there is support for running commands on remote servers, like SSH. We haven't used that functionality yet, but I'm looking forward to pushing setup scripts to remote server to provision and deploy at the same time.
We have been using Git to do our deploys for the last 6 months.
Here is the whole process:
CI server build the project
CI server checks it in to a local git repository
CI server pushes the changes to the centralised git repository
User creates an empty repository on the live server
User adds the central git repository to the remotes
User pulls the latest version over https (no need to open any ports)
It is a lot to setup in the beginning but once setup it works great. Deploys take seconds as only changed files get copied.
Another great thing about this method is that git keeps history of changes so rolling back is pretty simple. You can also roll back a few revisions and it's done straight on the live server. If something goes wrong reverting is super fast.
Also you can save some time if you use a hosted git service (github) for your central repository.
This is a very brief description but I can give you more info if you want.
Of course! My favorite is Capistrano. This was originally built for Ruby but I've found that it works just as well for other languages.
https://github.com/capistrano/capistrano

straightforward single developer deployment with mercurial and netbeans?

I am coding a website using the Codeigniter PHP framework.
I am using mercurial for version control.
I have 3 systems I work with. I do my coding on a Windows 7 machine using Netbeans 6.9.1. I am occasionally making commits, and pushing to a repository at Bitbucket.org, purely for the purposes of backup and version control.
I have a "beta" website (on a shared Linux box with it's own dedicated IP address) that I upload to using FTP, where I can test that everything is working as intended on an actual site running Linux.
Once I'm happy with that, I upload to my "live" site, which is on it's own dedicated server. Again I'm just using FTP to upload the files from my development server.
I realize that this is all kinds of wrong. For one thing I have to go in and change some things on the beta and live machines so that they're referring to the correct domain name, instead of localhost. For another, I'm not making use of mercurial at all to help with this. I assume instead of uploading from FTP, I could be using mercurial to "grab" a particular revision that I've marked as ready to deploy. I also think I could possibly be doing something in Netbeans differently to make the process easier.
What I want to do is have some very smoothe way to control all this, and hopefully one that knows how to deal with the issue of a slightly different configuration setup for the beta and live sites from the localhost.
Is there a standard way to do what I'm looking for? I've seen references to some third party apps for "continuous integration" but I'm not sure I need anything like that.
I'm a little lost as to what would be the SIMPLEST thing for me to do that would make my life easier....any help greatly appreciated :) Thanks!
It depends on how different the setup for each site is, and if there are secrets involved, which should not be visible on a public place (I assume you use a public bitbucket repository).
If the changes are not sensitive, then you can add two additional branches for your test and production servers, where only the configuration changes are applied. Every time you change something in default and deploy it to test, you would simply merge default on top of test, and mercurial fill in the different configuration settings in the process. Then the server deployment wold be a call to hg archive within the correct branch.
A typical change history would look like this:
O----o-o-o-o-o-o-o-o---o default
\ \ \
T1--------T2-----------T3 test
\ \
P1---------------------P2 production
where in T1 and P1 the parameters for test and production are filled in. You also can use this branch setup to mature the development of your site, where you hack in default, and only propagate stable changes into test and production.
If the changes are sensitive, you can create a non-versionized deploy script (or better a versionized deployment script and a not versionized configuration file), which patches the output of hg archive.
You should use deployment scripts anyway, which handles the packaging of the product and deploy an the target in an automated and standardized way. Within this script you can also embed information about the source revision into the final archive.
Note that this model works fine for an environment, where no changes are made on the server. If you do changes to the product on the server, you need to copy the files from the server back into your development environment(at the correct revision), to check what was changed on the server. When you want to make changes also on the server, you might want to install mercurial also there.

Eclipse / Aptana File Sync Solutions

Our development team uses Eclipse + Aptana to do their web development work. Currently, most of them are mapping their Eclipse projects directly to the web server. I'd rather them create a local project and use that to sync to the web server project directory they are working on.
The issue is that there aren't any good solutions which is just appalling given the popularity of the two.
The FileSync plugin for Eclipse is only one-way. Meaning if another developer makes a change to the file on the server, another dev isn't even notified and could overwrite the change.
The File Transfer option in Aptana 2.0 doesn't support any sort of Sync, just manually uploading/downloading files.
The Sync option in Aptana 1.5.1 doesn't allow you to merge files when they are different. You can only update one or the other. It does however allow you to view a diff (but only if you right click and select) and in that diff you can't make any changes.
I did find a way to allow files to be uploaded to their Sync repositories in Aptana using Eclipse Monkey. However it doesn't work if a user saves multiple files at once, 'Save All', again it doesn't work. And additionally, there is no notification if a user opens a local file that has an updated copy on the server. I tried to add one using Eclipse Monkey but I couldn't find any sort of listener in the Eclipse API to do it and any Eclipse Monkey documentation is far and few between.
My only solution at this point is just to let them continue to map directly to the server or ask them to do a manual download before they do any work (but again what if someone uploads a change right after they do that).
Anyone have any ideas?
April 2010
Add EGit to your Eclipse+Aptana setup, and:
let developers push to a local bare repo their developments (see also this post)
let your local project be updated by a git pull from that same local bare repo (creating/updating) a local working directory with sources merged/updated (or by using a post-update hook as described in my previous SO link)
let your local Aptana+Eclipse(+EGit) reference that local working directory, also used by your web server.
In short, when you are speaking of file synchronization + merges, this is a job for a (D)VCS (Version Control System: Centralized or Distributed VCS)
Oct 2011: as xmedeko mentions in the comments, Aptana3 has its own Git plugin.
And it isn't very compatible with EGit: See bug 1988.
Adding to VonC answer (which is correct IMHO), what probably lies beneath this scenario is that the process you adopted is not correct in itself, apart from the tools used.
If I understood well, you should not allow nor perform a direct upload from a development version of the project to the web server. Merging is not a job for remote synchronization tools, and it should happen well before the deployment phase (upload to web server is practically a deploy).
You should have a dedicated repository taken from some point in development history (according to you release timeline), a point where merge has already happened. Then deploy it (by means of file synchronization if you want, but that is not mandatory) on a local/staging web server.
Perform there any test you run on the web site actively running (i.e. integration and/or functional tests). If there's any bug & fixing, well there are different ways to actually apply the fixes on development & staging code repository. Only after that, you deploy the staging repository on to production web server (again, synchronization tools are a way to do that).