eclipse: Automatic branch compilation checking - eclipse

I have a task to check compilation of code from one of our branches, lets call it "stable". Our team mainly working in "HEAD" branch and sometimes making some merges to "stable" branch. I've made a local copy of our main workspace and named it "main-workspace-stable" and replaced all of the code by code from "stable" branch. Now at the end of every work day I open that workspace, update the code to the most recent and wait for the compilation result.
This is pretty boring. Is there any ways to automate that task?

You might want to look at one of the Continuous Integration tools, for example Cruise Control or Hudson.

As mentioned by ChssPly76, Continuous Integration is the way to implement what you want to do.
Regarding Hudson, I can mention hudson-eclipse, in order for you to monitor a Hudson health icon, displayed at the bottom of the Eclipse window. The icon is red on build failure and green on success.
You can couple that with an automatic publication of your project as a maven snapshot artifact (also triggered by Hudson).
And have Maven compile your Eclipse project in that special Continuous Integration Environment.
The idea is not to rely on a pure Eclipse solution, because eclipse cannot be up and running every environment, especially those where you do not actively develop, but only compile and launch tests.

Related

jenkins continuous delivery with shared workspace

Background:
We have one Jenkins job (Production) to build a deliverable every night. We have another job (ProductionPush) that pushes out the deliverable over a proprietary protocol to production machines the next day. This is because some production machines are only available during certain hours during the day (It also gives us a chance to fix any last-minute build breaks). ProductionPush needs access to the deliverable built by the Production job (so it needs access to the same workspace). We have multiple nodes and concurrent builds (and thus unpredictable workspaces) and prefer not to tie the jobs to a fixed node/workspace since resources are somewhat limited.
Questions:
How to make sure both jobs share the same workspace and ensure that ProductionPush runs at a fixed time the next day only if Production succeeds -- without fixing both jobs to run out of the same node/workspace? I know the Parameterized Trigger Plugin might help with some of this but it does not seem to have time delay capability and 12 hours seems too long for a quiet period.
Is sharing the workspace a bad idea?
Answer 2: Yes, sharing workspace is a bad idea. There is possibility of file locks. There is the issue of workspace being wiped out. Just don't do it...
Answer 1: What you need is to Archive the artifacts of the build. This way, the artifacts for a particular build (by build number) will always be available, regardless of whether another build is running or not, or what state the workspaces are in
To Archive the artifacts
In your build job, under Post-build Actions, select Archive the artifacts
Specify what artifacts to archive (you can use a combination of below)
a) You can archive all: *.*
b) You can archive a particular file with wildcards: /path/to/file_version*.zip
c) You can ignore the intermediate directories like: **/file_version*.zip
To avoid storage problems with many artifacts, on the top of configuration you can select Discard Old Builds, Click Advanced button, and play around with Days to keep artifacts and Max # of builds to keep with artifacts. Note that these two settings do not control for how long the actual builds are kept (other settings control that)
To access artifacts from Jenkins
In the build history, select any previous build you want.
In addition to SCM changes and revisions data, you will now have a Build Artifacts link, under which you will find all the artifacts for that particular build.
You can also access them with Jenkins' permalinks, for example
http://JENKINS_URL/job/JOB_NAME/lastSuccessfulBuild/artifact/ and then the name of the artifact.
To access artifacts from another job
I've extensively explained how to access previous artifacts from another deploy job (in your example, ProductionPush) over here:
How to promote a specific build number from another job in Jenkins?
If your requirements are to always deploy latest build to Production, you can skip the configuration of promotion in the above link. Just follow the steps for configuration of the deploy job. Once you have your deploy job, if it is always run at the same time, just configure its Build periodically parameters. Alternatively, you can have yet another job that will trigger the deploy job based on whatever conditions you want.
In either case above, if your Default Selector is set to Latest successful build (as explained in the link above), the latest build will be pushed to Production
Not sure archiving artifacts is really a good idea. A staging repository might be better as it enables cross-functional teams to share artifacts across different builds when required by tweaking the Maven settings.xml file.
You really want a deployable (ear/war) as the thing that gets built, tested, then promoted to production once confidence is high with the build.
Use a build number on your deployable (major.minor.buildnumber). This is the thing you promote to production, providing your tests can be relied upon. Don't use a hyphen to separate minor with build number as that forces Maven to perform a lexical comparison... a decimal point will force a numeric comparison which will give you far less headaches.
Also, you didn't mention your target platform, but using the Maven APT/RPM plugin to push an APT/RPM to a APT/YUM repo that's available to a production box (AFTER successful testing!) would be a good fit, as per industry standards?

${CHANGES} does not work in the mail-ext plugin if jenkins job is driven by a bash script

I have setup a Jenkins job to build a project. I'm using email-ext plugin to send out build notifications with the intent of showing who did what and the path to the files changed. But unfortunately I'm not getting anything. I believe the reason why is that under "Source Code Management" I'm setting it to "None". My shell script that I'm using to drive the build is responsible for check-in out a copy of the code based on a CVS tag and run maven to do the build. In the ext-email i'm using the following syntax
${CHANGES_SINCE_LAST_SUCCESS, reverse=true, showPaths=true,
format="\n====\nChanges for Build # %n\n%c\n",
changesFormat="\n[%r] %d %a %m %p\n"}
Same thing with CHANGES: ${CHANGES, showPaths=true}
Is there a way of getting CHANGES and CHANGES_SINCE_LAST_SUCCESS to work if None option is used under Source Code Management?
Thanks for your help folks.
EmailExt plugin gets that info from Jenkins. As Jenkins has access to that info only via its SCM plugins the answer is "no", you can't do it without specifying the SCM option.
There are two things you can do:
(1) Do it by hand. Which with CVS, if I remember correctly, means having a working copy checked out anyway.
(2) Use SCM checkout/update option, but store the working copy on the side without using it in the build. You'll use twice as much disk-space, but nowadays disk-space is not a problem.
By the way, why are you using CVS? SVN, GIT, and Mercurial are all free.

Output binary files linked my version-control server without a build system?

I am trying to setup a internal Mercurial HgWeb server on a Windows 2003 server. The Hgweb part is working. I could just share a folder to put released binary files for each projects. But I am wandering could I still somehow link the version control system with binary build output. So when there is a commit, the build output will automated get update as well for a release?
I know I could have a build system on the server end. But for Delphi, C#, ASP.NET projects and with a few third-party libraries, it seems much more work.
Right now, I am thinking about for each project I will have two repository, one for development (not output binary), the other for release which will including everything including the build result binaries (or only build result including dependency will be a better idea?). But I don't know yet how to make those two synchronize automatically without manually commit twice.
Maybe simply a hook on Dev repository fires every time commit to Master branch which will make another commit to the Release branch?
You really need a build system like CruiseControl.NET to build your binaries after pushes happen to a remote repository that CC.NET is watching. The binaries built can then just be copied to a standard Web server to be served up for download. CC.NET is not complicated to configure and supports Mercurial out-of-the-box. Using a system like this, you can get the extras like build stats, run unit tests before pushing a build to be downloaded, and lots more.

TeamCity Projects and Multiple SVN Branches

In the spirit of keeping my SVN trunk clean and ready for deployment, I've been utilizing the following source control model. For the impatient, the basic concept is that you create development branches to do actual development, and leave the trunk clean and ready for deployment, at any time (no junk in the trunk).
In addition to this, I am configuring TeamCity for continuous integration. Within TeamCity, I'd like to ensure that all development branches, as well as the deployment-ready branch (the trunk, in my case) build correctly and pass all unit tests.
This might be a stupid question, but not being overly familiar with TeamCity, should I create a new TeamCity project for each branch? The deployment-ready branch, in particular, has a few additional rules than the development branch. For example, releases should be saved in versioned directories on the file system (e.g., C:\Projects\MyProject\1.0.187..., C:\Projects\MyProject\1.0.188...) to enable easy access to the binaries, at any point in time. On the other hands, saving versioned copies of the assemblies in the development branches is not necessary and would waste hard disk space.
Within TeamCity, I'd prefer to see only a single project for each software project. In other words, if my company is working on X number of development projects, I'd prefer to see that project listed only once, not X * 2 (assuming each project has only two branches).
You only need to create a single project, but you will need multiple build configurations - 1 for each branch. As far as I know, you can't customize the artifact folder name on disk (it's an auto-increment number), however you can download all artifacts as a zip file in TeamCity 4.5 from the UI. There is also a scheduler included with TeamCity that lets you cleanup artifacts so they don't consume too much disk space.
TeamCity 2018.1.5
TeamCity doesn't support multi branches for SVN as for GIT - so I solved such problem with Configuration parameter - where I set active branch from which I need to build and after can easily switch to another branch by running a custom build or change that configuration parameter.
After need just configures triggers to start building from a specific branch:
So on project side you can see different branches
And easily switch between branches by running Custom Build and change branch there:

Controlled Integration of Changes with Continuous Integration

I have a NSIS installer that we previously built using nAnt scripts that copy some files around and run makensis.exe via a exec task to build the installer exe. After the nant script completes, I have the compelte structure for our CD and also our download.
I was just doing a get from sourcesafe onto an unused desktop and using it as a build box, compiling there. Sometimes we would have a couple of files checked in that fix something critical. In those cases I would go to the build box, and very selectively get only those files, to avoid getting other changed files that we aren't ready to release yet. Basically I am able to allow development to continue and selectively include certain changed files into the installer for release.
Now we no longer have a free box, and need to build from our server. So I am setting up CI Factory so that the developer can kick the build off without remoting into the server. The one issue I am struggling with, is the best way to continue to allow this selective change control to occur. The default concept of CI that CI Factory implements is fine for internal development "head". However, I also want to setup a CCNet project that is run only on demand via a Force Build for this "public release" type of build.
This is what I've brain stormed so far, without being sure how well this will work, if at all(still figuring out what CCNet and CI Factory are all about). The "public release" CCNet project config/build would be setup such that it would not get latest. Modifications would not trigger a build. Since the other CCNet project that is using the default CI methodology(we'll call it the "CI project") of getting latest when changes are detected, then these two projects can't share the same working directory. So the "public release" would need a different working copy, so that its files won't get updated when the CI project's build is triggered. The developer would need to remote into the server, one VSS, selectively do a get into the "public release"'s working copy, and then force a build through CI Factory.
The disadvantage's I see with this is
1) Having to remote in to selectively do gets.
2) I have no idea how to allow a single CI Factory project to have two different working copies of the Product folder, so that each project configuration block has it's own.
3) I'm afraid of what kind of strangeness this might cause. I'm not quite sure yet how to specify a source control block in CCNet project config block, but prevent it from doing a get latest when it builds. I'm still gradually figuring out what things are in scripts and can be easily taken out without breaking other things, versus what is not meant to be mucked around with and/or is not configurable.
I would really like to hear about how others deal with this issue of selectively releasing changes, if you have a similar situation. I am constrained to VSS, so my immediate need is to solve this with that in mind, but at the same time I'd be interested in hearing how you manage this with other source control systems. I guess you would probably have a branch that is your latest developments branch, and then merge changes into the trunk whenever you want to release them? I really don't trust VSS for branching/merging, and I think the branching concepts might be a little too much overhead and learning curve for this shop. Like I said though, stories with other source control systems would be useful future knowledge for me.
Thanks in advance.
You need a branching structure in your repository to facilitate this. Something like the release branch method. Only select individuals can commit to this branch (or have a release/stable for that). Set up your manual CI launches to pull from the release branch as release nightly promote to milestone or final from there. I don't like the idea of manually modifying things on your build machine. Set up the changes in version control, in a safe place to prepare your release and let CI build from there, but manually triggered.
Check out these branching patterns. I suggested C3, codeline-per-release, often called release branching.
Heres an article on VSS branching that includes a link to merging.
This question looks similar.
Maybe you could move to another source control system with better support for this kind of thing. Any suggestions from MS people out there?