Project Structure In The SCM…

April 29, 2008

As a follow-on to the last post Self-Monitoring Build System, I figured I’d discuss the way I manage my projects and their dependencies. I believe it’s important to have a self-contained project that allows “time to login screen” to remain as small as possible. I do this by packaging the source, libraries, tools, basically everything under a single space within the SCM. This puts the control and responsibility for managing dependencies where I feel it should be; in the IDE so that, ultimately, it’s in the developer’s hands rather than in the SCM itself. Plus, it makes sure that the build server will always have what it needs to build the project without needed extra installs. If you keep the build server clean, you won’t be in a pinch should it take a dirt nap. (You ARE using VMs for your build servers aren’t you??) Keep in mind that when I’m saying “project”, I’m referring to a project in the sense of a product or system in the enterprise which is probably comprised of at least 1 Visual Studio solution and a few ancillary things.

The Setup

In an effort to keep things practical and concrete, here’s how I usually set things up. To start with I have a common project which contains a basic “template” build script and the tools that I like to use – NCover, NUnit, MbUnit, NAnt, ILMerge, Simian, etc. Now, all subsequent projects will be branched from this common project. This allows you to manage which version of the tools each project is running as well as easing the push to move a project to the latest and the places you have to check in that new version. Side note: There has been a recent discussion on the altdotnet group regarding svn setup and it has been interesting to see the discussion. However, this configuration assumes that you’re using a single repository because…well…that’s the only way I’ve worked thus far. If you’ve got an idea on how to make it work with multiples, I’d like to hear about it.


David O’Hara is a Senior Consultant with Improving Enterprises in Dallas, Texas.

Advertisements

Self-Monitoring Build System…

April 24, 2008

ourobouros.gif
Currently, I’m helping a client get their Continuous Integration system in a little better shape. They have CruiseControl.NET installed and monitoring their VSS repository (I know, I know, don’t get me started – I’m working on getting Subversion adopted). When spinning up a project, I’m a big advocate of having EVERYTHING under source control and that includes the build system itself. This is a brief guide on how that works.

CruiseControl Configuration

First you need to configure CC.NET for your environment so that it’s monitoring your SCM and ready to execute your build script. Here’s our ccnet.config that accomplishes this:


	
		c:\buildlogs\buildsystem\
   		c:\build\buildsystem\
		http://buildserver/ccnet/
		
			
		
		60
		
		   C:\Program Files\Microsoft Visual Studio\VSS\win32\SS.EXE
		   $/Repos/BuildSystem
		   build
		   buildpw
		   true
		
		
			
				tools\nant\nant.exe
				default.build
				300
			
		
		
			
			
				
					
				
				
					
				
			
		
	

NAnt Script

This file is then checked into a project called “BuildSystem”, to match the configuration file, along with a build script that will copy the edited files over top of the existing ones and a copy of NAnt to execute it.

< ?xml version="1.0" encoding="utf-8" ?>
<project name="AffiliateIntranet" default="deploy" xmlns="http://nant.sf.net/release/0.86-beta1/nant.xsd">
  <target name="deploy">
    <copy file="ccnet.config" todir="D:\Program Files\CruiseControl.NET\server" overwrite="true" />
    <copy file="dashboard.config" todir="D:\Program Files\CruiseControl.NET\webdashboard" overwrite="true" />
  </target>
</project>

Now when changes are made to the files and checked in, the build system will see the changes and copy the new files over top of the old ones. CC.NET will see the new file and restart in order to pick up the changes. This is why we have to do it using NAnt rather than the built-in version.

Why, Why, Why

While a configuration like this may seem like overkill, it helps to cut down on the amount you need to remote to the build server as well as decreasing the amount of time to spin up a new build server should the current one decide to take a “dirt nap”.


David O’Hara is a Senior Consultant with Improving Enterprises in Dallas, Texas.