Continuous Integration Basics: Part 2–Repeatable Builds with Build Automation

Repeatable builds are central to starting on the path to a successful continuous integration strategy. What I mean by repeatable builds is that a developer should be able to get the source code from a repository and in a single step run a set of complex tasks, tasks such as code quality checking, compiling the source code, running tests, determining code test coverage, labelling versions or deploying finished software. That single step process must produce the same result for every developer, on every machine so that developers can concentrate on writing code and know confidently that the supporting processes are in place and providing the desired feedback in a centrally managed manner.

There are a number of build automation platforms, NAnt, MSBuild and Rake to name a few of the popular systems in the .NET space. There is no right or wrong answer in choosing a build automation system, although I find MSBuild to be the one I use the most, primarily do the fact that many companies are not comfortable with non-Microsoft products. It also counts in MSBuild’s favour that any machine that has a full .NET framework installed automatically has MSBuild installed. Personally I think Rake, which is ruby based, is a fantastic product that provides a very clean way to configure your builds in a programmatic manner, however I will use MSBuild for all the examples in this post.

I will explore a scenario that includes:

  • Using MSBuild to prepare the project for building
  • Using StyleCop for Code Quality Analysis
  • Using MSBuild to label the source code for a particular build version
  • Using MSBuild to build the source code
  • Using NUnit to run Tests
  • Using MSpec to run Context Specifications
  • Using OpenCover to check the code coverage of the tests

The example code can be downloaded from my bitbucket repository.

Conventions

The first step to ensure that code can be built on any machine is to make sure that all source code dependencies are part of the source code repository so that there are no prerequisites for a build to succeed on a particular machine. I follow a set of conventions for every project that makes this as easy as possible and minimises the time to setup a new project.

FolderStructure2FolderStructure1

On the left above is the root of a new project containing a folder for source code (called src) and a folder for all external dependencies (called tools). The build script is in the root folder and is named the same as the solution file in the src directory. A reports directory is created during the build process where all tool output is collected possibly to be consumed by a CI tool. The tools directory contains the binary dependencies the source code requires and external tools the build process depends on, in the past I manually downloaded these tools and placed them in this directory however with the advent of package managers such as NuGet and OpenWrap I let these tools manage as many of these as possible.

While a build process can be used to perform many different operations, there are a small number of operations that cover what most projects commonly require. A single MSBuild file can contain configurations for many of these operations and when we run the build file we are able to choose one of these operations to perform, they are termed targets by MSBuild. A target may depend on other targets for example a target responsible for running tests will be dependent on the target responsible for compiling the source code as the test runner needs a compiled assembly to test. The generic targets I begin each project with an be seen below and consist of:

  • Clean (remove any artefacts from a previous build)
  • Label (apply a version number to the projects assemblies)
  • CodeQuality (scan the source code for coding convention irregularities)
  • Compile (compile the source code projects)
  • Test (run unit tests)
  • CodeCoverage (run unit tests and analyse their coverage)
  • Specs (run context specification tests)
  • CI (target used by a CI server that pulls many of the above tasks together, excluding code coverage)
  • CICoverage (target used by a CI server that pulls many of the above tasks together, including code coverage)
<PropertyGroup>
<!-- The build target configuration (Debug versus Release) -->
	<Configuration>Debug</Configuration>

	<!-- General Paths -->
	<!-- The root directory containing the build file -->
	<RootPath>$(MSBuildProjectDirectory)</RootPath>
	<!-- The source code directory -->
	<SrcBasePath>$(RootPath)\src</SrcBasePath>
	<!-- The tools directory -->
	<ToolsBasePath>$(RootPath)\tools</ToolsBasePath>
	<!-- The reports directory -->
	<ReportsPath>$(RootPath)\reports</ReportsPath>
	<!-- The Source Code Solution Name, this is conventions based and should be named the same as the build file.  e.g. example.sln should have a matching example.msbuild file in the top level directory -->
	<BuildSolutionFile>src\$(MSBuildProjectName).sln</BuildSolutionFile>
	<!-- The tools path for the MSBuild Extension Pack -->
	<MSBuildExtensionsPath>$(ToolsBasePath)\MSBuild\MSBuildExtensionPack</MSBuildExtensionsPath>
	<!-- The tools path for the MSBuild Community Tasks Pack -->
	<MSBuildCommunityExtensionsPath>$(ToolsBasePath)\MSBuild\MSBuildCommunityTasks</MSBuildCommunityExtensionsPath>

	<!-- NUnit -->
	<!-- The tools path for NUnit -->
	<NUnitPath>$(ToolsBasePath)\NUnit.2.5.10.11092\tools</NUnitPath>
	<!-- NUnit report name and location -->
	<NUnitOuputFile>$(ReportsPath)\tests-output.xml</NUnitOuputFile>

	<!-- StyleCop -->
	<!-- The tools path for StyleCop -->
	<StyleCopPath>$(ToolsBasePath)\StyleCop.4.6.3.0</StyleCopPath>
	<!-- The StyleCop report name and location -->
	<StyleCopOutputFile>$(ReportsPath)\stylecop-output.xml</StyleCopOutputFile>
	<!-- The StyleCop max violations count -->
	<StyleCopMaxViolationCount>50</StyleCopMaxViolationCount>
	<!-- StyleCop Force Full Analysis -->
	<StyleCopForceFullAnalysis>true</StyleCopForceFullAnalysis>
	<!-- StyleCop Treat Errors As Warnings -->
	<StyleCopTreatErrorsAsWarnings>true</StyleCopTreatErrorsAsWarnings>
	<!-- StyleCop Cache Results -->
	<StyleCopCacheResults>false</StyleCopCacheResults>

	<!-- MSpec -->
	<!-- The tools path for MSpec -->
	<MSpecPath>$(ToolsBasePath)\Machine.Specifications.0.4.24.0\tools</MSpecPath>
	<MSpecExecutable>mspec-clr4.exe</MSpecExecutable>
	<MSpecPathOutputFile>$(ReportsPath)\specs-output.xml</MSpecPathOutputFile>
	<MSpecSettings></MSpecSettings>

	<!-- OpenCover -->
	<!-- The tools path for OpenCover -->
	<OpenCoverPath>$(ToolsBasePath)\OpenCover.1.0.719</OpenCoverPath>
	<OpenCoverReportGenPath>$(ToolsBasePath)\ReportGenerator.1.2.1.0</OpenCoverReportGenPath>
	<!-- OpenCover report name and location -->
	<OpenCoverOuputFile>$(ReportsPath)\coverage-output.xml</OpenCoverOuputFile>
	<OpenCoverTmpOuputFile>$(ReportsPath)\coverage-tmp-output.xml</OpenCoverTmpOuputFile>

	<!-- Assembly Versioning -->
	<!-- Major -->
	<AssemblyMajorVersion>1</AssemblyMajorVersion>
	<!-- Minor -->
	<AssemblyMinorVersion>0</AssemblyMinorVersion>
	<!-- Build -->
	<AssemblyBuildNumber>0</AssemblyBuildNumber>
	<!-- Revision -->
	<AssemblyRevision>0</AssemblyRevision>

</PropertyGroup>

For tests I follow a convention that has each test assembly named such that it ends with “.Tests”, the same goes for each context specification assembly which ends in “.Specs”. By following these conventions and using no hard coded values in the MSBuild configuration I can reuse the same build file in a new project immediately by just creating the directory structure, adding the source code to the appropriate directories and naming the build file the same as the source code solution file. As the entire build is self contained we can integrate the results into the CI server of our choice and not have to rely on a particular CI servers features. The build can be executed by a developer or a CI server by running msbuild someproject.msbuild from the command line or for example msbuild someproject.msbuild /t:Tests to run just the Tests target.

MSBuild configuration file details

I’ll give a brief breakdown of what each section of my standard MSBuild build file does. There are plenty of comments in the file so it should be most self explanatory.

MSBuild configuration global properties section –

In this section I define paths, file names, executable tool locations, tool options and version numbers that are used throughout the configuration. All paths are defined relative to the MSBuild configuration file’s location and names are based on the MSBuild file’s name.

<PropertyGroup>
	<!-- The build target configuration (Debug versus Release) -->
	<Configuration>Debug</Configuration>

	<!-- General Paths -->
	<!-- The root directory containing the build file -->
	<RootPath>$(MSBuildProjectDirectory)</RootPath>
	<!-- The source code directory -->
	<SrcBasePath>$(RootPath)\src</SrcBasePath>
	<!-- The tools directory -->
	<ToolsBasePath>$(RootPath)\tools</ToolsBasePath>
	<!-- The reports directory -->
	<ReportsPath>$(RootPath)\reports</ReportsPath>
	<!-- The Source Code Solution Name, this is conventions based and should be named the same as the build file.
			 e.g. example.sln should have a matching example.msbuild file in the top level directory -->
	<BuildSolutionFile>src\$(MSBuildProjectName).sln</BuildSolutionFile>
	<!-- The tools path for the MSBuild Extension Pack -->
	<MSBuildExtensionsPath>$(ToolsBasePath)\MSBuild\MSBuildExtensionPack</MSBuildExtensionsPath>
	<!-- The tools path for the MSBuild Community Tasks Pack -->
	<MSBuildCommunityExtensionsPath>$(ToolsBasePath)\MSBuild\MSBuildCommunityTasks</MSBuildCommunityExtensionsPath>

	<!-- NUnit -->
	<!-- The tools path for NUnit -->
	<NUnitPath>$(ToolsBasePath)\NUnit.2.5.10.11092\tools</NUnitPath>
	<!-- NUnit report name and location -->
	<NUnitOuputFile>$(ReportsPath)\tests-output.xml</NUnitOuputFile>

	<!-- StyleCop -->
	<!-- The tools path for StyleCop -->
	<StyleCopPath>$(ToolsBasePath)\StyleCop.4.6.3.0</StyleCopPath>
	<!-- The StyleCop report name and location -->
	<StyleCopOutputFile>$(ReportsPath)\stylecop-output.xml</StyleCopOutputFile>
	<!-- The StyleCop max violations count -->
	<StyleCopMaxViolationCount>50</StyleCopMaxViolationCount>
	<!-- StyleCop Force Full Analysis -->
	<StyleCopForceFullAnalysis>true</StyleCopForceFullAnalysis>
	<!-- StyleCop Treat Errors As Warnings -->
	<StyleCopTreatErrorsAsWarnings>true</StyleCopTreatErrorsAsWarnings>
	<!-- StyleCop Cache Results -->
	<StyleCopCacheResults>false</StyleCopCacheResults>

	<!-- MSpec -->
	<!-- The tools path for MSpec -->
	<MSpecPath>$(ToolsBasePath)\Machine.Specifications.0.4.24.0\tools</MSpecPath>
	<MSpecExecutable>mspec-clr4.exe</MSpecExecutable>
	<MSpecPathOutputFile>$(ReportsPath)\specs-output.xml</MSpecPathOutputFile>
	<MSpecSettings></MSpecSettings>

	<!-- OpenCover -->
	<!-- The tools path for OpenCover -->
	<OpenCoverPath>$(ToolsBasePath)\OpenCover.1.0.719</OpenCoverPath>
	<OpenCoverReportGenPath>$(ToolsBasePath)\ReportGenerator.1.2.1.0</OpenCoverReportGenPath>
	<!-- OpenCover report name and location -->
	<OpenCoverOuputFile>$(ReportsPath)\coverage-output.xml</OpenCoverOuputFile>
	<OpenCoverTmpOuputFile>$(ReportsPath)\coverage-tmp-output.xml</OpenCoverTmpOuputFile>

	<!-- Assembly Versioning -->
	<!-- Major -->
	<AssemblyMajorVersion>1</AssemblyMajorVersion>
	<!-- Minor -->
	<AssemblyMinorVersion>0</AssemblyMinorVersion>
	<!-- Build -->
	<AssemblyBuildNumber>0</AssemblyBuildNumber>
	<!-- Revision -->
	<AssemblyRevision>0</AssemblyRevision>
</PropertyGroup>

 

 

Imported Tasks Section –

The targets in MSBuild consist of a set of MSBuild Tasks, many tasks are built-in but third party sets of tasks exist such as the MSBuildExtensionPack and MSBuildCommunityTasks. Below are some examples of adding third party tasks to the configuration file, I include these tasks in the tools directory:

<!--****************-->
<!-- Imported Tasks -->
<!--****************-->

<!-- Include the MSBuild Extension Pack NUnit Task -->
 <UsingTask AssemblyFile="$(MSBuildExtensionsPath)\MSBuild.ExtensionPack.dll"
TaskName="MSBuild.ExtensionPack.CodeQuality.NUnit"/>
<!-- Include the MSBuild Extension Pack AssemblyInfo Task -->
<UsingTask AssemblyFile="$(MSBuildExtensionsPath)\MSBuild.ExtensionPack.dll"
TaskName="MSBuild.ExtensionPack.Framework.AssemblyInfo"/>
<!-- Include the StyleCop task -->
<UsingTask AssemblyFile="$(StyleCopPath)\StyleCop.dll"
TaskName="Microsoft.StyleCop.StyleCopTask"/>

Build Targets Section –

 

The Clean Target:

<!-- The Clean Target -->
 <Target Name="Clean">
	<!-- Remove the reports directory if it already exists from a previous build -->
	<RemoveDir Directories="$(ReportsPath)" Condition = "Exists('$(ReportsPath)')" />
	<!-- Create the reports directory for this builds output -->
	<MakeDir Directories = "$(ReportsPath)"  />
	<!-- Clean the source code projects -->
	<MSBuild Projects="$(BuildSolutionFile)" ContinueOnError="false" Targets="Clean"
	Properties="Configuration=$(Configuration)" />
  </Target>

The clean target firstly removes the reporting directory from a previous run if it exists using a Condition construct before recreating an empty reports directory. The MSBuild task is used to perform operations on a Visual Studio solution file, here we tell MSBuild to execute the built in “Clean” Target on the solution. This is the same as if you clicked the Build –> Clean Solution menu in VIsual Studio.

The Label Target:

 <!-- The Label Target that sets the AssemblyInfo Build Version -->
 <Target Name="Label">
	<!-- Include all assemblies that end in Tests.dll (This is convention based) ->
	<CreateItem Include="**\AssemblyInfo.cs" exclude="**\*.Tests\Properties AssemblyInfo.cs;**\*.Specs\Properties\AssemblyInfo.cs">
		<Output TaskParameter="Include" ItemName="AssemblyInfoFiles" />
	</CreateItem>
	<!-- Update the Assembly and File Version -->
	<MSBuild.ExtensionPack.Framework.AssemblyInfo AssemblyInfoFiles="@(AssemblyInfoFiles)" SkipVersioning="false" Condition="'$(CCNetLabel)' != ''"
		AssemblyMajorVersion="$(AssemblyMajorVersion)"
		AssemblyMinorVersion="$(AssemblyMinorVersion)"
		AssemblyBuildNumber="$(AssemblyBuildNumber)"
		AssemblyRevision="$(CCNetLabel)"
		AssemblyFileMajorVersion="$(AssemblyMajorVersion)"
		AssemblyFileMinorVersion="$(AssemblyMinorVersion)"
		AssemblyFileBuildNumber="$(AssemblyBuildNumber)"
		AssemblyFileRevision="$(CCNetLabel)"
		/>
 </Target>

The label target is mostly used by a CI server to version a particular build, this excerpt is from a CruisCeontrol.Net CI Server setup and uses the global variable “CCNetLabel” to set the revision number of the assembly and assemblyfile details. I use the MSBuildExtensionPack task to accomplish this. The CreateItem task is used to get a list of AssemblyInfo.cs files from the solution that exclude the Tests and Specs assemblies if they conform to the conventions of having names ending with .Tests or .Specs.

The Code Quality Target:

 <!-- The Code Quality Target, checks the source code for stylistic compliance via StyleCop -->
<Target Name="CodeQuality">
	<!-- Create a collection of files to scan -->
	<CreateItem Include="$(SrcBasePath)\**\*.cs">
		<Output TaskParameter="Include" ItemName="StyleCopFiles"/>
	</CreateItem>
	<!-- Run the StyleCop MSBuild task -->
	<Microsoft.StyleCop.StyleCopTask
		ProjectFullPath="$(RootPath)"
		SourceFiles="@(StyleCopFiles)"
		ForceFullAnalysis="$(StyleCopForceFullAnalysis)"
		TreatErrorsAsWarnings="$(StyleCopTreatErrorsAsWarnings)"
		CacheResults="$(StyleCopCacheResults)"
		OverrideSettingsFile="$(SrcBasePath)\Settings.StyleCop"
		OutputFile="$(StyleCopOutputFile)"
		MaxViolationCount="$(StyleCopMaxViolationCount)">
	</Microsoft.StyleCop.StyleCopTask>
</Target>

The Code Quality task uses the StyleCop task to scan the solution code file for stylistic compliance. An interesting parameter to take note of is “TreatErrorsAsWarings”, if you set this to false your build will fail on stylistic errors, this is desirable on new projects but may be difficult on existing projects until the code base is cleaned up with a tool such as Resharper or CodeMaid.

The Compile Target:

<!-- The Compile Target, compiles the source code for the solution -->
<Target Name="Compile" DependsOnTargets="Clean">
	<MSBuild Projects="$(BuildSolutionFile)" ContinueOnError="false" Properties="Configuration=$(Configuration)">
		<Output ItemName="BuildOutput" TaskParameter="TargetOutputs"/>
	</MSBuild>
  </Target>

The compile target is relatively simple and uses the MSBuild task to build the projects in the solution.

The Test Target:

<!-- The Test Target, runs unit tests on the compiled source code via NUnit -->
<Target Name="Test" DependsOnTargets="Clean;Compile">
	<!-- Include all assemblies that end in Tests.dll (This is convention based) -->
	<CreateItem Include="**\Bin\Debug\*Tests*.dll" >
		<Output TaskParameter="Include" ItemName="TestAssemblies" />
	</CreateItem>

	<MSBuild.ExtensionPack.CodeQuality.NUnit Assemblies="@(TestAssemblies)" ToolPath="$(NUnitPath)" OutputXmlFile="$(NUnitOuputFile)">
	</MSBuild.ExtensionPack.CodeQuality.NUnit>
</Target>

The Test target uses the MSBuildExtensionPack NUnit task to execute the NUnit console runner against assemblies that match our Test conventions, i.e. name ending in .Tests.

The CodeCoverage Target:

 <!-- The Code Coverage Target, checks code coverage using opencover and NUnit, the
			task generates both a coverage report and the test report -->
<Target Name="CodeCoverage" DependsOnTargets="Clean;Compile">
	<!-- Include all assemblies that end in Tests.dll (This is convention based) -->
	<CreateItem Include="**\Bin\Debug\*Tests*.dll" >
		<Output TaskParameter="Include" ItemName="TestAssemblies" />
	</CreateItem>

	<!-- Execute opencover -->
	<Exec Command="$(OpenCoverPath)\OpenCover.Console.exe -register:user -target:&quot;$(NUnitPath)\nunit-console.exe&quot; -targetargs:&quot;/noshadow @(TestAssemblies) /domain:single /xml:$(NUnitOuputFile)&quot; -filter:&quot;-[$(MSBuildProjectName)*.Tests]* +[$(MSBuildProjectName)*]$(MSBuildProjectName).*&quot; -output:$(OpenCoverTmpOuputFile)" />
	<!-- Use ReportGenerator Tool to build an xml Summary -->
	<Exec Command="$(OpenCoverReportGenPath)\ReportGenerator.exe &quot;$(OpenCoverTmpOuputFile)&quot; &quot;$(ReportsPath)&quot; XmlSummary" />
	<!-- Report Generator has no way to name the output file so rename it by copying and deleting the original file -->
	<Copy SourceFiles="$(ReportsPath)\Summary.xml" DestinationFiles="$(OpenCoverOuputFile)"></Copy>
	<Delete Files="$(ReportsPath)\Summary.xml"></Delete>
	<!-- Delete the original opencover output before it was transformed by ReportGenerator -->
	<Delete Files="$(OpenCoverTmpOuputFile)" />
</Target>

The Coverage task is perhaps the most complex task,mainly due to the fact that I am using an open source coverage framework called opencover that is relatively young and does not yet have a dedicated MSBuild task. If you are willing to pay for a license NCover has better MSBuild support. Points to take note of is the use of &quote; to escape quotes for command line arguments to the Exec MSBuild task and the use of the ReportGenerator tool after execution to transform the output from opencover.

The Specifications Target:

 <!-- The Specs Target, runs the context specifications on the compiled source code via MSpec -->
<Target Name="Specs" DependsOnTargets="Compile">
	<!-- Include all assemblies that end in Specs.dll (This is convention based) -->
	<CreateItem Include="**\Bin\Debug\*Specs*.dll" >
		<Output TaskParameter="Include" ItemName="SpecsAssemblies" />
	</CreateItem>

 	<PropertyGroup>
		<MSpecCommand>
			$(MSpecPath)\$(MSpecExecutable) $(MSpecSettings) @(SpecsAssemblies) --xml $(MSpecPathOutputFile)
		</MSpecCommand>
	</PropertyGroup>
	<Message Importance="high" Text="Running Specs with this command: $(MSpecCommand)"/>
	<Exec Command="$(MSpecCommand)" />
 </Target>

The specifications target uses the MSBuild Exec task to execute the Machine.Specifications console runner to run the specifications on assemblies that match our convention i.e. name ending in .Specs.

 

The Build Target:

 <!-- The default build task that pulls the other tasks together, usually executed by a developer -->
<Target Name="Build" DependsOnTargets="Clean;CodeQuality;Compile;Test;Specs">
</Target>

The build target is the default target (specified in the root Projects node at the top of the file). This target just lists the targets it depends on and these are executed in order. This target would be used by a developer to execute the build on a local machine, it does not label the project assemblies.

The CI Target:

 <!-- The CI build task that pulls the other tasks together and includes assembly labelling that pulls in the CI servers build number -->
<Target Name="CI" DependsOnTargets="Clean;CodeQuality;Label;Compile;Tests;Specs">
</Target>

The CI target is the same as the default build task just with the addition of labelling the project assemblies with the next build number. This task would be specified on the command line by a CI server.

The CICoverage Target:

 <!-- The CI build task that pulls the other tasks together, tests are run including codecoverage and includes assembly labelling that pulls in the CI servers build number -->
<Target Name="CICoverage" DependsOnTargets="Clean;CodeQuality;Label;Compile;CodeCoverage;Specs">
</Target>

The CICoverage target is the same as the CI target with addition of code coverage information.

 

These simple set of targets when used with project conventions create the ability to produce valuable repeatable builds with minimal effort, it is literally a case of create the directory structure, add the source code, name the build file to match the source code solution and you are ready to build. There is obviously a LOT more one can do and I hope to explore some complex real world problems in future posts.

Posted in enterprise-basics | Tagged , , , , , , | Series: | Comments closed

Continuous Integration Basics: Part 1 – Introduction

Setting up a Continuous Integration Server is one of the firsts steps I perform when starting a new project, it’s become a development practice that I can’t live without. If you are working in a team with even a few team members the benefits are simply immense. In this series of posts I will explore the basics of CI and then move onto some real world examples of using CI. I hope to cover areas such as automated testing, enforcing source code formatting, documentation generation, database change script generation, installation package creation and automated deployment in relation to CI.

In its simplest form CI is an automated way of frequently checking the state (in terms of quality) of a software project. In order for the state of the project as a whole to be checked, the work of each team member needs to be assessed as an integrated part of the project. This inherently means that frequent integration of each individual’s tasks is essential to the process. While many teams unaccustomed to CI initially feel this is not possible, experience has shown that it is not just possible but that integration concerns very quickly become “non-events” in practice, with team members usually integrating with no ill effect at least once a day.

CI however should not stop at just being a source code integration tool, when utilised to its fullest a CI server can provide a confidence in your product from development to production deployment that cannot be rivalled by manual processes.

 

ciserveroverview

There are a myriad of CI servers available and I will discuss some of them in detail in later posts however the basics stay the same. Most offer a web front-end to manage and control the build process, modules to manage the server and per project configuration as well as security and access control. The basic operation of a CI server is to watch a versioned source code repository for changes, usually at a set time interval, and if changes are detected to execute the tasks associated with the project. The task execution engines are generally very configurable and have the ability to check in/out source code, build documentation, label versions, compile source code, run tests and deploy projects.

As a rule I don’t suggest using the CI servers built-in capabilities to execute a source code compilation task but rather using a dedicated build automation platform like MSBuild or NAnt which most CI servers will also support. It is of utmost importance that a source code build is self sufficient and can be compiled on any machine the code based is checked-out to in exactly the same manner as on the CI server without the need for environmental dependencies. Using a build automation platform also ensures that if you do change your CI server the source code build will still run as expected.

 

buildautomation

The build automation platform can be used to automate a wide variety of quality control and administrative tasks as part of the CI process:

  • Most importantly the source code compilation
  • Checking for adherence to source code formatting guidelines (StyleCop, FxCop)
  • Running automated tests, probably the biggest benefit to business when correctly implemented (NUnit, MSpec)
  • Assessment of test coverage (NCover, Part Cover)
  • Generation of technical documentation (SandCastle)
  • Auto generation of change scripts for deployment to a production environment (Redgate SQL Tools)
  • Where applicable installation packages can automatically be created
  • For advanced scenarios an automated deployment procedure can be created and possibly result in Continuous Deployment

CI is a simple process and can be evolved over time, even the first step of having the server check-out and build your projects source code from version control a few times a day will provide immediate benefits and provide a platform for spending more time on better software practices.

Posted in enterprise-basics | Tagged , | Series: | Comments closed

An ASP.NET MVC 2 & JQueryUI example: Part 2 – Jumping in

Since this series is primarily about applying the JQueryUI themes to an ASP.NET application we will start with looking at how the theme is stored for a user and how this translates into a CSS file applied to the website for an individual user.

Theme storage and retrieval:

The first thing I did was add a user accessible property to the Profile Provider configuration in the web.config as shown below.

Listing 1:

<profile>
  <providers>
    <clear/>
    <add name="AspNetSqlProfileProvider" type="System.Web.Profile.SqlProfileProvider" connectionStringName="ApplicationServices" applicationName="/" />
  </providers>
  <properties>
    <add name="Theme"
      type="System.String"  />
   </properties>
</profile>

As you can see in Listing 1 I am using the standard AspNetSqlProfileProvider and I have added property called “Theme” of type string. This will allow us to store the theme chosen by each user on an individual basis for each user registered by the Membership Provider. I modified the default AccountService which is created by the ASP.NET MVC 2 Project Wizard in the “AccountModels.cs” file by adding the line “ProfileBase.Create(userName, true)” (Listing 2 line 10).

Listing 2:

public MembershipCreateStatus CreateUser(string userName, string password, string email)
{
    if (String.IsNullOrEmpty(userName)) throw new ArgumentException("Value cannot be null or empty.", "userName");
    if (String.IsNullOrEmpty(password)) throw new ArgumentException("Value cannot be null or empty.", "password");
    if (String.IsNullOrEmpty(email)) throw new ArgumentException("Value cannot be null or empty.", "email");

    MembershipCreateStatus status;
    _provider.CreateUser(userName, password, email, null, null, true, null, out status);

    ProfileBase.Create(userName, true);

    return status;
}

When a user registers on the website a user profile will be created automatically for the user, however it will not have any data in the profile yet. Setting profile properties during the registration process will NOT work as the request is not in fact authenticated until the user logs in. We will get around this by checking for an empty profile when retrieving the users theme choice and setting a default theme if one does not exist.

In order to set the correct CSS file for a user in the Site.Master file we will user a Helper method to retrieve the CSS file name.

Listing 3:

<head runat="server">
    <link href="<%: Url.Content("~/Content/default.css")%>" rel="stylesheet" />
    <link href="<%: Url.Content("~/Content/themes/" + Html.GetTheme() +  "/jquery.ui.all.css")%>" rel="stylesheet" />

    <script type="text/javascript" src="<%:Url.Content("~/Scripts/jquery-1.4.2.min.js") %>"></script>
    <script type="text/javascript" src="<%:Url.Content("~/Scripts/jquery-ui.1.8.2.min.js") %>"></script>
    <script type="text/javascript" src="<%:Url.Content("~/Scripts/jquery-getCSS.min.js") %>"></script>
    <script type="text/javascript" src="<%:Url.Content("~/Scripts/jquery.loadImages.1.0.1.min.js") %>"></script>
    <script type="text/javascript" src="<%:Url.Content("~/Scripts/mvcjqueryuiexample.js") %>"></script>
    <title>
        <asp:ContentPlaceHolder ID="TitleContent" runat="server" />
    </title>
    <script type="text/javascript">
        $(document).ready(initialise);
    </script>
    <asp:ContentPlaceHolder ID="HeaderContent" runat="server" />
</head>

As you can see on the third line we use “Html.GetTheme()” (Listing 3 Line 3) to retrieve the theme name, the “GetTheme()” function is shown below.

Listing 4:

public static MvcHtmlString GetTheme(this HtmlHelper helper)
{
    string baseTheme = "ui-lightness";
    string theme = baseTheme;
    if (helper.ViewContext.HttpContext.User.Identity.IsAuthenticated)
    {
        theme = helper.ViewContext.HttpContext.Profile.GetPropertyValue("Theme").ToString();
        if (string.IsNullOrEmpty(theme))
        {
            helper.ViewContext.HttpContext.Profile.SetPropertyValue("Theme", baseTheme);
            theme = baseTheme;
        }
    }
    return MvcHtmlString.Create(theme);
}

The function first checks if the current request is authenticated otherwise it just returns the default theme name. If the request is authenticated it uses the profile provider to retrieve the “Theme” profile property, if it does not exist we set the property to the default theme else we return the users chosen theme.
The theme name that is stored in the profile corresponds directly to the name of a theme directory in the /Content/themes directory, these themes are the standard themes included in the JQueryUI themes package and can be downloaded from the JQueryUI blog.
Listing 3 has another very important bit of code, lines 13-15 show a piece of JQuery which hooks the document load event and calls the initialise function, this function resides in the “mvcjqueryuiexample.js” file that I will discuss in detail later, for now just take note that we are initialising some clientside scripts on each load of a page.

A quick aside – the User Menu Area:

 

Being able to store a users chosen theme against their profile is all well and good but there needs to be a way to change the theme, the gateway to the profile editing pages is the user menu area which I will discuss briefly before talking about how the theme itself is selected.

In the Site.Master file I use the Html.RenderPartial helper method to render a usercontrol, in this case “LogOnUserControl.ascx”. Being in the Site.Master master page this is rendered on every page and so needs to be able to differentiate between authorised and anonymous users.

Listing 5:

<%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<dynamic>" %>
<% if (Request.IsAuthenticated)
   {
%>
<ul>
    <li>
        <img alt="avatar" src="<%:Html.GetGravatarUrl(24)%>" width="24px" height="24px" />
    </li>
    <li>
        <a class="ui-button ui-state-default ui-corner-all ui-button-text-only" href="<%: Url.Content("~/account/editprofile/") %>">
            <%: Page.User.Identity.Name %>
        </a>
    </li>
    <li>
        <a class="ui-button ui-state-default ui-corner-all ui-button-text-only" href="<%: Url.Content("~/account/logoff/") %>">
        Log Out
        </a>
    </li>
</ul>
<% }
   else
   {
%>
<ul>
    <li></li>
    <li>
        <a class="ui-state-default ui-button-text-only ui-corner-all ui-button" href='<%: Url.Content("~/account/logon/") %>'>
            Log On
        </a>
    </li>
</ul>
<%} %>

The usercontrol just uses some simple conditional statements to render different content based on whether the request is authenticated or not. If the request is authenticated it uses another HtmlHelper to render a gravatar, a button with the user’s username on it to edit the user’s profile and a logout button versus just a logout button when the request is not authenticated. The gravatar url is created by generating an MD5 hash of the user’s email address and appending this and the requested image size to the gravatar service url. The code is shown below.

Listing 6:

public static string GetGravatarUrl(this HtmlHelper helper, int imageSize)
{
	string result = string.Empty;

	MembershipUser User = Membership.GetUser();
	if (User != null)
	{
		System.Security.Cryptography.MD5CryptoServiceProvider x = new System.Security.Cryptography.MD5CryptoServiceProvider();
		byte[] bs = System.Text.Encoding.UTF8.GetBytes(User.Email);
		bs = x.ComputeHash(bs);
		System.Text.StringBuilder s = new System.Text.StringBuilder();
		foreach (byte b in bs)
		{
			s.Append(b.ToString("x2").ToLower());
		}
		string gravatarHash = s.ToString();

		result = string.Format("http://www.gravatar.com/avatar/{1}.png?s={0}", imageSize, gravatarHash);

	}

	return result;
}

Theme Selection:

In the previous section I described how the user menu area displays a button with the currently logged in user’s name on it as a means to edit the user’s profile. I’ll now show how the user is able to preview and select a theme from the available themes. For simplicity’s sake on application start up I retrieve a list of sub-directories in the /Content/themes directory and store these in the application state as shown below.

Listing 7:

protected void Application_Start()
{
	// build the list of themes
	string physicalPath = Server.MapPath("~/content/themes");
	string[] themeDirs = Directory.GetDirectories(physicalPath);
	IList<string> themes = new List<string>();
	foreach (string themeDir in themeDirs)
	{
		string theme = themeDir.Split(new char[] { '\\' }).Last();
		if (theme != "base")
			themes.Add(theme);
	}

	Application.Add("themes", themes);

	AreaRegistration.RegisterAllAreas();

	RegisterRoutes(RouteTable.Routes);
}

The actual theme selection happens on the edit profile view, this view contains two tabs for editing basic user details and profile settings as shown below.

image

image

This view is managed by the account controller using three methods, “EditProfile” to return the view and the data for the two tabs, “EditProfileBasic” to save the data for the basic tab, and “EditProfileSettings” to save the data from the profile tab. Notice the use of the “Authorize” attribute to ensure only authenticated users can edit their profiles as well as the “HttpPost” attribute on the two methods used for saving the users profile details which only allows the methods to be called from a page post. I also redirect back to the “EditProfile” action at the end of the two post methods as we need to refresh the data model being returned as the data posted to each method is only for the current tab and I would end up with one of the tabs having no data if just returned the view with the current model.

Listing 8:

[Authorize]
public ActionResult EditProfile()
{
	MembershipUser user = Membership.GetUser();
	return View(new ProfileModel(User.Identity.Name, user.Email, user.CreationDate.ToShortDateString(), this.HttpContext.Profile.GetPropertyValue("Theme").ToString()));
}

[Authorize]
[HttpPost]
public ActionResult EditProfileBasic(ProfileModel model)
{
	if(ModelState.IsValid)
	{
		if (string.IsNullOrWhiteSpace(model.EmailAddress))
		{
			ModelState.AddModelError("", "Email address may not be empty.");
		}
		else
		if (MembershipService.ChangeEmail(User.Identity.Name, model.EmailAddress) == false)
		{
			ModelState.AddModelError("", "New email address is not valid.");
		}
	}
	return RedirectToAction("EditProfile");
}

[Authorize]
[HttpPost]
public ActionResult EditProfileDetails(ProfileModel model)
{
	if (ModelState.IsValid)
	{
		if (string.IsNullOrWhiteSpace(model.Theme))
		{
			ModelState.AddModelError("", "A theme must be selected.");
		}
		else
		{
			this.HttpContext.Profile.SetPropertyValue("Theme", model.Theme);
		}
	}
	return RedirectToAction("EditProfile");
}

In the EditProfileDetails method in Listing 8 above I save the the chosen theme back to the current users profile which in turn will change which CSS file will be returned next time a page is requested by the user.

The tabs are created using the JQueryUI library and the view includes the script below in its header to initialise the tabs:

Listing 9:

<script type="text/javascript">
    $(document).ready(initialiseSettings);
</script>

The javascript function initialiseSettings is contained in the mvcjqueryuiexample.js file. This function sets up the tabs and hooks the change event of the dropdown box so that the theme can be previewed.

Listing 10:

function initialiseSettings() {
    /// <summary>
    /// Called from the edit profile page. Sets up the tabs to be JQueryUI tabs, initialises the themes select box and preloads the busy image.
    /// </summary>

    // Setup the tabs
    $("#editprofile").tabs();

    // initialise the themes select options
    initialiseThemes();

    // preload the busy image
    $.loadImages('/Content/ajax-loader.gif', preloadDone());
}

JQuery is used to select the div with id “editprofile” and then the JQueryUI tabs() method is called to create the tabs. Then the initialiseThemes() method is called which hooks the dropdown change event and finally I preload the busy spinner image that will be shown while the new theme is applied so that there is no delay in showing the animation, the preloading is done via a JQuery plugin (loadImages) which is included in the scripts directory.

As the theme preview is rather complicated I will talk about this in part three of this series.

Posted in enterprise-examples | Tagged , , , , | Series: | 19 Responses

An ASP.NET MVC 2 & JQueryUI example : Part 1 – Introduction

MVCJQueryUIExample is an ASP.NET MVC example application that uses the JQueryUI CSS theme framework to create a fully themeable user interface which allows designers to use JQueryUI’s themeroller application to create new application themes. The demo source code shows my personal take on creating themeable CSS frameworks and some cute JQuery to allow previews of themes on the fly.

Below are some examples of the application using some of the standard themes available, all applied with no change to the code base:

image

image

image

Requirements:

You will need to have Visual Studio 2010 or Visual Web Developer 2010 Express installed to build the source.
You will need to have ASP.NET MVC 2 installed, I used the Microsoft Web Platform Installer to install it.
You will need to have Microsoft SQL Server 2008 Express installed as the project uses the SQL ASP.Net membership provider to store the selected theme in use against a users profile in the Visual Studio generated ASPNETDB.

I am assuming that you already have a basic understanding of ASP.NET MVC and are at least familiar with using the ASP.NET membership system and a reasonable knowledge of javascript.

The source code is available for download here.

Basic Concept:

The demo website is fairly simple consisting of a Header Area that contains a User Menu and a Logo Area, a Content Area and a Footer Area.

Outline

User Menu Area:
The User Menu Area’s content changes based on whether the current user is logged on or not.

When logged on the User Menu Area contains the user’s personal gravatar if they have one, their username, which if clicked will take them to a page to edit their profile including change their preferred application theme, and a log off button.

UserMenuLogoff

When not logged on the User Menu Area only displays the log on button.

UserMenuLogon

Content Area:
The Content Area’s holds two types of content: page views such as Home, Products, Blog, About and Contact and administrative views such as Logon, Registration and Profile Editing.

When displaying page content the Content Area uses a two column layout with tabs representing the pages available, clicking a tab results in a browser request and only the content for the selected tab is available at any one time. For this demo the individual content pages just display portions of Lorem ipsum as fake content and the sidebar data is static.

PageContent

When displaying administrative views the Content Area uses a single column layout and always displays a link to the Home page.

AdminContent

The administrative pages in the application are:

  • A logon page
  • A new user registration page
  • A user details and profile editor page

ProfileEdit

The user details and profile editor page has tabs that do not cause a request to the server when changing tabs, the data for both tabs is available without a post back.

The Details:

Now that you know what the website is designed to do we can get down to the details and show how it is done.

The CSS theme structure:

If you open the project in Visual Studio you will see a project layout as below. In the content directory there is a themes subdirectory and then a further sub directory for each theme.

image

At the top level of the content directory the following CSS files are present:

  • reset.css (a file taken from the Yahoo User Interface framework that resets element styles to common values across all browsers as each browser has different default values).
  • reset-fonts.css (also taken from the Yahoo User Interface framework and similar in that it resets all font styles for elements to common values across browsers).
  • typography.css (a CSS file where we place any font based styles for our application that are not theme specific)
  • layout.css (a CSS file where we place all styles that affect the layout of our application, these are not theme specific but based on how we want to structure the site layout. It should not contain any styles that set colors or fonts).
  • default.css (a CSS file that includes all the above files so that we can just include one file in our application).

The themes directory contains all the JQueryUI themes from the 1.8.2 build. Each theme subdirectory has a CSS file named “jquery.ui.all.css” so to change a theme basically requires linking to the same file in a different directory based on the theme name as each “jquery.ui.all.css” file contains exactly the same CSS classes just with different colors / fonts etc. based on the theme. You can make a new theme with the themeroller application and place it in the theme directory to make it available to your application.

JQueryUI:

The JQueryUI widgets have great documentation so I will just discuss how I used their CSS framework in the demo app.
JQueryUI is basically a set of UI widgets such as accordion, button, datepicker, dialog etc. In order to keep a similar styling across these widgets they developed a set of CSS classes that can be reused and composed together to set common styles. These styles fall into the following areas:

  • Widget Containers – styles pertaining to containers, container content and headers.
  • Interaction States – styles pertaining to clickable elements interactions including a default style, hover style, active style and focus style.
  • Interaction Cues – styles pertaining to elements that are highlighted, disabled or in an error state.
  • Icons – styles pertaining to how images and icons are depicted and the JQueryUI themes contain a set of default icons that have had their colors set for each theme.
  • Miscellaneous Visuals – corner styles and overlay styles.

By applying these styles to the elements of our application we can leverage the same framework as the JQueryUI widgets and get a themeable application, for example if we apply the widget container styles to our broad container areas (Header, User Menu, Content, Footer) and the interaction styles to our links and tabs we are immediately able to use any of the JQueryUI standard themes and have our application look and feel themeable. I’ll dive into the code a little later to show you exactly how I did this.

ASP.NET MVC 2:

The ASP.NET MVC part of the project has very few changes from the default project created by the project wizard. There are only two controllers, the generated AccountController and the HomeController, the HomeController has methods for each of page data view and just returns an appropriate ViewResult.

The first real change in the ASP.NET MVC project is in how I setup the master pages. I have one Site.master that includes all the appropriate scripts and CSS files, it also renders the header and footer areas and has a content placeholder for the Content Area. There are two nested master pages, OneColumn.master and TwoColumn.master, which can plug into the Site.Master content placeholder and as mentioned earlier the page views use the TwoColumn master page while administrative views use the OneColumn master page.

The second change is that I have setup the Profile Provider in the Web.config file to have a property called “Theme” that allows us to store the users chosen theme against their ASP.NET membership profile.

image

That’s the end of the introduction to the demo, hopefully it gives you an idea about what the intent and basic concepts of the demo are. in part 2 we will dive into the code.

Posted in enterprise-examples | Tagged , , , , | Series: | 1 Response

Source Control – Part 3 – Installing a local Mercurial Server

If you are going to work as part of a development team that requires its own private repository you will need to install the Mercurial server components. There are a few options for this but my personal choice for a windows development team is to server the Mercurial repository via an IIS web server. This guide assumes you already have IIS installed.

As Mercurial is written in Python you will first need to install Python on your server. To see which version of Python you need you take a look at the available windows download packages at http://bitbucket.org/tortoisehg/thg-winbuild/downloads/. The windows builds come in a number of flavours, some with a standalone version of Python which won’t work as we need to install a number of dependant Python site packages for IIS to serve Mercurial.

image

Most of the installation information below was found on Matt Hawley’s blog here and is by no means my own, I have added some alternate steps and information at the end as to how I configured my setup.

    1. Look for the latest Mercurial download ending in win32-pyX.X.exe, in this case mercurial-1.5.4.win32-py2.6.exe meaning Python 2.6 is required. Download the appropriate Mercurial package to use later.
    2. Now download the appropriate Python version from http://www.python.org/download/releases/ in this case Python 2.6.5.
    3. Install Python by running the installer and leaving the default directory of c:\Python26\. Make sure you add c:\Python26 to your path.
    4. Download the PyWin32 a package required to call win32 functions from Python from sourceforge. Run the installer and let it decide the default installation paths for you.
    5. Install the mercurial-1.5.4.win32-py2.6.exe package we downloaded in the first step and again let the installer decide on the appropriate install paths.
    6. Download the isapi-wsgi package from http://code.google.com/p/isapi-wsgi/, the appropriate download at this time is isapi_wsgi-0.4.2.win32.exe. Install by running the installer and letting it once again decide the installation paths.
    7. Next you need to get the Python script hgwebdir_wsgi.py however this is only available in the source package of Mercurial. The simplest way to get it is to point your web browser at http://selenic.com/repo/hg-stable/then click browse on left menu and then choose the contrib directory, followed by the win32 directory, you should be at a page as below.image
      right click the hgwebdir_wsgi.py file and save it to your local machine.
    8. Create a directory for your website which will host your Mercurial website in IIS such as c:\inetpub\wwwroot\hg
    9. Copy hgwebdir_wsgi.py into this directory.
    10. Create a directory where you will store your Mercurial code repositories such as c:\repositories\

Create a file called hgweb.config in a text editor with the following content
[collections]
c:\repositories\  = c:\repositories\
and save it in your Mercurial website directory (c:\inetpub\wwwroot\hg in the example).

  • Open the hgwebdir_wsgi.py file in a text editor and find the following substituting your own path# Configuration file location
    hgweb_config = r’C:\inetpub\wwwroot\hg\hgweb.config’
  • Open a command prompt and change to your Mercurial website directory and run the following command python hgwebdir_wsgi.py which will generate a DLL shim called _hgwebdir_wsgi.dll.
  • Open the IIS management console and create a new application pool called Mercurial, set the .Net Framework Version to “No Managed Code”, leave the pipeline as Integrated Mode. If you are running a 64bit operating system be sure to open the advanced properties and set Enable 32-Bit Applications to True.
  • Browse to your hg directory (or whatever you called your Mercurial website) under the Default Web Site Node and right click and choose convert to application. Be sure to select your newly created Mercurial App Pool as the applications default.
    image
  • Open the Handler Mappings for you new application (not the default web sites mappings!).image
  • Add a new “Wildcard Script Map” with the executable location pointing to the _hgwebdir_wsgi.dll in your Mercurial web directory, calling it something like Mercurial-ISAPI.image
  • Click OK and when it prompts you to allow this ISAPI extension, click “Yes”image
  • If you now browse to your Mercurial website at http://localhost/hgyou should see the followingimage
  • Now lets add a Mercurial repository, we will add the Mercurial source itself as an example using TortoiseHg. Using windows explorer browse to c:\Repositories (or whatever you chose as your repository location in step 10), right click and choose TortoiseHg –> clone. Enter the source and destination paths as below and click clone.image
  • Once done browse to your Mercurial website at http://localhost/hgyou should now see the hg-stable repository visible, which you can browse deeper into by clicking the hg-stable link.image

 

While being able to browse your repositories visually is pretty cool the main reason for setting up the local Mercurial server was to allow pushing and pulling code to a central local repository, At this stage you will be able to pull code over http from your repository by specifying http://localhost/hg/hg-stable in the source path of the TortoiseHg Clone dialog but would not yet be able to commit back. We will cover the configuration steps required for this in part 4 of this article.

As I mentioned before most of these steps are shown in Matt Hawley’s blog post with a few modifications that I made along the way to suit my own preferences.

Posted in enterprise-basics | Tagged , , | Series: | 2 Responses

Source Control – Part 2 – Installing a Mercurial Client

The mercurial project owners have dedicated windows installers for mercurial so installing on windows is pretty easy. If you have used subversion you will more than likely be familiar with TortoiseSVN the integrated subversion client for windows explorer. TortoiseHg is the mercurial equivalent of TortoiseSVN and can be downloaded directly from the mercurial download website for 32bit or 64bit platforms.

image

The file you download is a standard windows installer that will setup TortoiseHg for you.

image

Once installed you will get the Mercurial menu options available in the windows explorer context menu, shown above running side by side with TortoiseSVN. Don’t forget to add your TortoiseHg directory to your path so that you can use Mercurial from the command line. Remember to turn off windows indexing for your development folders as performance can be greatly affected. Right click on your top level development folder and on the general tab click –> Advanced and uncheck “Allow files in this folder to have contents indexed…”.

image

If you are going to be part of a team that needs to use their own local private Mercurial repository be sure to take a look at Part 3 – Installing a Mercurial Server.

You can sign up for a free bitbucket account at http://www.bitbucket.org, the free Mercurial hosting service which will allow you to share your code and follow progress of other Mercurial projects.

Posted in enterprise-basics | Tagged , , | Series: | 6 Responses

Source Control – Part 1 – Introduction

Source control is a non-negotiable facet of enterprise software development, not just for its use as a code repository but for the possibilities it enables in terms of continuous integration, distributed teams, code reviews and overall peace of mind.

There are a large number of source control systems out there with the newer systems being based on a distributed rather than a centralised revision control paradigm. While I have used subversion for many years, the enhancements provided by DVCS products like GIT and Mercurial (Hg) have completely won me over for all projects going forward, as Joel Spolsky said “If you are using Subversion, stop it. Just stop”.

As I develop software mostly on windows and the .NET platform I find that Mercurial is substantially easier to install and use than GIT when you need to have a local server (which is pretty common for an enterprise team). However I strongly suggest you install and become familiar with both as there is a significant amount of .NET open source software available on github (http://www.github.com) that you really want to be able to use, much more so than on bitbucket (http://www.bitbucket.org) the mercurial equivalent.

I would also highly recommend reading Joel Spolsky’s tutorial on using Mercurial, its a fantastic introduction to using a DVCS, especially if you have used subversion in the past. Also take a look at the GIT book.

Posted in enterprise-basics | Tagged , | Series: | 1 Response
Skip to toolbar