Uncategorized

Test Framework Philosophy

My development team is working to implement and enforce more formal development processes than we have used in the past.  Part of this process involves deciding on which unit test framework to use going forward.  Traditionally we have used NUnit and it has worked well for our needs but now that we’re implementing Visual Studio Team System we now have MSTest available.  This has sparked a bit of a debate as to whether we should stick with NUnit or migrate to MSTest.  As we examine the capabilities of each framework and weigh each of their advantages and disadvantages I’ve come to realize that the decision is a philosophical matter.

MSTest has a bit of a bad reputation.  The general consensus seems to be that MSTest sucks.  A few weeks ago I would have thoroughly agreed with that assessment but recently I’ve come to reconsider that position.  The problem isn’t that MSTest sucks, it’s that MSTest follows a different paradigm than some other frameworks as to what a test framework should provide.

My favorite feature of NUnit is its rich, expressive syntax.  I especially like NUnit’s constraint-based assertion model.  By comparison, MSTest’s assertion model is limited, even restrictive if you’re used to the rich model offered by NUnit.  Consider the following “classic” assertions from both frameworks:

NUnit MSTest
Equality/Inequality Assert.AreEqual(e, a)
Assert.AreNotEqual(e, a)
Assert.Greater (e, a)
Assert.LessOrEqual(e, a)
Assert.AreEqual (e, a)
Assert.AreNotEqual (e, a)
Assert.IsTrue(a > e)
Assert.IsTrue(a <= e)
Boolean Values Assert.IsTrue(a)
Assert.IsFalse(a)
Assert.IsTrue(a)
Assert.IsFalse(a)
Reference Assert.AreSame(e, a)
Assert.AreNotSame(e, a)
Assert.AreSame(e, a)
Assert.AreNotSame(e, a)
Null Assert.IsNull(a)
Assert.IsNotNull(a)
Assert.IsNull(a)
Assert.IsNotNull(a)
e – expected value
a – actual value

They’re similar aren’t they?  Each of the assertions listed are functionally equivalent but notice how the Greater and LessOrEqual assertions are handled in MSTest.  MSTest doesn’t provide assertion methods for these cases but instead relies on evaluating expressions to define the condition.  This difference above all else defines the divergence in philosophy between the two frameworks.  So why is this important?

Readability

Unit tests should be readable.  In unit tests we often break established conventions and/or violate the coding standards we use in our product code.  We sacrifice brevity in naming with Really_Long_Snake_Case_Names_So_They_Can_Be_Read_In_The_Test_Runner_By_Non_Developers.  We sacrifice DRY to keep code together.  All of these things are done in the name of readability.

The Readability Debate

Argument 1: A rich assertion model can unnecessarily complicate a suite of tests particularly when multiple developers are involved.

Rich assertion models make it possible to assert the same condition in a variety of ways resulting in a lack of consistency.  Readability naturally falls out of a week assertion model because the guess work of which form of an assertion is being used is removed.

Argument 2: With a rich model there is no guess work because assertions are literally spelled out as explicitly as they can be.
Assert.Greater(e, a) doesn’t require a mental context shift from English to parsing an expression.  The spelled out statement of intent is naturally more readable for developers and non-developers alike.

My Position

I strongly agree with argument 2.  When I’m reading code I derive as much meaning from the method name as I can before examining the arguments.  “Greater” conveys more contextual information than “IsTrue.”  When I see “IsTrue” I immediately need to ask “What’s true?” then delve into an argument which could be anything that returns a boolean value.  In any case I still need to think about what condition is supposed to be true.

NUnit takes expressiveness to another level with its constraint-based assertions.  The table below lists the same assertions as the table above when written as constraint-based assertions.

Equality/Inequality Assert.That(e, Is.EqualTo(a))
Assert.That(e, Is.Not.EqualTo(a))
Assert.That(e, Is.GreaterThan(a))
Assert.That(e, Is.LessThanOrEqualTo(a))
Boolean Values Assert.That(a, Is.True)
Assert.That(a, Is.False)
Reference Assert.That(a, Is.SameAs(e))
Assert.That(a, Is.Not.SameAs(e))
Null Assert.That(a, Is.Null)
Assert.That(a, Is.Not.Null)
e – expected value
a – actual value

Constraint-based assertions are virtually indistinguishable from English.  To me this is about as readable as code can be.

Even the frameworks with a weak assertion model provide multiple ways of accomplishing the same task.  Is it not true that Assert.AreEqual(e, a) is functionally equivalent to Assert.IsTrue(e == a)?  Is it not also true that Assert.AreNotEqual(e, a) is functionally equivalent to Assert.IsTrue(e !=a)?  Since virtually all assertions ultimately boil down to ensuring that some condition is true and throwing an exception when that condition is not true, shouldn’t weak assertion models be limited to little more than Assert.IsTrue(a)?

Clearly there are other considerations beyond readability when deciding upon a unit test framework but given that much of the power of a given framework is provided by the assertion model it’s among the most important.  To me, an expressive assertion model is just as important as the tools associated with the framework.

Your thoughts?

Advertisements

LINQ: IEnumerable to DataTable

Over the past several months I’ve been promoting LINQ pretty heavily at work.  Several of my coworkers have jumped on the bandwagon and are realizing how much power is available to them.

This week two of my coworkers were working on unrelated projects but both needed to convert a list of simple objects to a DataTable and asked me for an easy way to do it.  LINQ to DataSet provides wonderful functionality for exposing DataTables to LINQ expressions and converting the data into another structure but it doesn’t have anything for turning a collection of objects into a DataTable.  Lucky for us LINQ makes this task really easy.

First we need to use reflection to get the properties for the type we’re converting to a DataTable.

var props = typeof(MyClass).GetProperties();

Once we have our property list we build the structure of the DataTable by converting the PropertyInfo[] into DataColumn[].  We can add each DataColumn to the DataTable at one time with the AddRange method.

var dt = new DataTable();
dt.Columns.AddRange(
  props.Select(p => new DataColumn(p.Name, p.PropertyType)).ToArray()
);

Now that the structure is defined all that’s left is to populate the DataTable.  This is also trivial since the Add method on the Rows collection has an overload that accepts params object[] as an argument.  With LINQ we can easily build a list of property values for each object, convert that list to an array, and pass it to the Add method.

source.ToList().ForEach(
  i => dt.Rows.Add(props.Select(p =>; p.GetValue(i, null)).ToArray())
);

That’s all there is to it for collections of simple objects.  Those familiar with LINQ to DataSet might note that the example doesn’t use the CopyToDataTable extension method.  The main reason for adding the rows directly to the DataTable instead of using CopyToDataTable is that we’d be doing extra work.  CopyToDataTable accepts IEnumerable but constrains T to DataRow.  In order to make use of the extension method (or its overloads) we still have to iterate over the source collection to convert each item into a DataRow, add each row into a collection, then call CopyToDataTable with that collection.  By adding the rows directly to the DataTable we avoid the extra step altogether.

We can now bring the above code together into a functional example. To run this example open LINQPad, change the language selection to C# Program, and paste the code into the snippet editor.

class MyClass
{
  public Guid ID { get; set; }
  public int ItemNumber { get; set; }
  public string Name { get; set; }
  public bool Active { get; set; }
}

IEnumerable<MyClass> BuildList(int count)
{
  return Enumerable
    .Range(1, count)
    .Select(
      i =>
      new MyClass()
      {
        ID = Guid.NewGuid(),
        ItemNumber = i,
        Name = String.Format("Item {0}", i),
        Active = (i % 2 == 0)
      }
    );
}

DataTable ConvertToDataTable<TSource>(IEnumerable<TSource> source)
{
  var props = typeof(TSource).GetProperties();

  var dt = new DataTable();
  dt.Columns.AddRange(
    props.Select(p => new DataColumn(p.Name, p.PropertyType)).ToArray()
  );

  source.ToList().ForEach(
    i => dt.Rows.Add(props.Select(p => p.GetValue(i, null)).ToArray())
  );

  return dt;
}

void Main()
{
  var dt = ConvertToDataTable(
    BuildList(100)
  );

  // NOTE: The Dump() method below is a LINQPad extension method.
  //       To run this example outside of LINQPad this method
  //       will need to be revised.

  Console.WriteLine(dt.GetType().FullName);
  dt.Dump();
}

Of course there are other ways to accomplish this and the full example has some holes but it’s pretty easy to expand. An obvious enhancement would be to rename the ConvertToDataTable method and change it to handle child collections and return a full DataSet.

They Write the Right Stuff

A few days ago someone on Reddit linked to this fastcompany article about the team responsible for building the space shuttle’s on-board software. The main focus of the article is how this team of 260 people consistently releases virtually bug-free software.

This article was really timely for me given some of the Code Camp sessions I attended last weekend. Many of the key points from those sessions were reiterated for me.

Although most of us don’t write software that is not only used in but also controls life and death situations, we as a practitioners of a maturing industry could really benefit by studying and incorporating their practices. The article is a bit long but really worth the read.

Adobe Camera RAW vs Nikon Capture NX2

I’ve been using Adobe Camera RAW (ACR) ever since I started shooting RAW with my D40 in mid-2007.  I’ve always been pretty happy with the results, particularly after bouncing into Photoshop CS3 and doing some additional adjustments such as some changes in LAB mode but recently I’ve been wondering what other software is available for manipulating the NEFs that come off of the D300.  Without too much effort I found a ton of sites talking about how Capture NX2 from Nikon is the best editor for NEFs hands down.  Nikon even has a 60-day free trial of the software so I decided to give it a shot.

I’ve spent a few hours each night for the past few days experimenting with NX2 and found myself seriously disappointed with the software each time.  The problem isn’t the quality of the output.  After seeing the results of the various adjustments such as white balance, noise reduction, Active D-Lighting, and a ton of other features I dare not question the capabilities of the software.  It really is great at adjusting NEFs.  Where it really gets me is that it extends my workflow, it results in more used disk space, and it’s REALLY SLOW!

With few exceptions I always load the processed NEFs into Photoshop so at a minimum I can add a copyright watermark and a border treatment.  In order for me to fit NX2 into my workflow I’d need to do the processing in NX2, save the image as a TIFF, and open the TIFF in Photoshop, do the appropriate processing, save the PSD, and then export the JPEG that will end up on flickr or a CD/DVD.  With my current workflow I just open ACR via Adobe Bridge, do my processing, let ACR generate a 5-10K xmp sidecar file, and proceed into Photoshop.  Generally speaking, the results with this process are (IMHO) fantastic and I don’t have a 70+MB TIFF sitting along side a 90+MB PSD.  Granted I could delete the TIFF when I’m done with it but that would be adding yet another step into the process.  The real deal breaker for me though is how insanely slow NX2 really is!

I’ve seen some posts that discuss how NX2’s UI is a bit cumbersome.  I really didn’t think the UI was the problem.  After a bit of poking around I found most of the basic adjustments to be fairly intuitive.  The UI wasn’t what slowed me down.  What really slowed me down was how long it took NX2 to complete ANY operation.  Changing white balance?  Wait a few minutes.  Setting the black point?  Wait a few minutes.  Zooming in?  Wait a few minutes.  Applying noise reduction?  Go watch TV.

Maybe the slowness of this application would be more tolerable to me if I wasn’t already used to the speed of ACR.  I’ll admit that my laptop is a few years old but these same adjustments in ACR are nearly instantaneous!  I obviously don’t know what’s going on under the hood of these two apps but if Adobe’s generic RAW editor can be as good as it is I would think that Nikon could create a specialized NEF editor that would be much better.

My experience this past week with Capture NX2 has left me thinking that Nikon needs to release the full details of the NEF format, get out of the desktop software market, and let the companies like Adobe that have proven their ability handle making the desktop utilities.  It was bad enough when I opened the box for my D300 and pulled out the software suite CD.  I paid $1700 for a D300 and all Nikon is going to give me are View NX and Kodak EasyShare?  How am I supposed to do anything with 14-bit NEFs with those???  And then they want $180 for software that would eat more of my time and storage space?  W…T…F???

In the mean time, I think I’m going to download the trial version of Adobe Lightroom 2.0.  I’ve seen demos for Lightroom 1 and have liked everything I’ve seen.  I’m thinking it could streamline my workflow a bit.  That, and I hate Bridge too…but that’s another topic for another time.

T-Bird Tail Light

Today was another good day for my photography.  One of my photos won the June 2008 Assignment: Indiana contest!  The topic for June was Vintage Vehicles and I snapped a shot of the tail light of a 1956 Ford Thunderbird while Esther and I were at the 1st annual Carmel Artomobilia event in Carmel’s Arts and Design District.

Thanks to everyone that voted for this shot in June.  I’m looking forward to the July contest.

T-Bird Tail Light

This shot was taken at f/8 for 1/160.  Only the usual color adjustments were applied in Photoshop.

Streaks in the Sky

For the first time in a long time I tried my hand at capturing some lightning.  I’ve tried very unsuccessfully in the past to get a lightning shot that I could be proud of but tonight was another story.

We had yet another not so insignificant storm pass through the area tonight.  At first I tried shooting through one of the west windows in our sunroom but by the time I got my camera set up most of the lightning had moved to the other side of the house.  I moved my gear to an east facing dining room window but the screen and trees were causing other troubles.  I was about to put everything away when I decided to move into the garage (also east facing).

At first I wasn’t having much luck.  I had been trying to limit my scope to one of the houses across the street (and the sky above it obviously) since I had been seeing a fair amount of activity in that general direction.  After several unsuccessful attempts I changed my approach and zoomed back out to 18mm and widened my field to nearly the entire cul-de-sac.  Immediately after repositioning and opening the shutter I was presented with a perfect flash that I knew was right across the top third of my frame.  I left the shutter open for a few more seconds before closing it.  When the review came up I took one look at it and said to myself “this is the shot I’ve been waiting for,” packed up my gear, and headed inside satisfied that I FINALLY got the lightning shot I’ve wanted since I got my D40.

Streaks in the Sky

For anyone interested in the technical information about this shot, this was taken at f/8 for 23. seconds.

And The Winner Is…

I entered four photos into the novice division of the September 2007 Photo Venture Camera Club competition. My two on-topic (self-portrait) entries didn’t fare too well since the judge didn’t like the inclusion of the camera in the shot but I swept the off-topic color category. Here are my winning entries:

Leafscape – Second Place

Leafscape

Bored Dog – First Place

Bored Dog