Adventures in Remoting

I was recently given a “special” project that involved .NET Remoting.  One of our customers requires that security is enabled on the remoting channel between a DMZ server and an internal server.  Normally this wouldn’t be a problem, just edit the config files to enable security on the client and server channels and include some credentials on the client channel.  Unfortunately it’s never that simple and our customer has a company-wide policy that requires that any credentials stored within a configuration file be encrypted.

My first thought was to encrypt the credentials using aspnet_regiis or aspnet_setreg.  It turns out that neither of these utilities support encrypting the <system.runtime.remoting> section of configuration files!  This seems odd since other sections that allow credentials can be encrypted with these utilities but ok, what other options do I have?

Up until this point my experience with remoting has been limited to… Ok, you got me.  I have no remoting experience beyond reading the configuration information for a few wellknowns.  I quickly realize that I’m going to have to get deeper into remoting than I ever cared to.  I’ve already determined that the encrypted credentials are going to need to be managed by one of our configuration utilities and somehow programmatically set on the channel.  The problem is how?

My first idea was that since all of the remoting objects derive either directly or indirectly from MarshalByRefObject I could create a utility method that would read the encrypted credentials from our configuration utility, decrypt them, then apply them to the channel sink properties retrieved from ChannelServices.GetChannelSinkProperties for that MarshalByRefObject instance.  This method would be called from the constructor of each of the classes deriving from MarshalByRefObject.  This soon proved to be a futile effort because little did I know that the constructor wasn’t actually called locally so the credentials would never be applied to the channel so the remoting request would still fail.

At this point I feel like I’m running out of options but keep searching and discover that I can create a custom channel sink provider and a custom channel sink.  The documentation I’ve found is scattered but I think I have enough to scrape up something that will work.  One particularly important tidbit I found read:

“On the client side, custom channel sinks are inserted into the chain of objects between the formatter sink and the last transport sink.”

What this meant to me was that I could be sure that the last channel sink was always going to be the transport sink which is where the credentials needed to be set.  If I inserted a custom sink provider into the chain via the configuration file I could use my provider to walk to the end of the channel sink chain and set the appropriate properties!  After about 20-25 minutes I have a class implementing IClientChannelSinkProvider inserted into the provider sink chain and setting credentials exactly as our customer requires.

I’m really happy with the simplicity of this solution.  I only wish that I had found it sooner rather than spinning my wheels on updating constructors that would never be called on the client.  Not that I can do anything about it now but for future reference was there a better way?

They Write the Right Stuff

A few days ago someone on Reddit linked to this fastcompany article about the team responsible for building the space shuttle’s on-board software. The main focus of the article is how this team of 260 people consistently releases virtually bug-free software.

This article was really timely for me given some of the Code Camp sessions I attended last weekend. Many of the key points from those sessions were reiterated for me.

Although most of us don’t write software that is not only used in but also controls life and death situations, we as a practitioners of a maturing industry could really benefit by studying and incorporating their practices. The article is a bit long but really worth the read.

Indy Code Camp Notes, Part 4

This is part of a four part series of notes I took at Indy Code Camp on May 16, 2009. This year’s Code Camp consisted of five tracks each with five sessions. Track topics were SharePoint, SilverLight and WPF, Data Access, MIX highlights, and best practices. These notes are from the sessions I attended.

These notes are in no way intended to replace attending one of these talks if you have the chance.

Test Driven is Driving Me Insane!

Presented By: Dennis Burton

Dennis’s talk focused on road-blocks to effective Test Driven Development and some common, practical test patterns. My notes are a little sparse on this one since it was the last session of the day and my mind was wandering a bit ;) He had a lot of good examples that I wish I could remember a bit better.

Road-blocks To Effective Up-front Testing

  • Management push-back
    • “Double the code? Are you kidding?!”
    • Project timeline impact due to code changes taking longer while tests are updated
  • Test design issues
    • Long-running tests – ideally each test should run in less than a second
    • Long-setup time
    • Fragile tests – more work is required to keep tests running when changes are made
    • Data dependency issues – tests rely heavily on data specific data, often from a production environment

Common Patterns

  • Dummy Pattern – use when an object is needed only to help construct another object
  • Stub Pattern – use when an object is needed to help construct another object and its state must be verified
  • Mock Pattern – use when testing functionality functionality instead of specific data
  • Spy Pattern – use to add validation to an object that does not implement the required validation. This is typically implemented as a wrapper class that passes values into the type being tested

Recommendations

Indy Code Camp Notes, Part 3

This is part of a four part series of notes I took at Indy Code Camp on May 16, 2009. This year’s Code Camp consisted of five tracks each with five sessions. Track topics were SharePoint, SilverLight and WPF, Data Access, MIX highlights, and best practices. These notes are from the sessions I attended.

These notes are in no way intended to replace attending one of these talks if you have the chance.

Being More Than a Code Monkey: Practicing Beautiful Code

Presented By: Michael Wood – Strategic Data Systems

In this session Michael described some common traits of “ugly” code and presented some guidelines as to what makes code beautiful. Clean Code by Bob Martin is recommended reading for more information on this topic.

Beautiful Code is NOT

  • Hard to follow
  • Hard to extend
  • Something you have to slog through
  • Done with brute force (If it’s really hard to do you’re probably doing it wrong)
  • Code that people run from

Why Should I Care?

  • Working with ugly code can negatively impact project timelines. If code is difficult to understand it will take subsequent developers longer to get to a position to maintain that code. It can also potentially have cross-project impact if previous developers need to be brought in to help explain the code.
  • Code that is difficult to extend will likely require additional refactoring which can possibly seep into additional areas.
  • Like broken windows bad code tends to beget bad code. High quality standards need to be maintained consistently across the application in order to be effective.

Traits of Beautiful Code

  • Self documenting – types and their members should be named according to their purpose. Comments are nice but need to be maintained to be effective as code is changed. By writing self-documenting code the need for comments is significantly reduced if not eliminated.
  • Avoids using magic numbers – anything that could be a magic number should be defined as constant or enum member. Similarly, consider using an abstract class with constant strings to simulate an enum to avoid having hard-coded strings inline.
  • Exceptions carry additional information – the Exception class has a Message property and an InnerException property…USE THEM!

Code Monkey Song by Jonathan Coulton

Michael mentioned this song and video in his presentation so I thought I’d include it. Enjoy!

Indy Code Camp Notes, Part 2

This is part of a four part series of notes I took at Indy Code Camp on May 16, 2009. This year’s Code Camp consisted of five tracks each with five sessions. Track topics were SharePoint, SilverLight and WPF, Data Access, MIX highlights, and best practices. These notes are from the sessions I attended.

These notes are in no way intended to replace attending one of these talks if you have the chance.

Improving Our Craft: A Discussion on Software Estimation

Presented By: Michael Eaton – Validus Solutions

Michael started his talk by tossing out some statistics about project success rates and defining some terms. He then moved on to discussing why we’re so bad at giving good estimates and wrapped up with some ideas for how to improve the estimates we do give. Unfortunately I wasn’t able to capture the exact stats but they were eye-opening to say the least in that in 2006, only 32% of projects were considered to be successful.

Terms

  • Estimate
    • Tentative evaluation or rough calculation
    • Preliminary calculation of cost
    • Initial judgement
  • Target
    • Statement of desirable objective
  • Commitment
    • A promise to deliver

Understanding Estimates

  • The challenge is to determine whether we are supposed to be estimating or projecting to hit a target. If we’re projecting to hit a target do we produce what was expected?
  • Estimates will by their nature, be wrong. We’re providing estimates not exactimates.
  • A good estimate is one that provides a clear enough view of the project reality to allow the project leadership to make good decisions regarding targets.

Why Are We So Bad?

  • Unknown problem domain
  • Vague, missing, or large requirements
  • We forget stuff
  • Overconfidence
  • Insufficient education or practice making estimates
  • Never given a chance to succeed (set-up for failure)

The Cone of Uncertainty

The Cone of Uncertainty states that the actual duration of a project can take up to four times as long or 1/4 as long as expected.

What Can We Do?

  • Stop giving off-the-cuff estimates
  • Communicate
  • Incorporate Agile practices
    • Iterations
    • Daily Stand-ups
  • Decompose the problem into smaller chunks and estimate the chunks
  • Let developers estimate their own tasks rather than having a manager or lead developer provide estimates for the team
  • Estimate end-to-end tasks
  • Learn from your mistakes
  • Keep a log of estimates and periodically review it

Indy Code Camp Notes, Part 1

This is part of a four part series of notes I took at Indy Code Camp on May 16, 2009. This year’s Code Camp consisted of five tracks each with five sessions. Track topics were SharePoint, SilverLight and WPF, Data Access, MIX highlights, and best practices. These notes are from the sessions I attended.

These notes are in no way intended to replace attending one of these talks if you have the chance.

Care About Your Craft: Adventures in the Art of Software Development

Presented by: Tim Wingfield – Quick Solutions

Tim’s talk focused on how to get better results out of each phase of the SDLC. Included here are key points from each section.

Design

  • TDD should not be thought of only as Test Driven Development but also as Test Driven Design
  • Design with the SOLID Principles in mind
    • Single Responsibility Principle
    • Open/Closed Principle
    • Liskov Substitution Principle
    • Interface Segregation Principle
    • Dependency Inversion Principle
  • Design by Contract – TDD can actually force us to do this
  • Avoid programming by coincidence – Understand how and why something works
  • Don’t be a plumber – Chances are good that someone else has had to save the same technical problem so there might already be a framework that can be used and let you focus on the actual business problem at hand
  • Justify technology use – Don’t use a technology just because it’s new, use it because it better solves the problem
  • You Ain’t Gonna Need It (YAGNI)
    • Don’t waste time building features that might be used

Development

  • Quality is not an accident – build it into everything starting with requirements
  • Don’t Repeat Yourself (DRY)
    • Reuse code and avoid copy/paste programming
  • Automate where ever possible
    • Build servers are a great for automation
    • Use code generators to speed up development by automating common coding tasks such as creating properties, creating method stubs, and refactoring.
  • Avoid finger pointing by taking collective ownership – We’re all in it together
    • OUR build broke
    • OUR defect
  • Don’t live with broken windows
    • Bad code begets bad code
    • Fix problems when they’re found (time permitting) rather than letting them propagate
  • Know when enough is enough
    • Perfect is nice but software has to ship sometime

Debugging

  • Someone (QA or a customer) will find a defect
  • When a defect is found, don’t panic
  • Think about the solution before rushing to action
  • Avoid placing the blame and work toward the best solution

Continuous Improvement

  • Always keep learning
  • Maintain a positive attitude
  • Be a mentor
  • Create a culture geared toward continuous improvement
    • Schedule lunch ‘n’ learns
    • Have book reviews

Adobe Camera RAW vs Nikon Capture NX2

I’ve been using Adobe Camera RAW (ACR) ever since I started shooting RAW with my D40 in mid-2007.  I’ve always been pretty happy with the results, particularly after bouncing into Photoshop CS3 and doing some additional adjustments such as some changes in LAB mode but recently I’ve been wondering what other software is available for manipulating the NEFs that come off of the D300.  Without too much effort I found a ton of sites talking about how Capture NX2 from Nikon is the best editor for NEFs hands down.  Nikon even has a 60-day free trial of the software so I decided to give it a shot.

I’ve spent a few hours each night for the past few days experimenting with NX2 and found myself seriously disappointed with the software each time.  The problem isn’t the quality of the output.  After seeing the results of the various adjustments such as white balance, noise reduction, Active D-Lighting, and a ton of other features I dare not question the capabilities of the software.  It really is great at adjusting NEFs.  Where it really gets me is that it extends my workflow, it results in more used disk space, and it’s REALLY SLOW!

With few exceptions I always load the processed NEFs into Photoshop so at a minimum I can add a copyright watermark and a border treatment.  In order for me to fit NX2 into my workflow I’d need to do the processing in NX2, save the image as a TIFF, and open the TIFF in Photoshop, do the appropriate processing, save the PSD, and then export the JPEG that will end up on flickr or a CD/DVD.  With my current workflow I just open ACR via Adobe Bridge, do my processing, let ACR generate a 5-10K xmp sidecar file, and proceed into Photoshop.  Generally speaking, the results with this process are (IMHO) fantastic and I don’t have a 70+MB TIFF sitting along side a 90+MB PSD.  Granted I could delete the TIFF when I’m done with it but that would be adding yet another step into the process.  The real deal breaker for me though is how insanely slow NX2 really is!

I’ve seen some posts that discuss how NX2’s UI is a bit cumbersome.  I really didn’t think the UI was the problem.  After a bit of poking around I found most of the basic adjustments to be fairly intuitive.  The UI wasn’t what slowed me down.  What really slowed me down was how long it took NX2 to complete ANY operation.  Changing white balance?  Wait a few minutes.  Setting the black point?  Wait a few minutes.  Zooming in?  Wait a few minutes.  Applying noise reduction?  Go watch TV.

Maybe the slowness of this application would be more tolerable to me if I wasn’t already used to the speed of ACR.  I’ll admit that my laptop is a few years old but these same adjustments in ACR are nearly instantaneous!  I obviously don’t know what’s going on under the hood of these two apps but if Adobe’s generic RAW editor can be as good as it is I would think that Nikon could create a specialized NEF editor that would be much better.

My experience this past week with Capture NX2 has left me thinking that Nikon needs to release the full details of the NEF format, get out of the desktop software market, and let the companies like Adobe that have proven their ability handle making the desktop utilities.  It was bad enough when I opened the box for my D300 and pulled out the software suite CD.  I paid $1700 for a D300 and all Nikon is going to give me are View NX and Kodak EasyShare?  How am I supposed to do anything with 14-bit NEFs with those???  And then they want $180 for software that would eat more of my time and storage space?  W…T…F???

In the mean time, I think I’m going to download the trial version of Adobe Lightroom 2.0.  I’ve seen demos for Lightroom 1 and have liked everything I’ve seen.  I’m thinking it could streamline my workflow a bit.  That, and I hate Bridge too…but that’s another topic for another time.

T-Bird Tail Light

Today was another good day for my photography.  One of my photos won the June 2008 Assignment: Indiana contest!  The topic for June was Vintage Vehicles and I snapped a shot of the tail light of a 1956 Ford Thunderbird while Esther and I were at the 1st annual Carmel Artomobilia event in Carmel’s Arts and Design District.

Thanks to everyone that voted for this shot in June.  I’m looking forward to the July contest.

T-Bird Tail Light

This shot was taken at f/8 for 1/160.  Only the usual color adjustments were applied in Photoshop.

Streaks in the Sky

For the first time in a long time I tried my hand at capturing some lightning.  I’ve tried very unsuccessfully in the past to get a lightning shot that I could be proud of but tonight was another story.

We had yet another not so insignificant storm pass through the area tonight.  At first I tried shooting through one of the west windows in our sunroom but by the time I got my camera set up most of the lightning had moved to the other side of the house.  I moved my gear to an east facing dining room window but the screen and trees were causing other troubles.  I was about to put everything away when I decided to move into the garage (also east facing).

At first I wasn’t having much luck.  I had been trying to limit my scope to one of the houses across the street (and the sky above it obviously) since I had been seeing a fair amount of activity in that general direction.  After several unsuccessful attempts I changed my approach and zoomed back out to 18mm and widened my field to nearly the entire cul-de-sac.  Immediately after repositioning and opening the shutter I was presented with a perfect flash that I knew was right across the top third of my frame.  I left the shutter open for a few more seconds before closing it.  When the review came up I took one look at it and said to myself “this is the shot I’ve been waiting for,” packed up my gear, and headed inside satisfied that I FINALLY got the lightning shot I’ve wanted since I got my D40.

Streaks in the Sky

For anyone interested in the technical information about this shot, this was taken at f/8 for 23. seconds.

Eclipse Photos

Wow! First off, I didn’t realize that it has been since the end of September that I posted anything here. So much for keeping this updated!

As many people know, there was a full lunar eclipse last night. For some reason I decided to brave the cold to watch it. I set up the tripod and managed to snap off several great shots of various stages of the eclipse. What really amazed me though was how quickly the photos were hit on flickr after I posted them!

In all of the time I have been posting images to flickr I have never seen a response like this. Up until last night my most viewed photo was a shot I titled ‘Round and ‘Round We Go that I uploaded back in August. That shot had 67 views as of last night. Four of the eclipse shots approached or exceeded that number in less than 24 hours!

The four eclipse shots that gave my view counts such a great boost are included here for your viewing pleasure. Each photo is linked to the flickr page if you’d like to see a higher resolution version or comment on any of them.

Blood Moon (46 views)
Blood Moon

Take a Bite out of the Moon
(48 views)
Take a Bite out of the Moon

Flared Moon
(49 views)
Flared Moon

Eclipse in HDR
(76 views)
Eclipse in HDR