A trying post

ctodx
03 May 2006

I was looking through some code recently that did this with a Hashtable:

 if (documentCache.ContainsKey(key))
   doc = documentCache[key] as XmlDocument;
 else {
   doc = LoadDocument(key); // via long winded call across a network
   documentCache.Add(key, doc);
 }
What's the problem here? 

For me, the thing that leaps out is that we search twice for the key. We search for the key the first time when we call ContainsKey, and we search for it again when this returns true and we read the value for the key using the array access syntax.

Now, this is a hash table. The search algorithm used is O(1), a constant time operation. So it doesn't matter really now many items are in the hash table (modulo some hand-wavy talk about memory and the hardware cache), we're going to find the key in some constant time. But we are still doing twice the work.

Better perhaps to do this:

 doc = documentCache[key] as XmlDocument;
 if (doc == null) {
   doc = LoadDocument(key); // via long winded call across a network
   documentCache.Add(key, doc);
 }

Here we only do the one search for the key value. The nice thing about the Hashtable class is that if the key is not found the array access operator will return null for us. So we've improved the code's efficiency a little bit.

Unfortunately, if documentCache were an instance of Dictionary instead -- we've perhaps upgraded our code to .NET 2.0 for better efficiency -- this code will break. The reason is that if the key were not found the array operator getter would throw an exception. We could write this, I suppose:

 try {
   doc = documentCache[key] as XmlDocument;
 }
 catch {
   doc = LoadDocument(key); // via long winded call across a network
   documentCache.Add(key, doc);
 }

But that gives me the complete shivers. An open catch like that? Brrr. We could catch the instance of the proper exception class tat's thrown and make it slightly better, but even so. It's pretty nasty to write this kind of code where we can imagine the catch is going to be thoroughly exercised.

So it looks like we should revert to the previous version again and check that the key exists before we try and retrieve its value and suck up the inefficiency of the double search.

Not so fast. The BCL team invented a nice design pattern for .NET 2.0 and applied it to a lot of places in the Framework. The pattern, for want of a better phrase is the Try pattern. Essentially the Try pattern applies in the "try this and catch if it didn't work" situation, by removing the need for the try..catch.

Our code becomes this:

 if (!documentCache.TryGetValue(key, out doc)) {
   doc = LoadDocument(key); // via long winded call across a network
   documentCache.Add(key, doc);
 }

And we remove the need for the exception handling completely.

Another great example is primitive types now have TryParse methods to try and convert a string to the relevant type. You don't have to wrap your conversions from strings to integer types in try..catch blocks any more, making them ultra-efficient compared to .NET 1.1 and 1.0.

In other words, instead of writing something like this in the old days:

  DateTime result;
  try {
    result = DateTime.Parse(inputString);
  }
  catch {
    result = defaultValue;
  }
 
You can write something like this instead:
  DateTime result;
  if (!DateTime.TryParse(inputString, out result))
    result = defaultValue;

Free DevExpress Products - Get Your Copy Today

The following free DevExpress product offers remain available. Should you have any questions about the free offers below, please submit a ticket via the DevExpress Support Center at your convenience. We'll be happy to follow-up.
No Comments

Please login or register to post comments.