Articles‎ > ‎

Debugging vs. Logging

posted Jan 29, 2011, 2:42 AM by Michael Schollmeyer
Amongst the pros and cons for debugging vs. logging you might consider that debugging should be avoided because it constantly breaks DRY. This article explains you why. 

A debugger is a very powerful tool to find out what is really going on in your application. 

When you application is hosted by a virtual machine, you can stop any thread at any source code line and start inspecting the system. 

A quite different approach to trace information of an application at runtime is logging. There are some powerful logging frameworks available. 

Obviously, there are some things to consider that are beyond your control: For example, you might need to find a bug that only occurs on a customer's computer and you cannot start a debugger there. On the other hand, your target machine might be embedded and you cannot write a log file. 

As both approaches tackle the same thing, how do you decide which one to use? I have recently worked on a product that is on the market for a couple of years. The customer complained that printing is not working at some point. We started analyzing the problem and after some hours of debugging I was able to find the root cause. We had a similar problem some years back and I spent the same amount of time to solve the problem. 

Because I was using a debugger, I had no opportunity to store all the breakpoints and watch windows in the past and reuse them now. Neither was I able to do it now. All the work I spent setting up an environment to make the bug visible silently vanishes at the very moment I close my IDE. 

This is a waste of time and effort and it also breaks one general rule in development which says DRY: Don't Repeat Yourself. On the other hand, if you setup logging, all your log statements will still be in your code. If you write a unit test to reproduce the errornous behavior you will have everthing in your code base. Nothing will be lost.
Comments