How long does it take to compile? We don’t ask that. It would be absurd. The fact that people ask how long unit testing takes mean they see it as an optional cost to be incurred. What I want to know is why they don’t ask for a similar accounting of the cost of NOT writing unit tests!
I think the “cost” of unit testing falls into three categories:
1) The early days/training/learning
2) The ongoing cost
3) The cost of not testing/cost of errors
The early days/training/learning
The first time you do something, it takes longer. This is why we pay experienced people more. It is expected there is a training or learning cost. If an activity is worthwhile, one recoups this cost quickly. It is called an investment.
The problem I see is that some teams don’t get past this point. They see unit testing taking longer the first time and imagine it will take that long forever. I wonder how these people learned Java or regular expressions or anything else. Except that they wanted to learn the hard tech. If one believes the current buggy state of affairs works or can’t imagine a better way, staying motivated to get over the learning hump is difficult. This is why having a mentor or coach promoting unit testing is helpful.
The ongoing cost
When writing unit tests as part of the task, it is difficult to measure the
amount of time it takes. I really can’t tell you what percentage of development time I spend “writing tests” because it occurs at the same time I do the other parts of the task. I also couldn’t account for the percentages of time I spend typing, thinking, compiling, etc. These things are occurring simultaneously.
Many of the people who complain the ongoing cost of testing is to high are treating it is a separate activity. They write the code, manually test it, debug, repeat and then write a unit test. If done that way, the entire cost of writing the test is extra. And often resented because they really are “done” before they start writing the tests. In this case, the unit tests only have value as regression but not as part of the development process. While I suppose it is better than nothing, I find this a way to make writing tests more costly than it should be.
I’m not saying one has to use TDD (test driven development) to see a benefit. Using the unit tests as a replacement for that manual testing lets it become a cost you need to incur anyway. Granted, it takes a little longer to write a test than test by hand. But this is minimal. It is also offset by way less regression issues and the ability to test error conditions easily. And if you write tests in close proximity to the code, it is even faster.
There’s another measurement issue going on here. Yes, it takes longer to write tests than test manually ONCE. If you are writing a script that will only be run once, it isn’t worth it. How many times have you had an application that was written once and then never touched again?
The cost of not testing/cost of errors
This is the part that really bothers me. There is a cost to not doing an activity. It tends to be swept under the rug and treated as a cost of doing business. It is hard to see opportunity cost. But it still exists. The two biggest costs I see are finding errors late and regression errors in future release.
- We’ve all see the curve that shows how fixing errors late is much more expensive than finding them in development. Unfortunately, developers get to claim they are “done” when they’ve really just moved the errors to later. They are more expensive, but the developer gets to claim they finished the task in X days. Managers need to stop allowing this.
- The future release problem is even harder. I think more of the value of having unit tests comes from the future. Some developers claim that they don’t need unit testing because they produce high quality code without it. Sometimes this is even true. However, what happens when that developer looks at the code in a year or leaves the team. The unit tests are code and live on. My development velocity on maintenance/enhancement tasks is much higher with unit tests because I don’t have to posit what I was thinking a year ago when I wrote the code.
What if my management still needs a cost
This blog post is inspired by someone asking me this question. “What would you say is the average percentage of development time devoted to creating the unit test scripts?” While I take issue with the question, I think he is still going to have it. As a result, here are links to three articles/webpages that use numbers.
- Misko Hevery comes up with a figure of 10% cost. He calls it a 10% tax and points out the benefits that come a tax. Note that he is writing tests as an integral part of his process and is fluent in doing so. He also actively dispels the myth that testing takes twice as long.
- Brian Johnston discusses costs. I like that he covers the cost of not testing too. While he picks extreme figures (based on the worst case myth), it does cover risks and things to take into account for your own shops. Also, he discusses the hardest 15% of tests. The earlier tests written are the easier ones and cost less.
- A variety of opinions wiki’d. While numbers are mentioned, the real value of the page is the caveats for dealing with such numbers.
Conclusion
When talking to management about a cost, make sure they know about the associated benefits. And the cost of other options. Doing nothing is an option and has a definite cost!
HenryK posted via Google Buzz and re-quoted with permission:
“Value of car you buy includes crash tests fee. You cannot buy car from the company that doesn’t test its vehicles. The same thing is with the software. Unit tests code is an integral part of the delivered product.”
I think this is an excellent analogy.
Pingback: another reason $ of time writing tests is meaningless | Down Home Country Coding With Scott Selikoff and Jeanne Boyarsky