The Metrics: Number of Ignored Tests

The number of ignored automated tests can tell you how to improve your team's relationship with tests
Standard

Time for another metric. This time is about ignorance, and not the blissful kind.

From whenever we (or Kent Beck) invented test frameworks, we wanted to ignore tests. For all kinds of reasons, mostly focusing around: Don’t bother me now, I’ve got better things to do. Like spend time debugging.

But that is a temporary inconvenience, and as quickly as we can, we remove the ignor-ation, skipp-ation or disabled-ation of these automated tests and get them working. Right?

Right?

Funny, I seem to be hearing crickets, which is uncommon in an urban, closed and air-conditioned room.

What is it?

The number of skipped / ignored / disabled tests, or whatever your test framework uses to designate a test that is identified, but not run. Sadly, it is not feasible to also count fully commented out tests, but if there were, I would recommend tracking those too.

Should I track it?

Yes, I would not spend my Saturday morning telling you otherwise.

Why should I track it?

Ignored tests are a solution to an immediate pain. And that’s ok, as long as they are not kept in this state. In the ideal condition, ignored automated tests wouldn’t be pushed (or even committed) into our source repository.

The real problem is they just stay there, getting out of touch with the real code. When you get back to them, two months or years later, you start thinking:

  • What does this test do?
  • Why is ignored?
  • Do I already have a similar test? Does it check completely what the other test does?
  • Why is it failing? Because the test is not current, or the code?
  • What about those comments and its name? Do they describe the wanted behavior, current behavior, or what?
  • How long has it been laying there?
  • Is it flaky? It may be passing now, but I’m not sure how I can trust it.

And some more time wasters we don’t want to hamper us. It’s just not worth it.
The funny thing is how easy it is to fix the problem: Make it work, or throw it away.

What does the metric really mean?

Once again, the metric may point you to problematic tests, but it really shows how much the teams care about the testing effort, and how much they understand the value of automated tests. If you see this going and staying over zero too much, it’s a sign for re-education and focus. If you’ve inherited a code base with 500 ignored automated tests – delete them. Don’t worry, source control remembers everything.

How to track it?

That’s easy – All test frameworks report them. Just draw a line of the metrics across consecutive builds and you’ve got some fodder for your retrospectives.

If you want to know more about how simple metric tracking can impact the team behavior, try starting with this one. It brings a lot of interesting discussions to the table.

And, for helping your team get on track, contact me. I can help with these things, you know.

Leave a Reply

Your email address will not be published. Required fields are marked *