Yesterday I stumbled upon a discussion about unit testing private functions. Many of the responses were very prescriptive (never do this, just say no). In other testing discussions I've heard many advocate 100% coverage via automated testing, an attempt to test everything. In general I find a lot of prescriptive guidance around testing.
Whenever I experience prescriptive guidance, I can't help but defiantly ask why?
Why do we test?
These reasons are the obligations that drive me to test. Testing to me is a means to these ends.
When I first got into automated testing, I jumped on the bandwagon of the test everything mentality. I actually recommend trying this if you haven't experienced it, nothing helps you see testing as optional faster than experiencing cases where it truly does just get in the way and adds no value.
Once I realized it can be a burden, I started to think of automated testing as a tool (an option, not an obligation). I started to keep track of which situations I felt it added value and which ones it just got in the way.
I found extreme value in automated testing of complex calculations, component logic, projections, aggregate behavior and interactions with remote systems. Other areas, especially integration layers like a controller (MVC) or service boundaries were often way more work than they were worth. In these cases I found focusing on simplicity and keeping logic out of these layers was a better approach.
The reality is there are other equally valid tools for the job. Reading code is another tool to verify behavior and to build confidence. So is manual testing. Sometimes these are just as effective as automated testing. Sometimes they aren't. Sometimes they are very complementary.
When treated as a tool, I quickly realized that many forms of automated testing gave me a new level of confidence I'd never had before. Some tests were saving me hours of time versus manually verifying behavior, even on the first iterations of development! And some tests were just not worth the hassle. Testing literally became a no brain-er for me in many situations.
Time and intuition have developed an almost innate ability to pick up and put down the "automated testing tool," instinctively knowing when it's adding value and when it's not.
Once, I saw the "light", I wanted to spread the good word. Unfortunately, in my haste, I fell prey to being prescriptive. Though I restricted my prescription to areas I felt a lack of confidence or a lack of verification, I failed to communicate it as such. As a result, I often noticed a lack of quality in the resulting tests: missing scenarios, rushed implementations, copy/pasting, readability issues etc. And when testing wasn't explicitly requested, there rarely was any.
In situations where I conveyed my concerns about confidence and verification, the quality of the tests dramatically increased.
Instead of asking for tests of X, Y and Z, if we ask "How do we verify X?" or "Are we confident of Y?" or "In what situations does Z apply?" we can weigh the pros and cons of using automated testing versus other tools to answer these questions.
If we look at automated testing as a tool, a tool we need to study, practice and perfect our usage of, I think people will pick up on the good word faster. If automated testing is optional, but people know the value of it, I bet we'd see more of it as a result, after all, it is a very powerful tool.