Mission Creep

“Testing was going so well” said the tester, “at least on the first release”.

“We had a clear mandate: find as many important bugs as quickly as we could, and report them to the developers. And we found lots: the PM was happy because we were giving her a good feel for quality, the developers were happy because we were giving them rapid feedback, and the test team was happy because we felt like we were doing our jobs well.”

“I suppose it was during acceptance testing that things started to change. The UAT team hadn’t had enough exposure to the app during development, and struggled to figure out how to run their tests. In the end, the PM asked us to help out. We were only too happy to: we starting designing and maintaining scripted walkthroughs of most of the key features and requirements, as well as authoring the associated test data. This took a fair amount of effort, but we were up for it: it’s a team effort after all.”

“The initial release went in pretty smoothly, I mean, some bugs crept through, but we helped out there too: a lot of what support were getting hit with, we were able to find workarounds for, anything else we were at least able to repro and isolate for the devs. We still do a lot of that now: it helps to keep up a good relationship with the support team.”

“The latest release was a lot hairier; a fair few of the devs have changed. The new guys struggled to understand why a lot of the unit tests were failing, and ended up commenting them out: this meant we started seeing a lot more regression bugs. Added to that, they’re offshore: now we’ve got developers on three continents. Communications don’t seem to be hanging together and somewhere along the line config management got messed up. We ended up doing a lot more regression testing this time around.”

“Got to go, post mortem starts in five minutes.  Release 2 went in last week, and the PM is on the war path: she can’t understand how we missed so many major bugs.”

What happened here?

In the story above, the testers started out with a clear mission: find bugs.

…then they began to provide scripted demonstrations for acceptance testing.

…then they started to figure out workarounds and do isolation for the support team.

…then they added black box regression tests to mitigate regression risks.

…and then they started to fail in their initial mission.

After their initial success, they allowed their mission to expand beyond its original goals, and fell foul of mission creep.

Mission creep brings a number of risks:

  • Loss of effectiveness. There are many possible missions for testing, for example: finding bugs, investigating quality, reducing support costs, mitigating risks, reducing liability, conforming with standards or regulations1. Whilst any of these is potentially valid, some practices are more suitable for some missions than others. If changes are not recognized, and practices not changed accordingly, a test team can find itself working in a way that runs counter to its mission.
  • Loss of focus. Different goals can be blended, but this requires practices to be blended too. This adds complexity. If you have try to do too many things, you may not be able to do any of them well.
  • Overextension. Like its project management cousin scope creep, increasing the number of goals often requires additional effort. Without a corresponding increase to time or resources, mission creep means that less effort can be allocated to each goal, making success of any goal all the less likely.

How can you address mission creep?  Here are some suggestions:

  • Make the mission explicit. Consult with your stakeholders and ensure that the mission is both understood and agreed. If appropriate (for example: if you are an external contractor or testing vendor), consider making the mission a part of your formal scope.
  • Keep an eye open for your mission changing or expanding. Don’t let change surprise you. Review regularly, and engage stakeholders in this process. Are all needs being satisfied?  Does the project need anything new out of testing?
  • Make adjustments to mission and supporting practices. Don’t let your mission and practices diverge, but consult with your stakeholders about the tradeoffs. What are the time and cost implications? Do any goals conflict? Consider whether it is possible to refocus: are any goals now redundant?

Testing missions evolve. Unrecognised, mission creep can cause significant damage to your ability to deliver what your project needs and expects of you. If you are testing the same software for any length of time, mission creep is almost inevitable: it needs to be managed.

 

See Lessons Learned in Software Testing (Kaner, Bach & Pettichord) for a discussion of missions in testing.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>