Testing

Wax Philosophical with Context (-Driven Testing)

It’s a snow day in Portland, OR. Like, a legitimate snow day. We’ve had about three of these so far this season, but this is the only one of substance – 8 inches of snow in most areas. This is problematic because people of the Pacific Northwest don’t typically know how to drive in the snow for some reason, roads aren’t salted, and snow plows are far and few between. Because of Snowpocalypse Episode IV, I’m unable to be in the office and work on testing a location-based technology project. Very unfortunate since it’s a pretty neat learning experience, but the upside is that you guys get to hear about the next part of my testing story – developing a QA philosophy and strategy. Please, try to keep your excitement contained.

One of the many limitations I found when researching QA practices is how a digital agency should approach testing. As I alluded to in my previous post, I had an idea of what to test (Does it look right in all browsers? Does it respond in mobile? Does it work for older browser/mobile versions?) but didn’t know what standards to establish as the bedrock of testing. From this, I figured that instead of a narrow focus on standards, I should take a step back and look at a general approach. I searched high and low and found many approaches that would work with product-focused development, but that wouldn’t work for our team. Since we develop different web- and app-based projects, a single approach is too limiting. What I found to be the most applicable to my company’s development cycle is Context-Driven Testing.

When I came across Context-Driven Testing (CDT), I saw the clouds part, the sun emerged, and angels started singing from the heavens. While it’s not a defined approach that provides standards and such, it was flexible enough to be applied to all of the projects for my company. Per CDT’s home, “Ultimately, context-driven testing is about doing the best we can with what we get. Rather than trying to apply “best practices,” we accept that very different practices (even different definitions of common testing terms) will work best under different circumstances.” I found this to be very important because of the different types of products we produce. For example, campaign, static HTML, and CMS websites all require different considerations. With new languages, technologies, and use cases being developed every day, the context will evolve. CDT allows for our testing approach to remain reactive to a project’s needs while still being able to apply knowledge of past experiences and techniques learned along the way.

For your own edification, here are the seven basic principles of the Context-Driven school

  1. The value of any practice depends on its context.
  2. There are good practices in context, but there are no best practices.
  3. People, working together, are the most important part of any project’s context.
  4. Projects unfold over time in ways that are often not  predictable.
  5. The product is a solution. If the problem isn’t solved, the product doesn’t work.
  6. Good software testing is a challenging intellectual process.
  7. Only through judgment and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effective test our products.

One of the biggest takeaways for me regarding Context-Driven Testing is rejecting the notion of best practices, and that good testing is a matter of skill and not procedure.By rejecting the notion of best practices, a tester is not limited to the narrow focus of a procedure and can apply a more broad focus before narrowing down the scope to what works for a given test case. Essentially, a skilled tester will know what to look for based on experience and intuition. CDT says there are good practices, but no best practices. Best practices may work for a product whose test cases remain relatively consistent over time, but when it comes to different projects and implementations of a digital agency, it’s not as applicable.

I’m also a big fan of the idea that a tester’s skillset and interaction are more important than a process or the tools by which feedback is transmitted. Because a tester is aware of the context in which they are testing, their feedback trumps that of something more process-based because it provides a better picture of why the feedback is important. I see it as the difference in saying “this is what is happening” versus “this is what is happening and is important because…” So, more details, more context.

Another thing I like about CDT is that it values working software over comprehensive documentation. This relates to a few different things, including the need to produce (or not produce) testing documentation, framing what’s really important (a working website), and in the case of my company, the idea that a one-and-done project doesn’t require a test plan to reference in the future. All three are essentially woven together because a) there are a finite amount of hours available for testing, b) I’d rather spend time testing than creating test plans and adjusting them as I test because new considerations are discovered, and c) since the majority of projects won’t require much more documentation than a style guide (created by the development team), a completed test plan won’t provide any additional value to the project in the future. When I first started formally doing QA, I created test plans and cases because I thought that’s what you do. But as time went on, I determined that the real value is a product free of defects. It is of my opinion that the time is best spent testing, though I’m open to the idea of generating test plans for larger projects that will remain in-house with a retainer, will be iterated and require test cases for regression testing, and more or less, are requested from the beginning.

kermit

So, Context-Driven Testing is what anchors my philosophy of testing. Of course, this may change over time, but this is currently what works best for the company. I should also add that CDT is only one part of the overall philosophy – in this case when it comes to testing. I also firmly believe in prevention before things reach development (QC!), like being active in design reviews and development kickoffs, but perhaps that can be explored in a future edition.

Now it’s time for me to turn things over to you guys. What do you think of Context-Driven Testing? Is it something you know much about? Is it something you’ve practiced? Are there any processes you feel that are better suited for a web development/digital agency environment? I’d love to hear more from others so I’m not limiting myself to a certain ideology that seems to make sense.

That’s it for now, folks. In the meantime, I feel obligated to teach some of the Portlanders how to drive in the snow, but that’s none of my business…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s