Beyond scripts – transcripts

“Can you show me your test scripts?”
“Will your test scripts be part of the deliverable?”
“This role involves writing and executing test scripts”.

There is a sector of the software development community that believes, no, accepts unquestionably as a truth, that testing is writing test scripts then executing them. This leads to a vicious cycle of managers and clients asking for test scripts, and testers delivering test scripts because they were asked for them, thus reinforcing the requests and so on ad infinitum.

I’m not going to describe how wasteful scripting before testing is, or even the oxymoron of putting scripted and testing together in a sentence. If you want to read further, Michael Bolton does a great job here and here. I make an attempt here.

Instead I’m going to put forward some propositions:
1) At the beginning of a project we know least about the project
2) Good testing requires sapience
3) Skilled exploratory testers perform hundreds of what could be considered discrete tests in a session. Few are worth repeating.
4) Most tests performed are informed by the results of the previous test.
5) The usefulness of a test is often not known until it has been performed. You don’t know if a rock is worth lifting until you’ve lifted it to see what’s underneath.
6) Testers, when following a script, will deviate from the script.
7) Different testers interpret scripted instructions differently resulting in differences between testers even when following the same script.

Therefore:
8 ) Test Scripts don’t tell you what you may think they are telling you.

Instead, let’s delve a little deeper into what’s being asked of us when others request test scripts. Could it be that that’s the only artifact they know of that will deliver what it is they want? Why do people want test scripts?

* They want to know what you did
* They want to be able to verify coverage
* They want some kind of repeatability

If you want evidence of what happened, for whatever reason, then how about thinking beyond scripts, and into transcripts. Like scripts, but written after the fact.

What’s written down in a script and what a tester actually does is often two separate activities anyway, so scripting, even when done with the best of intentions, is already behind. But if we transcript our tests, we can write down what we actually did.

And I’m not talking about recording EVERYthing we do. But when we perform some testing activity that we determine the particulars of to be very important, perhaps for some future formal testing, perhaps to show as evidence of some claim we make, we transcript those details. It could look like a traditional script, or it could take some other form, but it’s a win win for everyone:

* The tester is free to investigate the software as they see fit at the time of testing using their judgement
* Clients and managers get evidence of what was performed
* Tests activities that are judged to be worth repeating in a particular way in future are recorded
* Time isn’t wasted writing down testing activities in minute details at exactly the point in the project we know the least about it.

In conclusion, I’m asking for a subtle shift in the language we use to bring about a more fundamental shift in the activities and artifacts that are judged to be useful by others.

Let’s move from Test Scripts to Test Transcripts as an alternative. Perhaps if we start offering transcripts as an alternative, and provide good value from them, then people may begin asking for transcripts and we can break the cycle.

Alternative Test Artifacts may be the topic of a future post.

Author: Aaron Hodder

7 thoughts on “Beyond scripts – transcripts

  1. Good post. You might be interested in checking out Rapid Reporter, an elegant free tool that is used to record “test transcripts.” It is created by Shmuel Gershon, an Exploratory Tester who works at Intel (who also happens to be, like Jon Bach, one of the nicest testers you could ever hope to meet).

    Justin

  2. Hiya Aaron,

    Great idea, I’ll think I’ll play with this one and see how it’s received. Good list of prepositions too, highlights the futility of writing scripts up front in an attempt to prophecise how the testing will pan out.

    Mark.

  3. Great post.

    I have a theory that I think contradicts your statement “There is a sector of the software development community that believes, no, accepts unquestionably as a truth, that testing is writing test scripts then executing them. This leads to a vicious cycle of managers and clients asking for test scripts, and testers delivering test scripts because they were asked for them, thus reinforcing the requests and so on ad infinitum.”

    My theory (based upon my own observation) is basically that more often than not it is the managers themselves who perpetuate the ‘need’ for test scripts etc. and not the testers.

    In short, managers (seem to) like metrics, and test scripts provide them with what is at first glance a simple process by which those metrics can be attained, even though as you rightly point out those metrics will at best only ever be misleading.

    I think testing without scripts seems like voodoo (or simply playing around) to them, and even if the results are good I think they don’t feel they can measure or report on what’s being done… so this reinforces their desire for scripts… basically it’s possum management :p

    I don’t see ‘testing = test scripts’ as something being perpetuated by testers (for the most part) … however many testers too suffer from being caught in the headlights, unsure of what to do when their managers / ‘the process’ call for test scripts and invariably they (the testers) just sit down and get on with the job.

    I think the greatest benefit can be attained by reaching out to managers and providing them with ways to avert their eyes when the headlights beakon 🙂 Test Transcripts may be one such way.

  4. I think you missed some of the managers and colleagues desires:
    Mostly, instead of “They want to know what you did”, it’s actually “They want to know what you WILL DO”.
    Why is that?, since at that stage they can still influence (just like we aim to review requirements before coding (note that I don’t say how long before)),
    Now, we shouldn’t get insulted and say – well, what do they know? (we hate it when Devs do it to us when we report bugs – right?),
    My experience shows that good review can give at least 30% additional test ideas the tester has missed – let’s face it, we are no perfect either – these are our “Bugs”.
    Why do we write in advance? – To give feedback before the SW is coded, that proved to be cheaper than all rework.
    Also, to think before we dive in.
    How deep should we write? – as much as helps us in targets above.
    I hate writing full details, but my reviewers seem to be able to contribute ideas ONLY if they can visualize my tests – So I am considering maybe to present 1 detailed TC for each group, and display the rest of the tests scope as dimensions table, to enjoy best of both worlds…
    Are these all the tests we need – NO, we can’t think of it all in advance – that’s why we must leave space for thinking while testing, allow detours, plan for exploratory testing IN ADDITION to “scripted”.

    • Great reply, and I’d love to respond, but really busy at the moment. Give me a couple of days, please 🙂

  5. Since this post came out, there was some progress in supporting tools – like QTrace and QAWizard’s tool, but we miss a major point in focusing on reporting what we did or plan to do.

    While we actually have to focus on the purpose of the test – what we aim to test more than how we have accomplished that.

    @halperinko – Kobi Halperin

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s