A late report from our workshop last year. I stumbled across it again in my preparations for KWST (Kiwi Workshop on Software Testing) 2015. It was supposed to be published through our gracious sponsor, The Association for Software Testing (AST), but it never eventuated. So I thought I’d post it here. Better late than never.
So here goes….
For the fourth year in a row, Wellington (New Zealand) has successfully hosted the Kiwi Workshop on Software Testing. The two-day intensive testing workshop is one of the key drivers of the Context-Driven Testing (CDT) community Down Under.
In its beginnings, the aim was to give the experienced and senior community members a platform to drive innovation and exchange ideas. The impact of KWST in the community over these past years has had far reaching effects in New Zealand as well as Australia.
Workshops, conferences, and magazines have emerged since, which have lifted the game right across the board. KWST 2014 was specifically aimed at involving new faces in the community and not drawing as much on the established KWST crowd.
The topic this year was:
“How to speed up testing? – and why we shouldn’t”
This topic left the discussions open enough to reach from technical and automation aspects right through to process and methodology. I am glad to say that the experience reports (ERs) covered a lot of ground and kept it quite diverse.
Over the last years, KWST has been held at the premises of Software Education. They also helped with general organisation and support during the workshop. AST kindly funded our lunches through their Grant Program (http://www.associationforsoftwaretesting.org/programs/ast-grant-program).
We are very grateful to both organisations as this allows us to run KWST for free, so that money is not an inhibitor to attendance. And it means we have great premises at our disposal, really yummy food, and a well organised event. 😀
All participants are pictured below. From left to right: Katrina Clokie, Chris Rolls, Aaron Hodder, Parvathy Muraleedharan, Joshua Raine, Adam Howard, Richard Robinson, Andrew Robins, Oliver Erlewein, Viktoriia Kuznetcova, Thomas Recker, Rachel Carson, James Hailstone, Ben Cooney, Till Neunast, Nigel Charman & Sean Cresswell.
So, at 9:30AM on a Friday the KWST workshop kicked off with introductions all around and a quick and concise explanation of the discussion procedures and topic. Richard Robinson then took the helm as facilitator and brought us on our merry way.
ER1 – Sean Creswell, Risk Based Testing, and Adrenaline Junkies Speed Fix
Sean went into a tale of how his testers (and the business) see Risk Based Testing (RBT) as faster testing. If ‘test isn’t going fast enough’, he would get asked “Can you teach them how to do risk based testing?”. On the surface it’s a good thing and easy communicate: “This thing is risky, so I want to take more time on it.”
This is obviously never as simple and straightforward as that, and Sean portrayed the different types of RBT that he encountered. The inconsistent quality of the results has made him doubt the fit and benefit to his organisation, and he has now removed the risk based approach as a standard practice. This does not mean he wouldn’t use it. It just means that it should not be a prescribed process that is the first port of call for speeding up testing.
The Open Season discussion that followed (and this continued for both days and every ER) was lively, to say the least. Lots of cards, lots of discussion from KWST greenhorns and KWST seniors alike.
The format was well understood, and there was a strong focus on results and on making the best of the time available.
Among the discussion threads that came up were:
- Backlog ordering and prioritisation
- Agile definition of done and acceptance tests
- Knowledge of risk changes over time
- Using risk assessment to train and understand/learn
ER2 – Chris Rolls, Automation Speeds Up Testing
Chris stipulated that a commonly held, and promoted, belief is that “test automation” is the “best method” of “speeding up testing”.
Even if that were true, we could achieve the same quality of testing, at a much faster pace, with no negative effects. We would then have to accept that testing always needs to be fast. While sometimes software testing absolutely must be fast, there are many other cases where time isn’t the major factor and other factors weigh in.
He firmly believes that test automation can be both a help and a hindrance, regardless of the timeframe available.
He shared four specific experiences: A situation where testing needed to be fast and automation helped and another in which it was a hinderance. A situation where time was not a pressing concern and a focus on automation improved the quality of testing, and another where it decreased the quality of testing.
ER3 – Thomas Recker, How Do You Know if You Need to Automate?
Thomas has worked on several automation projects over the years, and his talk was about the challenges and benefits he could see. Generally, he was brought on late in the game, which is never ideal. If you want to have automated validation/testing, the automation project should start right from the beginning, and it needs the full support from management.
He also contended that automation doesn’t stop at testing either. The whole development process needs to be adjusted and and needs to incorporate testing into the release process in order to be successful (CI/CD).
Thomas’ talk brought forth a technical discussion which highlighted where test automation and test automators still had to mature, like connecting to CI and CD, code management, and other areas that are more a development domain.
ER4 – Andrew Robins, CDT at Tait Communications
Andrew talked about the changes he implemented over the years at Tait in Christchurch: How his team slowly became CDT, and how it affected the ways they tested and the speed at which they tested. He also highlighted the challenges that he had with stakeholders and how he overcame them.
This was the end of day one. All attendees were tired, and a local testing services provider (Assurity) was kind enough to offer us food and drink and their rooms to mingle and network as the evening progressed. There were – of course – the usual testing games being played and some good heated after-discussions happening.
ER5 – Viktoriia Kuznetcova, Testing Mobile Applications With Maps
Viktoriia told us about her experiences in testing mobile applications for Yandex. The challenges she faced had to do with timely testing, high stakes due to a high market visibility and a huge user base. She described the methods of by which she dealt with these.
The methods can be summarized as “be prepared”. She talked about the type of questions she needed to ask, information she required, and structuring the test documentation around it so that testing stayed flexible and lightweight.
The after discussion included topics like the perception of time and levels of uncertainty in testing.
ER6 – Rachel Carson, Changing the Mindset Around Testing
Rachel’s workplace recently made some changes to their development procedures and development makeup. For her, not only was it a change in team structure, but also a change in testing mindset.
While day-to-day testing activities did not change, they were completing testing much faster and at a higher quality than before. She filled us in on how exactly a change in procedures and thinking helped make her testing faster.
ER7 – Adam Howard, When Things Go Wrong
Adam’s hypothesis was, that when things start to go wrong, our instinct is often to try and fix the problems we face by going faster. Because we have hit trouble and lost time, we think that working harder and faster is the best way to recover. When we think about it bit more clearly though, we may realise that change is the answer – reviewing what has led us into trouble, and altering our course slightly. This is an attempt to turn, at speed, and anyone who has driven a car or ridden a bike knows there’s danger in that.
Adam showed us his experiences in trying to help a giant, lumbering behemoth to turn at speed – implementing a CDT inspired approach (based around Rapid Software Testing) to speed up testing and ultimately enabling delivery of a project that had previously failed being resolutely factory-school in its testing methods.
Adam’s ER had wrapped up the second day. We asked everyone what they thought and felt about this year’s workshop. The feedback was unanimously good (which I never see as a given).
These two days filled me with sheer amazement and pride of how far the community and testing has come. The idea of challenging something, coming up with new ideas, and confidently going into the unknown with just our toolbox of test magic felt natural to all attendees. The Wellington and NZ testing market is changing quickly and the direction is definitely towards better quality achieved in the right way with the tools that are appropriate.
Everyone got along, if they knew each other or not. Usually such harmony at KWST would make me sceptical, but it was obvious that it grew out of everyone pulling in the same direction to move forward. In my mind that is a wonderful sign of a maturing testing community.
I’d like to extend my heartfelt gratitude to The AST and SoftEd for sponsoring, Richard Robinson for the excellent facilitation, all attendees for a great time, and Brian Osman for the honour to let me take over the reins while he is away in Australia.
Last but not least, I am very excited to see what 2015 will bring! We live in interesting times.
by Oliver Erlewein