Friday 13 November 2015

Using strong-style pairing and a coding dojo for test automation training

At work we're implementing a brand new automation suite for one of our internet banking applications. This is the first framework that I've introduced from a coaching perspective as opposed to being the tester implementing automation day-to-day within a delivery team.

Aside from choosing tools and developing a strategy for automation, I've discovered that a large proportion of the coaching work required is to train the testers within the teams in how to install, use and extend the new suite.

I've done a lot of classroom training and workshops before, but I felt that these formats weren't well suited to teaching automation. Instead I've used two practices that are traditionally associated with software development rather than testing: strong-style pairing and a coding dojo.

I've been surprised at how well these practices have worked for our test automation training and thought I would share my experience.

Strong-style pairing

After a series of introductory meetings to explain the intent of the new suite and give a high-level overview of its architecture, each tester worked independently using the instructions on our organisation wiki to get the tests running on their local environment.

As the testers were completing their installations, I worked in parallel to create skeleton tests with simple assertions in different areas of the application, one area per tester. To keep the training as simple as possible I wanted to split out distinct areas of focus for individual learning and reduce the potential for merge conflicts of our source code.

As they were ready, I introduced an area to each tester via individual one hour pairing sessions using strong-style pairing. The golden rule of strong-style pairing is:

"for an idea to go from your head into the computer it MUST go through someone else's hands"

For these sessions I acted as the navigator and the tester who I was training acted as the driver. As the testers were completely unfamiliar with the new automation suite, strong-style pairing was a relatively comfortable format. I did a lot of talking, while the testers themselves worked hands-on, and together we expanded the tests in their particular area of the application.

As the navigator, I prepared for each pairing session by thinking up a series of objectives at varying degrees of difficulty to accommodate different levels of skill. My overarching goal was to finish the hour with a commit back to the repository that included some small change to the suite, which was achieved in two-thirds of the sessions.

As a coach, I found these sessions really useful to judge how much support the testers will require as we progress from a prototype stage and attempt to fulfill the vision for this suite. I now have a much more granular view of where people have strengths and where they may require some help.

I had a lot of positive feedback from the testers themselves. For me the success was that many were able to continue independently immediately following the session and make updates to the tests on their own.

Coding Dojo

At this point everyone had installed the suite individually, then had their pairing session to get a basic understanding of how to extend an existing test. The next step was to learn how to implement a new test within the framework.

I felt that a second round of individual pairing would involve a lot of needless repetition on my part, explaining the same things over and over again. Ultimately I wanted the testers in the team to start pairing with each other to learn collaboratively as part of our long-running pairing experiment.

I found a "how do you put on a coding dojo?" video and decided to try it out.

I planned the dojo as a two hour session for six testers. I decided to allow 90 minutes for coding, with 15 minutes on each side for introduction and closing activities. Within the 90 minutes, each of the six testers would have 15 minutes in the navigator/co-pilot role, and 15 minutes at the keyboard in the driver/pilot role.

I thought carefully about the order in which to ask people to act in these roles. I wanted to start with a confident pilot who would put us on the right course. I also wanted the testers to work in the pairs that they would work in immediately following the session to tackle their next task. So I created a small timetable. To illustrate with fictitious testers:



On the morning of the session I sent an email out to all the participants that reiterated our objective, shared the timetable, and explained what they would not require their own laptops to participate.

We started the session at 1pm. I had my laptop prepared, with only the relevant applications open and all forms of communication with the outside world (email, instant messaging, etc.) switched off. The laptop was connected to a projector and we had a large flipchart with markers to use a shared notes space.

I reiterated the content of the morning email and shared our three rules:

  • The facilitator asks questions and doesn't give answers
  • Everyone must participate in the code being written
  • Everyone must take a turn at the keyboard

Then I sat back and watched the team work together to create a new test!

Though I found it quite challenging to keep quiet at times, I could see that the absence of a single authority was getting the group to work together. It was really interesting to see the approach taken, which differed from how I thought they might tackle the problem. I also learned a lot more about the personalities and social dynamics within the team by watching the way they interacted.

It took almost exactly 90 minutes to write a new test that executed successfully and commit it back to the repository. Each tester had the opportunity to contribute and there was a nice moment when the test passed for the first time and the room collectively celebrated!

I felt that the session achieved the broader objective of teaching all the testers how to implement a new test, and provided enough training so that they can now work in their own pairs to repeat the exercise for another area of the application.

I intend to continue to use both strong-style pairing and coding dojos to teach test automation.







2 comments:

  1. Thanks for sharing, Katrina! Great to see that strong-style pairing and mob format for teaching are as useful to you as they've been for me.

    ReplyDelete
  2. Thank you for this Katrina. Using your talk from Test Bash Brighton and this to try to implement similar where I work. Even the non-software side of the company are interested in it!

    ReplyDelete