Tuesday, September 4, 2018

Testing Tour Stop #19: Session-Based Pair Testing with Guna

Guna Petrova had listened to my testing tour talk at a local meetup end of July. Right afterwards, on the same evening, she scheduled a pair testing session with me. What a wonderful feedback! We had a great time testing together.

When it came to selecting our topic, we had lots of ideas and areas we would like to dive into or improve our skills in. For our session we agreed to try an approach we've known for a long time but never actually tried out ourselves: session-based testing.

Prepare and Align

As neither of us was really familiar with session-based testing or session-based test management we had to first read about the approach. The following resources helped us prepare as well as guide us during the session.
We agreed to try this approach to support our exploration, so we needed a test target. We decided to go for the Hours time tracking web application. The domain is not unfamiliar, the product still unknown to us.

We also needed to decide where and how to take notes. As far as we could see, usually a simple text file is used which can easily be parsed for automatic reporting. For our session we did not care about the automation part, so we decided to have the sample session reports guide us, but not limit us. We wanted to create a mind map instead.

Taking the template and making it our own, in a visual way, was a great exercise in itself. We had to discuss and align which information to take over, where to put it and how to structure it in the beginning, knowing we will adapt it as needed during our session. We agreed to use child nodes along with notes on them to store more detailed information. Later during the testing session we also decided to use icons to mark issues so we could quickly get an overview on our findings.

While creating the mind map and aligning on everything, we agreed on our mission and our first charter to focus on.
  • Mission: Evaluate the web app whether it's suitable for a first time user
  • Charter: First interaction with the system, namely the
    • Web home page
    • Registration
    • App landing page & content
We noted down areas to test and session metrics. We planned to report issues and opportunities for future charters, as well as test data and tools used.

We agreed to time box our testing session to 45 minutes. We also wanted to pair the strong-style way, frequently switching the navigator and driver roles, having a mob timer support us. Finally, all agreements were made and we felt ready to start our session.

Time Flies

We started testing, tackling the very first contact point of most first time users of Hours: the web home page. If we would want to evaluate the product, this is where we would start to learn more about the application and create an account.

We found several issues already on this home page. We frowned upon lots of things we would not have expected. We also found praise! We took note of all these findings in the report section of our mind map. Besides that, we also stored details on what we tested in the notes section of our charter. I wondered whether I really would put down so many notes myself (I normally don't). The benefit I saw was that it would definitely help tell the testing story in the end when discussing findings. Guna added that this was also proof what we did, it felt safe to do it, especially in situations were you needed to show evidence of your testing.

Our mind map evolved quite naturally. We discussed and aligned ourselves. Suddenly we realized that not much time was left and we were still on the web home page, not even close to the other parts we wanted to look at. We decided to keep it with that, and still we went overtime. We tested for 55 minutes and therefore exceeded our time box by 10 minutes.

Debrief!

In session-based testing the session is debriefed right afterwards. We agreed to not use questions of the offered debrief checklist, but try the PROOF approach of Jon Bach.
  • Past. What happened during the session?
  • Results. What was achieved during the session?
  • Obstacles. What got in the way of good testing?
  • Outlook. What still needs to be done?
  • Feelings. How does the tester feel about all this?
This felt pretty fast and easy, straightforward, and indeed helpful. Mental note to myself: Do debriefs more often for your own testing; you might learn a thing or two.

Looking Back

In retrospect, both Guna and I liked our session. It was great to try something - together. It's way easier this way. As Guna put it: She wouldn't do session-based testing alone. Cassandra did it alone and struggled. But together it provided value. It's not a bad format.

What I liked about our approach was that we took the format and template, tried to understand the essence and then transferred it to a visual mind map, thus making it our own and adapting it to a setting which might be useful in our everyday life. Guna said documenting testing and findings like this seemed to be a neutral way to not force your own style on the other person. It felt great for multiple people working on the same stuff and it was a good reminder that there's more to consider for testing sessions.

Regarding our strong-style pairing, Guna offered interesting thoughts on it as well. We had to communicate and align ourselves. Whenever we felt we were not aligned, we explained why we would want to do this or that. We had to hold back sometimes, especially when we wanted to bring up something but it wasn't our time yet. These cases were the only time she noticed the pairing part. Besides that we were still having a natural conversation and flow. Her conclusion: we should do more pairing. I agree!

No comments:

Post a Comment