Chris Dabnor

14/04/20

People

Pair testing with your subconscious

When I was studying literature, my tutor would refer to “Oh!” moments. Those moments where you read something and it jumps out at you, but you don’t know why, or what it is. Those moments, she said, you should listen to. You should try and figure out what it was that made you sit up and pay attention. It’s the same with testing software. As people who test software, it is useful to listen to our subconscious minds, to investigate those “Oh!” moments, rather than dismiss them. 

The reason I’m writing this is because, during this lockdown, my 7 year old son (7 years and 3 days as I write this, to be exact - it was his first ever birthday in a pandemic lockdown 3 days ago) and I often are in the same space. He looked over my shoulder, pointed at the piece of software I was testing and asked “should there be two of those?” I’d not yet noticed it, but he had. I posted this as an amusing anecdote on Ministry of Testing, but it got into a bit of a conversation about children and the way they view things. Children haven’t had their view of the world imposed on them to quite the extent we adults have, haven’t built up the extensive catalogue of experience adults have, and therefore that view has the potential to be less tainted and biased. That got me thinking further, about how we have whole catalogues of biases that we’ve built up over the years, how this can influence the testing process and what we can do to mitigate it.

During the development process, a lot of conscious thought is put into things, into what they do, how they do it, and what it looks like when they actually do do it. A lot of thought by a lot of people with a lot of experience, a lot of intelligence, and a lot of ability (or that’s certainly the case where I work), but it’s conscious thought and therefore comes with its own baggage. Then there is the testing, and again, it can be a largely analytical, conscious process with its own added baggage. Our experience of using software becomes ingrained. Our existing experiences of using software (often software that has been designed by the companies we work for and the people we have worked with) influence our design decisions, and then when we come to test the software, we are further influenced by the fact we were involved in those design decisions. Consciously or otherwise, we are invested in those decisions being the “right” ones. As you can see, there are many ways we build up our catalogue of bias, and many ways those biases become reinforced. 

When we test the software, we click on button “x” because we’ve always clicked on button “x”, or because we’ve designed button “x” to do a particular thing. But what about our users? They may never have used button “x”. They may have used a button that is like button “x”, or is in the place where we’ve put button “x”, to do something entirely different. The button “x” that we have put before them is possibly one they are seeing for the first time.  It’s very difficult to put yourself into the shoes of someone who is facing that screen for the first time. 

We can mitigate this to an extent by using things like personas, but once you know something, well, you know it. You can use various forms of usability testing to see how untrained and unbiased users behave when presented with the software. Where these forms of usability testing are not available or practical, however (and it will be interesting to see how methods of usability testing such as guerilla and lab usability testing adapt to the lockdown), it is even more important that those involved in testing be the user advocate and therefore attempt to overcome the bias. Sometimes, in the course of your testing, you might make a mistake - click on the wrong button, enter something in the wrong field. Often, our reaction is to go back and correct our mistake, rejoin the happy path, seeing ourselves as the source of the problem. However, before you do this, stop and think. Why did you make the mistake? If it’s because you didn’t know how to use the software, might it be that it’s not as intuitive as it could be? Which is correct here, the conceptual model, or your subconscious mental model? You may have unhelpful mental muscle memory, or ingrained cognitive biases, so you should also consider that you might, without realising it, be taking the actions that users unfamiliar with your software would, or simply the action that makes most sense. Here you should try to question assumptions that either the software is right, or you are right. You should take a moment to consider which it is, or what discussions should be taking place to improve usability. One of my mantras in testing is “question everything”. Don’t assume that just because the software was designed and developed by a brilliant team, that you cannot add value by asking questions - the solution may be a fantastic one, but conversation might lead it to being an even better one. Even should no consensus be arrived upon, these discussions can be formed to create hypotheses and areas of interest for usability testing. Your sample users may come down on one side or the other of any debate, or surprise you by taking another direction altogether.

You may get even more ambiguous, intangible feelings of “wrongness” (or at least that there is an area for improvement). There may be something about a journey or an interface that it is impossible to put your finger on, but it just feels like it’s not quite “there”. Don’t dismiss it. Talk to someone about it. Just ask “Is there something that feels off about this?” They may be able to see, or help you crystallise, what it is that you are feeling. No matter how hard we try, it is practically impossible to behave, and experience the journey, as a user would, but there are ways to get closer to doing so. At the end of the day, what I’m trying to say is don’t dismiss your feelings and don’t ignore your missteps. Listen to them. Explore them. They might be telling you something that your active brain hasn’t seen. They just might not be able to articulate it very well.

Chris Dabnor

Read more by Chris Dabnor

Our cookies

We use analytics cookies to collect information about how our site is being used. We use these cookies to allow us to improve our services. Tell me more