Cognitive biases help us to think faster, but they also make us less rational than we think we are. Being able to recognize and overcome biases can prevent problems and increase the performance of software teams.
João Proença, a principal quality engineer, and Michael Kutz, a quality engineer, spoke about the impact that cognitive biases in software development have at Agile Testing Days 2021.
Groupthink is a tendency to agree with the rest of the group, no matter the consequences, Kutz mentioned. That way the group avoids conflict and preserves harmony, but reaches sub-optimal, sometimes even catastrophic, decisions. Individual concerns are not raised and the group as a whole becomes deaf to criticism from outside the group:
Once, we tried to create a new test strategy for a large number of teams. Clearly we were the only ones thinking about this on that level, so it was easy to ignore outside opinions as uninformed or incompetent. We felt like we were only thinking of the greater good and hence our moral felt unquestionable. In the beginning, we had a lot of heated and unfruitful discussions, but at a point that stopped. I thought it was due to our ideas becoming riper. Ultimately, nobody objected to our ideas but the strategy failed anyway.
In hindsight, the strategy was really not good, as Kutz explained:
We tried to mix in all the ideas of our group, reaching a compromise that just didn’t work.
Proença mentioned that there are many misconceptions in the software industry around cognitive biases. He gave an example of affinity bias and diversity:
In tech, too many times I’ve seen leaders making statements like "I want to promote diversity because it’s the right thing to do, but at the end of the day I have a business to run". In my opinion, this is an entirely wrong way of looking at the issue. Diversity is not only the right thing to do, but it’s also good for business, as diverse teams have a higher chance of becoming high performers.
InfoQ interviewed João Proença and Michael Kutz about dealing with biases in software development.
InfoQ: What are cognitive biases?
Michael Kutz: Cognitive biases are systematic tendencies in human thinking. They occur mostly when we think fast, and less when we make conscious well-weighed decisions. Most likely the evolutionary process came up with these shortcuts to give us an edge while forming successful hunter/gatherer parties. Today these shortcuts can still be useful, but often they cause very suboptimal decisions and cause huge social problems.
João Proença: Other reasons why we probably evolved as humans to have "two systems of thinking" (as Daniel Kahneman calls them) are probably speed (fast vs slow thinking) and energy spent in cognitive processes.
InfoQ: Can you give some examples of biases and the effect that they can have on our professional life?
Proença: Affinity Bias is a tendency to gravitate toward people like ourselves in appearance, beliefs and background. It may even make us (subconsciously) avoid or dislike people who are different from us. It is usually the reason why you’ll find teams that are not diverse at all inside of organizations. This has several implications in terms of team performance and also inequality.
Also interesting are the "statistics biases": the Conjunction Fallacy, Anchoring Effect, and Availability Bias. All of them tell me that we are really bad at estimating probability, size or time, when we are not rooting those estimates in real, objective data.
Kutz: Another great example is the present bias. It makes us prefer smaller short-term rewards over bigger long-term ones. You might know the term procrastination. That behaviour is a direct result of present bias.
For example, I found spending two hours cleaning up the test environment became strangely attractive when the alternative was reading a 500-page book on cognitive biases.
InfoQ: What impact do biases have on the software industry and how does it deal with biases?
Kutz: Well, effects like those described above naturally have a strong effect on software development. Groupthink influences especially product planning and other pre-development processes; when the planning group starts to ignore out-group objections, real bad assumptions flourish and become the foundation of the product.
During development, we suffer from the IKEA effect, which makes us stick with bad frameworks we assembled ourselves.
In the end, we may fall prey to confirmation bias, ignoring negative feedback from the market.
The industry as a whole is not very aware of the biases. There are some good practices that can mitigate the effects. E.g. planning poker –when done right– can minimize the Anchor effect during estimation. The 1-2-4-all liberating structure addresses the dangers of groupthink; by making individuals do their own thinking first, then having them share their insights with one other member of the group, then these two sharing their merged insights with another pair, and only then sharing it with the group as a whole, then individual concerns cannot be hidden in silent agreement.
InfoQ: What’s your advice for recognizing and overcoming biases?
Proença: I believe that a lot of misconceptions should be clarified in the industry around cognitive biases. A misconception around Affinity Bias in organizations is that the common way to address it is to set up quotas (gender, racial, etc.), which are highly controversial and you don’t necessarily need them. There are many other effective things you can do around the way you hire people into teams or leadership positions, like for instance targeting ads for open positions to people from underrepresented groups or having clear and objective criteria for evaluating candidates being outlined beforehand.
Kutz: Getting to know the biases and effects certainly helps a lot. They are there anyway. Everybody is influenced by these biases. It helps to give things a name and to know specifically which measures may help to mitigate that specific influence.
There’s no overcoming biases. Mitigation is the best we can possibly do unless we leave all our decisions to an AI (but who should choose the training data for that – oh my).
Understanding a set of biases helped me personally to recognize problematic decision processes and understand why certain techniques are helpful, while others make processes just more complicated.
For example, I knew several liberating structures. I tried them out and they felt good. Still, facilitating them can be an effort, and so I didn’t suggest it often. Now that I know about groupthink, bandwagon and other biases, I might see the need for such a measure earlier and apply them more purposefully, not bloating the process.
Proença: One of the things we have tried doing is not only "telling" people about cognitive biases, but rather making them experience the biases first hand in the workshop that we did at Agile Tour Vilnius 2021. This helps them understand each bias a bit more deeply, and at the same time, makes it clearer that most of us "suffer from them" and it’s not a matter to be ashamed of, but rather to manage personally. As Michael is saying, I feel I’ve become much more "experienced" at noticing a bias happening in front of me, and usually that’s the hardest part of overcoming it!