Design better by avoiding your cognitive biases

Good UI design is all about guiding attention to what’s important. When making the right thing for the user the easy and obvious thing, you can’t ignore cognitive biases. After all, these biases are brain shortcuts that let us quickly and effortlessly make decisions and react to our environment.

So a lot has been published about what designers should know about the cognitive biases of users. It’s interesting though, that so much human-centeredness rarely acknowledges that we, designers, are also mere humans. And we’re subjected to exactly the same biases that our users have!

In this post I will present seven cognitive biases, how they can mess up your design work and what you can do to avoid that.

The Ikea effect

Trong Nguyen gave me the idea to write this post. He writes:

The more time you spend on your design, the more love you will take for your design. So it would be difficult for you to receive feedback and change your design.

This is the Ikea effect: the bias in which people value things they created themselves disproportionally.

What you can do about it

Trong’s solution:

Share your work with other designers and your team member earlier, you will get early feedback from them. So you change your design early, without throwing your whole month effort.

I’ll add that in my experience, much of the best design work is done when designers facilitate a process where many people contribute to finding user-centered solutions. When you see facilitating that process as your main objective, the output feels less like your own creation to begin with. You still need to be open to others offering suggestions to improve that process though!

The Dunning-Kruger effect

This one is my favorite. Every venture I’ve supported at my job at BCG Digital Ventures has been very different from the previous: different product, different industry, different target group.

Do you know that overwhelming feeling of this is so much, I don’t know where to start? Then you also probably know the feeling a few weeks later, when you’re like, this is awesome, I’ve learned so much. Now I understand the German healthcare system/European freight forwarding/industrial material selection/[insert the context of your latest project here]! That’s why this is my favorite bias. Consciously experiencing it is one of the best things of being a designer. But only a couple of user observation sessions later, you’ll find out that you really know less than you thought you knew even at the start.

That is the Dunning-Kruger effect. It describes how people with only a little bit of knowledge about a topic vastly overestimate their own abilities.

Graph with a curve showing perceived knowledge increasing much faster than actually gained knowledge. After a quickly reached peak, perceived knowledge decreases rapidly until it steadily builds up again, closer to actually gained knowledge.
Maybe I’m getting myself Dunning-Krugered into thinking I’m equipped to write about cognitive biases in the workplace?

What you can do about it

Plan enough time to research users and the product context. Validate ideas with your users while they’re still sketches; don’t waste time adding details to a concept that’s wrong to begin with.

If someone who reports to you wants more responsibility, make sure you have information that can back up they’re ready for it. After all, after reading a couple of management books one could believe they’ve figured everything out and are ready to lead a team. I certainly did!

Not Invented Here

Not a well researched bias just yet, but Not Invented Here is often seen as a reason for organizations not to choose the for them optimal, but existing solution. And subsequently to create something new (and often sub-par) instead.

I think I may personally have had some NIH when I wasn’t interested in articles published by companies I was competing with. And when (years ago!) I wasn’t keen on using design component libraries from third parties.

One of the reasons I’m in design to begin with, is that I like coming up with new solutions and to do weird experiments that nobody else has done before. Arguably, that attitude prevents me from picking the most obvious and effective solutions to some problems.

What you can do about it

Start design with finding cheap and easy to build ideas. That should force you to look at existing solutions, perhaps even those created by competitors. They’re not bad necessarily! Always consider technical feasibility and economical viability when comparing solutions.

Not Invented Here syndrome is most observed in organisations. Ask yourself, why do your peers, reports and managers not propose to do what competitors do? Is it possible they believe you wouldn’t like it, because they didn’t come up with it themselves? If it’s common in your industry to have disdain for others, NIH may even more important. The movie Moneyball is all about that.

If you’re working for a leading organization, it may suffer from the handicap of a head start. All your competitors are looking at you and can quickly copy the features, methods and tactics that took you a long time to develop. But they may tweak it and improve it. So don’t discard copycats as inferior. Look at them closely: what did they change and why? Perhaps you can take that a step further and make your project open source. Some big contributions may come from people outside your organisation, but that doesn’t make them any worse.

Inattentional blindness

I don’t think there’s a more convincing way to explain this phenomenon than this video. Please watch it, it’s less than a minute:


Know it already? Here’s another brilliant one.

What you can do about it

Get users to test your work. They pay attention to different things than you do. I know, it’s the kind of advice you’ve heard a thousand times. But inattentional blindness makes clear that even if you think your design is obvious as anything, it’s possibly you literally don’t see what is wrong with it.

Survivorship bias

In the Second World War, coming back in one piece after flying over enemy territory was as likely as winning a coin toss. Of course, all sides involved wanted more of their crews and airplanes to survive bombing runs. Because planes would get too heavy by completely plating them with armor, it was only added to the most important places. To find out what these important places were, the American navy analyzed where planes returning from their missions suffered the most damage. Just before it was decided to armor the wing tips and the central part of the fuselage, statistician Abraham Wald stepped in. Do you see what mistake the navy almost made? Wald figured that only airplanes that returned to base were considered in the analysis—the planes suffering most damage had already crashed before that!

This historical anecdote comes from You Are Not So Smart, where you’ll find a lot more about survivorship bias.

What you can do about it

Most of us aren’t designing bombers—luckily. But we do try to learn from our environment. We look to successful people and successful products for information on how to succeed.

Success in the market is all about not failing. Just like with the WWII bombers, we can often learn more from failed products than from the successful ones. Take the iPhone. Why was it successful? One could believe it was the touchscreen. Or the App Store. The integration with the music store? But Windows phones had all of that too, but we don’t hear much from them anymore.

So when analyzing a market, include companies and offerings that no longer exist. Why did they fail? As a designer, be careful assuming it was because of inferior design and that you would do a better job. How can you be sure that the design was so bad that it stopped people from buying?

Information bias

Ever found yourself stuck on a hard design problem and tried to solve it by getting more inspiration or doing more desk research? This may be information bias at work: the bias to seek information when it does not affect action. I certainly had some of that working from home alone.

What you can do about it

Take research seriously and use the scientific method. Make your hypotheses explicit, be critical of your testing methods, separate gathering data from analysis.

Form habits to frequently do ideation sessions. From the moment you start your research, set regular time apart for making quick sketches. This can get you out of the information seeking mode into creation mode.

Put breaks in your planning (and take them!). It’s okay to be in research mode after hours of web searches and reading articles. It’s okay that ideas don’t come immediately when you take a blank sheet of paper after staring at a screen for a while. Go for a walk, let the information sink in, chat with someone. Your brain is not a computer, so don’t use it like one.

Confirmation bias

Confirmation bias is The Mother Of All Biases. Well, maybe not all of them, but if you believe that and look for supporting evidence, you’ll likely find it. And that’s exactly the problem with it. Confirmation bias makes us see what we already know or believe and makes us ignore and discard evidence against it. Confirmation bias is with me and you, every day, at all times. In the design process its ugliest effect may be in user research.

Say, you’re interviewing users after they tested some design concepts. People rarely speak full sentences without a script. They gesture and refer to things they see. That makes it subject to your interpretation. That interpretation may very well be what you want to hear instead of what they really mean. The stronger your own belief in the quality of one of the solutions in the test, the more likely your observations are skewed towards it. And of course, the opposite can happen too!

Or say, you’re preparing a presentation forecasting how your new product is going to be super successful. You’re looking up some supporting facts online. Oops, now you’ve stopped doing good research. Now you’re ignoring the information that would tell you something different. Something you could learn from and avoid perhaps the biggest risks for your project. You may be doing this to get an okay from your manager, but in the meanwhile you’re changing the perception of reality for you and your team.

Confirmation bias can make a project snowball out of control, where initial wrong assumptions aren’t challenged, but reinforced with wrong or incomplete data. This is why startups fail even after years of work and several rounds of funding.

What you can do about it

Because confirmation bias is the basis for so many biases, there’s really no way to avoid it completely. But there are some things you could do to avoid important mistakes.

Avoid doing user testing on your own. Another person in the session can correct you, when they notice your interpretation of user feedback is skewed, or if you’re ignoring relevant observations. If your team is big enough, you can agree on always letting the user testing be run by people who didn’t work on the solutions, so that they should have less strong preconceptions to begin with. Set evaluation criteria for all solutions before executing the test: not all qualitative research has to be explorative research.

When it comes to desk research and forecasts, it can be useful to think in options and ranges instead of roadmaps and values. Be upfront about risks—when you present to investors, likely that’s what they care about most anyway. After all, the trick of building a successful venture is not about what you consider ‘better’ than the competition. It’s about not failing before the competition fails. Avoiding stupidity is better than seeking (sidenote: Credits to Farnam Street and Charlie Munger. ) .

Trust your design process and be proud of finding results that go against your intuition. It’s awesome to be able to say to your product manager: I had the wrong intuition before, but now I have reliable facts that show us the right answer to our problem. And remember: experienced designers may be good at coming up with user-friendly solutions, but they’re often wrong predicting which of their solutions performs best.

Can you really outsmart your brain?

Cognitive biases are hardwired and skew our perception of reality. Just knowing about cognitive biases doesn’t make you immune to them. They’re like visual illusions. Knowing that in the image below, square A is just as bright as square B doesn’t help you assess that correctly next time you’ll see a similar image in a different context.

A visual illusion showing a checker board with squares that seem to be of different lightness because of the context in which they appear.
The checker shadow illusion

The other nasty thing of biases is that they can be easy to spot in other people’s reasoning, but much harder in your own. And because you don’t feel you’re prone to them, it’s hard to accept when somebody points out that your brain is playing tricks with you. You can challenge the people you work with, but be gentle with them. Consider that you, too, may be wrong.

The same image as before, with the two equally colored squares connected with the same color.

The most effective way to reduce the effect of biases, is to change what you perceive, not how you process it. That means you have to change the environment and the people around you. The solutions I propose do that for you: they’re methods and habits that should help you get better perspectives.

References