We think that you are in United States and that you would prefer to view Bookwitty in English.
We will display prices in United States Dollar (USD).
Have a cookie!
Bookwitty uses cookies to personalize content and make the site easier to use. We also share some information with third parties to gather statistics about visits.

Are you Witty?

Sign in or register to share your ideas

Sign In Register

5 Books to Help Us Understand Our Internal Biases

There's a damaging and pervasive idea that's been floating around out there for too long; the idea that we should never change our minds, that we should be steadfast in our beliefs and never second-guess ourselves. This, apparently, is a great virtue. There's just one problem. By subscribing to this belief, we're also resigning ourselves to being wrong – a lot.

And no one wants to be wrong. It's uncomfortable. Most of the time we'd rather cling to old arguments than admit we made a mistake and update our thinking in light of new evidence or information.

Writer Julia Galef gave a TED talk about this very topic in 2016. She called it: "Why you think you're right, even if you're wrong" and in it she identified two types of people: Soldiers and scouts. Soldiers will defend their beliefs at all costs, spurred by the desire to be right and win the argument, while scouts will stay open to new information, spurred by curiosity and the desire to see as objectively as they can.

In a piece on learning to be okay with changing your mind, Galef writes that not wanting to be wrong is "so automatic that it's hard to notice it coloring your judgment unless you really pay attention, but once you do, you realize how frequently it makes you grasp for a fallacious argument just so you don't have to admit to yourself that you were wrong."

Her theory is that if we want to be actually right more often, we need to get comfortable with being wrong; to keep ourselves open to new information and to be proud of changing our mind when we do.

Here's five books that can help.

Mistakes Were Made (But Not by Me)

Admitting we were part of the problem is never comfortable. Our brains aren't wired to work that way. Tavris and Aronson explain how we are wired exactly the opposite way; for self-justification. Imagine you were part of a group project that went disastrously wrong. Most people are unlikely to sit back and think critically about their own role in it, the mistakes they made and where they could have performed better -- but they'd most likely easily compile a list of reasons why Simon or Jane or Andy were wrong and how they contributed to the failure. This book is a study of self-deception and how to overcome it.

Buy the Book
You Are Not So Smart

We think we're rational thinkers. We usually believe our decisions are based on logic. We think we are independent thinkers who know when we are being influenced. Not so fast. McRaney has some cold, hard reality for you: You're just as delusional as everyone else. Sounds harsh, but it's liberating if you let it be. In a witty and engaging way, You Are Not So Smart helps us understand why we think the way we do, why we assume the best of ourselves and how to recognize when you're fooling yourself.

Buy the Book
The Invisible Gorilla

You've probably heard of the gorilla experiment. A group of people, some in white shirts and some in black shirts pass basketballs around in a room. Viewers are asked to count how many passes the people in the white shirts make during the clip. Half of them, focused on counting the passes, fail to notice that a person dressed as a gorilla walks across the screen during the video. It's a test for selective attention and it reveals how often we miss things going on around us, even if they're staring us in the face. Chabris and Simons examine the every day illusions that prevent us from seeing clearly and explain why we shouldn't assume we ever have the full picture. The book will make you feel less sure of yourself, the authors say, and that's a good thing.

Buy the Book
The Believing Brain

Which comes first, the belief or the reasoning? After thirty years of research, Michael Shermer, a psychologist and historian of science believes it's the former; the belief comes first and the explanation follows. In other words, once our brain forms a belief out of the information we receive and the patterns we find in the world around us -- we then start looking for evidence with which we can explain and support the belief. It's basic confirmation bias. We look for information which supports and confirms our already existing opinions. Our brains go to a lot of trouble to almost obsessively try to reinforce our beliefs as correct, which ultimately creates a loop of belief confirmation and cuts us off from discovery.

Think Twice

People make major decisions every day. Huge, important decisions that affect lives. And yet so often they turn out to be wrong. According to Mauboussin, we fall victim to simplified mental routines that lead to seriously bad judgement, even when the stakes are high. Simplified thinking prevents us from seeing the true complexity of a problem. Whether it's groupthink, over-reliance on expert opinion, failure to consider enough alternatives, or misunderstanding cause and effect, there are so many ways we can go wrong without having any idea. This books helps us identify when it's time to think twice and reexamine our decision-making process.

Buy the Book

People are often made to feel stupid, uncommitted, flippant or flakey for changing their minds – but never changing your mind isn't something to be proud of. It just means you're wrong more often than you think you are.

If we can understand our own internal biases, even when it's difficult and uncomfortable, it can help us understand each other a little better, even when we disagree. Especially when we disagree. And, given the acrimonious times we live in, it might be particularly helpful to learn this skill.

One of my favorite pieces of writing on this topic is by Dr David Robert Grimes, a science writer and a physicist at Oxford University. He argues we should feel no shame in changing our minds in light of new information. We shouldn't aim only to pick holes in the arguments of others, but to be as committed to spotting flaws in our own reasoning.

But I'll leave the final word to Fyodor Dostoyevsky: "Above all, avoid falsehood, every kind of falsehood, especially falseness to yourself. Watch over your own deceitfulness and look into it every hour, every minute."

Danielle Ryan is a​ ​freelance journalist based in Budapest, Hungary. She writes about media, geopolitics, and books.​​ Follow her on Twitter: @DanielleRyanJ