Home Moral guidelines Brain Bias: A Yale Psychologist Examines Common ‘Thinking Problems’

Brain Bias: A Yale Psychologist Examines Common ‘Thinking Problems’

0

The sometimes counter-intuitive ways our brains work can raise big questions. Why do we procrastinate, drag our feet when we know we’ll regret it later? Why are misunderstandings and miscommunications so common? And why do people often turn a blind eye to evidence that contradicts their beliefs?

For Woo-kyoung Ahn of Yale psychology professor John Hay Whitney, a better understanding of these kinds of questions is crucial.

In her new book, “Thinking 101: How to Reason Better to Live Better,” Ahn explores the ins and outs of so-called “reasoning errors” – or, as she describes them, “thinking problems” – and how they affect our lives, from influencing how we perceive time to why we stereotype and profile others. His work examines the complexities inherent in science of how we think and ultimately promotes solutions to overcome these errors in thinking.

The book is based on the lessons Ahn teaches in his undergraduate course “Thinking,” which has become one of Yale’s most popular courses.

In an interview with Yale News, Ahn discusses the nuances of these “thinking issues” and what steps we should take to reduce bias. The interview has been edited and condensed.

The book is based on your popular course, “Thinking.” What inspired you to create the course itself and condense a semester-long course into a book?

Woo-kyoung Ahn: I’ve been teaching for over 30 years and have always covered some of this material in various courses, and I also taught a seminar and a graduate psychology course on thinking. But in 2016, I felt it was time to release this content more widely for those who don’t major in psychology.

The inspiration is this: it is now well known that people make a variety of errors in reasoning, but these have been mostly discussed in the context of behavioral economics or, to put it differently, psychology in the economic context. For example, people have talked about “negativity bias” – the bias of overweighting negative information over positive information. [information] — in terms of loss aversion in transaction or endowment effects. But I believe that these irrational behaviors influence us in other situations of our daily life. And I wanted to teach that rationality is not only important in money management [and other material things] but much more broadly.

Another inspiration is that there hasn’t been much discussion about what to do next once we notice we are committing thinking biases. It is not enough to recognize them; it is as if one could recognize that they suffered from insomnia, but it is not enough to cure the insomnia. So, I provided as many action strategies as possible.

What concepts did you choose to highlight in the book – and how did you choose to focus on them?

Oh: In the course, I cover many topics, such as creativity, moral judgments, the effects of language on thoughts, among many others – which I may or may not cover in a book at another time. But to be honest, I really don’t have a good answer as to why they were chosen; I could have written a completely different book with the same title! One thing I tried was not to cover too much in one book. Now there are websites with titles like “61 cognitive biases”. I just don’t want to overwhelm readers. I also address more technical issues in the course, [like] models of causal learning or mathematical proofs of irrational choices, which I don’t think was necessary for a book aimed at the general public.

A valuable insight that has become a recurring theme in the book is that these thinking “problems” aren’t really about what’s wrong with us. I mentioned this problem sometimes in my course, but I became more convinced of it while writing the book.

How would you define or diagnose these “thinking problems”?

Oh: The kind of wrong thinking that interests me the most is unrighteous thinking. We can be unfair to ourselves and also to others when we are inconsistent, biased, overconfident or underconfident.

For example, I don’t think the well-known confirmation bias is necessarily a reflection problem.

It refers to a tendency to confirm what we emit or what we believe. In other words, we tend to look for evidence that supports what we believe or interpret evidence based on what we know. It sounds pretty bad, but it’s actually quite an adaptive mechanism. For example, you go to a Stop & Shop three miles from your house and find that their apples are good. The next time you want apples, you can try another supermarket, but if your goal is just to get good apples, you might as well go back to the same supermarket. In other words, when our goal is to survive, it is better to continue with what you know than to try to explore other possibilities.

Confirmation bias becomes a thinking problem when it causes us to draw conclusions that are unfair to ourselves or others. For example, let’s say a CEO of a company only hires white people for his leadership positions. They’re all doing a reasonably good job, so now the CEO believes race matters and continues to only hire white people. However, this CEO did not even check what would have happened if non-white people had been hired for these positions.

Confirmation bias can even hurt those who commit it. An experiment I conducted illustrates this. In the experiment, participants performed a saliva test. Half of the participants were told the test results indicated they were at high genetic risk for major depressive disorder, and the other half were told they were at no risk. Next, we asked all participants about the symptoms of depression they had experienced in the past two weeks. These participants were randomly assigned to one of two conditions, so there’s no reason one group was more depressed in the past two weeks than the other.

But those who were told they had genetic risks for major depression said they were significantly more depressed and their average score was higher than what might clinically be considered mild depression.

A recurring theme in the book is how we might better communicate ideas, especially when there is evidence that the use of metrics – like data, statistics, etc. – may be ineffective. How do you think public messaging campaigns can best incorporate some of these ideas and concepts?

Oh: I think public messaging campaigns should recognize how we are wired. For example, when a charity asks for donations, they should not just use statistics or abstract data, but also an anecdote from someone who is suffering from the issues they are concerned about. In my recent study, we presented very disgusting images of COVID-19 to participants: images of COVID toes, burials of those who died of COVID, etc. As we know, politically conservative people were generally less willing to conform to [Centers for Disease Control and Prevention] guidelines than politically liberal people, at least at the time of the study. However, when presented with these stark examples of those suffering from COVID-19, they became much more willing to comply with the guidelines.

You seem to be a big proponent of promoting more dialogue between people, as that will help combat a lot of these cognitive biases that we have.

Oh: Again, I tried to emphasize that those who commit cognitive biases are not bad people; these errors are part of our highly adaptive cognitive mechanisms. Let’s go back to confirmation bias again to illustrate this once again. My favorite example is what happened when my son was four. He asked me why a yellow light is called a “yellow” light. I didn’t understand the question but I had the patience to tell him it’s called a yellow light because it’s yellow. Then he told me it was [not yellow, but] orange. I said no. He insisted that I watch it, so I did. And that is orange.

I had no ulterior motive to call it a yellow light or to interpret the color amber as yellow, but because everyone called it a yellow light, I saw it as yellow all my life until until he tells me. But what I committed was confirmation bias; I interpreted the color of a traffic light in light of what I already believe. We do it every moment of our lives. Thus, we should not think that those who disagree with our views are different types of people; they just see the world from their point of view.

Of course, there are people with ulterior motives and right-thinking people, but if we want to start a dialogue, we must first recognize [the biases] we all share.

Moreover, since these prejudices are essential elements of our survival, they are not easy to counter. So sometimes I present actionable strategies – don’t guess what others might like, ask! – but in some cases we just need to try to focus on fixing current problems rather than trying to change others.