The First-Instinct Fallacy

Grant defines this term in the Prologue of Think Again and circles back to the concept several times throughout the book. The first-instinct fallacy refers to the common misconception that revising original responses to questions will inevitably lead to a wrong answer. Students are frequently advised not to change their answers to test questions in the mistaken belief that first-instinct answers are more likely to be correct. However, research demonstrates the opposite. A comprehensive review of the evidence across multiple studies shows that most answer revisions tend to be right.

The Scientist Mode

According to Grant in Chapter 1, being a scientist is not just a profession, it’s a frame of mind. Individuals move into scientist mode when they search for the truth through rethinking what they know. Doing so involves running experiments in their daily lives to test hypotheses and discover knowledge. In scientist mode, the idea is not to start with answers or solutions. Instead, one should lead with questions and respond to those questions with evidence rather than with instinct. The key to being able to rethink, says Grant, is to approach issues like a scientist: to search for reasons why one’s ideas might be wrong and revise old views based on new learning.

The Preacher, Prosecutor, and Politician Modes

Set in contrast to the scientific mode (see above), people often find themselves playing the role of the preacher, the prosecutor, or the politician, each of which hinders rethinking. Grant introduces these concepts in Chapter 1 and discusses them throughout the book. Preachers avidly defend their entrenched beliefs, even when evidence doesn’t support their thinking. Prosecutors go on the offensive and set out to prove others wrong. Finally, in politician mode, people focus on persuading others to share their perspective. While scientists prioritize curiosity, humility, and experimentation, preachers, prosecutors, and politicians cling more tightly to their established ways of thinking.

Desirability Bias

Introduced in Chapter 1 and discussed throughout the book, desirability bias describes people’s tendency to see only what they want to see, preventing them from applying their intelligence to certain issues. Desirability bias prevents even math geniuses from analyzing patterns that contradict their preexisting views. Desirability bias allows people’s preferences to cloud their judgment and is the enemy of rethinking.

The Dunning-Kruger Effect

A report by psychologists David Dunning and Justin Kruger found that people who scored lowest on tests of logical reasoning, grammar, and sense of humor had the most inflated opinions of their skills. According to what is now termed the Dunning-Kruger effect, discussed primarily in Chapter 2, people who lack competence in a particular domain are more likely to be overconfident about their abilities in that domain. Grant believes this tendency matters because it compromises self-awareness and hinders rethinking.

Confident Humility

In Chapter 2, Grant defines confident humility as a mindset that allows people to have faith in their capability while appreciating that they may not have the right solution or even be addressing the right problem. This gives people enough doubt to reexamine old knowledge and enough confidence to pursue new insights. Learning to adopt a mindset of confident humility doesn’t just open people’s minds to rethinking. Grant believes it also improves the quality of that rethinking.

Motivational Interviewing

Discussed in Chapter 7, motivational interviewing is a practice designed to encourage change by helping people to find their own internal motivation. The practice was developed by clinical psychologist Bill Miller and an addiction nurse, Stephen Rollnick. Simply telling people what to do is seldom a successful way to motivate them, Grant points out. Motivational interviewing, on the other hand, helps people see new possibilities by prompting them to rethink their own views. This technique involves asking open-ended questions, engaging in reflective listening, and affirming the person’s desire and ability to change.  

Binary Bias

Discussed in Chapter 8, binary bias is the tendency to seek clarity and closure on an issue by simplifying a complex continuum into two categories. However, Grant believes that presenting the two extreme positions on any issue is unhelpful because it contributes to polarization. He suggests that the best way to overcome binary bias is to become aware of the wide range of perspectives across a given spectrum. As an example, he points to climate change. Polls suggest that, while at least six schools of thought exist on this issue, only two positions are popularly discussed: believers and nonbelievers. One should guard against binary bias, says Grant, because it fosters an us-versus-them mentality that hinders rethinking.

Performance Culture

Discussed in Chapter 10, performance culture refers to a culture, usually in an organization, where excellence of execution is the paramount value. While this sounds like a positive attribute, the existence of a performance culture sometimes leads to failures to rethink because individuals are afraid of being punished for questioning long-established best practices. In performance cultures, people tend to censor themselves in the presence of experts or bosses. Grant uses NASA as an example of an organization whose performance culture led to catastrophic mistakes. A performance culture is the opposite of a learning culture.

Learning Culture

Discussed in Chapter 10, learning culture refers to a culture, usually in an organization, where growth is the core value and where cycles of rethinking are routine. In learning cultures, the norm is for people to know what they don’t know, doubt their existing practices, and stay curious about trying new routines. Workers are not afraid of reprisal from leadership for taking risks. Evidence shows that in learning cultures, organizations innovate more and make fewer mistakes. Grant uses Amazon as an example of a learning culture where workers are encouraged to experiment. A learning culture is the opposite of a performance culture.

Psychological Safety

In the context of Think Again, as introduced in Chapter 10, psychological safety refers to the ability of employees to take risks in the workplace without the fear of being punished. Grant points out that although the term has become a buzzword in many workplaces, leaders often misunderstand what it means. Psychological safety isn’t a matter of relaxing standards and giving unconditional praise. It involves fostering a climate of respect, trust, and openness in which people can raise concerns and suggestions without fear of reprisals.

Escalation of Commitment

Introduced in Chapter 11, escalation of commitment describes a tendency to double down and sink more resources into a plan even when it isn’t going well. Although sunk costs are a factor in escalation of commitment, the main causes appear to be psychological rather than economic, Grant suggests. Because humans tend to rationalize, they are constantly searching to justify their prior beliefs as a way to validate their past decisions. Escalation of commitment involves a refusal to rethink and is a major factor in preventable failures.