Economists have long assumed individuals are rational. Behavioral economics undermined this assumption. Most economists now accept that humans are often beset by various forms of irrationality, although the normative implications of this conclusion are still widely debated (see Mario J. Rizzo and Glen Whitman, Escaping Paternalism: Rationality, Behavioral Economics, and Public Policy, New York: Cambridge University Press, 2020). That individuals exhibit significant lapses of rationality is often viewed as regrettable. Stuart Vyse disagrees. The central thesis of The Uses of Delusion: Why It’s Not Always Rational to Be Rational is that in some cases “our irrationalities are features, not bugs” (p. 7). Sometimes, it is rational to be irrational.

Vyse’s thesis might sound contradictory. It is not. His thesis relies on a distinction between epistemic and practical rationality. Practical rationality is a standard for evaluating actions. An agent is practically rational if she chooses actions that serve her goals. Epistemic rationality is a standard for evaluating beliefs. An agent is epistemically rational if she undergoes a reasonable search for evidence before forming beliefs; furthermore, her beliefs must be grounded in this evidence. Vyse argues that epistemic irrationality can be practically rational. That is, by sometimes not searching for or ignoring available evidence, agents are better off in terms of achieving their goals than they would be if they did perform a more thorough search for evidence or made better use of this evidence when forming beliefs.

After an introductory chapter where Vyse presents his thesis, each chapter of the book highlights a different kind of epistemic irrationality and how it can be practically rational. Let me walk through a few examples. In chapter two, Vyse presents a bevy of empirical studies demonstrating that individuals tend to overestimate their abilities. For instance, 87.5 percent of Americans believe they are above-average drivers; most college students believe they have more friends than average (p. 20). As two behavioral economists summarize these sorts of findings, “on almost any desirable human trait, from kindness to trustworthiness to the ability to get along with others, the average person consistently rates him- or herself above average” (Nicholas Epley and Thomas Gilovich, “The Mechanics of Motivated Reasoning,” Journal of Economic Perspectives 30 [2016]: 135). While these sorts of beliefs are clear cases of epistemic irrationality (not everyone can be above average!), Vyse argues they can sometimes be practically rational, in that they help individuals achieve their goals. For instance, overconfident athletes will perform better in competitive matches than athletes with a realistic sense of their abilities (p. 28). Moreover, “confident entrepreneurs are more resilient during setbacks and more likely to take on subsequent ventures after an initial failure” (p. 31).

As another example, chapter four examines rituals and superstitions. Many of us indulge in such foibles: we might wear our lucky jersey on gameday or always eat scrambled eggs on sourdough toast before giving a public lecture. Like overestimation of our abilities, such epistemic irrationalities—the belief that a lucky jersey or certain breakfast will make a difference—can help individuals better achieve their goals. For instance, telling college students in a putting contest that their golf ball is lucky may increase their accuracy (p. 62). Rituals help us control our anxiety. In Vyse’s words: “anxiety is often felt as a loss of control. The successful performance of a rigid sequence of actions restores a sense of order and mastery over the physical world” (p. 66).

As a final example, chapter nine examines illusions of control. In many cases, we believe we are in control when we are not (e.g., when hitting the crosswalk button at a busy intersection). In other cases, we believe we are not in control when we really are (e.g., Ouija boards). Like the prior examples, illusions of control can help agents better achieve their goals. For instance, feeling that we are in control even if we are not “helps us gain a sense of achievement and autonomy” (p. 161). Moreover, to assign credit or blame to someone for their conduct requires us to believe that they are in control of their conduct.

Vyse convincingly demonstrates that, in some cases, epistemic irrationality is practically rational. What I am left wondering is what to do with this information. One cannot simply embrace epistemic irrationality as a way of life. This is because, as Vyse notes, epistemic irrationality can often be practically irrational, in that it can often lead agents to pursue actions that do not serve their ends very well. Consider again overestimation of our abilities. Just as it can lead to good outcomes, it can also lead to bad outcomes. Overconfidence can lead bankers to make risky investments and politicians to launch foolish wars. In fact, Vyse notes the conditions under which overconfidence is likely to lead to poor outcomes (pp. 36–37). Overconfidence is good in cases that involve moderate risk, require persistence and skill, have short-term cumulative impact, and depend on known factors. Overconfidence is bad when there are substantial downside risks, when decisions are irreversible, when long-term obligations may be incurred, and when there are many unknowns.

So, one cannot simply embrace epistemic irrationality and expect one’s life to go well. Perhaps we should only embrace it when doing so will lead to good outcomes. If epistemic irrationality would benefit you in your current circumstances, then you should choose to be epistemically irrational; if epistemic irrationality would not benefit you in your current situation, then you should choose epistemic rationality. Is this actually possible, though? Can a person consciously choose to be epistemically irrational? I have doubts. To be clear: I do not doubt that in many cases we are epistemically irrational—this seems undeniable. Rather, I doubt we can pick and choose when epistemic irrationality befalls us.

To see why, let us break the requirements of epistemic rationality into two components: first, an agent must make a reasonable search for available evidence; second, her beliefs must be grounded in this evidence. Consider the latter component. I do not think agents can choose to ignore evidence already in their possession. For instance, suppose I want to be overconfident heading into a basketball game. Before the game I receive a report from a scout that contains an evaluation of my athletic performance. The report is a sobering analysis of my many faults as an athlete. How can I possibly ignore this information? How can I unsee what I have seen? To say I can ignore the evidence and form beliefs contrary to it is to embrace what philosophers call doxastic voluntarism, which is a theory that says individuals can control their beliefs. It has few defenders. This is because brief reflection reveals that we have little control over our beliefs. I cannot make myself believe there is a purple elephant dancing across my living room. Likewise, I cannot make myself believe that I am a good basketball player after reading a report written by an expert that says I am not.

While individuals cannot ignore evidence already in their possession, they can choose to not seek out evidence when they know it might hurt them. Suppose the scouting report is emailed to me right before I leave for the game. Because there might be information in the email that lowers my confidence, I choose not to open it. I can choose epistemic irrationality by not opening the email. While not seeking out evidence is possible in ways ignoring evidence already in one’s possession is not, consciously choosing to do so likely counteracts whatever positive effects epistemic irrationality might have. If I choose to not open the email so I can be overconfident for the game, then in the back of my mind I will know that the email may contain damning evidence of my athletic abilities. This knowledge will likely impact my confidence and hence athletic performance.

Because epistemic irrationality is not always good, and because we cannot pick and choose when to be epistemically irrational, I am not sure what the takeaway of Vyse’s book is. Perhaps the takeaway is to simply not fret so much if you do exhibit various forms of epistemic irrationality; after all, such foibles may help you lead a better life, even if this is not immediately obvious to you. That is a message worth embracing. Because of Vyse’s book, next time I wear my lucky gameday jersey or perform my pretalk ritual, I will not feel so silly.

Brian Kogelmann
West Virginia University
Culture and Society
Other Independent Review articles by Brian Kogelmann
Winter 2021/22 Libertarianism