In “The Knowledge Illusion,” the cognitive scientists Steven Sloman and Philip Fernbach hammer another nail into the coffin of the rational individual. From the 17th century to the 20th century, Western thought depicted individual human beings as independent rational agents, and consequently made these mythical creatures the basis of modern society. Democracy is founded on the idea that the voter knows best, free market capitalism believes the customer is always right, and modern education tries to teach students to think for themselves.
Over the last few decades, the ideal of the rational individual has been attacked from all sides. Postcolonial and feminist thinkers challenged it as a chauvinistic Western fantasy, glorifying the autonomy and power of white men. Behavioral economists and evolutionary psychologists have demonstrated that most human decisions are based on emotional reactions and heuristic shortcuts rather than rational analysis, and that while our emotions and heuristics were perhaps suitable for dealing with the African savanna in the Stone Age, they are woefully inadequate for dealing with the urban jungle of the silicon age.
Sloman and Fernbach take this argument further, positing that not just rationality but the very idea of individual thinking is a myth. Humans rarely think for themselves. Rather, we think in groups. Just as it takes a tribe to raise a child, it also takes a tribe to invent a tool, solve a conflict or cure a disease. No individual knows everything it takes to build a cathedral, an atom bomb or an aircraft. What gave Homo sapiens an edge over all other animals and turned us into the masters of the planet was not our individual rationality, but our unparalleled ability to think together in large groups.
As Sloman and Fernbach demonstrate in some of the most interesting and unsettling parts of the book, individual humans know embarrassingly little about the world, and as history progressed, they came to know less and less. A hunter-gatherer in the Stone Age knew how to produce her own clothes, how to start a fire from scratch, how to hunt rabbits and how to escape lions. We today think we know far more, but as individuals we actually know far less. We rely on the expertise of others for almost all our needs. In one humbling experiment, people were asked to evaluate how well they understood how a zipper works. Most people confidently replied that they understood it very well — after all, they use zippers all the time. They were then asked to explain how a zipper works, describing in as much detail as possible all the steps involved in the zipper’s operation. Most had no idea. This is the knowledge illusion. We think we know a lot, even though individually we know very little, because we treat knowledge in the minds of others as if it were our own.
This is not necessarily bad, though. Our reliance on groupthink has made us masters of the world, and the knowledge illusion enables us to go through life without being caught in an impossible effort to understand everything ourselves. From an evolutionary perspective, trusting in the knowledge of others has worked extremely well for humans.
Yet like many other human traits that made sense in past ages but cause trouble in the modern age, the knowledge illusion has its downside. The world is becoming ever more complex, and people fail to realize just how ignorant they are of what’s going on. Consequently some who know next to nothing about meteorology or biology nevertheless conduct fierce debates about climate change and genetically modified crops, while others hold extremely strong views about what should be done in Iraq or Ukraine without being able to locate them on a map. People rarely appreciate their ignorance, because they lock themselves inside an echo chamber of like-minded friends and self-confirming newsfeeds, where their beliefs are constantly reinforced and seldom challenged.
According to Sloman (a professor at Brown and editor of the journal Cognition) and Fernbach (a professor at the University of Colorado’s Leeds School of Business), providing people with more and better information is unlikely to improve matters. Scientists hope to dispel antiscience prejudices by better science education, and pundits hope to sway public opinion on issues like Obamacare or global warming by presenting the public with accurate facts and expert reports. Such hopes are grounded in a misunderstanding of how humans actually think. Most of our views are shaped by communal groupthink rather than individual rationality, and we cling to these views because of group loyalty. Bombarding people with facts and exposing their individual ignorance is likely to backfire. Most people don’t like too many facts, and they certainly don’t like to feel stupid. If you think that you can convince Donald Trump of the truth of global warming by presenting him with the relevant facts — think again.
Indeed, scientists who believe that facts can change public opinion may themselves be the victims of scientific groupthink. The scientific community believes in the efficacy of facts, hence those loyal to that community continue to believe they can win public debates by marshaling the right facts, despite much empirical evidence to the contrary. Similarly, the traditional belief in individual rationality may itself be the product of groupthink rather than of empirical evidence. In one of the climactic moments of Monty Python’s “Life of Brian,” a huge crowd of starry-eyed followers mistakes Brian for the Messiah. Caught in a corner, Brian tells his disciples: “You don’t need to follow me, you don’t need to follow anybody! You’ve got to think for yourselves! You’re all individuals!” The enthusiastic crowd then chants in unison: “Yes! We’re all individuals!” Monty Python was parodying the counterculture orthodoxy of the 1960s, but the point may be true of the belief in rational individualism in other ages too.
In the coming decades, the world will probably become far more complex than it is today. Individual humans will consequently know even less about the technological gadgets, the economic currents and the political dynamics that shape the world. How could we then vest authority in voters and customers who are so ignorant and susceptible to manipulation? If Sloman and Fernbach are correct, providing future voters and customers with more and better facts would hardly solve the problem. So what’s the alternative? Sloman and Fernbach don’t have a solution. They suggest a few remedies like offering people simple rules of thumb (“Save 15 percent of your income,” say), educating people on a just-in-time basis (teaching them how to handle unemployment immediately when they are laid off) and encouraging people to be more realistic about their ignorance. This will hardly be enough, of course. True to their own advice, Sloman and Fernbach are well aware of the limits of their own understanding, and they know they don’t know the answer. In all likelihood, nobody knows.