Dogmatic Delusions Can Neuroscience and Social Psychology Help Us Understand How America Has Lost Its Mind?

When was the last time you employed reasonable arguments to try to convince an anti-vax friend to vaccinate their children? Or the last time you tried to convince your crazy uncle that the CIA wasn’t spying on him? We’ve all felt exhausted from trying to reason with people who hold dogmatically to delusions about important issues. The recent election has highlighted tens of millions of people espousing bizarre QAnon conspiracy theories whose delusions could change the destiny of the country. Reason is a core value of humanism, and right now, reason seems to be taking some punches.

Can applied reason, in the form of science, help us to understand these dogmatic delusions? The short answer is not yet—at least not fully. At present there’s no scientific consensus about what’s going on in the brains of people engaged by dogma or delusion. What evidence we have is based on a few studies with small samples finding modest effects. So we should be cautious about drawing conclusions and skeptical about one-size-fits-all explanations.

On the other hand, these small studies put together tell a coherent and consistent story about the psychological processes of dogmatic delusion, and much of the evidence makes sense in terms of what we know about human brain function and evolution. This story may not be appealing to those of us who value reason, but it does both explain why most of our efforts at reasoning with people in the thrall of delusional beliefs have been futile and point toward alternative (admittedly harder) approaches that may work better.

Dogmatic People

Dogmatism is a personality characteristic that describes an intolerant and inflexible conviction in one’s own beliefs. These beliefs may be about health regimens, child-rearing, or the best business practice. They can be ideological, religious, or political. Although many Humanist readers can easily identify people with these characteristics in religious institutions or on the right wing of American politics, this description of dogmatism fits some secular progressives as well.

Acute observers through the ages have noted how often rigid thinking or intolerance play key roles in various human tragedies, but it was only after the horrors of World War II that psychologists attempted to define and measure these traits. In the late 1940s psychologists at the University of California, Berkeley, including Theodor Adorno, wrote one of the most famous books in American psychology, The Authoritarian Personality. Adorno devised an “F (for Fascism) scale” to measure authoritarianism. While this scale seemed to provide a retrospective explanation for some WWII criminals, it didn’t predict much about how ordinary people would behave. Although many people still use “authoritarian” casually to describe objectionable people (e.g., a bad boss) to their friends, it’s been hard to define clearly and the F scale is not used much in psychological research now. In 1960 Milton Rokeach attempted to adapt the F scale to peacetime society with his “D (Dogmatism) scale.” In 1996 Bob Altemeyer proposed to define dogmatism as an unchangeable and unjustified certainty in one’s beliefs and measured with a new DOG scale. Unfortunately, statisticians found as little correlation among items on the DOG questionnaire as on previous scales. So, although most of us feel confident that we can recognize it when we see it, “dogmatism” is harder to identify objectively.

Psychologists have also tried to find other personality traits that can explain dogmatism. One study found that people who had an anxious attachment style (i.e., “clingy” close personal relationships) seemed to be somewhat more dogmatic in their opinions. People who were overconfident in their initial judgments were also rated slightly more dogmatic. It seems that people with poor working memory are more likely to be dogmatic. This makes sense if you think that nuanced thinking requires holding in mind several distinct ideas or items of information. However, none of these studies found large effects, and failed to provide an underlying explanation for dogmatism.

Some useful results have come from brain imaging. In an early study, Sam Harris read statements about religious beliefs and commonly agreed-upon facts (e.g., there are forty-eight states in the continental US) to observant Christians and also to atheists, while recording brain activity using functional MRI. He reported several conclusions, including that disbelief seems to make most of the brain work harder than belief, with one important exception: believers seem to engage a brain region known as the ventromedial prefrontal cortex when making affirmations of faith. Jonathon Howlett and Martin Paulus published a similar study in 2015 contrasting brain activity in response to testable versus non-testable statements, and Harris did a follow-up study regarding political (instead of religious) beliefs in 2016. These studies too support the finding that the ventromedial prefrontal cortex is critical for belief but not for reasonable assent to facts.

So, what is the ventromedial prefrontal cortex? A large brain area about two inches behind the bridge of your nose, this area is greatly activated by strong feelings about close social relationships, for example, feeling guilty if you’ve hurt somebody you care for, or feeling grief at the loss of a loved one. This area is also active if you’re faced with a moral dilemma involving harm you might do. And it’s overactive in many depressed people. A plausible interpretation of the imaging results about belief, described in the last paragraph, might be that expressions of faith activate the same brain region required for maintaining close social networks.

Delusions

A delusion is a persistent, often bizarre, false belief maintained despite compelling and readily available evidence to the contrary. We often hear of delusions as symptoms of psychosis or sequelae of certain neurological disorders.

We don’t have clear understanding of the brain processes leading to delusions in mental illness. Numerous studies point to excess dopamine, but the evidence is conflicting: many deluded people have dopamine levels clearly higher than normal but many don’t. And we don’t know what causes the high dopamine levels in those people—dopamine levels are only very weakly related to genes. We often hear about dopamine in popular science articles as a reward signal, but in fact your brain also produces dopamine to focus your attention in aversive situations. Dopamine is the neurotransmitter most important for the feeling that something is really worth paying attention to.

Many psychologists, following Karl Jaspers, now think that unusual feelings precede delusions: a person starts to feel that many unremarkable events or coincidences are somehow significant or suspicious. As the person tries to make sense of their agitated feelings, they come to explain their experiences as indicators of a wider (fictitious) pattern, which becomes a full-blown delusion later.

Although most of us don’t experience these extreme kinds of delusions, many of us have been in the grip of delusion at some point, particularly about romantic partners; and most of us hold persistent delusions about events in our past. These garden variety delusions shed light on some general mechanisms. Have you ever reminisced about old times with college buddies you haven’t spoken with for a decade or more? We often find that our memories disagree substantially with those of others who were there. Even more disturbing is when something that we know happened to us is recounted by someone else as if it happened to them. These kinds of things are quite common.

Research into how such false memories form finds several mechanisms consistently at work. First, we generally recall things in a way that puts us in a favorable light. Second, we like versions of events that can be seen as foreshadowing a theme important to our broader life narrative. Third, as we tell our stories to others, we often shade or even distort them in ways that galvanize our audience and draw their attention—and a common factor in many distorted memories is retelling the story to others who weren’t present at the original event. As we reminisce, our brain connections are malleable so that some of those shadings work their way into our own synapses and become part of our own memories, leaving us with a firm conviction in our version of events. But one of the best-replicated findings of cognitive psychology is that human memories, even for very significant events, are often faulty.

Shared Dogmatic Delusions

Up until this point I’ve discussed mostly dogmatic personalities and individual delusions. These may hurt no one, or only a few people. What becomes dangerous is when groups of people are dogmatic about a joint delusion. Why do conspiracy theories such as QAnon spread? And what can we do about them?

It’s tempting for us to seek one villain behind the proliferation of misinformation—a Dr. No of Delusion. We might take that villain to be the Koch brothers, neoconservatives, Fox News or, more generally, social media or the religious right. Somebody must be behind all this, right? In fact, it’s a misleading cognitive heuristic, common to many conspiracy theories, that a deplorable outcome must have a proportionately malevolent cause. We humanists should eschew simple all-encompassing explanations of complex social trends, however tempting. People get drawn into conspiracy theories and other dogmatic delusions from a variety of feelings and individual motivations.

By analogy with Jaspers’s idea about individual delusions, I suggest that many people in the United States today feel manipulated, demeaned, and ignored by their bosses and by the corporations and other large organizations that make decisions affecting them without inviting their participation. Humiliation is a powerful emotion and these aversive experiences focus their attention with a surge of dopamine. The people feel there’s a pattern but don’t know how to think about their situation. Then such individuals come across groups promoting ideas of elite conspiracies against ordinary people. Those ideas make sense of their feelings of humiliation and help them craft a more appealing life narrative in which their humiliation isn’t their fault. Furthermore, the groups tell these people that they’re being initiated into a secret community of knowledge and becoming part of a resistance to massive evil. It’s a ready community sympathetic to their humiliations and supportive of their efforts to resist.

There is historical precedent for delusions becoming widespread in a stable, prosperous society. The mystery religions (e.g., Mithraism, the cults of Cybele, Dionysus, Isis, and a hundred others) that flourished throughout the later Roman Empire fulfilled a similar role in that efficient but brutal realm. These religions told people that their hemmed-in daily lives were not the ultimate reality;
behind what they were experiencing in the here and now was a hidden cosmic drama in which they could participate. Christianity was seen by the Roman elite as another such cult, but it had more social cohesion and eventually took over the state.

In our time the process of finding a story to make sense of one’s experience has become dramatically easier—“frictionless” in Facebook jargon—through online connection in chat rooms and social media. Even the mentally ill can connect with each other: one outcome is that there are now large online communities of people discussing how to block the signals from the secret spy radios embedded in their walls. So it’s no surprise that many ordinary people who feel repeatedly humiliated and disenfranchised should flock to groups retelling versions of events in which they, rather than the professionals who set the bounds of their everyday lives, are at the center of the stories.

Perspective and Prospects

Let’s turn to evolutionary theory to gain some perspective. For more than a million years human survival has depended more on cooperation and learning from others than on exceptional individual abilities. Social relationships and status are critical to human life, such that brain regions important for these are larger in humans relative to apes. The human medial prefrontal brain regions, which serve to manage interpersonal relations, send messages to and influence the function of most other brain regions, much more so than do the corresponding regions in monkeys or apes. So our social feelings dictate broadly to our thinking and even our perceptions of the world.

Yuval Noah Harari argues that, since human beings began gathering in larger settlements, shared imagination, or delusions, have become the most effective way to coordinate human activities on the scale needed for large cooperative enterprises. We humanists may dislike the implication that individual reason may not be enough, but even a short history of American humanism, or a survey of the many short-lived anarchist communes, shows the difficulty of coordinating activities among people who prize independent thinking. It isn’t too much of a stretch to say that we’ve been selected for capacities to share imagined worlds with others, and that shared delusions are a feature built into the human mind.

Science is still some way from a complete account of dogmatic delusion, and the discoveries sketched here don’t give us a recipe for curing the ideological sickness of America. However, they do explain why our default humanist strategy of rational argument is failing to stem the tide of delusion. We humanists often shy away from discussing issues of feeling. But understanding how social relationships, rather than reason or evidence, necessarily shape human thought processes can lead us to other strategies. When thinking about how to address the deluge of dogmatic delusion today, we need to consider the conditions under which people acquire their delusions.

Many Americans are facing bleak prospects amid stark and humiliating economic inequality. We humanists are asking many people to give up illusory stories that endow them with a special place in the world. If we want people to engage in reasonable discussion, we need to engage with them based on common values that offer a realistic prospect for a better life, including security and dignity. The Renaissance humanists offered a vision of dignity for all, not only for the nobles and professional elite. Modern humanists can and should do the same.