Uncertainty in Science: It’s a Feature, Not a Bug
For all that Americans don’t know about science, one thing we do know is that we’re in favor of it. At least, we think we are. Public opinion polls rank science as more worthy of respect than every other profession except teachers and the military; and the majority of Americans agree that science is a beneficent force in society and that scientists are trustworthy and non-partisan.
So if we put that much trust in science, what accounts for the traction of beliefs that scientists have refuted with conclusive amounts of evidence? The first annual Northeast Conference on Science and Skepticism (NECSS) met in New York this September to discuss, among other things, why so many Americans are convinced that a link between vaccines and autism exists, despite extensive studies showing none; and that evolution and global warming are unproven theories, despite the overwhelming evidence supporting them.
In fact, polls show that only 39 percent of Americans accept the theory of evolution; 38 percent believe no link exists between vaccines and autism; and just 57 percent say there is solid evidence for global warming.
The media drew plenty of censure at the conference for their role in fueling these opinions by giving a platform to the anti-science voices and playing up conflicts over what are essentially uncontested points among scientists. But insights from speakers at the NECSS and other experts on science journalism also point to a deeper root cause: a tangle of misconceptions about the nature of science itself, how it progresses, and what we mean when we say that scientists “know” something. Perhaps even more than ignorance of specific facts, these fundamental misconceptions make the public vulnerable to the arguments of global warming deniers, anti-evolutionists, and anti-vaccination groups alike.
People tend to think of scientific progress as always advancing in a straight line, with new facts being added permanently to our body of knowledge as they are discovered. “They do not understand that, instead, research is an ungainly mechanism that moves in fits and starts and that its ever-expanding path of knowledge is complicated by blind alleys and fruitless detours,” writes New York Times science reporter Cordelia Dean in her book, Am I Making Myself Clear? A Scientist’s Guide to Talking to the Public (2009). As a result, Dean says, revisions to a scientific consensus make people think that scientists don’t know what they’re talking about. NECSS panelist Dr. Massimo Pigliucci, chair of the philosophy department at City University of New York-Lehman College, has a favorite example of this mindset. In response to an editorial he penned on the science of evolution, a letter to the editor replied, “I don’t understand why people want to believe in science—science changes all the time.” Yet this, of course, is its strength; science adjusts its claims in response to new information.
The anti-climate change, anti-vaccine, and anti-evolution cases also rely on the mistaken belief that gaps in a scientific theory represent fatal flaws. Through that lens, missing parts of the fossil record are “holes” in the theory of evolution that weaken its credibility. So is science’s uncertainty over why we seem to find sporadic bursts of rapid evolutionary change throughout history. “There are missing parts in the fossil record. It’ll never be one hundred percent clear what the intermediate steps are between certain animals,” emphasizes NPR science correspondent Joe Palca. “The idea that there’s uncertainty and gaps in the theory doesn’t mean it’s wrong—it just means the theory hasn’t been fully articulated yet.”
That misconception is reinforced by the fact that scientists spend far more time talking about gaps than non-gaps, creating a skewed impression of how much contention actually exists. People primed to see controversy are liable to confuse scientists’ disagreement over a theory’s finer points for disagreement over the theory itself. “Readers will read a story about hurricanes, see some scientific disagreement reflected in that story, and then they’ll mistakenly assume that the same level of uncertainty applies to other questions of climate science,” laments Curtis Brainard, editor of The Observatory, the Columbia Journalism Review’s online critique of scientific, environmental, and medical reporting. In an ideal world, Brainard jokes, science writers would preface such stories with the disclaimer, “Warning: this article reflects legitimate scientific uncertainty about hurricanes, but that does not detract from the consensus that the atmosphere is getting warmer.”
Reporters may not be able to get away with that, but are there other ways of explaining scientific theories that avoid such misunderstanding? NECSS speaker and science writer Carl Zimmer suggests a concrete analogy to relate the issue to a more familiar subject. “Sometimes creationists will make a big deal about ‘missing links,’” says Zimmer. “But think about forensic scientists investigating a crime scene. They weren’t there when the crime happened, and they don’t have every piece of evidence. But they can still do a pretty good job of piecing together from the available evidence what happened.” Of course, analogies are tricky because they can be used by both sides. “Creationists will say, if there’s a crime scene and if there’s one piece that seems not to fit in, then that’s the really hot piece of evidence,” warns NECSS panelist Dr. Richard Wiseman, professor of psychology at the University of Hertfordshire, UK.
Experts can and do disagree about the amount of evidence we need to be confident in a particular theory. But laypeople often adopt the position that, unless you can disprove a theory, you have to take it seriously. “I don’t know how many people have said, you do agree there’s a possibility that the face on Mars was built by an alien civilization,” sighs Charles Petit, head tracker for the Knight Science Journalism program at MIT. Similarly, the existence of a possibility that global warming is not caused by human industry doesn’t imply that we should assume it isn’t. In fact, the argument quickly appears absurd in other contexts. During the Cold War, Petit notes, we never thought a Soviet attack was one hundred percent certain, but we still thought that the possibility of catastrophe warranted trillions of dollars in defense spending. “We haven’t encountered anything quite like global warming before,” says Petit, “but wars, we understand.”
Not to mention that disproving the other side’s claim isn’t always even conceivably possible. As many people have already argued, the hypothesis of intelligent design can’t be tested, so it can’t be considered a scientific theory. By contrast, the link between vaccines and autism may be a scientific claim, but it’s not one that science can ever disprove, at least not according to the standards of proof demanded by the anti-vaccine voices. After all, absence of evidence can never be evidence of absence. “They dismiss scientific studies by saying either not enough have been done, or they aren’t sensitive enough,” says Paul Offit, MD, panelist at NECSS and the author of Autism’s False Prophets (2008). It’s true that if a vaccine causes only a small number of cases of autism, then a study of hundreds of thousands of children won’t be able to distinguish that signal from the noise. Offit compared the anti-vaccination case to Bertrand Russell’s celestial teapot: what if someone claims a teapot is orbiting the sun, but is too small to be detected by any of our instruments? The claim can’t be disproven—but that doesn’t entitle it to be taken seriously.
Given the fundamental misconceptions about how science evaluates theories and develops its body of knowledge, it’s little wonder that fringe movements tend to craft their strategy around playing up the uncertainty inherent to science. In the case of the global warming deniers and the anti-evolutionists, that crafting bears a professional pedigree. An influential memo by Republican consultant Frank Luntz in the late 1990s recommended, based on extensive polling, that Republicans adopt a strategy of emphasizing the “scientific uncertainty” surrounding climate change. The strategy’s roots can be traced back deeper still, to the tobacco industry’s reaction to the growing evidence that smoking is damaging to one’s health. A 1969 memo from tobacco company Brown & Williamson encapsulates the industry approach in the now-infamous words, “Doubt is our product since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public.”
The “play up the uncertainty” strategy of the intelligent design movement was born out of a partnership between the Seattle-based, anti-evolutionist think tank Discovery Institute and a public relations firm called Creative Response Concepts (CRC), best known for the Swift Boat Veterans for Truth campaign. “One of the two talking points that they found resonated best in their research was ‘Evolution is just a theory,’” says Dr. Dietram Scheufele, Professor of Life Sciences Communication at the University of Wisconsin. CRC’s strategy has been invaluable to the intelligent design movement, boosted by the ambiguity of the word “theory.” In the context of science, it means “well-supported explanation”—but to a layperson, it sounds like “speculation.”
Dubious scientific claims also get a boost from an attitude that scientific theories merit the same pluralistic treatment as personal beliefs. America’s respect for diverse opinions and value systems is one of our core democratic principles. But science isn’t democratic. It has right answers, and it has wrong ones. “Maybe it’s the logical extension of the American ideal of wanting to be open-minded and fair. The instinct is good, it just doesn’t work in science,” says Offit. American populism and pride in autonomy have made the CRC’s second brainchild, “Teach the Controversy,” another wildly successful sound bite for creationism. The implication is: “Let us make up our own mind, we don’t want somebody in an ivory tower telling us what to think,” says Scheufele. And just as the ambiguity of the word “theory” helps the anti-evolutionists’ case, so does the ambiguity of the word “belief.” Whether unthinkingly or in an effort to be extra-judicious, journalists have been known to refer to people “believing in” evolution (as opposed to accepting it), adding more fuel to the fallacy that science is a matter of personal opinion.
That misguided pluralism in science coverage plays right into the media’s natural love of conflict. “The problem on the global warming story is that the science just keeps confirming that we’re in a tough situation and it’s getting worse, and that news does not lend itself to the kind of reporting that the media likes to do,” says Dr. Joseph Romm, editor of the blog Climate Progress. So in the name of “balance” and an interesting story, the media turns clear-cut scientific issues into he-said, she-said stories. “Frankly, it’s intellectually lazy,” Offit opines. Just like the instinct to treat all views equally, seeking a compromise may be a fine way of accommodating different preferences in a democracy. But it’s a misplaced impulse in science, where a “compromise” between a right answer and a wrong answer still yields a wrong answer. Elizabeth Culotta, contributing news editor at Science magazine, recalls, “I was once misquoted by a local reporter on intelligent design and called him to complain, and he apologized, then said, ‘But I was looking for some sort of middle ground.’”
That doesn’t mean the media can’t shape up. Journalists have become more aware of the danger of referring to the “theory of evolution.” “We now try to avoid using ‘theory’ in that way,” says Culotta. “The two meanings of that word have caused a lot of confusion.” More generally, Scheufele sees the media becoming more attuned to the machinations behind the framing of science news. “I think journalists have begun to treat scientific issues like political issues—they’re looking out for key players, funders, and interest groups—and they’ve become much more savvy in how they cover issues as a result,” he notes. At least regarding coverage of the alleged vaccine-autism connection, Offit agrees. “I think there has been a very positive trend in the mainstream media,” he said. “I think they’ve really been convinced by the scientific evidence, and by the harm that’s been done.” Nevertheless, the dire straits of the journalism world are working against accurate science reporting. With more and more specialized science writers getting cut, their stories are taken up by freelancers or reporters pulled off another beat, who don’t have the experience to recognize the telltale talking points or already-discredited claims.
It also may simply be unrealistic to expect the media to be able to communicate the fundamental principles of the scientific method and epistemology to a public that didn’t learn them in school. “If the question is, how do you get the mainstream scientific position across, then you have to say, ‘These are the arguments for and against,’ and hope they understand how science works… There’s no getting around that,” argues Wiseman. “The answer is improve your science education. If you want to be fair about it, that’s the only way forward.” Palca agrees: “Without the base of a better education system, there is a feeling of pushing this boulder up the hill and then pushing it back up all over again,” he sighs. “I wish I could say, well, I told people correlation doesn’t equal causation back in 1989, so I don’t have to say it again.”