Scientific Literacy and Democracy: Bob Reuter on the 2025 Luxembourg Humanist Conference

Photo by Cedric Letsch on Unsplash

Bob Reuter is a cognitive psychologist and an assistant professor in educational technology  at the University of Luxembourg. He serves as president of the Alliance of Humanists, Atheists & Agnostics in Luxembourg (AHA Luxembourg), promoting secularism, human rights, and scientific literacy. Reuter is known for his advocacy of open societies and is the host of the 2025 Humanists International Conference in Luxembourg City. 

Reuter discusses the vital connection between scientific literacy and open societies. He explores how critical thinking, evidence-based reasoning, and democratic engagement are interdependent. Reuter highlights the ethical challenges of AI, the evolving nature of science, and the importance of shared factual foundations in liberal democracies.


Scott Douglas Jacobsen: So, the theme for this year’s conference, held in Luxembourg City, Luxembourg, is open societies and scientific literacy. We touched on the rationale behind this choice in our last interview. Why do you think scientific literacy is critical? Why do you believe it is essential to link it with dialogue about open societies?

Bob Reuter: There are, of course, the well-known justifications often advanced by policymakers for promoting scientific literacy—chiefly, economic competitiveness. The argument is that citizens need to participate productively in the workforce, secure meaningful employment, and contribute to technological and societal innovation. However, the relationship between scientific literacy—indeed, scientific culture more broadly—and democratic governance runs much deeper than economic utility alone.

It is not a coincidence that the rise of modern democratic ideals and the Enlightenment occurred during the same historical period—the 17th and 18th centuries. Both were rooted in the critical questioning of traditional authorities, including the political powers of monarchy and aristocracy, as well as the religious orthodoxy of the Church. Once individuals began to challenge dogmatic rule—both governmental and theological—they also began to reimagine their relationship to knowledge and, by extension, to power. This cultural shift laid the foundations for open societies where rational inquiry and civic participation became not only possible but also necessary.

Today, we live in a hyper-connected, globalized world—far more complex than the social arrangements of our ancestors, who primarily operated within localized or national frameworks. Our lives are now shaped by technologies that are difficult to understand without specialized knowledge. mRNA vaccines, for example, represent a fundamental advancement in biotechnology based on decades of research. Climate science relies on complex models and interdisciplinary data, making it a challenge for many laypeople to interpret.

However, to participate meaningfully in democratic discourse, citizens must engage with these topics. A functioning liberal democracy presupposes a public that can critically evaluate information. If that foundation is missing, decisions—whether in conversation or at the ballot box—can easily be based on misinformation or emotionally appealing but factually false narratives. In this sense, the demand for scientific literacy is more pressing today than it was in the past when political life and technological processes were simpler.

We now inhabit a world where political, societal, and economic outcomes are shaped by systems—such as artificial intelligence, global trade, or genetic engineering—that few people fully comprehend. Consider AI: Even when framed as a mere productivity tool, it raises profound ethical and civic questions.

Jacobsen: So how can such tools be developed ethically in open societies to enhance—not undermine—scientific literacy and public trust? Take Khan Academy as an illustrative case. Founded in 2008, it has provided free, high-quality education online long before the rise of large language models and generative AI. Today, Khan Academy is exploring AI-based tutoring systems, but its mission remains rooted in expanding equitable access to knowledge.

Reuter: This kind of application shows that AI can serve as a tool for the public good—but we must remember that these technologies are not neutral. Their design reflects the political, ethical, and epistemological values of those who create them. They are shaped by choices—what is prioritized, who is included, how transparency is handled—and those choices impact society. So no, they are never just tools for passive consumption; they are expressions of intent and ideology and must be critically examined as such.

They are also tools for those who put them on the market. Thus, there are specific biases in the input-output data that went into these systems. That is, to some extent, inevitable—but it is essential to be aware that these biases exist.

However, if these biases become embedded in systems that some people begin to regard as oracles—or treat with the reverence once reserved for divine or authoritative figures—then we are no longer simply dealing with a tool. We are confronting something that has the potential to reshape our worldview. In that sense, it has a political impact. It influences how we perceive reality.

Moreover, of course, once you start changing how people perceive reality, you also begin to shape their judgments and actions. When people use these tools to seek “truth” or a version of it, it becomes dangerous—especially if they do not understand how the tools function or where their limitations lie.

Therefore, scientific literacy also involves understanding how different types of artificial intelligence systems function. Otherwise, people risk becoming the victims of actors whose goals are not aligned with their own. That is why we need public policies and regulations to ensure that the development and deployment of these technologies are conducted ethically.

If the companies investing in these tools are allowed to pursue their interests unchecked, we can be sure they will do so very efficiently—and these tools will primarily serve as mechanisms for profit generation. However, that does not mean they will serve the common good.

Jacobsen: A lot of scientific literacy and open society ideals seem to be, in a way, like the skim from the whole milk—they are the refined product of values in practice—maybe not an abundance of values, but at least a few core ones that lead to them.

Values like respecting another person’s rights—though, of course, you would need to define and enumerate those rights. However, at a high level, you can distill them down to codified expressions of the principle of universalism in how we treat each other—with dignity, for example. I think humans have a more sophisticated understanding of this today. However, at its core, there is what you might call an ethical switch. The cow is already there—you have to milk it.

Moreover, it is the same with scientific literacy. It depends on a value for evidence, logical consistency, and sound reasoning. All of that flows from a foundation of values. Moreover, it also includes a willingness to revise our theories.

I see, quite often in public discourse, people criticizing science whenever it tries to go beyond established theories. However, that is not a weakness; it is a strength. Unfortunately, it is rarely perceived in this way.

Reuter: Take, for instance, the COVID-19 pandemic. Initially, we knew very little about how the virus functioned, how it spread, and what the most effective public health responses would be. The early recommendations—like washing your hands—were based on the best available knowledge at the time from immunologists and pandemic researchers.

Handwashing is a good behaviour to reduce contamination. However, in the case of COVID-19, we later discovered that airborne transmission played a much larger role than initially thought. Wearing masks turned out to be far more effective than hand hygiene in reducing the transmission of the virus. We were navigating a situation of profound uncertainty.

I think many people—at least from my perspective—seem to have a kind of pseudo-religious conception of science. They assume that if scientists are speaking with authority—on TV, in classrooms, or policy discussions—then their knowledge must be absolute and unchanging. So, when scientific recommendations change, many interpret that as a failure or even a contradiction.

Jacobsen: Maybe that is also partly the fault of the scientific community, right? There has not always been enough emphasis on science as a work in progress.

Reuter: Yes, when science “changes its mind,” as people often say, it is not because scientists were clueless. This is because science is constantly evolving in light of new evidence. That is its strength, not its weakness.

Take nutrition science as another example. We get new dietary recommendations every ten years or so. The general principle—calories in versus calories out—remains the same, but the details evolve. If a substance once thought to be safe is later found to be carcinogenic, then, of course, the recommendations must change accordingly.

Moreover, this is a crucial part of scientific literacy. Because if people do not understand that scientific knowledge is provisional, they will lose trust the moment that knowledge evolves. They might think science is unreliable when, in fact, it is doing precisely what it should—adjusting as we learn more.

Jacobsen: A statement from a well-trained scientist who is actively working in their field represents the best provisional knowledge in that discipline at that moment. However, when scientific literacy declines, you start seeing confusion. People can no longer distinguish between a trained professional hypothesizing based on years of research and someone with no expertise making claims without grounding in the field at all.

Reuter: Those individuals—often lacking proper training—can still get a platform. They go on podcasts, write books, and sell ideas that resemble a kind of modern gospel, but they are not based on vetted or peer-reviewed knowledge.

Jacobsen: If you shake together just a few of those factors—scientific illiteracy, media amplification, charismatic storytelling—it is easy to see how people can get very confused. I mean, I am not a fully trained scientist myself. I know how to ask questions, synthesize information, and build narratives through interviews. However, I also have to double-check myself constantly. It is a matter of humility and ongoing self-correction.

Reuter: As a humanist community worldwide, we have a double-edged relationship with the phrase “think for yourself.” On the one hand, I do believe we need to learn how to think for ourselves—because, in the end, we are responsible for what we believe to be true, good, and beautiful. No one else can take that responsibility from us.

If I decide to defer to an authority who tells me what is true, I am still responsible for choosing to rely on that authority. So, the responsibility never leaves me.

However, thinking for yourself—that is not easy. Thinking scientifically is even more difficult. As an evolutionary psychologist, I know that our brains and minds did not evolve to be truth-generating or science-producing “machines.” They evolved to help us survive and reproduce. That was their purpose in our ancestral environments.

Now, if we couple those minds with technology and collaborative, collective intelligence—working together—then we can surpass individual cognitive limitations and construct knowledge at a much higher level. However, on the individual level, it is not easy. We all have personal, emotional, social, philosophical, and political biases.

Moreover, we have cognitive biases—well-documented for over a hundred years. Knowing about those biases does not make them disappear. It is hard work to think clearly. Thinking for yourself means thinking against your intuition. Moreover, that is not easy—and it is rarely pleasant.

However, that is what scientific thinking is. That is why it often works better as a community effort—where different people take on various roles, including playing devil’s advocate —to test ideas, challenge assumptions, and subject claims to reality checks.

Jacobsen: That is an excellent example of how scientific thinking works in practice—asking questions, getting feedback, adjusting behaviour. Just say “thank you” and move on.

Reuter: Exactly. That is science in action. I mean—you need to ask questions because knowledge does not just sit there waiting to be uncovered. It is an active process. It involves probing the world, trying things out, engaging with uncertainty, and ultimately discovering new things.

Jacobsen: Why is that process—not just the knowledge it produces, but the process itself—important for open societies?

Reuter: I think in an open society—in a liberal democracy—we value the idea that society is a collective project. It is not governed by a single person or a small group, such as a monarch or a bunch of oligarchs, deciding what is suitable for everyone. Instead, we co-decide—together—what we want to achieve and how we want to achieve it.

However, all of that needs to be grounded in some shared understanding of the world—not just the physical world but also the social and psychological world. That understanding needs to have some connection to reality.

Now, I struggle here because I am not a positivist. I do not believe we can open our eyes and see reality as it is. Theoretical assumptions continually shape our perceptions and conclusions. However, even so, in a democracy, it is meaningful to share the project of trying to base our political decisions—however much we disagree—on something like a common factual foundation.

We have different values or political goals. We might disagree about the best way to achieve even shared aims. However, there still needs to be a set of agreed-upon facts. For instance, if you let go of an object, it falls to the ground—that kind of basic consensus about reality.

In a dictatorship, by contrast, you can make people believe anything—even absurdities. Moreover, once they believe absurdities, they can be manipulated into committing atrocities.

Jacobsen: That reminds me of that famous quote from Voltaire. “If people can be made to believe absurdities, they can be made to commit atrocities.”

Reuter: That is the core of the connection between democracy and scientific literacy. If people lose the ability to tell what is true—if they lose that shared compass—then truth becomes subjective or strategic. That is what Orwell captured in 1984: doublespeak, historical revisionism, blurring the lines between reality and lie.

That is what we are seeing now in what is often called the “post-factual” era—especially in international politics. We hear phrases like “you have your facts and I have mine” and the idea that we do not even need to agree on a shared factual basis. At that point, it is no longer about debating what is true. It becomes a game of power—who has the most political, economic, or military force to impose their vision of the world?

The only tool we have developed as a civilization to counter this reliably is scientific inquiry. It is our way of finding out what is true. Of asking questions. Of challenging what we think we know. Moreover, we continually refine our understanding of the world, ourselves, and society.

Jacobsen: I appreciate the depth of the conversation. I will see you in early July.

Reuter: Yes, excellent. Thank you for your time, as always.