How Opportunities Emerge from Being Wrong

By Dr. Thibault Guicherd

Reflections on what science teaches us, where progress depends on our willingness to be wrong

Honoré Daumier, The Two Attorneys (ca. 1855–57).

Bottom Line

  • Certainty earns trust, but real progress depends on being willing to be wrong.
  • Science advances by seeking disproof, not confirmation.
  • Ignoring failures, from survivorship bias to pseudo-cures, distorts truth.
  • Our craving for quick, permanent answers blinds us to better solutions.
  • Normalizing error builds stronger knowledge, better choices, and healthier societies.

Introduction: The Value of Being Wrong

As a scientist, I’m often asked for my opinion on various topics, especially when it comes to economic news. But the way I respond, or even the way I formulate my thoughts, can be unsettling to those unfamiliar with the scientific method: taking the time to define terms, framing the question precisely, and laying out careful reasoning. These precautions can come across as time-wasting detours, or worse, as the arrogance of someone who likes to hear themselves talk.

At times, the scientific perspective seems so far removed from everyday life that it feels out of touch, even irrelevant, to the concerns of ordinary people. And to some extent, that’s a fair criticism. Still, science has shown—particularly through its extraordinary technical achievements—that it’s a powerful tool for understanding and describing the world.

So what are we to make of this? Should we believe that only science, or perhaps philosophy, can offer meaningful insight into reality? Isn’t there some principle that can be inherited—something that might apply directly to our daily lives?

The Discipline of Doubt

There is more than one such principle, of course. But to my mind, the most important one, essential to any serious endeavour—scientific or otherwise—is the ability to give ourselves every possible chance to be wrong.

To be clear, I don’t mean that we should act on false beliefs or try to be wrong on purpose. That would be intellectual self-sabotage. Rather, I’m talking about maintaining beliefs and taking action while imposing on ourselves a constant discipline: testing whether we might, in fact, be mistaken.

This idea can seem deeply counterintuitive. As children, we’re taught to give the correct answer when called upon in class. As adults, we’d rather not vote for someone who admits they’ve been wrong most of the time. Nor would we feel comfortable trusting a doctor who apologizes for making a mistaken diagnosis. It seems only natural to place our trust in those who are often right, as a sign of competence. And yet…

Few people know the name Abraham Wald, though many have heard the famous story about World War II bombers returning from missions over enemy territory. The plan was simple: examine the surviving planes, see where they were riddled with bullets, and reinforce those areas to improve survival rates. But Wald pointed out a critical flaw in this reasoning. If those planes made it back with damage in certain areas, it must mean those areas weren’t fatal to the aircraft. The real vulnerabilities were likely the untouched spots—because when those areas were hit, the plane never made it home.

This is a classic example of what we now call survivorship bias. It’s a mistake we also see in alternative medicine. A charlatan promotes a miraculous cure for cancer. It seems to work: there are hundreds, maybe thousands, of testimonials from people claiming they’ve been cured. No one is questioning the sincerity of those voices. But the logic behind the conclusion—that the treatment is effective—is flawed for at least two reasons.

First, those who recovered likely didn’t use only the miracle treatment; they may also have undergone conventional medical care or taken other steps that contributed to their recovery. Second, and more subtly: if those thousand success stories represent just 10% of all who tried the product, then it may well be 90% ineffective—or worse. To truly assess its value, we must also consider the voices of those for whom it didn’t work, provided they are willing or able to speak. Indeed, it is harmful to the community as a whole when the majority heed the siren call of those convinced they are right.

Falsifiability and the Strength of Knowledge

This principle is well known to serious scientists. The most epistemologically rigorous among them are familiar with Karl Popper’s idea of falsifiability. For a statement to be considered scientific, it must be possible to conceive and test a situation in which it could be proven false. This concept is more nuanced than it may appear and is often misunderstood.

It’s not simply about finding a counterexample. Take the well-known case of the black swan. The claim “all swans are white” is falsifiable: finding a single black swan is enough to disprove it. After that, a more accurate claim would be “most swans are white,” and the task becomes one of measuring just how many—95%, 99%, or less.

But what about statements that haven’t yet been disproven? Are they pseudoscientific? If we claim that the Earth is roughly spherical, does our inability to disprove it make the claim unscientific? Not at all. According to Popper, it’s enough that we can imagine an experiment that would refute the claim, even if it never actually happens.

This brings us to the heart of the matter: scientific knowledge is knowledge that has resisted repeated attempts at refutation. The pursuit of rigorous knowledge is therefore a form of methodical self-destruction, one that challenges the comfortable soil in which our confirmation biases grow.

Everyday Life and the Discomfort of Error

This idea, though rooted in philosophy of science, can be extended to everyday life. In a time where personal growth and self-awareness are widely promoted, cultivating a habit of self-criticism—not to see how often we’re right, but how others might reasonably show us we’re wrong—can open up opportunities for growth and, perhaps more importantly, for collective understanding.

It’s natural to seek out leaders—people who seem to have answers—to guide us through uncertainty. That’s why charismatic figures—pundits, analysts—rise to prominence. In such an anxious, fast-moving world, phrases like “I don’t know” or “I was wrong” feel intolerable. Yet during the COVID crisis, those phrases might have been far healthier than many of the overconfident proclamations we heard.

Back in the 1990s, psychologists Arie Kruglanski and Donna Webster identified and studied a fascinating phenomenon: the need for cognitive closure—the psychological desire to have a clear answer to a question. This need varies from person to person and topic to topic, depending on personal history and current concerns. But two tendencies appear consistently: urgency and permanence.

In other words, we all tend to seek three things: confirmation of what we already believe, delivered quickly, and by the same methods as before. Meanwhile, the world behaves in chaotic, unpredictable, and often counterintuitive ways. That doesn’t mean we can’t find patterns or principles that work—but all knowledge has a domain where it applies. When we step beyond it, our decisions grow shaky, even dangerous.

A Civic Duty to Embrace Error

Today, many self-help gurus market their personal success as a universal recipe. Vulnerable people—especially those facing hardship—are easily lured by these ready-made formulas, sold at a high price by pseudo-experts who promise a “life hack” for winning. In such a climate, it becomes a civic duty to promote the art of being wrong—and, more importantly, of realizing it.

We live in a world that prizes flawless winners—those who never make mistakes. This produces systems led by people who don’t understand error or humility. They double down on strategies that once worked for them, regardless of context.

To build a better society, we must normalize error, doubt, and the exploration of failure. Even science, for all its strengths, is carried out by humans prone to ego and ambition. We still reward discovery and success far more than careful correction. But behind every breakthrough stands a mountain of failed attempts, made by others whose contributions go unseen.

Giving ourselves the opportunity to be wrong is giving ourselves—and each other—the freedom to choose better. Never correcting someone is denying them the chance to improve. Admitting our mistakes is also a way of making room for others to be right. In a time when leaders silence nuance in favor of bold, immediate action, creating a culture that values the possibility of error may be the most hopeful path forward.


About the Author

Dr. Thibault Guicherd is an economist and professor. He earned his PhD in Economics from the University of Lyon, with research focused on the history of economic thought and the methodological foundations of the discipline. His teaching emphasizes critical thinking and methodological rigor as essential tools for inquiry and progress.