Press "Enter" to skip to content

A Citizen’s Guide to Navigating Science

…and the education needed to use it

by Robert Rue—

“Bias in Science. People disproportionately search for (55), share (56), and remember (even falsely) preferred information (57). In addition, people are selectively skeptical of discordant information (58) and more negatively evaluate scientific methods when results are undesirable (59, 60). Similar patterns occur among scientists (emphasis added). For example, peer reviewers evaluate research more favorably when findings support their prior beliefs, theoretical orientations, and political views (61–63). Scientific papers describe ideological outgroup members more negatively than ingroup members (64). Scholars are likelier to reject papers ostensibly written by little-known authors than identical papers ostensibly written by prominent authors (65). In an analysis of scientific papers, 96% of statistical errors directionally supported scientists’ hypotheses, suggesting credulity among scholars toward favorable outcomes (66).”

From “Prosocial Motives Underlie Scientific Censorship by Scientists: A Perspective and Research Agenda” (3)

It’s pretty clear what scientists should do with this exposé: Re-commit to integrity, self-awareness and the scientific method.

But how should we non-scientists respond to the fact that claims purported to be science sometimes aren’t?

The answer has existential implications. We have never needed the insights of clear-eyed science more than we need it now. But when claims made in the name of science are in fact shoddy science, the public’s trust rightfully erodes, and the result can be disastrous: we non-scientists start to have doubts about lots of scientific claims and feel the need to decide with our guts what’s true and what’s not. Sometimes those decisions have global impact because societal consensus, largely formed by non-experts, can be even more powerful than a logical proof.

I would ask if we want to live in that world, but we are already living in it.

Events like the pandemic and debates about things like biological sex or the role of genes in behavior have sent me on a non-scientist’s intellectual journey. When you add the study quoted above to the recent replication crisis in multiple scientific fields and the p-hacking scandals (the cherry picking of experimental data that confirms a desired conclusion), it is clear that we non-scientists cannot afford a naive view of scientific conclusions.

But, again, what do we non-experts do?

I share here some ideas:

What we should NOT do:

  • Assume that science is a set of unerring—and therefore un-revisable—
  • Assume that the scientific method is biased even if scientists can be.
  • Substitute our non-expert judgment about science for the judgment of experts.
  • Assume that because you’re not a scientist you cannot become more informed.
  • Blindly trust all scientists.
  • Blindly trust all scientists whose claims support your political ideology.
  • Assume that the mainstream view must be correct and that expert challengers of that view must be crackpots or conspiracy theorists.

What we SHOULD do:

  • Understand that science is a logical process, not a set of timeless conclusions, and that it demands revision when evidence requires it.
  • Understand that there is a difference between science (its methods) and scientists (the flawed wielders—and sometimes manipulators—of the method).
  • Understand that honest, skillful, rational scientists will sometimes disagree.
  • Understand that sometimes the scientists who disagree with the consensus are the ones who are right and that progress depends on giving them a voice.
  • Be skeptical, especially when science is being deployed in service of ideology.
  • Acknowledge uncertainty and get comfortable thinking probabilistically.
  • When a scientific issue is controversial, and your stance on it will lead to behavior that matters, do some work. The best method I know is to explore the issue in an open-minded way. Avoid the temptation to “prove” what you already suspect is true. Use the web to get to know the scientists who disagree (long-form podcast or video conversation is best for this), and here is my key and most challenging suggestion: trust more the scientists who demonstrate the intellectual virtues listed below.

(Much, but not all, of this articulation of intellectual virtues is borrowed from the Intellectual Virtues Academy of Long Beach.)

  • Curiosity—The desire to know more, not just what is present or obvious.
  • Intellectual Humility—Awareness that you can be, and sometimes are, wrong.
  • Intellectual Autonomy—The desire to think a subject through on your own, even when there appears to be consensus or unanimity of opinion about it.
  • Attentiveness—The practice of being fully engaged in a learning process.
  • Intellectual Carefulness—Awareness of logical fallacies and cognitive biases and a desire to mitigate them.
  • Intellectual Thoroughness—Unwillingness to settle for appearances or surface-level analyses.
  • Open-mindedness—A desire to listen to others and to actively seek views that are not your own.
  • Intellectual Courage—A willingness to stay part of the conversation even when you fear embarrassment or negative judgment from others. Also, the bravery required to change your mind.
  • Intellectual Tenacity—A willingness to embrace struggle and what is sometimes the long road toward learning.
  • Epistemic Integrity—An unwavering desire to find and articulate the truth and a rejection of the temptation to “win” at the expense of the truth. This quality often manifests itself in admitting and correcting errors and in acknowledging facts that are not easily explained by your conclusion.

But aren’t these judgments difficult to make, and won’t non-scientists make mistakes in evaluating the thinking habits of scientists? Isn’t this a terrible alternative to a science that we can always trust?

Yes.

But belief in a science that will always be accurate is belief in a fairy tale. In fact, it is a misunderstanding of what science is. Science is a process, a way of thinking—the best way of thinking about the world that humans have ever invented—but it is still not a set of immutable conclusions.

And that’s why scientists will disagree and not always about technicalities that seem irrelevant to our lives. Sometimes they will disagree about MRNA in a vaccine and the spike proteins it is designed to awaken, about aspects of climate change or the efficacy of particular surgeries. In those cases, we have no choice but to align our behavior with one scientific view or another. Most of us will not ever grasp the details of those fields, even if we had a good high school or liberal arts college education. But trusting scientists who manifest intellectual virtues is a far better proxy for expertise than ideologies or gut feelings.

I believe that the vast majority of us can see the value of the intellectual virtues and can become at least proficient in recognizing them. But imagine how much better off a society would be if its schools dedicated their academic programs to teaching these habits of mind. Non-experts would be smarter. Experts would be smarter. Public conversation would be more sane.

This is a place we should aspire to live in, and schools are the only societal institutions that can get us there. Can we afford to set a course for any other destination?