I am, as far as I can tell, a sentient human being.
I am somewhere in the middle of my total lifespan. A lifespan that will be over in a blink of an eye.
During that brief life, I have to somehow cram in learning, making friends, building community, caring for family, creating new artifacts, pursuing hobbies, passing on knowledge, seeking enlightenment, having fun, and a thousand other time-consuming occupations.
Along the way, I have to make decisions. Lots of them. Some decisions have no effect. Some effect only myself. Some effect my children and family. And some decisions, pooled together with the decisions of millions of other people, will contribute to long-term effects on society at large.
Making decisions means relying on knowledge. But how do I know that the things I “know” are true?
I do not have time to personally recreate the entirety of human experience up till this point on my own. I don’t even have time to learn more than an infinitesimal fraction of the entire corpus of things we think we know as a species.
And even if I could, how much of that knowledge could I rely on?
Consider an individual healthy human being. They are blissfully unaware of much of the data streaming past them at every moment. Even their most fundamental senses, senses that they rely on implicitly, are constantly lying to them. And as for what they do with what little information they are able to take in: their brain is a disastrous mess of biases, innumeracy, and outright defects.
Even if we assume that, as a group, we can correct for some of these systemic flaws, our entire species is still peering at the universe from one single, insignificant speck.
And then there is the small complication that apparently, when we look hard enough at the most basic building blocks of existence, the slippery little bastards change just because we are looking at them.
When I put all this together, I am left with the conclusion that there may be no such thing as objective truth, and even if there is, I will never, ever know even a fraction of it.
So how to make decisions, given this total lack of foundation?
The short answer is that I lean heavily on heuristics. And that I don’t even bother looking for objective truth. Instead, I try to rely on ideas that are viable.
What is a viable idea? It’s an idea that, when employed, bears good fruit.
The best technique that humanity has yet come up with for evolving viable ideas is the scientific method. Which means my best bet for making solid decisions is to base them, whenever possible, on the existing scientific consensus, if one exists.
Of course, science is far from perfect. It is flawed and messy and political, like all human endeavors. What sets science apart is not what it “knows”, but the process. The process of science is uniquely set up to be self-correcting, and self-improving.
This property of science is often far from obvious at any given moment. All human institutions evolve to be self-serving, and scientific research is no exception. Young scientists find it easiest and most rewarding to do research that supports the assumptions of the professors and experts who presently dominate the field. Then, after decades, newcomers with ambition decide to make to question the conventional wisdom. As a result science improves, but the improvement moves at a generational pace.
Sources of funding also bias what research is done, and what is ignored. Phillip Morris dumped millions of dollars into trying to get science that would downplay the harm of cigarette smoking. Some of it undoubtedly worked in the short term. But the truth eventually emerged. Here again, the only salvation is in the slow self-correction of science over decades.
At any given time, for any branch of science, there is a consensus about what is known, as well as a fragile penumbra of research that disagrees or questions the consensus. A tiny fraction of scientists in that fringe may eventually turn out to have been more correct than their peers. As the slow wheel of scientific opinion turns, these few may be heralded as early visionaries. Meanwhile, the majority of other scientists on the fringe, who turn out to be mistaken, will be forgotten.
Periodically a scientist on the outskirts of the present consensus, or a fan of such a scientist, will become convinced that their minority opinion is not only correct, but vitally important. They will begin spending a great deal of energy, not on further research, but on popularizing the minority opinion.
In most cases these popularizers are not opportunists; they are true believers. They believe they have come upon “hidden” knowledge that the scientific community is either intentionally or ignorantly ignoring or suppressing. They believe that the world needs to know what they know.
People like this often have very compelling stories to tell. And it’s tempting to ascribe extra weight to what they say simply because why would someone embark on such a quixotic campaign, and risk ridicule, unless they had hit upon essential knowledge?
A quick survey of the whole field of “hidden truth” dissemination serves to dispel this notion, however. The fact is, at any given time their are people vigorously promoting every conceivable variety of hidden truth. From fad diets to alien abductions, there are earnest true believers everywhere. They aren’t all on to something. In fact, only a tiny fraction will ever see their ideas vindicated.
In order to understand the proliferation of these fringe voices, we don’t have to insist that they must either be cynical frauds, insane, or on to something important. The truth is, being in a “hidden knowledge” niche is rewarding and self-reinforcing in its own way.
Here’s a concrete example. For years I’ve followed the work of Mark Sisson. Sisson is a relatively benign, mainstream example of the hidden knowledge guru. He promotes something he calls the “primal blueprint“.
I like a lot of what he has to say, and find him both engaging and convincing. He quotes a lot of current science to promote his worldview, and he has lots of success stories to tout.
But let’s be real here. The success stories are a classic case of confirmation bias. The vast majority of people who read Mark’s books and see no results are just going to fade away and move on to other things. They are never going to take the trouble to write in and say “it didn’t work for me!”.
Meanwhile, like most diets/lifestyle plans, it does work for some people. Does it work because it’s “right”? Or because they happened to be genetically predisposed? Or simply because picking up Mark’s book was the moment they finally decided to stick to a diet and fitness plan for real? Who knows! What is certain is that these are the people who are going to write in, excitedly sharing their success. And Mark will be rewarded and confirmed, not just by their money, but by their stories.
And what of the science? Well, Mark happens to be riding a scientific wave that has been steadily building behind low-processed-carb, high-fat diets for years. He was lucky to have seized on certain ideas early that have since been born out. I can guarantee you that there were thousands of other lifestyle gurus who seized on other ideas and weren’t so lucky. And as a result, you’re less likely to have heard of them or seen their books at Barnes & Nobel.
Even for people much further out on the “fringe”, promoting hidden knowledge has its rewards. For someone with the right personality, you can build a very comfortable, meaningful life selling and promoting healing-through-magnets, or the truth about lizard people in government, all while sincerely believing in it. People are drawn to hidden knowledge. If you persist, you will find others who want to believe. They will seek you out and cluster around you. They will support you and tell you to carry on fighting the good fight.
How can I know who is right? How do I know if a fringe knowledge promoter is actually on to something real?
The short answer is: I can’t. If I want to determine whether to follow a particular diet based on an obscure study, I can’t dedicate decades of my life to going back to school, getting a PHd in nutrition, catching up on thousands of books’ worth of the state of the art, then planning, funding, and executing studies to reproduce the results in the study.
In fact, I don’t even have the time or scientific training to read all of the relevant research and evaluate its quality. The best I can do is to follow the reports from science journalists who have the appropriate skills to both a) evaluate the worth of studies and their findings; and b) explain those results in layman’s terms. I’m stuck with getting my information from two removes away. But that’s still the best chance I have of getting at viable ideas. It’s still far more likely to give me a robust foundation for making decisions than if I were to pretend that I’m actually capable of independently evaluating fringe results and determining that they are more plausible than the broad scientific consensus as reported in the mainstream science press.
It is also worth noting that even when they turn out to have been right (for certain values of “right”) people who promote fringe interpretations are rarely if ever the ones who actually bring about an improved scientific consensus. Dr. Atkins might have at least been partially onto something, but when the New York Times writes about the evolving scientific attitudes towards fat, he isn’t the scientist they quote as having paved the way. The scientists who are actually moving understanding forward are, pretty much by definition, not the ones talking about it in the popular press. After decades of successful research they may write a pop-science book to explain their findings to the average person. What they don’t do is drop out of the research establishment early and start selling self-published books and going on radio shows the moment they stumble across a surprising result. This kind of single-minded devotion to promoting a minority idea, even if it started out with a scientific finding, should not be viewed as giving weight to the idea. If anything, it should render the idea suspect.
Does this mean that I “do whatever mainstream science tells me”? Not quite. There are two major caveats, and one minor caveat, that I observe when dealing with scientific consensus.
First, science is only as good as the questions that are asked. If I have direct personal experience in an area where I’m able to identify a blind spot in the questions that scientists are asking, I may put less weight than usual in their results. For instance, there are lots of scientists asking questions like “do students retain more knowledge if they have math class before or after lunch”. There are far fewer scientists asking questions like “do children who have near-total autonomy over their learning experience go on to have more overall life satisfaction”. Here, assumptions that most people, including most scientists, take for granted about education are constraining the set of questions that are being asked.
Second, contrary to what some believe, I don’t think science has anything to tell us about what what is best in life. It may be able to tell us what makes people happy, statistically speaking. It cannot, however, tell us if it is important to pursue personal happiness.
I mentioned that there is also a minor caveat, and it is this: some decisions have higher stakes than others. When choosing a personal diet or exercise program, the worst thing that is likely to happen is that I feel like crap for a while. On the other hand, the stakes are higher when voting for political representatives. And even higher, from my personal perspective, when making medical decisions for my children.
When the stakes are low, I feel free to play around with fringe ideas, even while recognizing that I have no special insight into why they should be right. When the stakes are high, it behooves me to defer to the current scientific consensus as best I understand it. That consensus may turn out to be wrong or flawed in a generation. But if it has to do with something outside my sphere of expertise, it is literally the best chance I have of making a good decision.