On Trusting the Experts
Introducing a new class of fallacies that will help you avoid being fooled by partisans, prognosticators, and propagandists.
One of the throughlines of this publication will be that the urges to control and to submit to control originate in fear. When humans cede control en mass, it opens the door to authoritarians.
There were evolutionary advantages to this programming in the distant past, and some of those advantages persist. Our forebears frequently found themselves in dangerous situations. The tribe could quickly reduce its coordination costs if everyone had an instinct to rally behind a strong leader. Because that strategy worked in those ancient contexts, the submission germline survived. We're still mostly cavepeople, after all. So we inherited those instincts.
But the world around us has changed considerably. Our evolved (instinctive) strategies are only sometimes appropriate in a modern context.
In many cases, experts are helping set policies for hundreds of millions of people. And that's where illusions of expertise are the most dangerous. Let's go through some ways that fear can give way to fallacy regarding experts.
The Fallacy of Adjacent Expertise
It's tempting to believe that people who might be effective in some areas will also be successful in an adjacent domain. When we are so tempted, we transfer their expertise across domains as if it were transferable.
He is a good CEO, so I trust his opinion on economics.
He is a doctor, so I trust his opinion on healthcare policy.
She won the most votes, so she must be good at making important decisions.
Expertise in one field loosely connected to another does not make someone an expert in the adjacent field. It's easy to fall into this trap, though, especially when we are desperate to believe.
The Signal Fallacy
Sometimes the credential signal creates the illusion. If some affiliation or designation confers an air of validity, it's easier and lower cost to let that signal stand in for confirming the individual's expertise. Some examples include:
She is a professor at Harvard, so she knows better what's good for people.
They've been studying this stuff for years, who are we to question?
X percent of experts polled believe Y. Therefore you should believe Y.
Not only are human beings naturally inclined to accept such status indicators, but we are also inclined to put faith in scholars’ specialization. But a laurel is not itself evidence of efficacy, fact, or predictive capability. Nor does it mean that a scholar has skin in the game. At best, the credential signal is a kind of status peacocking.
Indeed, far from signaling that someone possesses a reasonable opinion, one's position at an Ivy League University can make her less likely to understand the society around her. Not only do university faculty operate with very little market feedback, but their departments are also separated out from the wider economy upon which they are parasitic. Most universities depend on federally subsidized student loans, state budgets, grants, and endowments. Such a system can leave less room for truth tracking. Instead, academics have strong incentives to play the publication game. If anyone could figure out whether anyone reads the majority of journal articles that get published, they might be able to find out just how wasteful a game it is.
So controversy abounds.
One 2007 Indiana University study concluded that as many as half of papers are never read by anyone other than the authors, their referees, and the journal's editors. This paper sounded plausible at the time, given that scaling laws probably operate in large numbers of papers, as with other phenomena. But the Indiana paper has since been taken down, and the sources are either disappeared, dated, or discredited.
Fifteen years later, no one has been able to paint a definitive picture of the state of American scholarship. If taxpayers, parents, and students are required to pay for research, how much is that research being engaged or cited? It would seem no one in the academic cartel has any real incentive to ferret out that particular truth—especially not grant recipients.
Replication in science and social science is supposed to be the gold standard of research. The research is dubious if one can't produce at least similar results. But the news isn't good for the state of research replication. Metascience studies have shown that only half of researchers in psychology could reproduce their results, with similar issues plaguing other social sciences.
The twin crises of peer review and research replication have raised serious doubts about these institutions' value. With so much rot in higher education, we have good reason to be suspicious of its experts.
The Study Fallacy
Speaking of studies, one of the stealthiest fallacies is the careless (or deliberate) use of "study" or "report" in a headline. Few people take the time to find the original study and evaluate its data or methods. Someone else has checked the work the reader thinks, if he thinks at all. Most people just assume a report is valid and that the journalist looked into it. In some cases, journalists and researchers collude in advance of a study as if the conclusions were foregone.
Science writer Richard Harris explains the different forms of bad science that pass for research. Far from being some definitive gotcha moment, the phrase 'A study finds…' or 'Science says' should always be greeted with suspicion.
"Each year, about a million biomedical studies are published in the scientific literature," writes Harris.
And many of them are simply wrong. Set aside the voice-of-God prose, the fancy statistics, and the peer-review process, which is supposed to weed out the weak and errant. Lots of this stuff just doesn't stand up to scrutiny.
Researchers unconsciously will the data to tell a story that's not true. Other times, laments Harris, there's outright fraud. Either way, a large share of what gets published needs to be corrected.
Yet people eat it up.
Because they eat it up, journalists have learned they can invoke our blind faith by appealing to the enterprise of science itself. Science appears as an omniscient being. Of course, 'science' doesn't say anything. Scientists do. Each scientist is a fallible human with biases and shortcomings. Some stake their entire careers on a single line of inquiry. But as Philip W. Magness reminds us in a letter to the Wall Street Journal:
If your politics align with the press’s, it is now possible to bypass the inconveniences of peer review by taking your work directly to complicit cheerleaders in the newsroom.
Whenever writers tell us to "listen to the science" or otherwise tell us what "studies show," someone with an agenda invokes a dangerous power. But there is no such thing as the god Science. There are only the sciences and their human practitioners. Each has its unique uses, limitations, and language games. When we abstract 'science' from all the vicissitudes of knowledge production, we are not truth-tracking. Our judgment suffers when we turn science into an oracle whose priests can be identified by their frocks or impressive letters. As Ivan Illich suggests, we no longer respond to what seems right to our sense of how things are down here on the ground. Instead, we parrot what “the science” says.
NPR fallacy
The National Public Radio (NPR fallacy) is a fallacy in which, over time, listeners or viewers turn off their critical thinking because they have come to think that a source no longer requires scrutiny. NPR is just one example. It can be exhausting to be skeptical day in and day out. When NPR has one of its quarterly fund drives from listeners, they say You want news and commentary you can trust. And that is another way of saying turn off your brain.
Here are a few ways NPR and similar 'trusted' news sources distort reality.
Write the story and read the news as if the listener agrees with you and accepts your assumptions.
Omit information or perspectives that would show the issue in an entirely different light.
Play it straight or semi-skeptical as the writer or host, but invite a guest expert or quote a partisan.
If there were ever thirty-seventh-level wizards of distortion like those discussed above, it would be NPR journalists. To them, fallacies are strategies. And these strategies are masterfully executed and flawlessly delivered, which makes them easy to digest on your way to work.
However, I will say that since NPR went all in on social justice, the wizards have lost their Obi-wan-like use of these techniques.
The Government Fallacy
One of the oldest and most intractable fallacies is that government research, data, or expertise is unassailable. People mindlessly trust the government because, supposedly, their missions are to deliver accurate information without bias. Undue influence or corrupting incentives comes from other sectors, not from dedicated public servants!
Or so the story goes.
No organization is immune to bias and corrupting incentives. None. Some might argue that, by degree, government experts are more trustworthy than corporate or university experts. And yet such notions are so much blind faith. It's comforting to think that there is an authority out there charged with looking out for our interests, but experts in bureaucracies have their agendas, too. They operate within different sets of incentives, but the incentives are there. Remember Shirky's Principle?
"Institutions will try to preserve the problem to which they are the solution."
Operating in the background, some force like this must have been responsible for the experts who testified that:
There was hard proof of WMDs in the lead-up to the Iraq War;
We should eat a diet like that pictured in the USDA Food Pyramid;
Taxpayer dollars to corporations "stimulate" the economy and ease boom-bust cycles;
Inflation will be “transitory;”
Masks dont’ work well. Masks do work well. Masks don’t work well. (…)
COVID is a pandemic of the unvaccinated; Get a vaccine to stop the spread.
Can we count on the data from the FED, the Bureau of Labor Statistics, or the Census Bureau? It's hard to say. The questions they ask and the data they reveal are only part of the story. But these statistics become the paint pundits use to color our reality.
The Amateur Fallacy
Any number of the above fallacies can leave one with the sense that experts have a certain kind of privileged status. This status requires us to exclude amateurs from discourse and decision-making. Just as we should be skeptical of the credentialed class and its claims, we should not ignore the amateur's claims.
Economist Art DeVany was a principal founder of evolutionary fitness and the paleo diet despite not having any diet and nutrition degrees. Devany and other amateurs such as Mark Sisson and Gary Taubes helped overturn decades of nutrition orthodoxy involving the proper role of carbohydrates, fats, and sugar.
Biohackers, gene editors, and amateur CRISPR researchers work out of their garages and learn from each other online. Their work sometimes involves cracking proprietary formulae and gene therapies to make them more accessible for the least advantaged.
In 2012, a group of amateur astronomers discovered 42 planets, while in 2013, Michael Sidonio sighted a new dwarf galaxy, NGC 253-dw2. And in 2016, two amateur astronomers, John McKeon from Ireland and Gerrit Kernbauer of Mödling, Austria, filmed an asteroid impact on Jupiter.
Citizen stringers and civic journalists routinely film, blog, and report first-hand information in ways that allow stories to break. Famous examples include films of police abuse and video taken by the infamous right-wing provocateur James O' Keefe, whose work has exposed voter fraud and toppled media executives.
Teen software developers are notorious for becoming millionaires after working on applications in their bedrooms and basements. Tech giants now routinely hire coders without advanced degrees. (Recall this from Michael Gibson, who has invested millions in entrepreneurs under 20.)
As we continue to connect with others worldwide, we have the opportunity not to see ourselves as merely passive consumers or sharers of information but also as active participants in creating a great, ever-shifting mosaic of understanding.
But if we are ever going to improve our collective intelligence and get better at tracking the truth, we have work to do.
Thanks Max,
I think one of the more powerful fallacies pushed over the last couple of years was that "you are not an epidemiologist so how can you claim to know better than them?" It was quite obvious that besides the lies that were pushed through skewed statistics, many of the studies on viruses and vaccines appear to be relying on very unscientific methods. It is the same in my old discipline of economics. You don't have to be an economist to be able to see that the methods are unscientific. People literally tweak the analysis till they get the expected results, or to get grants, fame or expert income.
The study of philosophy sits above the study of epidemiology and economics, etc., in the Western tradition. That includes such things as Metaphysics, Logic and Epistemology. A basic understanding of such things as logic can indeed override the "expert" claims. Anyone can do that, anyone can question the epidemiologist and economist on a scientific basis.
I also want to add this on the topic of believing:
https://substack.com/@paxians/note/c-21474165?utm_source=notes-share-action&r=1vez4c
Great collections of fallacies - but you accidentally include one in your preamble: a variation on the teleological fallacy, whereby you presume to know the circumstances responsible for the emergence of specific biological or psychological features. We don't have access to these prehistories, and can only speculate on thin evidence, so it is always 'false expertise' that says "we're like this because it used to be like this". Case in point: social behaviour (including submissive behaviour) appears in mammals long before even proto-humans arrived. Evoking specific evolutionary backstories is almost always political rhetoric in disguise, so be on the watch for it!