Two Cheers for Motivated Reasoning
How individual biases in truth-tracking can create swarms of diverse seekers that better track the truth.
Cognitive faculties are a distinguishing feature of humanity—lifting humankind out of caves and enabling language, arts, and sciences. Nevertheless, they are also rooted in and subject to influence, or bias, by emotions and instincts.
One of the most significant ways information processing and decision-making becomes warped is through motivated reasoning, when biased reasoning leads to a particular conclusion or decision, a process that often occurs outside of conscious awareness.
—Psychology Today
To my knowledge, there is no membership card for the Rationality Community. You only have to be generally committed to what I have called truth tracking. To track truth, you must endeavor to be less wrong as an error-prone being scrabbling about on the earth.
While truth tracking requires acknowledging one's limitations, it also involves a general commitment to using one's cognitive powers to the end of discovery. Sometimes, our baggage—whether in our ideological priors or inborn biases—gets in the way of understanding. We’re human. So, we should minimize cognitive biases to understand the world around us.
But our evolved dispositions can be both features and bugs.
I realize some in the Rationality Community will bristle at the following suggestion. But after arguing elsewhere that the expert class is breaking down, I want to make two cheers for motivated reasoning.
To Rationalize is Human
If you've never heard of it, motivated reasoning is, in isolation, a kind of fallacious reasoning in which people form beliefs, make arguments, or investigate some claims because the investigator hopes to reach a certain conclusion. This kind of reasoning would seem to be at odds with truth tracking. Moral psychologist Jonathan Haidt calls it post hoc rationalization.
In a New York Times review of Haidt’s The Righteous Mind, the reviewer describes the matter thus:
The problem isn’t that people don’t reason. They do reason. But their arguments aim to support their conclusions, not yours. Reason doesn’t work like a judge or teacher, impartially weighing evidence or guiding us to wisdom. It works more like a lawyer or press secretary, justifying our acts and judgments to others. Haidt shows, for example, how subjects relentlessly marshal arguments for the incest taboo, no matter how thoroughly an interrogator demolishes these arguments.
That means you start with what your intuition says is right or what you wish to be true, and then you get better at arguing for that wish.
But rationality requires us to be less biased. And in this sense, the Rationality Community isn’t wrong: We should never let wishes father lies. And we should never let our biases blind us from feedback from the world.
But if we're honest with ourselves—that is, meta-rational—we must admit that we'll never completely rid ourselves of such baggage. Our species' psychosocial furniture includes biases, ideological priors, and inborn proclivities. While Enlightenment thinkers pushed us philosophically headlong towards the elegant and value-free methodologies of pure reason, we can no longer deny the “elephant” (emotional and intuitive centers) that our “rider” (logical and reasoning faculties) sits on. (Haidt’s metaphor.)
Let me be clear: I am not arguing that the Rationality Community is filled with elephant deniers. I suspect that other community members are, shall we say, more neurodivergent than you or I, which means a hell of a lot smarter. Some are probably on the spectrum. But by comparison, you and I are probably just a little more in touch with our inner elephants, which confers gifts beyond those restricted to the tips of our frontal lobes.
Motivated Swarms and the Republic of Science
Another way of looking at motivated reasoning concerns the idea of a whole hive of truth trackers. Yes, we want them to be rational in their pursuits. But, frankly, what sort of hive would we want to unleash upon the most important questions? Would we want them to be utterly devoid of motivated reasoning?
I say no. We need motivation in small doses.
Michael Polanyi was a polymath philosopher of science who defected to Britain from Hungary in the middle of the twentieth century. He disliked the centralization of science under illiberal regimes. He saw scientists as seekers and considered their independent activities sacrosanct.
In following their curiosity and judgments about what problems to investigate, Polanyi thought scientists were cooperating as a greater human community. Each scientist is engaged in seeking his or her answers. However, each must adjust frequently based on the findings of the other members of that community. Coordination by mutual adjustment within the same system brings about an overall result that is neither premeditated nor planned. Scientists, driven by their thirst for understanding, participate in the enterprise.
Polanyi called this the “Republic of Science.”
Wouldn't we want each truth tracker to shed as much baggage as possible before launching their investigations? To some degree, of course. But a little motivated reasoning in the Republic of Science can be healthy within limits. When we follow our curiosity and judgments about what problems to investigate, bias is unavoidable. Each is impelled to look in different places and different ways. As long as the resulting methodological diversity has a mechanism for achieving some measure of consilience, we might well track the truth.
For example, suppose someone—call her 'the skeptic'—is suspicious about some healthcare policy's predicted outcomes. Let's also assume the policy is fundamentally at odds with her ideological priors. The trouble with motivated reasoning can be that the skeptic will overlook data that conforms to her opposition's narrative on the policy's effects. She will be motivated only to look for data that fits her counter-narrative. While it's unfortunate that there is a mix of self-delusion and intellectual dishonesty in seeking only that data that gives her position more firepower, she might also be hellbent on turning over rocks and looking for truths that her opponents are unmotivated to find.
She might have made a critical discovery about the policy because she was hellbent. It just hasn't occurred to anyone to investigate in the way she does or for the reasons that motivate her.
Trust “The Science”
Check out this doozy from Psychology Today’s basic article on, I shit you not, motivated reasoning:
Studies by political psychologists highlight denial of climate change or discrediting its science as important examples of motivated reasoning; people process scientific information about climate shifts to conform to pre-existing feelings and beliefs. After all, accepting that climate change is real portends unpleasant environmental consequences and would require most people to head them off by making significant changes in lifestyle.
I chuckled as I considered the *reasoning* of such a claim.
A. As climate scientist Judith Curry has shown, “accepting that climate change is real” does not automatically portend “unpleasant environmental consequences.”
B. It certainly does not follow that such consequences “would require most people to head them off by making significant lifestyle changes,” as I have argued here. It might require embracing wealth creation and technological adaptation.
Then I wondered who’s really worried about a lifestyle change:
Now, that’s motivation.
Pluralism and Adjudication
In an important sense, all investigation is both value-laden and theory-laden. Of course, we must discipline ourselves not to look away from whatever feedback the world gives us. But when a million truth trackers are loosed in the Republic of Science, we will want to tolerate some motivated reasoning.
In aggregate, diverse motivations offer us a pluralism of perspectives, a diversity of locations to investigate, and various paths others might think are not worth considering. In accepting our humanity, we embrace our cognitive pluralism.
We want to keep our biases in check to minimize contributing fictions or falsehoods to the Republic of Science. But we don't exactly want to turn ourselves into soulless automata. Otherwise, we'll miss the curiosity, passion, and instinct that impels us to set off on a particular journey of discovery.
We want swarms of truth trackers with different motivations and an ever-improving process for adjudicating among perspectives and datasets.
What's missing, then, is not a system that mutes motivated reasoning entirely but a mechanism for adjudicating among rival claims. Without such, too many will remain in epistemic tribalism.
We’ll explore decentralized science (DeSci) at Crypto Cities Summit.
Motivated reasoning is only a serious problem when it is on the part of people with centralized control and coercive powers to shut down disagreement. Also, if everyone was "perfectly reasonable" (whatever that would mean), they would let go of their view instead of pushing to make sense of it. Many possible views would die early, even though they could be the right one.
Authors like Michael Polanyi, Alfred Korzybski, and Karl Popper taught me, to quote my husband Leif Smith, that it is more important to see truth(s) than prove one is right. Thank you! I think it is most important to be intellectually ruthless, about one's own beliefs and those of fellow travelers, even if it ticks some people off.