Put Up or Shut Up
How more skin in the game can improve our ability to track truth and set better policy.
The less skin in the game, the more one ought to keep his opinions to himself. In other words, when it comes to public truth tracking, it's put up or shut up. And if the question is of utmost importance, then there should be mechanisms that allow people to put skin in the game.
"If you give an opinion, and someone follows it," writes Nicolas Nassim Taleb in Skin in the Game, "you are morally obligated to be, yourself, exposed to its consequences."
Whether we're talking about central bankers determining the supply of money or bureaucrats deciding to launch proxy wars against other countries, the more distance there is between a person and the consequences of her decisions, the more likely we are to see poor choices and catastrophic results.
Rational irrationality describes situations where individuals rationally choose to maintain factually incorrect or seemingly irrational beliefs because the psychological benefits of holding those beliefs outweigh the personal costs of being wrong, especially in contexts where those costs are low. Without skin in the game, rational irrationality is far more likely to drive one’s decisions. So, American life is infected with rational irrationality, from politics to personal opinions.
Recall that this term describes a situation in which one's preferred opinion confers some benefit to her, but the consequences of holding that opinion are negligible. Whether in group affinity or the self-anointing of virtue, that benefit frequently causes her to keep a false or irrational belief. Such usually happens in situations where people have belief preferences. Some beliefs are more appealing than others, and the marginal cost to an individual of holding an erroneous (or irrational) belief is, therefore, too low.
Here are a few examples:
Fearful people were credulous and compliant about the authorities’ claims about COVID-19 mRNA vaccine efficacy in stopping virus transmission because they wanted to feel less afraid and be seen as doing the right thing or protecting others.
Mansueto Ventures, the company that owns Fast Company magazine, routinely publishes articles whose authors malign the idea of companies and profits. Customers pay for these articles. The company could not run such articles without a corporate structure and the ability to make revenues above costs. Bizarrely, Mansueto Ventures makes money writing articles like this.
If voters advocate healthcare policy x, it will allow them to be viewed as members of some virtuous in-group. Yet the policy drives up premiums, making health insurance less affordable and accessible. (This subtype of rational irrationality is also referred to as luxury beliefs.)
Sports fans(Partisans) often maintain unrealistic optimism about theirteam's(party’s) chances of winning, even in the face of poor performance statistics. The emotional satisfaction and social connections they gain from passionate fandom outweigh the costs of being wrong aboutgame(policy) predictions.
Of course, having rationally ignorant and rationally irrational views is legal. The problem is that we live in a system in which people are not only insulated from the direct consequences of their false beliefs, but everyone else bears their costs, which—though diffuse—mount over time. However, if we can devise systems that force people to confront their false beliefs and bear the cost more directly, fewer people will hold such beliefs.
So, how might we put skin in the proverbial game?
First, let’s take another look at the nature of science.
Conjecture and Refutation
Philosopher Karl Popper taught us that the process of science is evolutionary. Popper taught us that our knowledge domain is ever-changing, imperfect, and undergoing constant revision.
It is easy to obtain confirmations, or verifications, for nearly every theory—if we look for confirmations…. (But) a theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice. Every genuine test of a theory is an attempt to falsify it, or refute it.
Too many people fall in love with their conjectures, so they conveniently forget that their highest responsibility is to try to refute them. As we alluded to, though, most people are not truth-tracking automata. We seek to confirm our biases and, on aggregate, there may be some benefit in this. Happily, we can tolerate some motivated reasoning in The Republic of Science if the system is set up so that more checkers can participate. Most importantly, we can improve these open networks if no one is in charge and each participant has something to lose when they're wrong.
Vote Values, Bet Beliefs
Let's imagine a future America that looks more or less like what we have now, only with one change. Economist
Hanson has proposed we "vote on values, but bet on beliefs." He calls this system Futarchy, which includes a prediction market with some legal force. Hanson does not intend Futarchy to be ideological or even idealistic. Instead, he seeks to eliminate policies rooted in rational irrationality by increasing the payoff for getting things right.Elsewhere, we demonstrated that expertise can easily fail. Experts frequently get things wrong because they have no skin in the game. Failed policies informed by mere expertise are not just dumb in retrospect; typically, some people would wave their hands and yell "stop" because they had good reasons to doubt a policy's efficacy before it was implemented. So, Hanson argues that fewer dumb policies would get adopted if there were more reasons to anticipate perverse consequences.
Currently, we rely on policymakers' good faith and the idea that experts inform them. Officials cluster in their ideological tribes and ignore experts who aren't members, but outsiders have the relevant perspicacity to call bullshit on many occasions. The Great Temptation whispers We just need to get better experts in positions of influence, but getting these experts into those positions of power changes their incentives, that is, not to track truth so much as to play politics. Such changes experts' behavior and blunts their expertise. In other words, it's unclear what reshuffling the experts to find the 'best ones' will do, whatever their values or expertise.
With Futarchy, democratically elected officials would continue to let people express their values in the voting booth, as they do today. But betting markets would determine whether those values would be expressed.
"That is," writes Hanson,
[E]lected representatives would formally define and manage an after-the-fact measurement of national welfare, while market speculators would say which policies they expect to raise national welfare. The basic rule of government would be: When a betting market clearly estimates that a proposed policy would increase expected national welfare, that proposal becomes law.
I'm troubled by the feature that elected representatives would define and manage national welfare measures, especially given their incentives to lie or mislead. An independent group could be responsible for setting up such definitions and another for measuring the results. Still, any such centralized system will likely fail because it will likely be gamed.
Just look at the BLS data. Do you trust it?
The system would need to guard against any political legerdemain.
According to Hanson, Futarchy is designed to be "ideologically neutral" under his conception. In time, the outcome might be anything from extreme socialism to extreme minarchism, depending on what voters say they want and what the speculators think the policy would get for them.
Despite my deference to Foxes, the modest Hedgehog in me thinks that policies designed to create long-term improvements in social welfare will have to respect the Law of Flow and leave room for experimentation and self-organization. So even if all we got after the collapse was an upgrade of this type, I suspect Futarchy's discovery process would start to eliminate unworkable ideological positions over time through selection pressure.
Over what time scale?
That's hard to say, and there would still be problems. However, these problems could be solved through careful attention to local experiments and a meta-science of tweaking these prediction markets to eliminate distortion. Even after experiments help the markets develop better methods, Futarchy will never be perfect. There's no such thing as perfect. We're always looking for better than the next best alternative.
Futarchy would likely be a significant improvement over the status quo.
In this way, having some skin in the policy game would still be better than having no skin—or worse, venal skin. Legalizing and expanding prediction markets would significantly improve collective intelligence even if we stop short of implementing Futarchy. These ideas are worthy of consideration, not because they confer omniscience or yield perfection, but because it's better than living in a hall of mirrors. In asking the all-important question compared to what, the 'what' in this case is a system driving us towards collapse.
And I'm willing to bet on that.
Loved this. Thank you! And thanks for mentioning Popper!
To me, tracking truth is about holding the folks making the decisions personally accountable.
A few years I was contacted by the city manager of a small Western city. The city council had set up a year of one-day retreats on practical topics to improve their productivity and positive impact on their citizens. The city manager said they were interested in a program on accountability since, in their words, "We don't know what accountability is."
So, I came up with a sample agenda with topics and exercises, and a bibliography. (I am definitely old school when it comes to education. Read a book or five, dang it.) Sent it off. Heard nothing. Finally took the initiative to call. The city manager was embarrassed. Personally, she loved my proposal, but apparently I scared the elected officials, and the accountability segment of their program was cancelled.
I have noticed that many people - and not just elected and appointed government officials - find the idea of accountability, including having to justify their actions and be evaluated by their stakeholders and third-party specialists, untenable. And they considered themselves privileged, having achieved an elite level where people should just trust them without any kind of oversight. Actually are insulted if someone suggests otherwise.
By the way, if you want to see a solid model of local government oversight, Denver's elected city auditor, Tim O'Brien and his team are awesome.
https://denvergov.org/Government/Agencies-Departments-Offices/Agencies-Departments-Offices-Directory/Auditors-Office/Audit-Services
Betting preferable to guessing? Hardly! The people who run betting always win. The reason people bet is in hopes of beating the odds, which in reality means in hopes that everyone else who bets loses. Only a minority can possibly win, and that minority is always the few who run the game.
That is why almost everyone in government either is a crook (most of the time) or becomes one. People who believe that democracy is best in the long run are hopelessly deluded. Honest people are always a minority, if they exist at all. The only sure check on government is regicide, IF the rebels are honest. And who will trust someone who is willing to kill. By the way, the mark of an effective general is his willingness to kill.