If you are anywhere near as strange a person as me, you spend a lot of time thinking. And not only thinking, but thinking about thinking (whether any good ideas actually come out of this process is a discussion for another time). Over the years I’ve become more and more interested in epistemology. Is there a such thing as Truth with a capital T? If so, how would we know if we found it? How can we better manage the Bayesian updating of our priors?
Personally, as far as epistemology is concerned, I come down on the side of fallibism. Whether fallibism is, or should be, applicable to moral questions lies beyond the scope of what I write about here. But when it comes to our beliefs about economics, geopolitics, and investing, I think fallibism is an eminently sensible position.
Now, it’s important to distinguish between fallibism and nihilism.
Nihilism is extreme skepticism in the existence of Truth.
Fallibism is extreme skepticism in the provability of Truth and in the methods we use to arrive at Truth. (See also: The Problem of Induction; The Trouble With Truth)
For a fallibist, acquiring knowledge is a relentless, grinding process of formulating conjectures, challenging them, adjusting them, discarding them, and so on. It never ends. By definition, it can’t end. So if you’re bought-in on fallibism, you need to seek out people and ideas to challenge your priors.
This is not fun. In fact, we as humans pretty much evolved to do the opposite of this. For most of our history, if you were the oddball in the tribe you risked being exiled from the group to make your way in a harsh and unforgiving world, where you would likely die miserable and alone (albeit rather quickly), without the consolation of having passed along your genetic material.
The Confidence Meter is a little mental trick I use to mitigate my evolved distaste for challenging my priors, as well as my evolved distaste for being wrong. It’s something I think about when I want to judge how tightly to grip an idea (such as an investment idea). It also helps interrupt emotional thought patterns around certain ideas. For fans of Kahneman, I use it to interrupt System 1 and activate System 2.
The Confidence Meter (A Stylized Example)
At 0% confidence, I shouldn’t even be making a conjecture. At 0% confidence, I should just be gathering information, and soaking it all in without an agenda. (Not always easy)
At a toss-up, I could make a conjecture and support it with evidence, but I wouldn’t put anything at risk.
At 75% confidence and greater, a willingness to bet money on the outcome implies a sound grasp of the theory underlying my idea, as well as the empirical evidence. It also implies I have a sound grasp of the arguments and empirical data challenging my position.
Using this framework, how many of your beliefs do you suppose merit a >=75% confidence level?
For me, it’s a very small number. To the extent I’m >=75% confident of anything I believe, it’s elementary, almost tautological stuff, like how you make money investing.
The empirical data around the impact of the minimum wage on unemployment? Meh.
The relationship between marginal tax rates and economic growth? Meh.
That doesn’t mean I don’t have beliefs about these things. I’m just leaving an allowance for additional dimensions of nuance and complexity. Particularly when we’re inclined to look at relationships in linear, univariate terms for political reasons. The world is a more complex place than that admittedly powerful little regression model, Y = a + B(x) + e, would lead us to believe.
The Confidence Meter helps me keep that in perspective.