The last two posts might sound like I’m just stating the obvious; people fall victim to confirmation bias when choosing their information sources. But hopefully I’m making a slightly more subtle and useful point. To tackle uncertainty, people are rationally searching for the smartest voice in the room, but using a flawed process to do so. That process handicaps our ability to update beliefs away from their staring point. We’re all anchored to our default beliefs, as if by a ball and chain.
How can we do better? We can start by recognizing what type of foundation our beliefs are built on. Don’t be fooled into thinking you actually understand an issue unless you really, truly have a deep understanding of it. Just because you’ve read a hundred interviews with climate scientists/skeptics doesn’t mean you understand global warming. After hearing so much evidence, it’s easy to start thinking you do, but be careful. More likely, you have a very surface-level understanding of the evidence, combined with a strong trust in the credibility of your sources. That might be fine – but the point is that the foundation of your belief is your trust in your experts. The line between these two underpinnings for a belief is blurry, and I suspect that most of us often feel like our beliefs are reasoned out from the evidence itself. Once we get better at recognizing that distinction, we can start adjusting our confidence levels based on the systematic bias in the process.
Next, we can attempt to rewrite the mental algorithm we use when selecting our sages. When evaluating a source, the trick is to look at his process, not his beliefs. Is he conducting an investigation or a defense? Is he willing to acknowledge tough tradeoffs and inconvenient facts? Does he allow for the genuine possibility of being wrong, or merely pay lip-service to the idea? Does he have an unwavering dedication to being truth-seeking and open-minded? Does he do a cursory knock-down of opposing view, or put real work into testing it? People who pass these criteria are about as common as unicorns, and finding them takes a ton of work. But if we can can shift our attention towards the types of people who at least approach this ideal, we can hopefully start reducing our errors.
It would be nice if we didn’t have to rely on others at all, and we could form our own judgements on every issues, based solely on rock solid primary research that we’ve read for ourselves. But of course, the world is too complex for that. We must rely on others. Most of our beliefs will form as an amalgamation of hundreds of opinions from those we’ve chose to listen to. With effort, hopefully we can shift our attention to the voices that cluster a bit closer to the truth.