When encountering information that conflicts with what we think we know, it can be tempting—bordering on reflexive—to pull away from that information, become dismissive of it, and to immediately seek out alternative information (even information of dubious quality) to salve our aching brains and egos.
This is something we all deal with to some degree, though some of us are better at separating ourselves (and our egos) from our understanding of the world and the frameworks we use as shorthand for that understanding: our conception of how things work and the value-judgements arising from that conception.
I’m not one of the (I would argue) fortunate few to whom outmaneuvering cognitive dissonance comes naturally.
Because of this dearth, I’ve found it useful to consciously catch myself when I’m automatically dismissive of an idea or perspective, and to then run that triggering notion or point-of-view through a somewhat more thorough, but still relatively quick assessment-gauntlet.
If someone presents a claim I find faulty based on my existing heuristics, my understanding of the world, and my perception of their credibility, I’ll generally do my best to resist the temptation to brush their claim aside and instead ask myself why I’m tempted to do so.
In some cases, my initial inclination will be the right one: this person doesn’t know enough to realize they don’t know much about the topic they’re discussing, and I can safely nudge their claims into the probably harmless, but also probably not true box.
In other cases, my initial inclination will be somewhat right, but there will be a seed of something in what they’re saying that leads me to a novel (to me) pocket of heretofore unexplored ideas, perspectives, and data.
This is a tricky business because the person making the claim might be right for the wrong reasons, or sort of right but not entirely.
If you approach the situation purposefully and from a posture of shared-discovery (rather than presumed superiority) you can sometimes learn something new alongside that person—but this tends to depend on things beyond your control, like their state of mind and intended conversational outcome (an ideological proselytizer will be unlikely to want to learn alongside you if that means deviating from what they’re proselytizing, for instance).
In still other cases, that knee-jerk reaction will be dead wrong.
This is often the most difficult outcome to deal with, because it requires a fair bit of epistemic humility: being willing to admit you were not just wrong about this one thing, but also maybe a great number of things. And that means a possible reassessment of not just the fact or idea in question, but a whole tangle of assumptions and worldviews predicated on this thing that has now been deemed incorrect or incomplete.
I have a great deal of respect for people who can face this kind of realization relatively unfazed and just move on with life, taking this now-necessary reassessment of all the things they thought they knew in stride.
This process tends to be a little cumbersome and ponderous, for me, but it becomes less taxing the more I work at it.
Also worthy of consideration is how we respond to people with whom we don’t agree even after we fully understand their point of view.
I personally tackle such discordances by telling myself it’s natural that different people will see the world—and the countless facets of reality—differently. We’re all different and can arrive at divergent conclusions even if we all have perfect information.
Sometimes I’ll even tell myself, “This is a smart person with whom I disagree, and that’s okay,” or “This person is seeing the world from a different angle than I am, and that’s okay.”
In my experience, actually vocalizing such reminders can make these types of disagreements more tolerable and ease the frustration I might otherwise feel about someone failing to see what’s so obvious and rational and correct from my perspective.
Another phrase I sometimes find myself thinking or saying is, “It’s not my responsibility or right to convert anyone.”
The logic here is that although I do think it’s generally a good thing to share information and perspectives, I don’t think it’s an ethical requirement—or even a good idea—to decide any one of us has the complete and total truth, and thus should spread and enforce said truth.
However we approach it, maintaining an intellectual stance informed by epistemic humility isn’t easy (for many of us), but it can help us reinforce our learner’s mindset while also helping us stay malleable in circumstances where we might otherwise succumb to rigid, monofocal thinking.