Why Facts Don’t Change Our Minds
Melanie Trecek-King
February 11, 2024
Melanie Trecek-King is an Associate Professor of Biology at Massasoit College, who spoke about “Why Facts Don’t Change Our Minds.”
She began with a 1954 cult story, the “Seekers,” led by one Dorothy Martin, involving Jesus, aliens in flying saucers, and predictions of Earth’s imminent destruction on December 21. Believers prepared to be saved, jettisoning all metal, including tooth fillings, which would have interfered with saucer navigation.
Trecek-King focused on what people believe, and why. The meaning of “belief” is actually a tricky question. She defined it simply as “things you think are true;” and act upon (whether true or not). Positing two basic sources for beliefs: 1) personal experiences, with the example of feeling better after homeopathic treatment (i.e., no actual treatment, but there’s still a placebo effect), and 2) received wisdom, accepting the beliefs of people around us.
Trecek-King noted we are social primates, programmed by evolution to trust in the group. We do tend to trust our tribe-mates, but they’re are not necessarily dependable. Placing trust wisely, she said, is incredibly important.
She cited philosopher W.V. Quine who characterized beliefs as forming a web, with “core beliefs” at the center, most closely linked to one’s personal identity, influencing the rest, and being hardest to change. We’ll hold to a core belief, even if incorrect, if it serves to make everything else make sense.
So again, beliefs are most tenacious if tied to one’s sense of identity; Trecek King said that “social death is worse than physical death.” If that sounds like hyperbole, she cited the example of people clinging to Covid denialism — even while themselves dying of Covid.
There’s also a dichotomy between falsifiable beliefs, capable of being refuted by evidence, and others that cannot be. The latter are subjective beliefs — for example, “abortion is murder” — that actually tend to be the ones held most strongly. And for those we make excuses, to fend off any seeming contrary evidence.
Confirmation bias is, more generally, the embrace of information supporting pre-existing beliefs, while dismissing discordant information. Smarter people are actually more prone to this, being better at coming up with rationales for rejecting the latter.
Trecek-King invoked psychologist Jonathan Haidt’s metaphor of the elephant and rider. The elephant is one’s unconscious, and is much bigger and more powerful, yet the rider — one’s conscious self — thinks they’re directing it. Mostly the rider is just coming up with justifications for the elephant’s choices.
But overconfidence is the real enemy. There was discussion of the Dunning-Kruger effect: limited competence tends to make people over-estimate their abilities. Or: their dumbness makes them think they’re smart.
Her “take home” message was: train your elephant. Be caring, skeptical, humble. She also suggested an exercise, applied to some belief you hold. How sure are you it’s true? Why? How would you feel if you’re wrong? What evidence would change your mind?
But in today’s America we have a problem of people inhabiting different realities, with indeed different understandings of how reality works.
Returning to the story of the “Seekers,” when December 21 arrived, nothing happened. At first this threw them for a loop. But then Dorothy said she’d received a message: in fact, it was their steadfastness that saved the world from its otherwise certain doom. So they sallied forth with redoubled faith in their crazy belief system.