I didn't know this until today but that essay (original from LiveJournal) is by the Slate Star Codex dude. (Yeah the same Slate Star Codex that I said I didn't really like a few days ago and quoted an anecdote therefrom which I really liked.)
(*) mug tree: that thing in your kitchen that you hang mugs on and reach for every morning as you're making your morning beverage.
I don't dare try to fit epistemic learned helplessness in a toot—it's a subtle and surprising reason for you to *not* take logical arguments and hot new studies seriously.
'If Osama comes up to him with a really good argument for terrorism, he thinks "Oh, there's a good argument for terrorism. I guess I should become a terrorist," as opposed to "Arguments? You can prove anything with arguments. I'll just stay right here and not do something that will get me ostracized and probably killed."'
Some of you might know that there's a site lesswrong.com and an accompanying movement—is it too much to call it a "movement"?—calling for more rationality, in light of cognitive biases and scientific methods and randomized experiments as the only ways towards truth.
One big deal for these CfAR (Center for Applied Rationality) people is *take ideas seriously*—if you accept an argument, live it.
Scott argues in the above that this is a terrible idea for vast majority of people.
If you're incredulous that I (or Scott Alexander) would say something's wrong with less cognitive bias, more science, more experiments, then good—read the essay, it transmits a tricky argument nicely:
The profligate explosion of conspiracy theories in the last few years, as well as their gaining legitimacy (or at least space in the public sphere) since the Trump candidacy is a direct result in a lapse in epistemic learned helplessness.
Reread that anecdote about Osama.
We desperately need more epistemic learned helplessness. (Or take away everyone's internet.)
We need more people ‘know that a false argument sounds just as convincing as a true argument’ (or at least very close).
To say that ‘unless I am an expert in that particular field… it's hard to take them seriously’ (I removed a typo there).
To acknowledge that it's terrifying that the ‘establishment can be very very wrong’ and to lack ‘good standards by which to decide whether to dismiss it’.
If there are any Pyrrhonian skeptics in the house, y'all know what I'm talking about. These people found so many pairs of diametrically-opposed but equally-strong arguments, for and against sundry things, that they suspended judgement on everything and found their lives got better. They don't say that the truth doesn't exist—just that they haven't found it yet, and continue seeking.
You don't have to worry that epistemic learned helplessness or Pyrronism is non-science. They aren't.
Scientists!!!, and anyone training to be a scientist—none of this is relevant to your professional work!
Just because your paper may be read by laypeople, or misconstrued into clickbait for ad revenue, you *have* to do literally everything you can to prove or disprove your findings to *yourself*, first and foremost—if there are *any* doubts that you can imagine, you have to address them. Your audience are your peers—you must convince them to replicate your findings.
“Plato’s #Socrates bequeathed at least two compelling ideals to the Western philosophical tradition. On the one hand, there is the ideal of following the argument where it leads. On the other, there is the ideal of appreciating the extent of one’s own ignorance, the respects in which one’s current knowledge and understanding are subject to profound limitations. These two ideals can interact in interesting ways.“
—Thomas Kelly in https://www.princeton.edu/~tkelly/ftawil.pdf (via Pinboard’s tweet about ELH)
“an awareness of one’s current ignorance and lack of understanding should leave one open to changing one’s mind in response to novel arguments, including in relatively dramatic ways. However, with respect to the rationality of radical belief change, … Particularly when one initially has at least some reason for believing as one does, an awareness of one’s own cognitive limitations might legitimately give rise to some measure of skepticism about arguments that attempt to undermine those beliefs.”
Epistemic learned helplessness is an important tool for us because often scientists lack humility or perspective, and act more like Atlantis/pseudohistory or conspiracy/ad revenue writers from the initial essay—
You’ve published a paper. You’ve tried to address all the critiques, real and potential, that you can imagine. But hopefully through experience, or from mentors, or through study of history, you expect devastating and unexpected holes to be revealed.
That is, hopefully you’ve learned humility from being wrong often before (finding holes in your own approach), from your seniors teaching to cultivate humility, or from seeing how holes were discovered in past works, and which blind spots past investigators failed to address (seance–alchemy Newton, and epigenetic Lamark are faves).
Humility isn’t some moral nice-to-have but a crucial component in the minimization of inevitable mistakes without letting ego interfere with science.
But then you have this
a study that showed evidence that gender dysphoria (being trans) like anorexia seems to have social contagion components, and the rebuttal paper by a trans scholar at the same school.
I’d have invoked epistemic learned helplessness in response to the first paper, purporting to show something surprising, especially after religious and politically conservative transphobic crowds embraced the study.
I’d invoke epistemic learned helplessness even before learning about its weak methods—they solicited surveys from parents of trans kids on trans-unsupportive websites?, and their disclaimer had pre-biasing language.
But I’m disappointed in the original study’s authors’s responses to their own peers, who are the true audience of their paper. If they’re failing to convince their peers about the merit of their findings, they should address their concerns, not just stoically “stand by” their work.
This is just one case study of how to apply epistemic learned helplessness as a layperson and how to use it as motivation for being a better scientist. (Hopefully the conspiracy theory peddlers getting ad revenue from their fake news posts don’t try to defeat epistemic learned helplessness.)
I use the tool every day. I wish I didn’t have to but there it is.
Also. Recall Scott reached for epistemic learned helplessness because:
"crackpots … all presented insurmountable evidence for their theories [but] all had MUTUALLY EXCLUSIVE ideas. After all, Noah's Flood couldn't have been a cultural memory both of the fall of Atlantis and of a change in the Earth's orbit, let alone of a lost Ice Age civilization or of megatsunamis from a meteor strike."
But today, you can be anti-vax flat-earth lizard truther birther … all at once 😫
At its core, a person's proclivity to embrace hidden knowledge is a desire to feel important.
We all have ways of dealing with the fact that we're not important but desperately need to feel important.
Some meditate (great).
Some code. Or edit Wikipedia. Or build guilds. (Pretty good.)
Some embrace extreme ideology. (Also bad.)
I think we should be prepared for our lives being dominated by conspiracy theories. These things don't just disappear.
About that link:
Clay Shirky's concept of *cognitive heatsinking* has been a critical part of my mental mug tree since 2008. His post didn't address conspiracy theories, but it describes the ecosystem of techniques to expend the cognitive surplus we've been gaining since the Industrial Revolution.
Ten years ago Clay was innocently excited about Wikipedia wars about Pluto and Warcraft guilds as elements of that ecosystem. Today we're more aware of its darker elements.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!