Research on mental immune systems goes back to the early 1960s. In a series of important but little-known experiments, psychologist William McGuire showed that exposure to a weakened form of a persuasive argument confers a kind of resistance to stronger forms of the same argument. McGuire was struck by the analogy with inoculation. (Immunologists inoculate our bodies by exposing them to weakened forms of dangerous pathogens, and our bodies respond by developing immunity to stronger versions of those pathogens.) He’d uncovered the first hard evidence of the mind’s immune system. He labeled his findings “inoculation theory.”
McGuire's theory essentially shows that ancient rhetorical tricks (like “straw man” argumentation) can induce immunity to new information, even if the argument is invalid. Put differently, bad actors can use these kinds of inoculations to “hack” mental immune systems. And this is exactly what ideologues, aspiring demagogues, cult leaders, and conspiracy theorists do.
In the 2000s, a new generation of inoculation theorists began asking different questions: How do we inoculate minds against misinformation? Can we prevent people from becoming science deniers or conspiracy theorists? If so, how? Experimentalists like Sander van der Linden, John Cook, and Stephan Lewandowsky have made important discoveries in this area. We now know that misinformed belief tends to be resistant to change. In fact, a mind’s immune system will mobilize to protect misbelief—sometimes by “attacking” the better information that threatens to replace it.
The good news is that it’s possible to inoculate minds against bad ideas. If good information gets there first, it can make a mind more resistant to bad information that arrives later. Studies show that raising awareness of the motives behind the peddling of misinformation can help to inoculate people against the misinformation they peddle. This process may also involve: (1) stressing that there’s a scientific consensus on, say, climate change, (2) exposing flawed argumentation, or (3) helping people understand that cherry-picked information can be used to make almost anything look plausible.
A research team led by Gordon Pennycook has shown that the “metabelief” that beliefs should change in response to evidence is highly correlated with mental immune health. More precisely, Pennycook's team has shown that when people lose this metabelief, they become more susceptible to extremist ideologies, conspiracy thinking, science denial, etc. In his 2021 book, Mental Immunity, Andy Norman argues that this metabelief is the linchpin of the mind’s immune system. Norman's “damaged fulcrum model” posits that attacks on the norms of accountable talk can profoundly compromise cognitive immune systems, leading to destructive outbreaks of unreason.