Can’t read this email? Click here to view online.
Global Engagement Center

GEC Counter-Disinformation Dispatches #3 March 25, 2020

The Myth that Debunking Doesn’t Work

Exposing the unsavory aspects of disinformation is the best way to discredit the brand of the disinformers, while debunking is essential in order to correct false claims, as explained in GEC Counter-Disinformation Dispatches #2.

This is not to say that every instance of disinformation needs to be debunked or exposed.  This is neither necessary, wise, nor practical given the tsunami of false stories that exist.

Many instances of disinformation can be safely ignored, if there are indications they will not spread widely. 

The Lithuanians ignore disinformation “if it is in a small blog, no one is republishing it, there is no traction on social media, and its impact is minimal.  But if it’s in a more significant publication, or is gaining traction on social media,” they believe it needs to be addressed, as noted in GEC Counter-Disinformation Dispatches #1.

If disinformation unfortunately does spread widely, the sooner it is exposed and debunked, the better.

 

Misperceptions about Debunking

For years, those working to counter disinformation have often had to contend with the perception that debunking doesn’t work, which became widespread about 12 years ago.  At that time, the conventional wisdom became that one should not refute a lie, but only talk about what is true. 

While debunking must be done carefully, as we will examine in the next issue of GEC Counter-Disinformation Dispatches, it is not accurate to say that debunking doesn’t work.  As explained in GEC Counter-Disinformation Dispatches #2, exposing Soviet disinformation worked spectacularly well in the late 1980s, causing the Soviets to abandon crude, anti-American disinformation in their media.

Recently, academic studies based on extensive testing have also shown that debunking causes people to correct their views about 50 percent of the time, on average.  Of course, certain fringe and other audiences are very unlikely to accept corrections to cherished, incorrect views.  But mainstream public opinion tends to be more willing to listen. 

How did the idea that debunking doesn’t work become so popular?  We can’t be 100% certain, but media stories and academic studies helped encourage this point of view.

In September 2007, an article in The Washington Post stated that “peer-reviewed laboratory experiments” had found that “denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.”

The article quoted a researcher who advised, “rather than deny a false claim, it is better to make a completely new assertion that makes no reference to the original myth.”

These two propositions became the new conventional wisdom with startling speed and are still believed by many people, even though evidence clearly shows that debunking works a large part of the time.

 

The "Backfire Effect"

Cartoon of a man putting his fingers in his ears

At about the same time, an academic study that would prove enormously influential came to similar conclusions.  “When Corrections Fail: The Persistence of Political Misperceptions” found that, in certain circumstances, trying to correct mistaken beliefs made some people believe them even more strongly.  The experimenters called this the “backfire effect.”

The study was first presented at a conference in 2006 and published in 2010, in the journal Political Behavior.  It became the most widely cited article in the journal’s recent history, by far.  As of April 2019, the article had been cited 1,132 times.  By comparison, the journal’s second most-cited article in recent times was cited 235 times, as noted in a recent study

In the “backfire” study, the social scientists ran four experiments.  In two of them, they found a backfire effect; in two, they did not – instead, debunking worked.

One topic – whether people believed in 2005 and 2006 that Iraq had large stockpiles of weapons of mass destruction just before the United States invaded in 2003 (a comprehensive U.S. government study in 2004 showed it did not) – was tested twice.  The first time, corrections to the mistaken belief produced a backfire effect among some; the second time, with simpler wording, they did not.

In all cases, the “backfire effect” only occurred among people who were strongly committed to a mistaken belief.  Others accepted the corrections more readily.

 

"Man Bites Dog"

Cartoon of a man biting a dog

Despite the ambiguity of the results, which the authors of the study carefully noted, subsequent media reporting on the study emphasized the most sensational aspects of the study.

This was a “man-bites-dog” story, which both the media and readers love.  The “backfire effect” was the exact opposite of what people expected, making it particularly remarkable and memorable.  People readily believed it.  And, after all, the surprising finding was based on “science.”

(Image credit: Shutterstock)

 

But are the Results Reliable?

A wonderful thing about science is that experimental results are not accepted – or should not be – until they have been independently replicated by others.  Quite often, further experiments do NOT confirm the original findings, especially in psychology, as noted in a 2018 article:

Over the past few years, an international team of almost 200 psychologists has been trying to repeat a set of previously published experiments from its field, to see if it can get the same results. Despite its best efforts, the project, called Many Labs 2, has only succeeded in 14 out of 28 cases.

… it has become painfully clear that psychology is facing a “reproducibility crisis,” in which even famous, long-established phenomena—the stuff of textbooks and TED Talks—might not be real.

… Ironically enough, it seems that one of the most reliable findings in psychology is that only half of psychological studies can be successfully repeated.

Interestingly, replication studies followed a pattern.  One expert noted, “if one of the participating teams successfully replicated a study, others did, too. If a study failed to replicate, it tended to fail everywhere.”

 

Further Research Shows Debunking Works

Photo of Ethan Porter

After the “backfire” paper was published, other researchers were eager to try to see if it could be replicated. 

“A pair of political science graduate students at the University of Chicago, Tom Wood and Ethan Porter, found the [backfire] study dazzling,” according to a long article examining this issue.  They decided to test for the backfire effect, eventually testing 52 contentious, mistaken beliefs among 10,100 people. 

To their surprise, they found no evidence of the backfire effect no matter how hard they tried.  They concluded:

evidence of factual backfire is far more tenuous than prior research suggests.  By and large, citizens heed factual information, even when such information challenges their ideological commitments.

(Photo of Ethan Porter; credit: The George Washington University)

 

Photo of Thomas Wood

Wood and Porter found, across all their studies, “facts almost doubled the share of accurate beliefs.” In other words, debunking worked – to varying degrees, of course, depending on the views of the audience, the issue, and other factors. 

Porter and Wood conducted their experiments from 2015 through 2019.  Their first study was published online in 2018 and they expanded upon their results in a monograph in 2019.

The authors of the “backfire effect” study, Brendan Nyhan and Jason Reiffler, welcomed the new information and collaborated with Porter and Wood on a further study, “Taking Fact Check Literally But Not Seriously?  The Effects of Journalistic Fact Checking on Factual Beliefs and Candidate Favorability.” 

Their study concluded that “journalistic fact-checking had a pronounced effect on factual belief,” – in other words, debunking works in many instances.  (Photo of Thomas Wood; credit: The Ohio State University)

In 2017, Nyhan complained that popular perceptions had not kept pace with the latest research findings, tweeting, “That [the ‘backfire effect’] finding got tons of press but not what we or others find in most studies,” adding, “why aren't encouraging results translating at macro level, where misperceptions often persist?” 

In other words, social scientists studying this issue had come to understand that the backfire effect was much less common than had originally been thought, but mainstream opinion was not yet aware of this. 

Most recently, three large survey studies found “no evidence of backlash, even under theoretically favorable conditions.”  The authors conclude, “these experiments show that when people are exposed to information, they update their views in the expected or ‘correct’ direction, on average.”  They also noted, “while a casual reading of the literature on information processing suggests that backlash is rampant, these results indicate that it is much rarer than commonly supposed.”

The bottom line is debunking works, not perfectly or always, but in many cases.

In the next issue, we’ll examine the best ways to debunk false claims.

 

For more, see:

Next issue: “What Works in Debunking”

Past issues:

To contact us, email: GECDisinfoDispatches@state.gov