Trading Truth for Stigma

Research

You skim a flyer you picked up in college health services or at a doctor’s office. It lists common myths and facts about mental illness. In the myths column: “The unpredictability of mental illness makes mentally ill people dangerous.” In the facts column: “Mental illness is not a good indicator of the risk for future violence.” What sticks in your mind as you walk away?

The “myth and fact” strategy to reduce the stigma of various illnesses has been employed for decades.  Keith Dobson and Savannah Rose sought to explore a possible backfire effect—do readers misremember myths as facts–in a campaign to reduce mental illness stigma. They showed 359 undergraduate students a pamphlet with myths and facts aimed at reducing perceptions of mentally ill people as dangerous, responsible for their illness, and generally to be avoided.

Students read the pamphlet and then took a survey about the contents immediately after or up to seven days later. Responses were scored based on the degree of endorsement with the negative “myths”; a higher score meant the reader held more stigmatizing opinions. Study participants who answered the survey several days after viewing the pamphlet were more likely to perceive people suffering from mental illness as dangerous compared to people who responded right after viewing the pamphlet, indicating a “backfire effect” after a few days.

The “myth and fact” strategy could inadvertently increase perceptions of myths it intends to dispel.

 

This finding was consistent with other research on the backfire effect: the effect tends to happen after a few minutes but strengthens after five days. The “myth and fact” strategy could inadvertently increase perceptions of myths it intends to dispel.

Some researchers think the backfire effect is based on two processes. Repetition of a statement makes it more believable: if people read or hear something they have heard before, they are more inclined to believe it. Secondly, memory for context details fades fast, so the myth may solidify as a fact a few days after viewing a campaign.

With the proliferation of social media, myths are frequently represented as facts. Misinformation abounds. Behavioral research shows that often the best strategy against misinformation is to draw a vivid, clear portrait of the truth, instead of drawing attention to false information. But if a falsehood must be addressed, the material should briefly explain why it is false instead of simply labelling it as “discredited.” The presentation of the accurate information should also be as accessible as possible; visual and auditory cues like pictures and rhyming are helpful.

Science communicators have struggled with effective public health communication during the COVID-19 pandemic. Some have called the overabundance of information an “infodemic” within the larger pandemic. In a nationally representative study, 47% of Americans reported being exposed to news that COVID-19 was “completely made up.” People with less information or more misinformation are less likely to follow official health advice. There is evidence that positive message campaigns with print and audio-visual aids helped reduce stigma around COVID-19 precautions.

Dobson and Rose’s findings challenge the common notion that “myth and fact” campaigns are effective at battling mental illness stigma, which may have implications for COVID-19 myths. They suggest public health campaigns should focus on the facts, and avoid any attempt at myth-busting.

Illustration via Getty Images

SHARES