This article has been reviewed according to the editorial process and policies of Science X. The editors have emphasized the following features while ensuring the credibility of the content:
In an article titled “A Meta-analysis of Correction Effects in Science-Relevant Misinformation” published in the journal Nature Human behaviorUniversity of Pennsylvania social psychologists and communication researchers Man-pui Sally Chan and Dolores Albarracín explain the conditions under which science misinformation corrections are most likely to work or fail, as well as the characteristics of the corrections most likely to succeed.
The authors conducted a meta-analysis, a quantitative synthesis of previous studies, involving 60,000 participants in 74 trials. In each experiment, either belief in science misinformation was assessed or science misinformation was presented as accurate and then misinformation corrections were presented.
Although the corrections did not achieve their goals on average, they worked better when the issue in the correction was more emotionally positive than the misinformation, the correction matched the recipients’ ideology, the issue was not politically polarized, and the correction provided a lot of information about why the previous statements were false.
Can science-related misinformation be corrected on average?
The researchers found that “attempts to debunk science-related misinformation were, on average, unsuccessful,” said Chan, lead author and research associate at the University of Pennsylvania’s Annenberg School for Communication.
“That’s why most misinformation relevant to science goes uncorrected, even if a denial is made. People believe the misinformation as much before as after a denial. This is quite remarkable, because corrections in other areas, such as reporting of accident or political event, perform reasonably well, as previous studies show. However, this does not happen in the field of science misinformation.”
The researchers conducted their study with two goals in mind. The first was to assess whether it is possible to correct incorrect information; another was to determine which types of corrections worked better than others.
Is it easier to correct good or bad news?
To achieve these goals, the team began by determining whether negative or neutral misinformation was easier to correct. Their study confirmed that positive misinformation, which makes people “feel good about themselves, their future, or the world in general,” is more difficult to correct than negative misinformation, the study says.
“We humans like to have our rose-colored glasses on, and we’re resistant to debunking feel-good pseudoscience,” said Albarracín, the Alexandra Heyman Nash University Professor at the University of Pennsylvania and director of the Department of Science Communication. Annenberg Public Policy Center. “It’s much easier to correct hype about a chemical spill that didn’t happen than it is about deforestation that is happening. The reason is that it’s more pleasant to move from pessimism to optimistic news rather than the other way around.” Good news corrects negative misinformation more easily than bad news corrects positive misinformation, she said.
The researchers also asked which corrective messages are most successful. They found that when a correction offers a detailed explanation, audiences are more likely to be receptive and misinformation is more likely to be dismissed. The process by which this happens involves two stages.
First, the information and details in the correction offer the respondent a new model for understanding the event described in the misrepresentation. Then this new representation of what produced the event replaces the original model created by the misrepresentations.
Alignment of the correction with the ideology of the recipient
Chan and Albarracín also examined whether a person’s attitudes or beliefs “influence the effectiveness of correcting science-related misinformation.” They found that when the disclosure conflicts with people’s ideologies, recipients are more likely to reject the correction and strengthen their support for the misinformation.
So, for example, a person with a left-leaning ideology is willing to accept correction of statements that oppose climate change. Conversely, when the disconfirmation conflicts with people’s ideologies, recipients are more likely to reject the correction and strengthen their support for the misinformation.
When the topic is politically polarized, and the ways to succeed
Another important factor is the political polarization around the scientific issue being discussed. The study found that when a topic is polarized, such as the COVID-19 vaccination, correction often fails. “It is more than twice as difficult to refute polarized misinformation as it is to correct non-polarized misinformation,” Albarracín said.
However, there are ways to correct misinformation. When obstacles can be identified, they can be worked through. Chan recommended “using corrections that are thorough, increase knowledge of the topic among the audience, make the discussion about science not about politics to depolarize it. But if the topic is already politically polarized, then the correction must be written in a way that aligns the recipient’s politics.”
Chan is part of a research team led by Albarracín focused on finding ways to curb the effects of scientific misinformation. Other recently published research by her team showed that without having to confront misinformation about an issue, its effects can be circumvented or “bypassed” by strengthening attitudes that increase support for socially beneficial policies.
Man-pui Sally Chan et al., A Meta-Analysis of Correction Effects in Relevant Science, Misinformation, Nature Human behavior (2023). DOI: 10.1038/s41562-023-01623-8
Nature Human behavior
Provided by the Annenberg Public Policy Center at the University of Pennsylvania
#Research #points #factors #correcting #misinformation #science #effective