Fake news 'vaccine' could stop spread of false information
{{#rendered}} {{/rendered}}
It might be possible to prevent people from falling prey to fake news by "inoculating" them with warnings that false information is out there, new research suggests.
In an online study, scientists warned people about the type of misinformation they might encounter in a subsequent statement. This warning prevented the false information from taking hold in a way that wasn't possible by simply providing people with the correct facts after giving them a false statement, the researchers reported Jan. 23 in the journal Global Challenges.
False information can be difficult to dislodge for many reasons, including that people may be motivated by political factors or issues of identity to want to believe things that the evidence doesn't support. People with vested interests may also intentionally induce confusion by claiming that there is scientific doubt about a particular conclusion, as has happened with climate change, wrote Sander van der Linden, a social psychologist at the University of Cambridge, and colleagues in the new paper.
{{#rendered}} {{/rendered}}
"Misinformation can be sticky, spreading and replicating like a virus," van der Linden said in a statement.
Consensus and conflict
Even when misinformation is corrected after it is presented, it can persist. A 2010 study in the journal Political Behavior found that issuing a correction after presenting false information didn't correct people's impressions of the facts. Some people even became more convinced that the original misinformation was correct after reading a correction that said it wasn't true.
But research on persuasion has also found that people do tend to believe facts more when they're told that there is a scientific consensus backing them up. In their new research, van der Linden and his colleagues presented their study participants with information — and sometimes also with misinformation — about the number of climate scientists who agree that climate change is happening. The researchers collected data on how people's opinions about climate science changed over the course of the study.
{{#rendered}} {{/rendered}}
The researchers chose this subject because it has real-world implications. Studies of active climate scientists have found that between 82 percent and 97 percent of them agree that climate change is happening, and is human-caused. However, there have been many attempts to undermine that consensus, including a website called "The Oregon Global Warming Petition Project," which claims to have more than 31,000 science-trained signatories who don't believe in climate change.
Misinformation inoculation
For the new study, van der Linden and his colleagues recruited 2,167 people through the online marketplace Amazon's Mechanical Turk, which lets people sign up to complete surveys or other tasks and get paid for their work or contributions. In some cases, the participants in van der Linden's survey were simply told that 97 percent of climate scientists agree that climate change is happening. In others, participants were told that there was no consensus among scientists, using language from The Oregon Global Warming Project Petition. Both of these methods were effective: Before reading the 97 percent number, 70 percent of participants in that condition thought there was a scientific consensus on climate change. Afterward, 90 percent thought so. In contrast, reading the misinformation dropped the percentage of believers in the scientific consensus from 72 percent to 63 percent.
Then, things got a little more complicated. Some participants first read the statement about 97 percent consensus, and then read a statement that there was no consensus. Presented with contradictory information, people stuck to their guns: There was no difference in climate-consensus belief after reading the messaging. That's bad news for anyone hoping to combat falsehoods by reciting facts.
{{#rendered}} {{/rendered}}
"A lot of people's attitudes toward climate change aren't very firm," van der Linden said. "They are aware there is a debate going on, but aren't necessarily sure what to believe. Conflicting messages can leave them feeling back at square one."
So the researchers tried two other approaches. Both involved inserting a warning about possible falsehoods in between the true and false statements, as a way of preempting the falsehoods before readers even saw them. In one approach, people first read the statement about 97 percent consensus and then read a general warning that "some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists." Then, the researchers showed those participants the false information about scientific disagreement.
Another group of participants read the 97 percent consensus statement, and then got a very specific warning about the information they were about to see next, explaining, for example, that the 31,000-person petition includes fraudulent signatures and that lees than 1 percent of the signatories have a background in climate science. Then, that group read the false information.
{{#rendered}} {{/rendered}}
This "inoculation" approach hit pay dirt: Adding a general warning between the true statement and the false information nudged people to accept the true information over the false. In that condition, 73 percent of participants started out believing in the scientific consensus on climate change, and 79 percent ended as believers.
More From LiveScience
The specific warning was even more effective. In that condition, 71 percent of participants came into the study believing in the scientific consensus. By the end of the experiment, 84 percent were believers despite having read misinformation during the study.
Preemptively "warning people about politically motivated attempts to spread misinformation helps promote and protect ('inoculate') public attitudes about the scientific consensus," the researchers concluded.
{{#rendered}} {{/rendered}}
Original article on Live Science.