An intriguing new study released last week in Psychological Science in the Public Interest reveals why people are more apt to believe false information being fed to them by the media and politicians.
According to the team of psychological scientists working on the study, led by Stephan Lewandowsky of the University of Western Australia, the main reason that people are more likely to believe false information (for example, that climate change is a hoax) is because it actually takes less brain power to believe a statement is false than to accept it as truth. Finding the truth takes time and effort that people often don’t care enough to spend on particular issues that aren’t of immediate concern.
A few excerpts from the report:
The main reason that misinformation is sticky, according to the researchers, is that rejecting information actually requires cognitive effort. Weighing the plausibility and the source of a message is cognitively more difficult than simply accepting that the message is true – it requires additional motivational and cognitive resources. If the topic isn’t very important to you or you have other things on your mind, misinformation is more likely to take hold.
And when we do take the time to thoughtfully evaluate incoming information, there are only a few features that we are likely to pay attention to: Does the information fit with other things I believe in? Does it make a coherent story with what I already know? Does it come from a credible source? Do others believe it?
Misinformation is especially sticky when it conforms to our preexisting political, religious, or social point of view. Because of this, ideology and personal worldviews can be especially difficult obstacles to overcome.
Even worse, efforts to retract misinformation often backfire, paradoxically amplifying the effect of the erroneous belief.
In the United States, we’ve seen several major misinformation campaigns over the years, perpetrated by both the media and politicians. Some of the most prominent campaigns include attempts to convince Americans that climate change is a hoax, that Saddam Hussein was somehow involved in the attacks of 9/11, and that President Barack Obama wasn’t born in America. To refute all of these claims by “leaders” would take time and research by individuals, which is often neglected. As the report explains, this is how these misinformation campaigns become successful.
And the success of misinformation is clearly represented in public opinion polls. Acceptance of climate change has fluctuated wildly over the last few years, reaching 71% acceptance by the population in November 2008, falling to 52% in 2010, and then reaching back up to 66% this year.
As for the “Saddam Hussein was involved with the 9/11 attacks” talking point, polls showed that 70% of people believed that statement in 2003, and even last year 38% of Americans still held that incorrect view.
As for President Obama’s nationality, 30% of registered Republicans still believe that the president was not born in America, and 20% of the general population holds that incorrect belief as well.
As the new study points out, all of these wrongly held beliefs can have a clear, negative effect on societies:
The processes by which people form their opinions and beliefs are therefore of obvious public interest, particularly if major streams of beliefs persist that are in opposition to established facts. If a majority believes in something that is factually incorrect, the misinformation may form the basis for political and societal decisions that run counter to a society’s best interest; if individuals are misinformed, they may likewise make decisions for themselves and their families that are not in their best interest and can have serious consequences.
Reliance on misinformation differs from ignorance, which we define as the absence of relevant knowledge. Ignorance, too, can have obvious detrimental effects on decision making, but, perhaps surprisingly, those effects may be less severe than those arising from reliance on misinformation. Ignorance may be a lesser evil because in the self-acknowledged absence of knowledge, people often turn to simple heuristics when making decisions.
But not all misinformation is deliberate, according to the study. While certain groups actively attempt to mislead the public – for a prime example, see DeSmogBlog’s treasure trove of information on the Heartland Institute’s attempts to mislead the public on climate change – the study tells us that sometimes, misinformation is by accident. In an attempt to be the first to break a story, media outlets will often rush on air without all of the facts, or with "facts" that aren’t true, resulting in misinformation spreading like wildfire.
The study lays out the most common sources of misinformation as follows:
Rumors and fiction. Societies have struggled with the misinformation-spreading effects of rumors for centuries, if not millennia; what is perhaps less obvious is that even works of fiction can give rise to lasting misconceptions of the facts.
Governments and politicians. Governments and politicians can be powerful sources of misinformation, whether inadvertently or by design.
Vested interests. Corporate interests have a long and well-documented history of seeking to influence public debate by promulgating incorrect information. At least on some recent occasions, such systematic campaigns have also been directed against corporate interests, by nongovernmental interest groups.
The media. Though the media are by definition seeking to inform the public, it is notable that they are particularly prone to spreading misinformation for systemic reasons that are worthy of analysis and exposure. With regard to new media, the Internet has placed immense quantities of information at our fingertips, but it has also contributed to the spread of misinformation. The growing use of social networks may foster the quick and wide dissemination of misinformation. The fractionation of the information landscape by new media is an important contributor to misinformation’s particular resilience to correction.
So the big question is this – How do we combat the mental apathy that helps reinforce misinformation? The study tells us that we can do the following:
Provide people with a narrative that replaces the gap left by false information
Focus on the facts you want to highlight, rather than the myths
Make sure that the information you want people to take away is simple and brief
Consider your audience and the beliefs they are likely to hold
Strengthen your message through repetition
It may not be possible to completely undo or prevent all of the potential damage that can be caused by misleading and false information, but education appears to be the key to slowing it down. An informed public can fight false information from vested interests, and the benefits of doing so could easily have positive global ramifications.
Open all references in tabs: [1 - 8]