Why do people believe things that science has proved untrue?

At least this dinosaur -- stationed in Arizona on Route 66 -- isn't fooling anybody.
At least this dinosaur -- stationed in Arizona on Route 66 -- isn't fooling anybody. See more fossil pictures.
Alan Copson/Photolibrary/Getty Images

Nearly half of Americans are sure that life began no more than 10,000 years ago [source: Diethelm]. This would have humans and dinosaurs co-existing, make carbon-dating a fraud and outright dismiss any evidence of evolution.

Creationists are not alone. About one-fifth of Americans believe vaccines can cause autism, even after the discovery that the study data used to make the connection was faked [sources: Gross, CNN]. A 2010 Gallop poll found that half of the U.S. population thinks human actions have nothing to do with climate change, despite the countless studies linking the effect to CO2 emissions [source: Rettig].

Advertisement

Don't forget these, either: Smoking does not cause cancer; sex positions can help you conceive your gender of choice; raw milk can't really do any harm.

The thinking might be rational in people who don't buy science at all -- no germs leading to illness, no evolution or genetic code, no "heat-retention" nonsense. But in those who do believe in the principles of science, in the scientific method and in most of its conclusions, how does this happen?

Psychologists call it "belief perseverance," and it's a widely studied phenomenon. All of us fall prey to it to some extent, but some people are more prone to it than others.

What exactly is at work here? To put it very simply, the human mind will go to great lengths to keep the peace.

Advertisement

Cognitive Dissonance

The world was going to end on Dec. 21, 1954, in a flood. But the cult members had no fear. They had faith, so they would be saved -- rescued by a spaceship and whisked away from God's wrath.

On Dec. 22, 1954, some of those cult members felt pretty foolish. But, to the shock of psychologist Leon Festinger, who had been studying the cult, others went the opposite way: They believed even more strongly than they had before the prophecy failed. In fact, to these true believers, the prophecy had not failed at all. They, the cult members, had managed to stop the flood with the power of their faith [source: Mooney]. That there was no flood was proof that they were right to believe.

Advertisement

In 1957, Festinger coined the term cognitive dissonance to describe what he had seen. It refers to the mental discomfort of facing inconsistency in one's thoughts, beliefs, perceptions and/or behaviors [source: McLeod]. He theorized that in this state of stress, the mind will tend seek a way to remove the conflict and restore cognitive harmony.

Most of us indulge this tendency to some degree. We all want to feel comfortable with our thoughts and actions, and it's a whole lot harder to change than it is to stay the same. We can see cognitive-dissonance theory at work in everyday life. A parent who believes her child to be brilliant believes the test he failed was poorly written, even though the rest of the class did fine on it. A man who catches his disheveled wife leaving a hotel with another man believes nothing happened, they just talked.

When an otherwise rational person holds an irrational belief in the face of significant evidence against it, cognitive dissonance is usually involved. How the mind facilitates this is a study in self-preservation, and it typically involves a mental tendency known as confirmation bias.

Advertisement

Confirmation Bias

When there was no flood, no spaceship, no death and destruction, the cult members were faced with two possible "facts." Option No. 1: They were wrong to believe. Option No. 2: They were right to believe, because their faith stopped the flood.

In fact, for the most sincere believers, option No. 1 probably didn't enter the picture. And if it did, they probably would have immediately forgotten it.

Advertisement

Confirmation bias can explain everything from unbudging stereotypes to increasing political polarization. The theory goes: We are more likely to believe (or seek or remember or even notice) the "facts" that support our current viewpoints, and less likely to believe the ones that would require mental adjustment. The more deeply ingrained or self-defining or consequential the current viewpoint, the further the mind might go to ignore the new evidence that would disprove it [source: Arnold]. Attempts to debunk an irrational belief will tend instead to reinforce it, as the believer may have come to see his or her perseverance as heroic, as standing up to the "establishment" [source: Arnold].

While belief perseverance is not limited to the realm of science, when that new, threatening evidence takes the form of overwhelming scientific data, there are some approaches that work particularly well to keep the conflict at bay.

Advertisement

Denialism

When a parent who refused to vaccinate due to the autism link, who spoke out publicly on the topic and even criticized his friends for following the schedule, is faced with overwhelming evidence that there is no link at all, he may determine that "evidence" to be the product of a far-reaching medical, governmental and corporate conspiracy to maintain high pharmaceutical-industry profits.

Creating a conspiracy is one of the easiest ways to reject evidence. Conspiracies by nature are irrefutable. It's all happening in secret. Anybody can be in on it. The data is faked. The photos are re-touched. The corporate-funded media will say anything.

Advertisement

It's not the only way, though, to validate a threatened belief. The collection of techniques that enable what has come to be called denialism is a varied bag of tricks.

In the case of Festinger's cult, the technique was "reinterpreting the evidence." This involves analyzing any new facts in such a way as to support the original belief. Before Dec. 21, the truth of their faith would be proved by the flood; after Dec. 21, the truth of their faith was proved by the absence of the flood. Similarly, when Andrew Wakefield's seminal study linking vaccination to autism was published in 1998, its presence in the prestigious British Medical Journal was proof of its legitimacy. In 2011, its retraction by that journal was proof of its legitimacy -- obviously, the pharmaceutical industry was frightened enough by the truth of the study to start throwing its weight around [source: CNN]. (Conspiracy can work in tandem with most other denial techniques.)

One can create standards of proof that science can't possibly meet, such as, "I'll believe that climate change is a result of human actions when I see proof that Earth has never undergone a temperature increase before."

One can seek out "experts" who support the irrational belief through pseudoscience, misinterpretations, misrepresentations and logical fallacies, as in "If smoking really caused lung cancer, everyone who smokes cigarettes would get lung cancer."

Pretty effective on their own, these (and all) belief-perseverance techniques have received a tremendous boost with the advent of the Internet. Those looking to maintain an irrational viewpoint need only perform a simple search to locate fellow believers, entire communities of them, and the "experts" who back it all up with appropriate jargon.

In the end, it's not about science at all. It's about avoiding the stress of unlearning, the possibility of regret or the shame of having been wrong. And so, in the interest of cognitive harmony, otherwise reasonable individuals believe vaccines can cause autism, human actions have nothing to do with climate change, smoking doesn't cause cancer, and the test, obviously, was wrong.

For more information on belief perseverance, denialism, and other theories of modern psychology, check out the links on the next page.

Advertisement

Lots More Information

Author's Note: Why do people believe things that science has proved untrue?

When I began addressing this topic, I was reminded of a conversation I had in college with a guy I would now call an extreme relativist. The discussion, which didn't produce much, did plant the seeds of what my mind would turn into this: A person can both believe, absolutely, in science, and still (at least tentatively) acknowledge that this belief may not be so very different than one in God. It's a potentially problematic position from which to write an article about the "irrational" rejection of scientific evidence, and my attempt to solve the conflict is in the introduction:

The thinking might be rational in people who don't buy science at all -- no germs leading to illness, no evolution or genetic code, no "heat-retention" nonsense. But in those who do believe in the principles of science, in the scientific method and in most of its conclusions, how does this happen?

I hope my readers feel this did the trick and that I carried distinction through to the end of the article -- that it is not the rejection of scientific evidence that is pathological but instead the inability to hear (let alone integrate) any new or conflicting evidence into one's belief system.

To read about a heartening example of the opposite, check out this Guardian article. That's courage (and good science).

Related Articles

  • Arnold, Carrie. "Diss Information: Is There a Way to Stop Popular Falsehoods from Morphing into 'Facts'?" Scientific American. (Oct. 31, 2012) http://www.scientificamerican.com/article.cfm?id=how-to-stop-misinformation-from-becoming-popular-belief
  • Castillo, Michelle. "CDC: US Whooping cough cases rising at epidemic rate." CBS News. July 19, 2012. (Nov. 2, 2012) http://www.cbsnews.com/8301-504763_162-57475858-10391704/cdc-us-whooping-cough-cases-rising-at-epidemic-rate/
  • Diethelm, Pascal and Martin McKee. "Denialism: what is it and how should scientists respond?" European Journal of Public Health. Vol. 19, Issue 1. P. 2-4. 2009. via EuroPub. (Oct. 31, 2012) http://eurpub.oxfordjournals.org/content/19/1/2.full
  • Duggar, Celia W. "Study Cites Toll of AIDS Policy in South Africa." The New York Times. Nov. 25, 2008. (Nov. 2, 2012) http://www.nytimes.com/2008/11/26/world/africa/26aids.html?pagewanted=all
  • The Flat Earth Society. (Nov. 1, 2012) http://theflatearthsociety.org/cms/index.php
  • Gross, Liza. "Doubt and Denialism: Vaccine Myths Persist in the Face of Science." QUEST. Aug. 8, 2012. (Oct. 31, 2012) http://science.kqed.org/quest/2012/08/08/doubt-and-denialism-vaccine-myths-persist-in-the-face-of-science/
  • Lehrer, Jonah. "Why We Don't Believe in Science." The New Yorker. June 7, 2012. (Oct. 31, 2012) http://www.newyorker.com/online/blogs/frontal-cortex/2012/06/brain-experiments-why-we-dont-believe-science.html
  • McLeod, Saul. "Cognitive Dissonance." Simply Psychology. 2008. (Oct. 31, 2012) http://www.simplypsychology.org/cognitive-dissonance.html
  • Mooney, Chris. "Made-up minds." The Week. May 13, 2011. (Nov. 2, 2012) http://theweek.com/article/index/215257/made-up-minds
  • "Retracted autism study an 'elaborate fraud,' British journal finds." CNN Health. Jan. 5, 2011. (Nov. 2, 2012) http://www.cnn.com/2011/HEALTH/01/05/autism.vaccines/index.html
  • Rettig, Jessica. "Fewer Americans see climate change a threat, caused by humans." U.S. News & World Report. Aug. 26, 2011. (Nov. 2, 2012) http://www.usnews.com/opinion/blogs/on-energy/2011/08/26/fewer-americans-see-climate-change-a-threat-caused-by-humans
  • Strickland, Jonathan. "Top 10 Space Conspiracy Theories." HowStuffWorks. (Nov. 2, 2012) https://science.howstuffworks.com/space-conspiracy-theory.htm

Advertisement

Loading...