
Ostrich with its head in the sand (Photo credit: Ripleys)
The 2016 presidential election and its aftermath has once again brought science to the forefront of political debate. The most notable example is the Trump administration’s placement of former Kansas senator and attorney general Scott Pruitt at the head of the Environmental Protection Agency. Pruitt—a self-described leader in challenging the EPA’s “activist” stance on not killing the planet—promptly began phasing out the use of the phrase “climate change,” and since January 2018, the agency’s homepage is still being updated “to reflect EPA’s priorities under the leadership of President Trump and Administrator Pruitt.” Continual climate change skepticism—which, in this case, takes the form of scrubbing references to it from the government’s website—is one symptom of the much larger, ongoing problem of science denial.
Historically, religion and science have been placed in opposition to each other, with faith and facts too often considered mutually exclusive. In the last few decades, science has also become inexorably tied to American politics. The most notably anti-science presidency until Trump’s was the George W. Bush administration, which regularly placed ideology about evidence: In a July 2007 New York Times article, Bush’s Former Surgeon General, Richard H. Carmona, said that he was not permitted to speak about stem cells, emergency contraception, or sex education—issues that the religious right especially find reprehensible. This attitude cemented Democrats as the party of science… well, sort of. (More on this later.)
I endorse being critical of the science we encounter. But the key difference between science criticism and science denial is that the former engages with data—even if it’s contrary to our beliefs—while the latter doesn’t. Denial says that climate change isn’t actually a problem, while criticism says that there isn’t a consensus of how much the Earth’s temperature will rise. While the latter compares and is critical of findings across studies, the former insists that this data doesn’t exist or is completely unreliable. And denial is especially dangerous when it perpetuates harmful and disproven beliefs.
Let’s look at an example near to my heart: the notorious, but nonexistent, link between vaccines and autism. (My brother, who has classical autism, is the reason I became a scientist.) In a now-retracted 1998 paper published in the medical journal The Lancet, former general practitioner Andrew Wakefield suggested that there was a connection between a common childhood vaccine (Measles-Mumps-Rubella, or MMR) and the development of autism. This single study, in concert with internet message boards and a handful of high-profile celebrity true believers, fueled an anti-vaccine movement that’s still going strong.
I’ve read through Wakefield’s original article—and it stuns me that the paper ever made it past peer review. Its experimental design is shoddy at best, and the data presented is inconclusive. Most of this data has to do with gastrointestinal issues, not vaccines; there’s one paragraph discussing a possible—but unproven—link to autism, which reads, “We did not prove an association between measles, mumps, and rubella vaccine and the syndrome described.” Wakefield, the study’s lead investigator, also failed to disclose a significant financial conflict—he was taking money from a lawyer who was suing the maker of MMR vaccines.
The study’s coauthors eventually disavowed Wakefield’s summary of the data; dozens of papers discredited the study after its results couldn’t be replicated; and Wakefield’s medical license was revoked. And yet the stigma around vaccines—all vaccines, not just MMR—has become so powerful that measles, once considered an eradicated disease in the United States, has made a deadly comeback. (For a good summary of the Wakefield paper and subsequent fallout, I recommend this April 2018 article in Vox.)
One reason the fearmongering, ableist belief that vaccines cause autism still looms large in American culture is because social media has become, among other things, a hub of anti-science rhetoric. Most people who see science posts on social media find them untrustworthy, and yet a much smaller number seek out more specialized information. But really, sharing science news on social media has never been about expanding human knowledge so much as it’s been about political score-settling, shaming, and being in the right. The crux of science denial is ignoring and distrusting published data in order to get leverage over the opposing side. And the more science is weaponized in this way, even casually, the more we’ll see a rise in science denial—which will subsequently have an impact on the kind of research that’s funded, focusing only on whatever will make the other side look worse.
Science denialism is especially dangerous when it perpetuates harmful and disproven beliefs.
Social media is not peer-reviewed; a poll on Twitter isn’t a representative sample of the population. Anyone can write a Facebook post with a graphic they found on Google and claim it as fact, and even people who fancy themselves intellectuals and critical thinkers are not immune to this sort of propaganda, regardless of its intention. We know that viral fake posts travel faster than real news, and that’s increasingly dangerous in a time when there’s constant temptation to share articles with damning headlines to prove a point.
And though we tend to associate science denialism with angry members of the Religious Right demanding that public schools teach Creationism alongside evolution and genetics, some studies show that Democrats are just as likely as Republicans to misuse science, ignoring evidence that opposes their beliefs in the same way that climate-change deniers or other conservative skeptics do. Before Breitbart starts courting me as the liberal scientist who supports its spin: I’m not saying that it happens to the same degree, or about the same issues. The point is, no one thinks they’re wrong—and it’s easier to ignore evidence that’s contrary to your beliefs.
The right hasn’t done itself any favors by painting “science” and “facts” as fripperies for hoity-toity blue-state elites, but those who consider themselves progressives need to consider the science behind their own hobbyhorses. A February 2013 article in Scientific American put it especially well: “Whereas conservatives obsess over the purity and sanctity of sex, the left’s sacred values seem fixated on the environment, leading to an almost religious fervor over the purity and sanctity of air, water and especially food.” Take the left’s fixation on GMOs: As with vaccines and autism, dozens of papers, including a comprehensive study from the National Academies of Science, Engineering, and Medicine, have shown genetically engineered crops to be safe to eat—yet those who have bought into the narrative are as intractable as any climate-change denier.
Neither side of the political spectrum is immune to science denialism. No matter our political background, in order to become more scientifically literate, it’s crucial to interrogate the reigning party lines, particularly when it comes to social media. You don’t need to have an advanced degree to recognize when a headline disagrees with the abstract of the paper it’s reporting on or to consider whether the article you’re about to share on Twitter is legit. Bitch has a handy five-step guide to deconstructing how popular media often warps complex studies; some academic journals even offer their own “TL;DR” breakdowns of new papers, presenting them in digestible—and apolitical—ways.
It’s become way too easy to replicate clickbait culture, and actual science will suffer for it. If we aren’t careful, we set our spaces to reaffirm what we already believe instead of challenging it, leading us to repeat science-based rhetoric that removes the science. Find original sources; read instead of skimming, and talk to people who might disagree—not to advance your own political agenda, but to be critical together. Science literacy can’t exist in an echo chamber, and neither should we.