midd-mag-duel-web

On the late afternoon of October 6, as a Category 4 hurricane lumbered toward the southeast coast of Florida, conservative political commentator Matt Drudge sent a pair of messages to the 414,000 people who follow him on the social media app Twitter: “The deplorables are starting to wonder if govt has been lying to them about Hurricane Matthew intensity to make exaggerated point on climate” and “Hurricane Center has monopoly on data. No way of verifying claims. Nassau ground observations DID NOT match statements! 165 mph gusts? WHERE?”

Earlier that afternoon, radio host Rush Limbaugh presaged the Drudge Report founder’s comments when he announced to his listening audience, “The National Hurricane Center is part of the National Weather Service, which is part of the Commerce Department, which is part of the Obama administration, which by definition has been tainted just like the [Department of Justice] has…With hurricane tracking and hurricane forecasting, I’ve been able to spot where I think they might be playing games because it’s in the interests of the left to have destructive hurricanes because then they can blame it on climate change, which they desperately continue trying to sell.”

Limbaugh allowed that Matthew was a “serious storm,” and he was right. By the time the hurricane’s posttropical remains had been absorbed by a front that was moving across eastern Canada, it had been responsible for an estimated $5–7 billion in damage ($4–6 billion in the U.S. alone)—and 1,044 fatalities.

An estimated one thousand residents of Haiti died after the storm ravaged that island country—a day before Drudge implied that the American government was intentionally exaggerating the hurricane’s strength to score political points. And 38 Americans lost their lives as a result of the storm’s impact on coastal communities in the days that followed Matthew’s initial U.S. landfall—in the early morning hours of October 7.

Now, is Matt Drudge or Rush Limbaugh or anyone else who might have suggested that Hurricane Matthew’s strength was exaggerated by a government agency—for partisan political purposes—responsible for the deaths of American citizens? That’s both virtually impossible to know and equally as dangerous to suggest as the initial comments themselves.

What is not in doubt, though, is that the American public and American scientists have drifted far apart in their perception of vital scientific issues, and this disconnect poses a clear and present danger to an educated and engaged citizenry. And it’s important to note that this disconnect exists across the ideological spectrum; it is neither a conservative nor liberal, a Republican nor Democratic “war on science.”

We’re all complicit. Even the scientists themselves.

***

hope you’re still reading this story. That is, I hope you didn’t stop because you saw this piece as an attack on conservative thought and beliefs. I hope you didn’t drop the magazine in disgust, decrying yet another example of a liberal bashing Republicans. But here’s the thing. You’d have a valid point. I chose to lead this story with an anecdote that conformed to my worldview—basically that people who don’t believe in anthropogenic climate change (or worse, people who actively seek to mislead the public) are inherently dangerous to mankind. I can obviously cite scientific consensus on the issue not only to bolster my point but to justify my decision to begin the story this way. But you know what? I could have also cited scientific consensus on another issue—whether it’s safe to eat genetically modified food—and chosen an anecdote involving GMOs, but I opted not to do so. I’d like to think that that is because, as a professional writer, I understand how to write a strong opening, and that when a recent event involving mass destruction and death sits within the context of the story topic, then focusing on that anecdote is an obvious solution. But you should know something. The GMO thing? My personal beliefs are in conflict with scientific consensus. That surprised the hell out of me, and, quite frankly, prompted me to be even more curious about why we, as a populace, see things differently than scientists do. (And that curiosity is a good thing, I would learn; more on that later.)

So let’s see if you’re as surprised as I was.

Like most U.S. adults, I believe that genetically modified foods are unsafe to eat; scientists believe otherwise. In a 2015 study conducted by the Pew Research Center in collaboration with the American Association for the Advancement of Science (AAAS), just 37 percent of the general public said that it is safe to eat genetically modified foods. By contrast, 88 percent of AAAS scientists say that such foods are safe. And that 51-point gap? It’s the largest opinion difference between the public and scientists on any issue surveyed. It’s larger than the differences in opinions on whether humans have evolved over time (98–65 percent); whether childhood vaccines should be required (86–68 percent); whether climate change is mostly due to human activity (87–50 percent). (In all of these cases, scientists represent the higher numbers.)

So, you tell me: Should I have led with an anecdote about genetically modified food, since on no issue are scientists and the public further apart?

I guess that’s to be debated.

What really isn’t up for debate is the main takeaway from the Pew report, which is that “citizens and scientists often see science-related issues through different sets of eyes.”

I wanted to know why, so I turned to a psychologist, a philosopher, a political scientist, and a physicist to shed light on this issue.

***

Barbara Hofer seems to be a relatively laidback person—until she starts talking about a topic that she cares deeply about; then, she practically radiates energy. I met the psychology professor for coffee one morning in the Davis Library’s Wilson Café, and about halfway through our conversation about the public’s understanding of science—and the global implications therein—she stopped mid-sentence and declared: “I care about this so passionately.”

That’s why I was there talking to her, having read a journal article that she recently cowrote, in which she and her coauthor presented research on why the public was struggling to better understand science, why it matters, and what can be done about it.

We had started by talking about scientific literacy, what I had—somewhat erroneously, it turned out—thought of as simply being well-read about scientific issues.

“I don’t think anyone would argue that there is a need for improved science literacy,” Hofer told me. (On this issue, a vast majority of those surveyed by Pew seemed to agree. Nearly 80 percent of the public said science has made life easier for people, yet both the public and scientists were highly critical of science education in America. Just 29 percent of adults said it was above average, a figure that drops to 16 percent for scientists.) “But we need to be very careful about how we rely on this literacy and even how we define ‘literacy.’”

Hofer brought up a view that psychologists refer to as the “knowledge deficit”; that is, if you simply acquire knowledge about an issue, you’ll understand it better. (What I understood as being well-read.) “Then why aren’t we seeing greater acceptance of evolution and climate change?” On these issues, the public remains far removed from scientific consensus. While 97 percent of scientists believe that the earth is warming (and have produced studies showing this to be the case), a quarter of the public says there is no solid evidence. On evolution, 98 percent of scientists say that humans have evolved over time, while one-third of U.S. adults say we have existed in our present form since the beginning of time.

“Literacy can’t just be content,” she said. “It’s a fallacy to believe that if we just impart more facts then we’ll have done our jobs. The definition of scientific literacy needs to be thought of as an understanding of the very nature of science itself and how it is conducted.”

She added: “So much of what we are encountering is a failure to understand the epistemology of the issue.”

(A quick note: If you’ve been out of the classroom for a while, it’s possible that the word epistemology rings a bell, but its definition escapes you. It means the study of the nature of knowledge; an epistemologist is one who studies how we know what we know. Both Hofer and her colleague in philosophy, Heidi Grasswick, whom I interviewed for this piece, speak often about epistemology, so I thought it best to offer this refresher.)

“One of the fundamental tenets of the scientific method is that knowledge is always open to revision. That’s how you produce solid science, science that is durable,” she said. Indeed, she makes this very point in her journal article, writing “scientists work toward increasingly accurate approximations to describe phenomena in the world and revise them as new information becomes available, usually through modification.”

And people can have a problem with an absence of absolute certainty. Hofer talks about epistemic cognition, basically how people think about reason and knowledge. The absolutist stance, where one holds a dualistic view that you are either right or wrong based on knowledge that is certain, is perhaps the most problematic dimension when it comes to scientific understanding, Hofer said. (For instance, I’ve spoken to someone who told me he was withholding judgment on climate change until scientists had reached 100 percent consensus.) This might explain why, according to the Pew report, at least a third of the populace believes that scientists do not agree that the Earth is getting warmer or humans evolved over time, despite the fact that 97 and 98 percent, respectfully, believe it to be true.

There is also a multiplistic stance in epistemic cognition, in which knowledge is based on interpretation and belief without clear criteria “for ascertaining the truth value of a claim.” About five years ago, Hofer conducted a study with Middlebury first-year students, gauging their attitudes toward evolution. She was stunned to learn that one-third of those students applied the colloquial definition of theory to scientific theory, stating that it meant one person’s opinion. Further, “a surprising number of students thought we should teach intelligent design right alongside evolution—even if they believed in evolution—so that people could ‘make up their own minds’ in the issue. This floored me. Science is not a belief system, it’s a method of investigation,” she said in describing an extreme instance of multiplistic cognition as applied to scientific understanding.

And then there is the evaluativist view, what Hofer described as an integration of objectivity and subjectivity, an appreciation for the relative nature of certainty, and a recognition that knowledge is contingent and contextual. 

“But even then you need to be epistemically vigilant,” she said. “Students and the public need to understand where the biases are. They need to understand how to critically evaluate claims and studies.”

They need to know whom and how to trust. And when it comes to epistemic trust, there are few, if any, people on the Middlebury campus who have thought more about this than philosophy professor Heidi Grasswick.

***

“I am an epistemologist, first and foremost,” Grasswick told me one day over lunch. “And I love thinking about not just what counts as knowledge, and what we do as individuals, as knowers, but how the circulation of knowledge is itself a philosophical issue. We’re dependent on others for knowledge, and not just experts, but us, here, talking.”

(As an example, she asked me what my birthday was. When I told her, she asked how I knew. “You don’t actually know that on your own,” she said, smiling slyly. “You’re depending on other people to tell you something as personal and individual as when you were born.”)

Grasswick said that testimony has become a more prevalent topic in epistemology during the past few decades, which drew her toward the epistemology of trust. “For us to depend on other people,” she said, “we’re going to need to have some sort of grounding in trust, and not just trust in information, but also trust in a relationship.”

Grasswick is the George Nye and Anne Walker Boardman Professor of Mental and Moral Science at Middlebury, and she says that philosophical reflections on “the repercussions of how society thinks about itself, how people think about themselves, and how any shift in knowledge might lead to a shift in practice” have always fascinated her.

Last January, she gave her inaugural lecture as the forenamed professor, a talk titled “In Science We Trust!—Or Not? Developing a Situated Account of Responsible Trust in Scientific Experts.” (It was this talk that initially turned me on to this subject as a potential story.)

“Scientists are often surprised or dismayed when their work is met with distrust or rejection by members of the general public,” she said then. “As far as they are concerned, they are engaged in the most robust form of knowledge generation available. They are the experts on their topics, and it seems to follow that nonexperts should follow what they have to say. Furthermore, since sound policy making needs to be based on sound science, it’s deeply worrisome that trust in science is not widespread.”

“It is worrisome,” Grasswick told me, when I asked her about this statement. “But it’s not as simple as just saying, ‘Trust me.’”

To begin with, she said, there are legitimate, contemporary reasons why people may distrust scientists. Scientists have been wrong, she said, citing the devastating effects of thalidomide use among pregnant women in the 1950s; and they have behaved unethically, even criminally, such as with the 40-year clinical study in which the U.S. government studied the progression of untreated syphilis in African American men in the rural South—withholding a known cure for thirty years after the efficacy of penicillin was proven, all under the guise of receiving free health care. 

“Entire communities, understandably, lose trust in the institution,” she said. “And there are two levels at which this impacts knowledge. The most obvious is that if I don’t have a reason to trust, then I’m going to miss out on that knowledge. And then there is the impact on knowledge generation, itself. If you have a group of scientists who have no input from those who are socially situated differently, you run a far greater risk of being influenced by biases.

“It’s the idea that you need to diversify your scientific community in order to be able to see some of the holes or the blind spots in your thinking,” she said. “No matter how good a scientist you are, you must start with an assumption; that’s part of the scientific method. But you also need people who see things differently. And then the scientists can work it out, and maybe some of the theories live and some die.”

I asked her about scientific literacy, and Grasswick echoed Hofer nearly word for word. “Knowing some basic facts that are understood as scientific facts is not going to help you all that much. If you are going to be literate in science, you need to have an essential understanding in how science works. And then you can discern what makes for a robust application of science versus a less robust application, and this builds trust.”

With this in mind, I asked Grasswick about the increased privatization and corporatization of scientific research and how one could be epistemically vigilant, as Hofer prescribes, in order to build trust in these institutions and, therefore, their results.

“I think it comes down to what we want to demand of these institutions, these companies, in order for us to say, ‘OK, we’ll give you our trust.’ I touched on this in my talk when I said that trust can come from a history of that party willingly circulating knowledge rather than hiding it from you,” she said. “And as soon as we find out that there has been information hidden or manipulated, then that itself takes away from our trust, as it should as reasonable beings.”

But what if we can’t be reasonable?

***

sat down with political science professor Matt Dickinson the day after presidential candidates Hillary Clinton and Donald Trump met in their second debate, a clash the New York Times described as “unremittingly hostile,” and one that seemed to end with the populace agreeing on only one thing—at least democracy itself did not go up in flames on that autumn evening.

I hauled out my now dog-eared copy of the Pew report and asked him about the findings that showed that Democrats are more likely than either Independents or Republicans to say there is solid evidence of global warming or that more Democrats than Republicans disagree with scientific findings on the safety of genetically modified foods, and he offered a wan smile.

“The party sorting that has increasingly matched party labels with ideology has not helped the discourse,” he said. “It’s made it too easy for people to think that the opposing party is increasingly out of step with what one believes is right. And I think part of what’s happening is when a scientific consensus dovetails with a policy objective that resonates with one party more than the other, then that doesn’t help people appreciate the science.”

I tell him that I know that every generation likes to think about themselves in extremes—things are either better or worse than they’ve ever been—but it sure seems like we’re seeing extreme views right now.

“The liberal Republican and conservative Democrat have become extinct,” he confirmed. “Before, you wouldn’t necessarily dismiss what a Democrat said or what a Republican said by virtue of their partisan affiliation, because that wasn’t an automatic indication of what their beliefs were. That’s not the case anymore. And what we’re seeing is that when you have ideologically active partisans presented with conflicting evidence, they double down on their initial inclinations. The people with the most well-developed worldviews are the ones who are most resistant to accepting disconfirming evidence.”

We touch on the subject of trust, and Dickinson said that when we view our governing institutions as out of touch with our concerns, as a significant portion of the electorate does, “we increasingly are willing to discount what they tell us is the truth. And if you don’t trust the government, why should you trust the National Science Foundation or the National Institutes of Health?” The populist movement that has aligned itself with Donald Trump on the right and with Bernie Sanders on the left has further exacerbated these inclinations, Dickinson said. “One of the hallmarks of populism is a distrust of elites, and that seems particularly pronounced in this election cycle. And science can be a part of that.”

***

could have ended the story there, but that would have been depressing—plus I promised you a physicist, and I think you’ll be glad that I did.

Rich Wolfson is the Benjamin F. Wissler Professor of Physics, and he’s taught at Middlebury since 1976. Like any other Middlebury professor, his office bookshelves creak under the weight of their load, which, in his case, includes Collapse: How Societies Choose to Fail or Succeed by Jared Diamond and Cod: A Biography of the Fish that Changed the World by Mark Kurlansky, along with the dozens of physics texts that have titles too long to include here.

Wolfson has authored a number of books himself, including Simply Einstein: Relativity Demystified and the texts Physics for Scientists and Engineers, Essential University Physics, and Energy, Environment, and Climate. The last book is about to reach its third edition, a milestone that Wolfson seems particularly proud of. Before achieving his PhD in physics, he earned a master’s degree in environmental studies and focused his thesis on environmental ethics.

He is active in outreach communications to what he calls “the non-science public,” something he has been doing for decades, “long enough that I have seen scientists move from looking down their nose at folks like me who reached out to lay people to recognizing that ‘hey, this might not be a bad idea.’”

Wolfson has taught courses on climate change since the 1990s and a specific course titled The Science of Climate Change since 2002. Designed for nonmajors, the class addresses the following questions: “Why do human activities affect climate? What future climatic changes can we expect? And what will be their impacts?”

He says that the course always fills—anywhere from 24 to 36 students. Half tend to be environmental studies majors, though not those already in the science track. The rest include religion majors, econ majors, history majors. (Similarly, Grasswick reports that her course on Science and Society draws not only philosophy majors and other humanities students, but also neuroscience majors and biochem majors. “I’ve had students tell me that it’s so great to also think about science in addition to practicing it.”)

On the day that I visited Wolfson in his sunny Bi Hall office, his Science of Climate Change students were taking an exam. Sporadically, they would filter into his office, seeking clarification on one question or another. Most queries were focused on one specific part of the exam, a classically Wolfsonian-inspired entreaty to analyze a climate system for a fictional planet named Zorq. For weeks they’d been studying Earth’s energy flows, Wolfson explained, and this particular task was a simpler subset of what they’d been studying.

As the top of the hour neared, students began to spill into the office, dropping off their exams. To each, Wolfson asked, “Did you get Zorq?” Responses ranged from the confidently affirmative to shakier “I think so?” As I prepared to leave, I thanked Wolfson for his time and added, “I hope they all get Zorq.”

“They won’t,” he replied. “But that’s not entirely the point, is it?”

I smiled, and thought about something Barbara Hofer had told me. Those first-year students who had failed to understand the meaning of scientific theory, who had wanted creationism to be taught alongside evolution to ensure a “balanced debate”? A longitudinal follow-up to that study showed who had changed their views and why. Those who had exhibited “scientific curiosity” by indicating they intended to take further courses in the sciences (whether they had actually done so or not) had changed the way they thought about the issue.

I don’t think I have to tell you what they thought.