The results of a new poll, released last week by the Pew Research Center, suggest that the American public's scientific literacy is — to use a technical term — so-so.
A nationally representative sample of 3,278 adults responded to 12 multiple-choice questions designed to assess their knowledge of science. (You can test yourself here.) For some questions, performance was quite good: 86 percent of respondents correctly identified the earth's core as its hottest layer. For other questions, performance was not so good: Only 34 percent knew that water boils at a lower temperature at high altitudes. Overall, the mean score was 7.9 (of 12), with only six percent of respondents achieving a perfect score.
In a recent 13.7 post, Barbara also raised the topic of scientific literacy, prompted by an 11-question test developed by Jon Miller, the director of the International Center for the Advancement of Scientific Literacy at the University of Michigan. (You can take Miller's test here.) That test revealed equally mixed performance in a group of 2,500 respondents polled in 2008. For instance, 86 percent knew that light travels faster than sound. But only 37 percent agreed with the (true) claim that "human beings, as we know them today, developed from earlier species of animals," and a full 53 percent agreed with the (false) claim that "the earliest humans lived at the same time as the dinosaurs."
What do these findings really tell us? Is knowledge of science trivia a genuine sign of scientific literacy? Is accepting evolution?
Of course, the answers to these questions depend on what we mean by scientific literacy. Miller, who Barbara interviewed for her post, writes that "civic scientific literacy represents the level of reading and comprehension skills needed to read the science section of the Tuesday New York Times or to watch an episode of Nova on public television."
The National Science Education Standards offer a slightly different take:
"Scientific literacy is the knowledge and understanding of scientific concepts and processes required for personal decision making, participation in civic and cultural affairs, and economic productivity."
A National Academies book explaining the term goes on to identify important scientific content associated with scientific literacy, but also key abilities. For instance, they write:
"Scientific literacy means that a person can ask, find, or determine answers to questions derived from curiosity about everyday experiences. It means that a person has the ability to describe, explain, and predict natural phenomena."
Miller's definition, too, is ultimately about reading and comprehension skills, not knowledge of specific scientific content per se. We might worry, then, that these simple tests — purportedly of scientific literacy — don't tell us much at all: They mostly test knowledge of content, not abilities or skills.
Despite these worries, I think people's acceptance of basic scientific facts does tell us something about scientific literacy. But it might not be what you think.
Consider research by psychologist Dan Kahan and colleagues. In an influential 2012 paper published in Nature Climate Change, they found that beliefs about climate change risk weren't positively associated with scientific literacy, nor with a measure of "numeracy," which assessed people's ability to comprehend and use quantitative information. Instead, the authors argued that beliefs about climate change — a kind of scientific content — are largely determined by the values of the communities that people identify with. Consistent with this hypothesis, they found that people with more hierarchical and individualistic worldviews rated climate risk significantly lower than those who were more egalitarian and communitarian.
For Kahan, this research casts doubt on the idea that knowledge of specific scientific content reflects something deep about scientific literacy. In an interview in The Atlantic, Kahan argued that questions about whether humans evolved (like those used by Miller and by Pew in other polls) aren't a reflection of scientific literacy at all: "It measures whether you're religious," he claimed. "It's just an expression of identity."
So, one take is this: Tests like Pew's and Miller's do tell us something, but it isn't about scientific literacy, it's about personal and cultural identity.
But, here's another take: If scientific literacy is an ability that reflects the role of science in an individual's everyday life and decision making, then the role of science in that individual's identity could be crucial. In particular, an individual's identity and associated values could determine whether science is seen as a relevant authority when it comes to certain questions — like those concerning human origins or climate change. And that could be crucial to whether or not she uses her "understanding of scientific concepts and processes" in "personal decision making, participation in civic and cultural affairs, and economic productivity."
So, even if Kahan is right, tests like Pew's and Miller's might still tell us something important about scientific literacy. It's just that scientific literacy isn't isolated from a range of other beliefs: political, religious, epistemic and beyond. It's these beliefs in conjunction, not scientific beliefs in isolation, that shape our actual decisions and our engagement in civic life.
So, here's a two-question test for readers: How should we assess scientific literacy? And, much more importantly, how should it be fostered?
Tania Lombrozo is a psychology professor at the University of California, Berkeley. She writes about psychology, cognitive science and philosophy, with occasional forays into parenting and veganism. You can keep up with more of what she is thinking on Twitter: @TaniaLombrozo