It seems like scientists love to bemoan the American public’s apparent scientific ignorance. Everything from public education, to the media, to pure laziness is held responsible for this phenomenon, which is blamed for the many divergences between public opinion and scientific consensus. Amidst these diverse accusations, one thing remains constant: the calls for more and better science education.
This oft-touted line fits in with the narrative (usually from the Left) that Americans are parochial, poorly educated, ideologically-driven ignoramuses. Better education would indeed be better. The call for science education seems like a safe solution to propose.
But is it a good one? And will it do what scientists want it to? The tradeoffs inherent in large-scale attempts to improve education in America are almost never examined in popular debate, and constant calls for more education may blind us to more effective interventions. To answer these vital questions would require an examination of how science literacy is taught, what resources it requires and what the effects of a science literacy intervention are on the subjects. All of these are beyond the scope of this piece. We seek merely to challenge the idea that science literacy and science education will likely lead to the desired results: an American public that trusts scientific consensus on vital issues like the causes of climate change, the medical dangers of not vaccinating children and nutrition.
Greater science literacy and science education among non-scientists doesn’t always lead to greater public agreement with the findings of the scientific community. Conservatives who understand climate science better are actually less likely to believe it is important than their less-informed conservative counterparts. Highly educated parents are more likely to refuse vaccination for their children than less educated ones.
The reality is that the cause of public mistrust of the scientific community cannot and should not be attributed to simple ignorance. The highly public scientific back-and-forth on complex issues like nutrition (such as, for example, the recent about-face on the health effects of cholesterol) may have contributed to the erosion of the public’s faith in science headlines. The widespread discussion about positive bias (the tendency to publish only results that show an effect, or a stronger effect, while not publishing results with less interesting implications) in many different sciences has caught the public eye and, perhaps, shaped public opinion, as have other high-profile articles challenging the scientific institution’s claims to objectivity. Stanford’s own Dr. John Ioannidis has highlighted many of these issues in the ensuing discussion from with his widely-cited article “Why Most Published Research Findings are False.” Figuring out when to trust the scientific consensus (and when not to) is far more challenging than learning a few facts about evolution or fossil fuels. Although scientific advances have immensely improved our lives and changed our world, these flaws are important because they provide an easy out for wannabe science critics, as well as genuinely slowing and undermining scientific progress.
However, there may be an even more significant force driving the wedge between scientists and the public than biases in the scientific community: differing value frameworks. To think that information alone will reliably be enough to change deeply held spiritual beliefs would be a naive assessment of human psychology. In the areas where scientists and the public are most at odds, it is because scientific information contradicts (or seems to contradict) the values of some segment of the public. In other words, the public does not disbelieve information at random; there is a pattern to the skepticism. People of all political affiliations distrust science when scientific findings conflict with their core values. This is clear in areas where science intersects with religious beliefs, like evolution skepticism. We see disbelief stem from the “just world” convictions of climate change skeptics, faith in natural remedies and fear of impure chemicals in anti-vaxxers and unfounded mistrust of new technology in GMO protesters.
To change their minds, these groups would have to not just accept new information, but leave behind core beliefs about the way the world works. The pleas for science education are an easy way to evade the ethical challenges that occur when science and morality intersect. Opposing ignorance is uncontroversial; opposing some cultural and spiritual beliefs that are common but lead people astray, less so. Saying people just need more information is certainly more comforting and politically palatable than saying they have a faulty perspective or framework that leads them to wrong answers, in which their incorrect beliefs are just a symptom of the problem. But, just like the people who disbelieve science so selectively, we should beware our own motivated reasoning about how to fix problems.
Our point is not that these values frameworks are equally beneficial for society, or equally responsive to new information (they aren’t). Nor is it that the scientific consensus is certainly right and the public wrong on all these issues (it may not be). Rather, it’s that a sprinkling of science education should not be expected to fix these deep problems. Teaching people the skills to critically examine their beliefs, showing them different ethical systems, and training them not to flinch away from information that contradicts what they hold to be true are all far more likely to benefit the public than squeezing more information into an already over-full, too-early curriculum. And, each of us should be especially careful to do research when we notice a scientific finding that contradicts our values; we are subconsciously biased against the science, making us more likely to be wrong, and stay wrong. Instead of just encouraging a better-educated population of science enthusiasts, we need to give people the training to engage with the science in a productive, critical manner while holding scientific institutions to the standard of excellence we need.
Contact Claire Zabel at czabel ‘at’ stanford.edu and Joseph (Joey) Zabel at joezabel ‘at’ stanford.edu.