Even in a college science course, some doubt the science.
Show of hands, please. In the freshman biology class taught by Mindy Walker at Rockhurst University, how many believe that genetically modified foods are really as safe as the scientific community insists?
Thirty-four hands said sure. But a couple dozen students didn’t think so.
There was disagreement as well on whether Earth is warming because of human activity: 46 yea, 10 nay, a few unsure.
Never miss a local story.
Should childhood vaccines — with measles the current hot button — be required? Almost one-fifth of the room thought no. And this class was for nursing majors.
The same questions were presented, and similar uncertainties expressed, in a recent national report by the Pew Research Center. Its surveys reveal that while huge majorities of scientists say one thing, in many cases regular people believe another.
Walker encounters it often. Even in senior-level biology courses filled with smart adults with years of exposure to the concept of human evolution, “I’ll always have a few students who just aren’t buying it.”
Academics and others cite plenty of reasons for Americans’ distrust of science. Religion and partisan politics come into play, of course, but so do the Internet, suspicions of large institutions, a lack of understanding of how scientists work and the often convoluted or combative ways in which they communicate to the masses.
“Scientists know how to speak to each other. But most have no idea how to speak to people” outside their group, said Alan I. Leshner, CEO of the American Association for the Advancement of Science.
More than 3,700 association members were included in Pew’s surveys, which polled them and 2,002 other adults representing the overall U.S. population. On issues ranging from medical care to food safety to the environment, wide credibility gaps emerged between public belief and near-universal scientific consensus:
▪ Asked whether it’s safe to eat genetically modified foods, 88 percent of AAAS scientists said yes, while only 37 percent of U.S. adults agreed.
▪ While 87 percent of scientists attributed climate change to human activity, Americans in general were 50/50 on the subject.
▪ On evolution, 98 percent of scientists believed that humans evolved over time, compared with 65 percent of U.S. adults.
▪ More than eight in 10 scientists surveyed said world population growth will be a major problem, compared with just 59 percent of U.S. adults.
▪ On the increasingly political issue of vaccinations, the gap was smaller, though still telling. Eighty-six percent of scientists said childhood vaccinations should be required, while 68 percent of the population sample said the same.
Pew’s findings have a margin of error of plus or minus 3.1 percentage points.
For health care providers, much blame for the scientific divide rests with a usual scapegoat, the Web.
Peter S. Holt didn’t hear many challenges from his patients when he started his career a quarter century ago. But today it’s common for people to come in with their own diagnoses, usually wrong, and suggestions on treatments they think would work.
Typically they’ve learned it online.
“It is rare anymore that a patient comes into the office without going on the Internet,” said Holt, who practices internal and geriatric medicines for Saint Luke’s Health System. His relationship with patients “used to be paternalistic, more like, ‘Do what I say,’ and people would say, ‘Fine.’”
Much as the digital age has brought useful information to our fingertips, he said, the Internet and social media also can scare.
People tend to fear the worst with every ache. Google searches lead them to a self-diagnosis of cancer or a remedy not proven to work. It can instill mistrust of practitioners who offer different answers, said Holt.
Scientists call it “confirmation bias.” You harbor a worry of childhood vaccines potentially causing autism and so you check online, where you discover that other parents blame vaccines for their kids’ disorders.
Venture deep enough into Web portals and blogs and your worries about vaccines (and the non-organic foods you buy) are confirmed, even if research says otherwise.
“Confirmation bias is a huge, huge factor” in the distrust for science, said Paul Fidalgo of the nonprofit Center for Inquiry, an advocacy group dedicated to secularism and the sciences.
Then again, scientists don’t always get it right.
Just last week, news broke of a federal advisory panel proposing to ease guidelines about avoiding egg yolks and other high-cholesterol foods. While acknowledging a continued risk of heart attacks from having the “bad” kind of cholesterol naturally produced in blood, cardiologists agreed that consumers shouldn’t fret over most cholesterol in food.
“We got the dietary guidelines wrong,” the Cleveland Clinic’s Steven Nissen told USA Today. “They’ve been wrong for decades.”
Then there’s aspirin. Does it help prevent heart attacks?
For some, experts say yes. But a study published in January’s Journal of the American College of Cardiology found that 11.6 percent of patients being treated by cardiologists were taking too much aspirin — enough perhaps to kill them.
And that report came during a flu season made worse by a less-than-effective vaccine.
In 2009, public trust in the scientific consensus on global warming took a dive with a scandal called Climategate, which centered on emails exchanged by climate researchers at a British university.
But several polls in recent years show upward ticks regarding Americans’ belief that humans are causing climate change.
With that and some other scientific topics, experts suspect politics are shaping views.
According to Pew, slightly more than seven of every 10 Democrats and left-leaning independents think Earth is warming primarily due to man-made causes. In comparison, only 27 percent of Republicans and right-leaning independents say so.
A couple of likely presidential contenders, Gov. Chris Christie of New Jersey and U.S. Sen. Rand Paul of Kentucky, recently weighed in with reservations about states and school boards requiring childhood vaccinations.
Both are Republicans, but Pew’s studies found the same number of Democrats as Republicans saying parents should decide.
Not commenting can be radioactive too.
On a visit last week to a prestigious London think tank, Gov. Scott Walker of Wisconsin, another Republican weighing a 2016 presidential run, dodged when asked if he believed in the theory of evolution.
“I’m going to punt on that one,” Walker said, lighting up social media. “That’s a question a politician shouldn’t be involved in one way or the other.”
Few would deny that politics have sullied science and vice versa.
“There are partisan differences about everything,” said Karlyn Bowman, an American Enterprise Institute senior fellow who tracks public opinion. She said that in surveys about science, people may be uninformed about the research and simply echoing “Barack Obama’s position or the Republican position.”
Facts vs. values
Bowman also suspects many people don’t size up issues the way trained scientists do: “The public doesn’t necessarily approach these questions with fact-based evidence in mind. They start with their values.”
As for the scientists, why would topics ranging from food safety to climate change elicit such strong agreement from all disciplines, from chemists to ecologists to astronomers?
“If research is published in well-respected, peer-reviewed journals, we believe it,” said AAAS’s Leshner. “We depend on the reliability of the (scientific) method. It’s a very rigorous process, a learned skill.”
Lottie Lawlor gets all that.
But as general manager of Nature’s Own Health Market in Kansas City, Lawlor still doesn’t believe it’s OK for agribusiness to mess with the genes of crops and livestock so they can grow faster and be disease-resistant.
Like others sticking to organic diets, Lawlor was influenced by the processed-food writings of Michael Pollan — a journalist, not a scientist — and disturbing documentaries such as “Food, Inc.” and “Genetic Roulette,” available on Netflix.
“I believe the scientists when they say the food is safe,” she allowed. “I just don’t like it. I don’t agree with that industry and the politics behind it.”
Londa Nwadike, a food safety specialist for K-State Extension Research Service and the University of Missouri, said the industry is partly to blame.
When Monsanto and other companies began developing “genetically modified organisms,” or GMOs, “they got so excited and talked with a lot with farmers” about the breakthroughs, Nwadike said. “But they didn’t talk much to the public initially.”
Talking may be a scientist’s worst trait. The research community knows it needs to do a better job explaining itself to win over the public, policymakers and funders.
It’s why many doctoral candidates in the hard sciences around the country are now required to take courses in science communication.
At Stony Brook University in New York, budding scientists are doing improvisational theater and signing up for classes such as Writing to be Understood. It happens at the school’s Alan Alda Center for Communicating Science, named for the actor (and nonscientist) who was host of PBS’ popular “Scientific American Frontiers.”
“We tell students to know their audience as much as possible,” said the center’s director, Elizabeth Bass. “When do their eyes glaze over? When do they laugh? React to those things. You don’t want a one-way conversation.”
Students also are encouraged to tell stories of real-life individuals helped by scientific discovery rather than speak of larger, faceless population groups.
Easy-to-read white papers and good storytelling might help restore in Holt’s patients a stronger faith in science.
But pollsters have been charting Americans’ dwindling faith in most institutions for 30 years.
“People don’t believe the CDC (Centers for Disease Control) because it’s Big Government,” Holt said. “They don’t believe Big Pharma because they’re just out to make money. … Anything that smacks of bureaucracy is fodder for skepticism.
“It really is a matter of trust.”