Strategies for Managing Complex Scientific Interactions

A 2009 Pew research poll about the state of science and its impact on society revealed that the high regard Americans have for scientists is unrequited. The profession, according to the poll, thinks the public is scientifically illiterate and disapproves of the media’s dumbed-down, lackluster coverage of scientific issues.

This schism is perpetuated by the prevailing mode of scientific communication, the top-down deficit model, in which scientists force-feed the public a prescribed dose of simple fact, on the theory that the public suffers from a lack of scientific knowledge and that scientists know what’s best.

But scientific communication researchers and other observers see a change, contending that outreach communications are in a transitional phase that recognizes the complexity of social and scientific interactions and attempts to address the problem of assigning responsibility.

Bruce Lewenstein, professor of science communication at Cornell University, sees “clearly a changing culture” wherein “being engaged with the public at some level is just part of what it means to be a scientist.”

When he began teaching a graduate level course in public communication in 2007, Lewenstein asked students to raise their hands if they were “afraid their PhD supervisor would find out they were taking the class, because there was this attitude you shouldn’t be communicating with the public.” The class would laugh, says Lowenstein, and then about half would raise their hands. But this year, “not a single hand went up.”

The arousal of interest about public engagement is driven by a confluence of factors—outreach snafus, research confirming the failings of the deficit model, and the ascendancy of new media and social technologies.

To extract more social value from funded research, funding agencies increasingly mandate outreach and educational components in grant proposals, criteria “designed to get scientists out of their ivory towers and connect them to society,” says Arden Bement, director of the National Science Foundation. But more fundamentally, this communication groundswell is a beachhead to prevent the erosion of public trust and preserve the image of scientific integrity—the sine qua non of science.

“We live in an age when most policy debates relevant to science…are collectively decided at the intersection of politics, values and expert knowledge,” says Dietram Scheufele, professor and John E. Ross Chair in Science Communication at the University of Wisconsin–Madison and a leading voice for change.

Too often, the intersection is an accident waiting to happen for scientists on the firing line—whether in their dealings with restive publics, getting out front on revolutionary technologies, or mired in scientific controversy of numbing complexity and uncertainty.

The sum of scientific missteps has cast a pall over the profession—scientists cast as political and corporate errand boys, ethically challenged, spinners, prevaricators and proselytizers. The microscope has been turned on science.

Now for the good news: Science has accrued a wealth of what Scheufele calls “perceptual capital,” the public goodwill manifest in the Pew poll. The trick, says Scheufele, is spending it wisely when controversy arises— using a more collaborative, consensus-driven approach to public engagement based on careful research that avoids communication muddles. At the present burn rate, the surplus of perceptual capital will drain down and reduce public support or threaten funding.

“Scientists Behaving Badly” screamed a 2005 headline in the journal Nature. Lead author Brian Martinson decried the “striking level and breadth of misbehavior” of 3,000 government-funded scientists whose voluntary responses to an ethical conduct survey formed a spectacle of ethical transgressions. A parade of scientific perpetrators admitted to misdeeds from the egregious to the sublime, including 15.5 percent of respondents who said they changed how they conducted experiments, or their results, upon pressure from funding sources—the most commonly cited misbehavior. Martinson’s study, thundered The Boston Globe, “threaten(s) the fundamental working of science.” The Wall Street Journal weighed in to denounce the “brazen culture of lawlessness.”

Nicholas Steneck, director of the Research Ethics and Integrity Program of the Michigan Institute for Clinical and Health Research, suggested during a 2006 American Association for the Advancement of Science (AAAS) science policy forum that “questionable research practices” could approach 50 percent of all research behaviors. (The outliers on Steneck’s bell curve quantifying the ethical landscape were falsification, fabrication and plagiarism at one end and “responsible conduct of research” at the other, each with between 0.1 and 1.0 percent of all behaviors.)

“The climate change issue really focused attention of how scientists try to handle their relationships with the greater public,” says Mark Frankel, director of the Scientific Freedom, Responsibility and Law program at the AAAS. It’s challenging enough to communicate about an issue of such magnitude and uncertainty, he says, but worse yet were instances of scientists trespassing into the policy orbit.

“You basically had some scientists buttonholing politicians, saying, ‘Look. Here’s what the data shows, it’s a real problem, we got to act and here are some suggestions.’ Some did that willingly,” says Frankel, “and some were naïve.” Either way, “they wound up saying things they would never say in a journal getting peer reviewed.” The Bipartisan Policy Center, a think tank co-chaired by former House Science Chairman Sherwood Boehlert (R-N.Y.) and Donald Kennedy, former editor of Science, urged the Obama administration to “establish procedures for keeping politics from clouding science in regulatory decisions,” warning that the “politicization of science” degrades policy debate and undermines public faith in science. The use of ideological criteria to select members of federal scientific advisory committees has been scrutinized by the Government Accountability Office.

Global warming continues to wreak havoc with the scientific community’s reputation. In late 2009, hackers released e-mails purporting to show data manipulation by global warming proponent/climatologists in England. The scientists were subsequently exonerated, but the retraction was not the stuff of front-page news. The incident, dubbed “Climategate,” occasioned a New York Times editorial deploring “diversionary controversies.”

Oft-contentious stem cell research is another arena where scientists have engaged in what Frankel calls “exaggerated advocacy, really going out on a limb making claims” about cures. “This overstepping really hurt us. The other side comes back and says, ‘OK, so where are all these cures?’ Then you have the scientists who developed the California policies regarding stem cell research. They wrote their 10- year expectations into their documents before Phase 3 trials. That was responsible.”

But Lowenstein, a self-described “historian who focuses on issues of public understanding of science,” takes issue with the idealized concept of purity—of science separate from values— maintaining that a social and political drumbeat has always accompanied the march of science.

“When Galileo first invented the telescope, he was looking at Saturn and he found three moons for Saturn. Problem was, his financial support came from the Medicis, and there were four Medici brothers, and he did not publicly announce his findings until he’d found a fourth moon to satisfy his patrons. That’s no different from scientists who look for the most commercially viable version of their work….” And some of the greats play fast and loose ethically, great “precisely because they follow their intuition,” says Lewenstein, who cites HIV pioneer Robert Gallo, “famous for jumping, and sometimes that means his lab work is a little messy and everything is not quite documented. There have been some major accusations of fraud against him, which in the end have not been supported” but arise “because of the way he works.”

Scheufele and other agents of change maintain that scientists are frequently their own worst enemies regarding good communications— more inclined to trust their intuition than take advantage of the growing body of interdisciplinary data, more accustomed to lecturing than listening.

The deficit model has been shown to have little impact on public perceptions or policy and may aggravate conflict, says Scheufele. More sophisticated techniques like framing and deliberative forums prevent polarization and encourage dialogue but risk drifting into “selling” science instead of the longer-term goal of engagement through participation and trust. Rule No. 1 for science communicators, says Scheufele, is knowledge of the value systems of target publics, since inputs are filtered through belief systems.

A recently concluded American Academy of Sciences multiyear workshop, “Improving the Scientific Community’s Understanding of Public Concerns About Science and Technology,” recommends democratizing science-related issues by moving discussion upstream to give the public greater ownership. Amy Gutman, President Obama’s newly appointed chair of the Presidential Commission for the Study of Bioethical Issues, announced that she would embark on a course of “deliberative democracy” to find common ground on controversial issues.

The deficit model represents “first-order thinking” about science/society relations, says Alan Irwin, dean of research at Copenhagen Business School, while the American Academy of Sciences proposal—with its bottom-up thrust toward building dialogue, trust and ultimately consensus—is of the second order. Irwin argues that third-order thinking— characterized by full consideration of issues and policy by all stakeholders and informed by science—is still lacking.

Media is event driven, science is a process; communicating the constancy of scientific uncertainty is an enormous challenge. A National Academies report on climate change recommends engagement—“iterative dialogues” between scientists and stakeholders—as optimal. “Perhaps the best role for scientists is to think of themselves not as communicators but as conveners and facilitators,” says Matthew Nisbet, associate professor in American University’s School of Communications

With the fragmentation of established media and the ascent of digital and social technologies, media literacy becomes paramount for scientific communicators. One observer likened the state of scientific public discussion to “waves in a shallow pan” with “a lot of sloshing and not much depth.” As print media recedes, scientists can bypass former intermediaries and use the Internet, blogs and social media sites to communicate directly with various publics. Global warming “dismissives” are a small minority but wield disproportionate influence through adept use of new technologies. By striking early and digitally, scientists can set the narrative agenda for others to follow. The shrinking resources of most news organizations preclude serious enterprise reporting.

The onus is on science to be more anticipatory. When science spills over into the public space, organizations must proactively communicate messages stressing personal relevance and common shared values or risk ceding the stage to rivals and competing interests that may couch developments in terms of conflict, complexity or uncertainty.

Lewenstein has two tips for lab managers:

  1. Collaborate and pool resources to hire a professional communicator to work with local schools, museums and media. “Every lab says we’ll spend our money to build a good website. The world doesn’t need more websites.”  
  2. "Don't cite the idea you are in a social world. Recognize it and work with it."