GES Center faculty Jason Delborne and Jennifer Kuzma are featured in the October issue of Scientific American:
By Brooke Borel | Scientific American October 2017 Issue
In 1999 Robert Shapiro, then head of Monsanto, gave a stunning mea culpa at a Greenpeace conference in London. Monsanto’s first genetically engineered (GE) crops had been on the market for only three years, but they were facing fierce public backlash. After a botched rollout marred by lack of transparency, the company, Shapiro said, had responded with debate instead of dialogue. “Our confidence in this technology … has widely been seen, and understandably so, as condescension or indeed arrogance,” he said. “Because we thought it was our job to persuade, too often we’ve forgotten to listen.”
The damage was already done. Fifteen years later only 37 percent of the Americans thought that GE foods were safe to eat, compared with 88 percent of scientists, according to the Pew Research Center. Regulatory bodies in the U.S. fought for years over whether and how to label GE foods. In 2015 more than half of the European Union banned the crops entirely.
Science doesn’t happen in a vacuum. But historically, many researchers haven’t done a great job of confronting—or even acknowledging—the entangled relation between their work and how it is perceived once it leaves the lab. “The dismal experience we had with genetically engineered foods was an object lesson in what happens when there’s a failure to engage the public with accurate information and give them an opportunity to think through trade-offs for themselves,” says R. Alta Charo, a bioethicist and professor of law at the University of Wisconsin–Madison. When communication breaks down between science and the society it serves, the resulting confusion and distrust muddies everything from research to industry investment to regulation.
In the emerging era of CRISPR and gene drives, scientists don’t want to repeat the same mistakes. These new tools give researchers an unprecedented ability to edit the DNA of any living thing—and, in the case of gene drives, to alter the DNA of wild populations. The breakthroughs could address big global problems, curtailing health menaces such as malaria and breeding crops that better withstand climate change. Even if the expectations of CRISPR and gene drives do come to fruition—and relevant products are safe for both people and the environment—what good is the most promising technology if the public rejects it?
“Without transparency, we might see a kind of hyperpolarization,” says Jason Delborne, a professor of science, policy and society at North Carolina State University. Concerned groups will feel marginalized, and advocates won’t receive critical feedback needed to improve design and safety. “This puts the technology at risk of a knee-jerk moratorium at the first sign of difficulty,” he notes.
To avoid that outcome, some researchers are taking a new tack. Rather than dropping fully formed technology on the public, they are proactively seeking comments and reactions, sometimes before research even starts. That doesn’t mean political and social conflict will go away entirely, Delborne says, “but it does contribute to a context for more democratic innovation.” By opening an early dialogue with regulators, environmental groups and communities where the tools may be deployed, scientists are actually tweaking their research plans while wresting more control over the narrative of their work.
Continued…
Read the full article at Scientific American