Skip to Main Content

Science & Technology

July 30, 2007
Volume 85, Number 31
pp. 51-55

Experiments Of Concern

Well-intentioned research, in the wrong hands, can become dangerous

Ivan Amato

FIVE YEARS AGO, when molecular biologist Eckard Wimmer of the State University of New York, Stony Brook, and his colleagues chronicled in Science magazine how they had chemically synthesized full-fledged poliovirus particles, waves of anxious "uh-ohs" crisscrossed the globe. The reverberations continue.

Courtesy of Jean-Yves Sgro/U Wisconsin
dangerous chemistry Poliovirus is a beauty of a beast that researchers synthesized chemically in 2002. Crystallographic rendition of 2PLV is shown.

When the paper came out, it had been only months since the Sept. 11, 2001, jetliner attacks and the subsequent anthrax-by-mail attacks. And here was a respected group of scientists telling the world that they—and now presumably others—could resurrect a nearly eradicated pathogen without relying on cells. Their revelation's timing bolstered the public's then rapidly growing sensibility that some people on this planet are hell-bent on procuring and applying powerful technologies for the purpose of inflicting injury and death on massive scales.

That the viral synthesis wasn't all that difficult magnified the point. Wimmer's group started with the Internet-acccessible RNA sequence that provided the full genetic characterization of poliovirus, an RNA virus. From that information, the researchers specified corresponding strings of DNA, which they mail-ordered from a biotechnology company. Then, with a commercially available enzyme, they transcribed the resulting DNA construct back to the viral RNA. When they mixed this RNA with a soup of readily available biochemicals, authentic poliovirus particles formed. "It built itself," Wimmer said during the press fury that followed the report.

To drill home the fact that they had made the poliovirus particles chemically, not biologically, the researchers included in their paper an empirical formula for the virus: C332,652H492,388N98,245O131,196P7,501S2,340. The following year, at a symposium at the New York Academy of Sciences, Wimmer acknowledged the security concerns that come with a recasting of viruses as chemical constructions within reach of scientists in laboratories. But he also said, "We have to deal with this new reality. We cannot hide from it."

Roald Hoffmann couldn't agree more. The Cornell University chemist is taking decidedly theatric steps to encourage his research brethren to confront head on the reality that well-intentioned research that holds promise to cure disease, clean water, and otherwise improve the conditions of life can also be commandeered for sinister purposes. For Hoffmann, one of modern chemistry's intellectual fixtures and a Holocaust survivor whose own family history was tragically shaped in part by Hitler's murderous use of chemicals, denying the potential nefarious applications of well-intentioned research—the so-called dual-use conundrum—is not a viable option.

Creative Eye-mages Photography

More on Roald Hoffmann's play and the playwright

Read the full script of chemist Roald Hoffmann's play about the dual-use conundrum. (PDF size - 220 KB)

"Should've Synopsis"
(PDF size - 16 KB)

Website: Roald Hoffmann Website

"Chemists have not worried enough about the consequences of the molecules that they make," Hoffmann says. "We can reproduce nature; we can make the unnatural. Yet with creation of any kind, there is a question to be asked and that is, 'Will that synthesis, will that painting or that poem, will it hurt people?' "

On Aug. 5 in Turin, Italy, at the biannual meeting of the International Union of Pure & Applied Chemistry (IUPAC), Hoffmann, a theoretical chemist and cowinner of the 1981 Nobel Prize for Chemistry, will be asking a chemist-heavy audience to personally consider these uncomfortable questions in the premiere staging of his new play, "Should've."

THE PLAY opens with the immediate aftermath of the suicide of veteran synthetic chemist Friedrich Wertheim. The three main characters are Wertheim's second wife and a former chemist (Julia); his gung-ho molecular-biologist daughter (Katie), who is on the verge of discovering and disclosing a key molecular detail underlying the unusually deadly pathogenicity of the 1918 flu virus; and his daughter's lover (Stefan), who uses his art to make political and moral statements.

As the characters interact, the audience learns that before his death, Wertheim had published a synthesis for a molecule related to saxitoxin, the deadly and structurally fascinating chemical produced by red tide algae. The audience also learns that a clan chief in Uzbekistan exploited that new synthetic knowledge to procure enough saxitoxin to kill more than 600 men, women, and children of a rival clan by lacing a dessert at a wedding with the synthetic poison.

"The moral situation for Friedrich Wertheim was very stark," Hoffmann says. "The sin Friedrich imagines is his came from research done in good faith." And it is just one of many nightmare scenarios that dual-use science makes easy to conjure, especially in these times when the leader of al Qaeda in Iraq, Abu Ayyub al-Masri, distributes an audio message, as he did last September, trying to recruit chemists, physicists, and other technical experts to develop "nontraditional weapons." Said al-Masri, "We are in dire need of you."

At first in the play, Hoffmann has his audience believe that the tragic use of Wertheim's chemical innovation is what motivated his suicide. In a letter mailed to his daughter before he poisoned himself with cyanide, he wrote, "I put that weapon in the killers' hands. I can't live with that." As the play unfolds, the cause of the suicide becomes ambiguous. Meanwhile, each of the three characters is transformed in a different way by the suicide and by revelations about Wertheim's life and relationships.

Katie debates with Julia and Stefan about whether she and other scientists who are pursuing knowledge with nothing but good intentions ought to consider limitations when it comes to knowledge that could, in principle, be used to kill people. Hoffmann does not answer this question. "A play is not a political or ethical tract," he says. But a minimal mandate does seem to come through, that scientists must ask the question: What bad things can come of the questions I am asking and the data I am publishing?

As Hoffmann often has said, "There are no bad molecules, only evil human beings." So it is a moral responsibility, he says, for chemists to imagine what those evil human beings might do with the new, public revelations about how nature works. Even good guys are capable of dark thoughts. After all, in writing "Should've," Hoffmann, a soft-spoken scholar, a poet and playwright, and a bridge between the arts and sciences, had to think like a terrorist.

Knowledge generated through scientific research and disseminated by publications can now rapidly diffuse worldwide, notes John D. Steinbruner, director of the Center for International & Security Studies at Maryland (CISSM), located at the University of Maryland. That is why "everyone has a stake in making judgments about what you are doing," he says, referring particularly to scientists working in laboratories. Steinbruner and his colleagues have been developing ideas and initial proposals about a comprehensive new mandatory oversight infrastructure that would provide independent scrutiny "on research activities that might plausibly generate massively destructive or at least highly dangerous consequences," as one of their recent reports puts it.

Image Title

There are precedents for setting limitations on research. The horrific use of chemicals in France during World War I led to the Chemical Weapons Convention and the worldwide adoption of community mores to not pursue chemical weapons research and development. The advent of nuclear weapons in the mid-20th century led to a strong community awareness of the dangers of nuclear phenomena and the need to create formal mechanisms for limiting proliferation. Most recently, the life sciences community has taken steps to raise awareness among its members that their quest for fundamental understanding about life, health, and disease could be exploited by terrorists and military organizations that view these advances merely as novel means for realizing their ends.

In the past few years, for example, prestigious national and international organizations, including the National Academy of Sciences (NAS) and the British Royal Society, have convened blue-ribbon panels and established advisory boards charged with getting seriously paranoid about catastrophes that well-intentioned scientific achievements might make possible in the hands of the ill-intentioned.

Among the products of these efforts so far have been fat, sobering documents, such as the 2004 NAS tome "Biotechnology in an Age of Terrorism: Confronting the Dual-Use Dilemma" (also known as the Fink Report) and a follow-on report in 2006 titled "Globalization, Biosecurity, and the Future of the Life Sciences." In 2004, the Department of Homeland Security, as part of its biosecurity initiative, established the National Science Advisory Board for Biosecurity (NSABB). Common to these documents and panels is talk of potential advisory and regulatory bodies that would work within well-defined protocols to assess the level of risk attached to dual-use research and exercise "protective oversight" commensurate with the risk.

Steinbruner is quick to point out, as is Hoffmann and most others contacted by C&EN for this article, that he is not a champion of research moratoria. "We have tried to find an experiment that should not be done, but I have yet to hear of a candidate that survived a half hour of discussion," Steinbruner says, referring to talks with his colleagues at CISSM and elsewhere. "You have to handle the consequences of discovery rather than deny yourself the knowledge. The answer we are suggesting is that we need to establish systematic oversight procedures such that whoever is proposing to do something in these inherently dangerous times is subjected to oversight."

UNFETTERED CURIOSITY has always been a mainstay of the scientific enterprise. Steinbruner and others pushing for what they consider to be prudent oversight of that curiosity are under no illusion that scientists will welcome their ideas.

Creative Eye-mages Photography
Staged Worry Actors perform a reading of "Should've" last year at King's University College for chemist and playwright Hoffmann.

At the moment, the chemistry community is gently opening doors to the discussion by way of developing codes of conduct and incorporating ethics into undergraduate and graduate instruction. The American Chemical Society's Committee on Professional Training, for example, recommends that students be trained in ethics, says Mary Kirchhoff, director of the society's Education Division. "I would like to see a course in ethics be part of every graduate curriculum in the country," Hoffmann adds.

Brian P. Coppola of the University of Michigan is one of the few chemistry professors who already include ethics discussions in his undergraduate organic chemistry classes. "We get the students to write case studies based on their own experiences" and ask them to participate in role-playing exercises, he says. Compared with age-old ethical issues such as cheating, fabrication of data, and acknowledging credit, the dual-use conundrum is new, and Coppola is just beginning to bring it into his teaching. "If at the end of the year I have students who understand that their actions have impact beyond them, then I have done something," Coppola says.

As for whether the chemistry community needs more formal, binding, and authoritative structures for managing dual-use research proposals and projects, Coppola calls for caution. Like Steinbruner, he is wary of letting imagined evil uses of chemical discovery and know-how inform regulatory action. "If you extrapolate that to its natural conclusion, you go back to bare skin and stone clubs," because no one would feel ethically permitted to pursue their research interests, he says.

"On the level of basic science, there should be no restriction on research or publication," adds Kumar C. Patel, a longtime researcher and manager formerly with Bell Laboratories and a member of the committee that produced the 2006 NAS "Globalization" report. But if investigators do find that there are easy pathways by which research can be exploited to do harm, especially on large scales, then "we must think seriously about how and when to publish," says Patel, who now runs Pranalytica, a company in Santa Monica, Calif., that designs and sells devices capable of measuring trace components in gases such as breath and the air around battlefields. "This is where societal interest should override personal interest."

For Randall S. Murch, formerly a Federal Bureau of Investigation forensics investigator focusing on terrorism and another panel member of the "Globalization" report, scientists can no longer assume the right to as much freedom as they have enjoyed, because, he says, the world has changed. "For certain groups, the gloves are off," says Murch, who is now a research program development official for Virginia Polytechnic Institute & State University. "When these groups are looking at weaponry, the means available to them to do harm, they now not only can look at guns and bombs, but they are thinking about nuclear, biological, and chemical agents as well."

In the 1980s, he notes, drug cartels recruited chemists to embed cocaine in products made of polyvinyl chloride, and the drugs would be extracted from the products after legitimate shipping. Says Murch: "Given the fact that there are money, power, and influence in the world of terrorism, one can imagine without much of a stretch that people with chemical training are thinking about ways of using certain kinds of illicit and dangerous compounds, whether weapons or other hazardous materials."

Professional societies are the loci where awareness of these possibilities and the sense of responsibility to minimize risks need to be nurtured, Murch says. And in line with his earlier role as an FBI investigator, he says, "Ideally, every scientist who might encounter work that goes over the line, or students who are thinking about social responsibility, need to report activities that go over the line when they see it."

"The international scientific community is wrestling with how to bring ethical issues into its way of thinking and in professional training, particularly in educational contexts," remarks chemist Peter Mahaffy of King's University College, in Edmonton, Alberta. Last year, Mahaffy, chair of IUPAC's Committee on Chemical Education, arranged for a reading of an early version of "Should've" at his own college, and he has been central in the follow-on production next week in Turin. There, Hoffmann will also be giving a plenary lecture on the necessary marriage of science and ethics.

Along with these Turin events, IUPAC is developing educational materials to raise awareness of underlying ethical issues related to "the multiple uses of chemicals and materials," says Mahaffy, who is writing a chemistry textbook due out in 2008 that stresses the connection between, in his words, "chemical reactivity and human activity."

The widespread adoption of more aggressive self-policing, new oversight structures, and a constant mindfulness of the quivering line between social responsibility and personal research freedom would require a lot of culture change, one of the most difficult types of change to realize. To those who are calling for the change, there couldn't be more at stake. "Improving people's lives through the transforming power of chemistry," is ACS's vision statement. It will be up to every individual chemist, Hoffmann says, to back up this statement. "If not I, then who?" he asks.

Chemical & Engineering News
ISSN 0009-2347
Copyright © 2011 American Chemical Society

Related Stories

Adjust text size:

A- A+

Articles By Topic