Keeping Good Research from Going Bad
The White House Office of Science and Technology Policy released yesterday the second installment of a policy initiative to address research that, while being done for the right reasons, could be used to cause significant harm.
Dual Use Research of Concern—or DURC, as such research has come to be known—has risen quickly on agenda of the Obama administration as life scientists have become increasingly capable of manipulating genetic material in microbes. This ability can speed the development of medicines and vaccines, but it can also be used to create particularly dangerous pathogens. Release of the draft policy—which is open for public comment for the next 60 days—marks the latest effort by the government in a delicate process of figuring out how to balance safety and security interests with concerns about stifling scientific freedom and technological progress. This new policy, which is focused on federally funded studies, could also serve as a harbinger for future government actions pertaining to this type of privately funded research.
The newly proposed policy, which is now open for public comment, has its roots in the fall of 2011, when the world was surprised to learn that the editors of two major science journals (Nature and Science) were poised to publish two controversial studies. Researchers in two separate institutions had—with funding from the National Institutes of Health—created strains of potentially deadly H5N1 (bird flu) virus with a dangerously enhanced ability to spread. Other scientists had taken similar steps. In 2002, for example, researchers in New York used commercially available genetic material to create infectious polioviruses from scratch. And in 2005 researchers at the Centers for Disease Control recreated the deadly 1918 influenza virus, which caused the one of the most devastating pandemics in human history.
Although federally funded, the work done in these influenza studies apparently took the government by surprise, causing the Obama administration’s health officials and scientists to confront the issue of how to respond to scientists’ growing ability to create deadly diseases with relative ease.
Policies currently standing
Some relevant protections are already in place. Since 1997 the federal government has maintained a list of “Select Agents”—biological agents and toxins declared by the Department of Health and Human Services or the Department of Agriculture to have the potential to cause a serious risk to public health and safety. The Centers for Disease Control runs the “Select Agent Program,” which regulates labs that possess, use, or transfer the agents within the United States. But the authors of the Select Agent Program did not anticipate the possibility that scientists would gain the ability to create such agents—or even more dangerous ones—from relatively benign organisms or from easily obtainable genetic components.
To address this gap after the flu-research issue arose, the government brought together experts from various health- and security-related agencies to consider how to handle risks posed by DURC, which they defined as life science research that could reasonably be anticipated to provide knowledge or technologies that could be misused to pose a significant risk to public health and safety, agriculture, the environment, or materiel or national security. In March 2012 the government released a multistep program that specifies how funding agencies should deal with potential DURC.
The first step of this program requires that federal agencies identify research—both newly proposed and ongoing—that specifically involves any of the 15 “Tier 1,” or most dangerous, agents on the Select Agent List. The second step is to identify any research involving those agents that could be expected to result in any of the following:
- Enhancement of the harmful consequences of the agent
- Disruption to the immunity or effectiveness of an immunization to the agent
- Conference of drug resistance to the agent or any other action that makes it harder to detect
- Increase in the stability or transmissibility of the agent
- Alteration to the host range of the agent—an increase, for example, in the number of vulnerable species
- Enhancement of the vulnerability of a potential victim population of the agent
- Reconstitution of an agent that no longer exists in the natural world
Under this policy, any research fulfilling these criteria will lose funding unless the researchers craft an acceptable plan mitigating potential risks.
The March 2012 policy took a top-down approach, requiring federal agencies to assess for potential DURC research under their umbrella that was already underway and work with researchers to create plans for risk mitigation. The policy released Thursday applies the same philosophy from the bottom up: Instead of federal agencies bankrolling projects before assessing them for DURC, in order to receive funding the researchers and institutions must now follow the same protocol for risk identification and mitigation required of the agencies in their reviews under the March 2012 policy. If the research qualifies as DURC, risk mitigation will be written into the contract under which researchers and institutions receive federal grants.
If implemented, the draft policy will affect huge numbers of researchers and institutions, and it remains to be seen how well they will respond to the proposal. Institutions may not appreciate the increased responsibility of reviewing research proposals for DURC, and scientists may chafe at the new guidelines; no one likes to feel as though they are not trusted, and many researchers have long viewed it as their right to pursue basic knowledge without much oversight. In this view, extra layers of oversight could stifle scientific progress. Just how burdensome this policy is will likely depend on what degree and form of risk mitigation is deemed acceptable by those granting funding.
From the perspective of the Obama administration—which has felt pressure from members of Congress who worry that certain federally funded research could threaten national security—the policy as proposed is much less burdensome than it could be. The government could have required mitigation plans for research involving any select agents instead of just Tier 1 agents, or it could have even called for classification or other restrictions on communicating the results of DURC, which would have had the potential to slow the information-sharing among scientists that is so important for scientific momentum. Furthermore, by introducing the policy in draft form with a 60-day comment period, the Obama administration appears to be signaling its willingness to be flexible and open to changes. The government also plans to gather data on how well the policy is working over the next several years and—ultimately—incorporate lessons learned.
Though this latest policy effectively addresses DURC where government funding is involved, a comprehensive approach to handling DURC in the private sector appears to be some time away, in part because it would require congressional action—a long shot, considering current gridlock. An intensive lobbying campaign could reasonably be expected. Since such a policy would address situations where the government does not provide the money—and could not therefore threaten to pull funding as an incentive for compliance or as a punishment for noncompliance—there would have to be some independent enforcement mechanism. The Select Agent Program offers one approach, where violators potentially face fines and jail time. But industry representatives would undoubtedly see such a threat as having the potential to stifle American innovation in the life sciences—and if such a federal policy is not drawn clearly, they could very well be right.
What to do in the mean time
While it will be some time before the policy on federally funded DURC is finalized—and significantly longer before any such policy is likely to be enacted for the private sector—there are immediate steps that can be taken by the government to address the biosafety and biosecurity concerns raised by DURC. The most concrete and least burdensome way to do this involves raising awareness within the life science community and working with existing institutions to encourage best practices.
Lessons can be learned from the physics community. During the Cold War, much of the research done by physicists into nuclear technology was classified because of the imminent risk such knowledge posed. No one is calling for widespread classification in the field of genetic engineering—for one thing, the tools are far simpler and more accessible than those required for nuclear bomb-making—but life scientists would do well to take some tips from the physics community, which has become accustomed to thinking about the national security implications of its work. This consciousness is still underdeveloped within the life science community, despite the growing capacity of the life sciences to wreak havoc on a large scale.
Awareness of safety and security concerns could be encouraged in many ways, such as through small online tutorials, such as the ones currently in existence to educate health care workers about patient protections under the Health Insurance Portability and Accountability Act, or HIPAA, and to reduce the risks posed by blood-borne pathogens. Such tutorials might not tell scientists anything they do not already know, but they could help raise the threat of DURC in their day-to-day consciousness in much the same way that the “See Something, Say Something” campaign implemented by the Department of Homeland Security has made people more likely to notice other potential risks.
Increasing awareness will only help so much, though, in cases where institutions have no reporting mechanism for researchers to register concerns. Institutions should have mechanisms in place where researchers can raise questions about their own or others’ work in a nonthreatening way. The goal must be to assess risks and address them without triggering an overly burdensome response that could delay the development of defenses against real biothreats.
Another topic to confront immediately within the government and the scientific community is how best to mitigate the risk when DURC is identified—something for which the new policy provides little guidance. As the government was reminded in H5N1 influenza situation that brought this discussion to the fore, it is difficult to know where limits on scientific knowledge should be placed. The two previously mentioned journals, for example, published virtually all of the H5N1 research, leaving out a few details to make it more difficult to replicate for purposes of bioterror. If full-blown classification is not warranted, should large chunks of studies be redacted when published—and if so, wouldn’t this serve as an impediment to researchers? If America goes down that road, what authority will decide who has access to certain knowledge and who does not?
It’s widely appreciated that a major reason U.S. science has been so successful is because research in this country is relatively free of unwarranted restrictions. It is therefore imperative that unnecessary roadblocks are not put in the way of research—even as the life sciences make inroads into potentially worrisome terrain. Certain scientific advancements, though, do carry great potential to cause harm, which indicates a necessity for at least some minimal guidelines. Rightly or wrongly, public opinion is also a factor here, and as the intense media interest in the H5N1 research showed, some research is bound to appear to the general public as a cause for alarm. Scientists who fail to be sensitive to this fact risk triggering a crackdown on scientific freedoms by Congress or the Obama administration.
The Obama presidency has prided itself from the start on its respect for the vast importance of science—which comes as a relief, after science suffered years of neglect under the Bush administration. Moving forward, it will be incumbent upon the Obama administration—as well as on researchers and their institutions—to strike the right balance between safety and security concerns and the research freedoms that fuel scientific progress.
Oliver Kendall has worked for numerous political campaigns and organizations and studies political science at Macalester College in St. Paul, Minnesota.
Comments on this article