Some horror movie tropes just come off as unbelievable, they’re so ridiculous and overused. Like, “Girl who falls down for no apparent reason while being chased by a killer." Or, “Group of friends that decides to split up when it’s obvious being alone will get you murdered." And then there's this one: “Science laboratory creates horrible disease that will inevitably escape and kill all of humanity,” which might be the most unbelievable, since it defies both logic and actual laws. Or rather, it did until Tuesday, when the U.S. government announced it was lifting a three-year ban on federal funding for experiments that alter viruses to make them even deadlier.
“Gain-of-function” research, in which scientists make pathogens more powerful or easily transmissible, is aimed at preventing disease outbreaks by better understanding how they might occur. The studies allow scientists, working in a highly controlled environment, to learn how a flu virus might mutate into a superbug capable of killing millions—a sort of game of wits played to gain insight into nature’s unpredictability. The ultimate goal is to proactively create vaccines, medications and other solutions to stop contagion in its tracks.
The new National Institutes of Health policy reverses a 2014 Obama administration funding ban on gain-of-function research projects specifically involving all forms of the influenza virus, Middle East respiratory syndrome (MERS), and severe acute respiratory syndrome (SARS). The new rules would extend beyond those viruses, “apply[ing] to any pathogen that could potentially cause a pandemic,” according to the New York Times. “For example, they would apply to a request to create an Ebola virus transmissible through the air.”
Possibly aware that this sounds like the prologue to a very hacky horror movie, the NIH accompanied its announcement with a list of criteria that proposals must meet before funding will be granted. According to those terms, a panel will only greenlight projects if the work promises to yield practical solutions, such as an effective antiviral treatment; the research benefits must sufficiently outweigh the risks; and researchers must prove their experiment outcomes cannot be obtained using safer methodologies. Contenders will also have to prove their researchers and facilities “have the capacity to do the work safety and securelyand to respond rapidly if there are any accidents, protocol lapses, or security breaches.”
“We have a responsibility to ensure that research with infectious agents is conducted responsibly, and that we consider the potential biosafety and biosecurity risks associated with such research,” NIH director Francis S. Collins said in a statement. “I am confident that the thoughtful review process...will help to facilitate the safe, secure, and responsible conduct of this type of research in a manner that maximizes the benefits to public health.”
Despite those reassurances, critics continue to express concern about potential mishaps. There’s some precedent for this. In 2014, CNN reported that dozens of workers at the CDC had been accidentally exposed to anthrax, while others had mishandled samples of the bacteria. No staff were found to be infected by the disease after prolonged monitoring. A Vice Motherboard report notes that between “2003 and 2009, there were 395 events reported that could have resulted in exposure to toxic agents, although this resulted in just seven infections.”
Harvard epidemiologist Marc Lipsitch offered tepid support, telling the Times the approval panels are "a small step forward,” but cautioning that gain-of-function experiments “have given us some modest scientific knowledge and done almost nothing to improve our preparedness for pandemics, and yet risked creating an accidental pandemic.”
Conversely, Stony Brook University president and biomedical researcher Samuel Stanley worries that the NIH decision, after three years of funding prohibition in this area, may be too little and just a wee bit too late.
"There has been increased scrutiny of laboratories working in this area, which can lead to an even more robust culture of safety,” Stanley told NPR. “But I also fear that the moratorium may have delayed vital research. That could have long lasting effects on the field. I believe nature is the ultimate bioterrorist and we need to do all we can to stay one step ahead."
Shares