Fake Banner
    Engineered Pandemics? A Memo To Bird Flu Virologists
    By Robert Cooper | February 14th 2012 10:55 AM | 2 comments | Print | E-mail | Track Comments
    About Robert

    I have given up on categories. I did a BA in physics, a PhD in molecular biology, and now a postdoc in a bioengineering department. So call that...

    View Robert's Profile
    A note to bird flu virologists: Not all of you have been approaching of this whole engineered flu pandemic controversy quite optimally.  It’s understandable that you weren’t prepared for all the attention.  After all, you were only answering calls from both the NIH and World Health Organization to better understand the deadly H5N1 bird flu.  

    Ultimately, by showing how bird flu can more easily infect mammals, you are trying to prevent a natural flu disaster.  So of course it was surprising and offensive when the media branded you as out-of-control mad scientists, and when the government decided it might censor your work.  That unprecedented step must have worried you greatly because it would be completely anathema to scientific progress and the free flow of information.

    But we must recognize that the public has legitimate reason to be worried as well.  Engineering two potentially deadly new viruses is a scary result.  The concern isn’t just because of poor timing relative to Contagion’s theater run – a real danger exists.  The response must be more than simply, “Don’t worry guys, we got this.”  We need to acknowledge the legitimacy of public fear, and figure out how to seriously address the implications of this work going forward.

    Yes, these studies represent valuable work and they should be published, but some of your arguments in that direction miss the point.  It is not enough to argue: Don’t worry, these viruses might not actually be dangerous.  While true enough, that doesn’t address the flip side of the coin: the viruses may be very lethal indeed.  There’s only one way to know for sure, and that experiment is neither on the drawing board nor passable for the ethics board.  Nor is it convincing to argue that nobody with ill intent is smart enough to replicate your work, and they wouldn’t want to anyway.  While not trivial, genetic engineering is getting easier every day.  Nor even is it enough to fall back on the argument that discoveries must always be shared completely, regardless of implications.  In the space of all possible discoveries, that can’t always be true.

    You have much better arguments for publishing this particular research.  This information has an immediate public health benefit by alerting us what dangerous mutations to look for in wild flu strains.  While man-made pandemics are still science fiction, nature has been relentless in throwing new diseases our way, and to head off the next natural pandemic we will need all hands both on deck and fully informed.  Furthermore, the new research was not an earth-shattering breakthrough.  Like water flowing downhill or an iPhone prototype escaping in a bar, this new information is sure to come out eventually through natural scientific progress, no matter how hard we try to stop it.

    However, there does remain a risk of virus escape or misuse, no matter how remote.  While the papers in question should eventually be published in full, it would be helpful to delay the most sensitive details.  Such a compromise would avoid censorship and allow discoveries to benefit humanity to their fullest potential.  At the same time, the delay would provide time to develop vaccines against the new strains and allay public fears.  This controversy will not simply go away, it will require some action.

    Likewise, genetic engineering itself will not go away.  As synthetic biology advances ever further, engineering potentially dangerous pathogens will only become easier.  The uproar over the two new bird flu viruses must serve as a call to action; we need a new mechanism to ensure the safety of future studies.  In the Asilomar conference of 1975, your predecessors established ground rules for experiments using recombinant DNA.  This reassured the public about the safety of their work, and it allowed huge advances in science and biotechnology.  Those advances have now reached the point where technology has far outgrown the Asilomar recommendations.  Asilomar was sparked by the then revolutionary ability to splice two pieces of DNA; we now have the ability to create an entire working synthetic genome from scratch.

    It is high time to update the Asilomar code of responsible research.  If we don’t, we could see censorship of science, public mistrust, stalling of progress toward important cures, and possibly even an unfortunate disaster, no matter how unlikely.  But if we do address genetic engineering proactively, science could continue to progress just as it did after Asilomar, and in another 37 years we may be further toward vanquishing disease than we can now imagine.

    Comments

    Gerhard Adam
    I still don't see how any of this changes anything.  Does anyone really believe that all research that could result in weapons is published?  Does anyone really believe that there aren't governments and even individuals that are more than willing to finance such research secretly?

    Like it or not, our technology isn't something that can be controlled.  The genie is out of the bottle and no amount of "control" is going to put it back in.  While these steps may be necessary to establish procedures and protocols to avoid mistakes, it isn't likely to be much of a deterrent in any other way.

    After all, just like our legal system, laws are intended for those that are inclined to obey them anyway.  In all other respects, they offer no protection beyond describing the steps the state may take in retribution or enforcement. 

    So bear in mind that all the rules, procedures, and methods that are put in place will exist for those that are inclined to take them seriously.  To all others, nothing will have changed.
    Mundus vult decipi
    car2nwallaby
    I agree, nothing can completely stop this technology from being used for malevolent purposes if someone is determined enough.  But an agreed upon code would make sure at least most research is safe, especially if it's tied into the grant review process.  Hard to do research with no funding.

    Most scientists do want to follow ethical guidelines and to help, not harm, through their research.  The problem here was that these researchers were playing by all the established rules, and got blasted for it anyway.  And not just by the media, but also by a federal biosafety board.  That suggests we need some new rules to help the vast majority of responsible scientists to make sure their work is safe and won't generate a huge backlash.

    To steal your analogy, we can't stop everybody from breaking laws, but that doesn't mean we should just get rid of them all.