Scientists urge revamped regulations for genetic engineering

Are new techniques for genetic modification being used to escape some of the regulatory oversight imposed on conventional genetic engineering techniques for the past several decades?

That’s the suggestion in a recent New York Times piece by Andrew Pollack that focused chiefly on a genetically modified grass being developed by the Scotts Miracle-Gro Company. It quoted Scotts chief executive Jim Hagedorn as telling investors explicitly “If you take genetic material from a plant and it’s not considered a pest, and you don’t use a transformation technology that would sort of violate the rules, there’s a bunch of stuff you can do that at least technically is unregulated.”

Scotts’ relationship with regulators has not always been so smooth. It had a calamitous previous experience with conventional genetic modification in 2003 when a genetically engineered grass it was then developing escaped from a test plot, which is forbidden by regulations governing traditional biotechnology methods for genetic modification.

That grass was bred to resist the plant pesticide glyphosate, and the fear is that resistance could be transferred to other plants, especially weeds. The USDA’s Animal and Plant Health Inspection Service (APHIS) fined the company $500,000.

Scotts’ newer grass also resists glyphosate but is not subject to regulation because its new genetic material comes from other plants and is not inserted by bacteria, as in conventional forms of genetic engineering. It thus contains no “foreign” genes that would make it subject to GE regulations.

I have written a number of times here at GLP about the possibly salubrious regulatory future for new genetic modification techniques that are not transgenic, do not involve insertion of “foreign” genes, genes from other species. For example, a new wheat developed in China can resist the disease powdery mildew, but was achieved by techniques resembling natural mutations. A researcher expressed the hope that the Chinese government, which has lately been leery of genetically modified organisms (GMOs), would regard the new wheat as nontransgenic.

Researchers using gene-editing techniques like CRISPR/Cas, based on a disease-fighting technique that bacteria developed nearly 3 billion (yes, billion) years ago, have expressed hope that the techniques  would be regarded as benign by regulatory authorities. But they have embraced the new methods largely because they are simpler and easier and more versatile (and cheaper), a big technological advance. CRISPR, which I explained here at GLP nearly a year ago, was declared a Science magazine Breakthrough of the Year in 2013.

But Scotts’ Hagedorn has made it clear that the fact that this new GE grass will not be subject to a time-consuming and expensive regulatory apparatus is a chief selling point for new genetic modification techniques.

Scientists urge rethinking oversight rules

Does that mean that technological advances will make regulation of GMOs a thing of the past? Seems unlikely. For one thing, in the past year there have been calls for regulation of the new methods, and they are not coming only from anti-GMO activists.

Last November, plant scientists at the University of California-Davis took to the pages of Nature Biotechnology to complain of “Genetically engineered crops that fly under the US regulatory radar.” They say an increase in the number of requests for exemption from regulations, especially from public institutions and smaller companies, suggests that adoption of new GM techniques “may be a deliberate strategy for smaller entities to navigate the US GE crop regulatory framework.”

But the Davis scientists are not complaining only about underregulation of some newer forms of genetic modification. They also note  “overregulating GE crops and technologies with proven track records of safety.” It is high time, they say, for a reevaluation of US GE regulations.

The problem, they say, is that the regulatory apparatus focuses on the process that produces a GMO, not the GMO product itself. The new techniques like gene editing fall outside the processes like transgenes that are subject to regulation.

The Davis researchers call for building “a system that is based on science, with enough flexibility to evolve with accumulating scientific knowledge and technologies and, importantly, that allows the participation of small companies and public sector institutions.”  Not much to argue with there.

Predicting harm

The point that regulation should be about individual GMO products rather than the way they are made was echoed recently on a private ag biotech listserv, although the commenter also urged a focus on products with actual risks. The commenter argued that refocusing regulation of genetic engineering on the product and its risk of harm would do wonders for the regulatory system.

Still, it’s not always possible to know in advance what harms might arise. Back in the last century, at the dawn of genetic engineering, the fact that the risks, if any, were unknown was exactly the problem that occupied the scientists who gathered at the legendary Asilomar conference in 1973. In those days the idea of inserting genes from one organism into another was brand new and the potential consequences unknowable. The fact that the scientists doing the work were the ones raising the alarm was one factor that led eventually to government regulation of GMOs.

Now, decades later, scientists have done a lot of work and know a lot more–about biotechnology specifically and about genetics in general. Just one example: last October regulators working at Canada’s Plant and Biotechnology Risk Assessment Unit reviewed potential “insertional effects” that might give rise to unintended traits when a new gene is inserted in a plant.

They argue that insertional effects from genetic engineering are similar to many other genetic changes that occur in plants, sometimes spontaneously (i.e., “natural” changes) and sometimes due to conventional breeding processes. One example from nature, which I’ve written about here at GLP, is the transposable elements that hop around in a genome, sometimes landing in the middle of a gene and disrupting it. Half of the human genome is composed of transposons.

Insertional effects due to genetic engineering, the Canadian regulators say, should present a similar level of risk as DNA insertions from natural processes and conventional breeding–and this information should be incorporated in comparative pre-market assessment of GE plants, foods and feeds. Moreover, it’s now clear that insertional effects do not always lead to unintended consequences. They cite the 2013 study by William Price and Lynne Underhill showing that detailed reviews of the nutritional composition of more than 100 GE plants in the United States and Canada failed to identify a single adverse effect of genetic engineering.

Price, now a retired from the U.S. Food and Drug Administration, is co-author of another 2013 study arguing that 20 years of scientific investigation has shown that “suspect unintended compositional effects that could be caused by genetic modification [of crops] have not materialized.” The paper concludes: “Hence, compositional equivalence studies uniquely required for GM crops may no longer be justified on the basis of scientific uncertainty.”

The Canadian regulators point out that GMOs involve disrupting DNA a lot less than the genetic manipulation indulged in every day by people using conventional plant and animal breeding techniques. Of course that’s not necessarily reassuring. Conventional breeding has given us thousands of dogs beset by the miseries of hip dysplasia. On the other hand, it has also given us the new English roses that resist diseases, bloom repeatedly, and smell glorious. And then there are evolution’s breeding techniques, the experiments performed on entirely random mutations by natural selection. Which have given us, well, us. Warts and all.

Listening to the scientists’ POV

Where all this is headed heaven knows, but the fact that scientists are among those complaining that existing regulations can’t take account of the current–and future–realities of genetic modification should help authorities take the need for regulation reform seriously. And while much of the agitation surrounds agricultural biotechnology, note that concerns about genetic modification go far beyond agriculture.

That’s because the new techniques like gene-editing have direct human applications. Some of them are in medicine, such as an experimental approach to HIV therapy. Some are science fiction, although plausible science fiction, like direct genetic engineering of human eggs and sperm and zygotes.

These developments have prompted scientists’ concerns too. More than a year ago, the Institute of Medicine, an arm of the National Academy of Sciences, recommended that the government should replace the moribund Recombinant DNA Advisory Committee with a new structure for oversight of risky clinical research. The Harvard scientists who proposed the “gene-drive” methodology for genetic modification of entire mosquito species in order to prevent malaria at the same time urged public discussion of this mind-bogglingly massive plan for genetic modification. They also said new regulatory structures might be called for.

So once again, as at Asilomar in 1973, scientists themselves are pointing out that we are unprepared for the genetic futures now in the pipeline. Present regulatory structures are not just outmoded, they say, they are counterproductive and inadequate to protect us from possible harm. Let’s hope they will be listened to.

Tabitha M. Powledge is a long-time science journalist and a contributing columnist for the Genetic Literacy Project. She also writes On Science Blogs for the PLOS Blogs Network. Follow her @tamfecit.

Additional Reading:

3 thoughts on “Scientists urge revamped regulations for genetic engineering”

  1. After discussions with a weed scientist, I am starting to consider that we should adopt regulation focused on traits. Something as powerful as BT technology or glyphosate resistance might be lost in a few years if the appropriate refugia or standards are not established. It is a tragedy of the commons scenario wherein the communal benefit must outweigh individual gain. It is largely irrelevant to me if the trait got there by biolistics, agro, mutagenesis, conventional breeding, CRISPRs or TALENs.

    With the advent of gene editing, technology has leapt over the constant din of people opposed to 30 year old technology. In my mind, they have argued themselves into a corner; if one can accept chemical or radiation mutagenesis, how can one argue against site-directed mutagenesis? Hopefully, public acceptance of new traits developed by gene editing will defuse some of the rancor so that development of appropriate regulation can be developed. Instead of wasting all that time and energy arguing over completely meaningless topics like labeling sugar, maybe we could have a real conversation about how to take advantage of new technologies while minimizing realistically potential risks.

    Reply
    • After discussions with a weed scientist

      Good gig if you can get it,,,Oops I was thinking of the other “Weed”…
      The rest of your points are excellent.

      Reply
  2. Instead of speculation and guesswork about off-target effects, unknown genetic modifications, and other such concerns, let us use technology to settle these questions. I propose that any new plant variety, regardless of the method by which it is obtained, must have its genome, and the genome of the plant from which it was derived, completely sequenced with the current technology that provides the highest confidence level over the entire genome. Those genomes must be recorded in a publically available database. All differences in the two genomes must be identified, and the effects of those differences must be investigated and understood and reported by at least three independent research labs. Regulatory decisions would be required to be based on those reports. The only evidence that could be considered by the regulatory authority would be the reports produced by the process as is described in the next paragraphs.

    This approach to regulation would insure that all proposed
    new plants receive exactly the same regulatory treatment, would provide an opportunity to advance our knowledge of plant genetics, and (hopefully) would increase the public confidence in the regulatory process. All reports submitted as part of the regulatory would be subjected to an extensive peer-review process, for which the regulatory agency would compensate the reviewers to increase the likelihood of a thorough review. Some care would need to be exercised to insure that no one lab or school would dominate any part of the process, and that reviewers would be blind (as much as reasonably possible) to the identity of the lab producing the reports.

    The funding for this work should be done a little differently from the usual research grant process. Any lab that has accepted funding from any commercial or private interest would be disqualified (perhaps with a time limit). Outside of that, any lab could submit an application. The application would require only that the lab provide evidence that it currently has the capability to identify gene function and gene interactions, and the lab must submit papers it has published that were based on the lab’s work in those areas. The papers submitted would be reviewed by some specified number of compensated reviewers, who would pass judgment on the completeness and thoroughness of the work reported in each paper. Labs which cannot show current capability, or who submit papers found questionable by the reviewers would be disqualified. All applicants meeting the requirements of current capability and acceptable scientific work would be placed in a lottery, and the needed number of labs would be selected at random.

    Yes, this system would require the involvement of a large number of research scientists, but I would recommend that the process be used as a training exercise involving a large number of post-docs to do the reviews of the papers submitted with applications and the papers submitted at the completion of the investigations. A few senior research scientists might be retained to provide “quality
    assurance” by reviewing the methodology used by the reviewers, as well as the quality and consistency in the analyses done by the post-docs. All participating reviewers would be vetted for potential conflicts of interest.

    An aggregation of the summary analyses done on the final research reports would inform the regulatory decision in a prescribed manner such that the combined scientific judgment of the review process could not be negated by the regulatory agency.

    Direct funding for this process would be provided by the NIH; the NIH would charge a commercial or private interest seeking regulatory approval a fee equal to the actual cost of the process (no more and no less); the review of products developed under funding by U.S. government agencies would be funded by the sponsoring agency.

    Reply

Leave a Reply

glp menu logo outlined

Newsletter Subscription

* indicates required
Email Lists
glp menu logo outlined

Get news on human & agricultural genetics and biotechnology delivered to your inbox.