Genetic engineering: the making of monsters?
Bernard D. DavisIN 1973 scientists integrated a number of esoteric techniques in microbial and molecular biology, making possible the directed molecular recombination of DNA. By this method, fragments of DNA from any source could be spliced in the test tube and cloned in host organisms. Scientists soon devised other ingenious techniques for manipulating DNA, including improved methods for isolating genes and determining their sequences. These developments have had a major impact on research in virtually every branch of the biomedical sciences. They have also created a burgeoning biotechnology industry that encompasses medicine, agriculture, and pollution control.
But despite the outstanding achievements and promise of this genetic revolution, the public has been ambivalent. People are eager for the benefits but fear the possible dangers. By now, after twenty years of expanding experience with biotechnology with no detectable harm to humans or to the environment, the anxiety has abated a good deal. Nevertheless, the development of safety regulations for bioengineering is still plagued by confusion, controversy, and continuing public apprehension. Why has there been such concern over essentially hypothetical dangers?
Setting the rules
As biotechnology began to take off in the early 1970s, molecular biologists themselves were deeply concerned over the potential dangers of their emerging field. Indeed, they contributed to public anxiety by publicizing their uncertainty very responsibly--and more openly than had been customary for the initial stages of a scientific controversy. One reason for this unusual candor was the overwhelming novelty and magnitude of these scientists' new powers. Many scientists no doubt also took pride in being responsive to student pressure for increased public participation in the control of technology. But these scientists overestimated their ability to identify the relevant risks of biotechnology, and they underestimated the apprehensive public response to their early and open discussion of potential problems.
Molecular biologists had pioneered their new field with great intelligence and experimental skill; they understandably viewed assessing the risks involved as a problem that they could and should solve in the same way. But risk assessment was more properly a problem for evolutionary biologists and epidemiologists than for molecular geneticists. The first two groups, unfortunately, were only sparsely represented at a major 1974 conference in Asilomar, California, which laid the groundwork for the initial regulations governing biotechnology. Moreover, in the early years of this new discipline virtually everyone assumed (erroneously, we now know) that the possible range of novel organisms was unlimited. Thus, few scientists felt confident in making predictions about such a radically new world.
Accordingly, excessive caution prevailed at the conference. In a tense and rushed atmosphere, some 200 scientists attempted to thrash out various complex issues in the presence of the press. This was quite a departure from scientific tradition, whereby a committee of experts first explores the problems created by new discoveries and then presents its evaluations to the public. At Asilomar separate working groups offered different assessments of various risks; conference participants recommended that the "worst-case" scenarios become the foundation for new regulations governing biotechnology. These rules were then formulated and administered by the National Institutes of Health (NIH), guided by a Recombinant DNA Advisory Committee (RAC).
Second thoughts about risk
The conclusions reached at Asilomar skewed the debate from the start. Though the initial guidelines may have reassured the public, they were, in retrospect, excessively restrictive, treating even normal human DNA as dangerous. Moreover, the guidelines classified several levels of risk for different groups of organisms on a basis that appeared to be much more scientific than it really was. Within a few months scientists close to the problem realized that they had exaggerated the dangers. But it took a number of years for the RAC to relax its guidelines for those recombinants that were almost certainly harmless.
Although much initial concern focused on the process of DNA recombination, by now virtually all bioengineering experts agree that the evaluation of a novel organism should be based solely on its properties rather than on the technique used to create it. Moreover, the definitions were soon outgrown as researchers developed new techniques of inserting DNA into cells that do not involve recombination in the test tube, such as use of an electrical potential or bombardment with DNA-coated microscopic metal particles. Nevertheless, regulators and the public have not so readily abandoned the view that recombinant DNA presents a special, dangerous case. What accounts for this persistent dissonance?
I would first note a unique feature of this controversy: its prolonged focus on hypothetical risks, rather than on the more usual exaggeration of demonstrated ones. With newly recognized bona fide sources of harm, such as asbestos or radon, we ordinarily react slowly and then overreact after a lag. But with recombinant DNA we reacted explosively and we continue to debate the issues vigorously--even though the basis for predicting future harm from recombinants has become exceedingly tenuous (with the exception of products of organisms that are themselves pathogens, that is, sources of disease).
Concern about genetic engineering has been further intensified by an underlying uneasiness over the future impact of gene therapy and genetic screening in human beings. Here, there are clearly serious concerns. These include the potential tendency of therapeutic purposes to blur into eugenic ones, the likelihood that knowledge of individual susceptibilities to future disease will often generate more anxiety than benefit, and the certainty that such knowledge will greatly increase problems of privacy. These concerns have been most influential in those countries where distortions of genetics contributed to the Holocaust: The Green political party has impeded progress in genetic engineering in Germany even more than anti-technology activist Jeremy Rifkin has been able to do in the United States. Concern for environmental deterioration has led to similar reactions in other countries, and in Switzerland it has driven out some biotechnology-based industries.
Still another, more general reason for uneasiness over genetic engineering has been an extrapolation from the model of the physical technologies. A few decades ago the advances in these technologies seemed to be providing us with a virtually free lunch; but disillusion set in as we encountered unanticipated costs to our environment and our security. As a result, some fear that manipulating the cell nucleus might, like manipulating the atomic nucleus, have unforeseeable costs.
Natural adaptation
The assumption behind these concerns is that we have no basis for estimating future dangers from biotechnology. In fact, we do have a basis for predicting the effects of biotechnology--our historical experience with domestication. Domestication began when our ancestors learned to tame certain animals, plants, and fermentation microbes to serve human needs, then discovered how to select empirically for varieties strengthened in certain valuable traits. The benefits have been strikingly free of social costs for thousands of years, in contrast to the more mixed bag yielded by the physical technologies.
Furthermore, the products of past domestication have not "taken over" via spontaneous spread, as is feared for the products of the new biotechnology. They have spread only to the extent that cultivation by humans has caused them to displace the earlier occupants of the same territory. Since the products of the new biotechnology are based on an extension and refinement of the same principles that govern domestication, they should be subject to the same limitations on their spread.
These limitations arise from the nature and scale of the evolutionary process. Evolution has been continuously experimenting with genetic novelties for three billion years. It has been extraordinarily effective in filling each ecological niche with organisms exquisitely adapted to that environment, from the Alaskan tundra to hot vents in the depths of the ocean. Moreover, in the microbial world the scale of natural adaptation is enormous: The average shovelful of soil contains as many individual creatures as the total human population. By comparison, our genetic experiments in the laboratory are puny.
Accordingly, the likelihood that we can further improve on the adaptation of an organism to its natural environment is virtually nil. (We may be able to improve the adaptation of an organism to an artificial environment, such as a farm. But this advantage is limited to the boundaries of that environment.) When we breed for "improvement" in an organism--an increase in a property that serves us--both theory and empirical evidence point to a decrease in its adaptation to the environment from which the parental strain was taken.
This decrease reflects the costs of the genetic changes that we introduce, for these changes lead to a less efficient or less balanced synthesis of what the organism needs. These costs may be large enough so that the organism becomes dependent on our care for survival; or they may be small enough that it can still become feral again and survive in nature (as with wild horses, dogs, and cats). But the important point is that no domesticated strain has been shown to be better adapted than its parental wild type to the original environment, and hence to displace the wild type there. The same can be expected for engineered variants.
There is an exception to the prediction that we cannot improve on nature: When the environment is changed we can sometimes predictably improve adaptation by introducing appropriate genetic changes. An example is an environment changed by the widespread use of antibiotics. This environment will select for naturally occurring resistant microbial strains, which arise by spontaneous mutation or via spontaneous genetic recombination. Under these circumstances, if we introduced genes for resistance into otherwise already well-adapted bacteria we could accelerate this population shift. One of the aims of proper regulation of recombinant bacteria is to avoid the spread of such resistance genes to pathogenic organisms.
Limits to genetic novelty and spread
But if we can now remake organisms at will, is there not a qualitative difference between modern genetic engineering and classical domestication? With an unlimited range of products, might not some inadvertently spread beyond our control?
Several arguments should allay this concern. First, even though we can indeed manipulate DNA in the test tube at will, it does not follow that we can modify organisms at will. In order for an organism to develop and function effectively its parts must interact in a coordinated manner, fitting each other like the parts of a smoothly functioning machine. Hence, only those new variants that have a sufficiently coherent, balanced set of genes can survive.
Furthermore, even if a radically altered organism is nominally viable, it suffers significant disadvantages in evolutionary competition, as noted above. Recombination of ill-matched genes from distant sources will yield poorly adapted organisms, not the vaguely conceived, dangerous monsters of current science fiction.
As with engineered bacteria, the main new concern with engineered plants--which will ultimately have a much greater variety and a much greater economic and social impact--has been that movable genes will spread from new plants to other species and will create novel, harmful weeds. But principles similar to those described above for bacteria would also mitigate this risk. For such dangers to arise, several conditions would have to be met: cross-pollination to a wild relative, survival and germination of the resulting hybrid embryo, fertility of the resulting plant, its competitive survival and establishment in the environment, and its creation of environmental problems. The many recombinants that have been examined have all failed to overcome these barriers.
But I suspect that these rather theoretical arguments did not have much to do with the gradual realization that genetic engineering is not so dangerous after all. Most important was the simple experience of expanding the work into thousands of laboratories without harm. Another factor was the eventual recognition that organisms containing foreign DNA arise in nature and hence are not as utterly novel as was initially assumed. The initial assumption of great novelty was understandable, not only because the technical advance was an enormous one, but because molecular biologists must have felt a Promethean pride in having apparently created combinations of genes that could never before have existed on earth. But scientists did not create de novo the several steps that they used in developing DNA splicing in the test tube. These steps all occur in nature, and the key discoveries simply improved their efficiency. So while these discoveries were essential for producing recombinant bacteria in the laboratory in great variety and in usable quantities, it is difficult to avoid the conclusion that in nature bacteria must also take up DNA from foreign sources and produce recombinants, though at a very low rate and with a very low survival value.
In humans, for example, bacteria must occasionally come into contact with DNA released from cells that die continually at the intestinal surface. Viruses could provide another mechanism, since they can introduce genes from a previous host, as well as their own genes, to a new host cell. Our failure to find human genes today in E. coli, a bacterium found in the human intestines, does not mean that they were never there: We would expect most such hybrids to be at an evolutionary disadvantage and hence to disappear in subsequent generations.
Indeed, sequences of DNA from distant organisms are sometimes strikingly similar, which provides evidence for this occasional crossing of the barriers between species. The whole body of DNA in the world may therefore be connected horizontally through lateral transfer of small blocks, as well as vertically by descent through normal reproduction. Thus, the new recombinants seem increasingly less radical.
One other feature of the microbial world provides further reassurance: the stringent requirements for pathogenicity. The vast majority of bacteria are not pathogens--they do not cause disease. They are found in soil and bodies of water, where they convert organic matter to simple degradation products (carbon dioxide, ammonia, etc.), which recycle into other microbes or into plants. One does not easily make an effective pathogen out of such harmless bacteria. For with pathogens, as with benign microorganisms, evolutionary success is not ensured by any single, powerful gene; it depends on an effectively interacting ensemble of genes.
Consider the example of diphtheria toxin. The gene that codes for this potent toxin is found in nature only in the diphtheria bacterium, where it is accompanied by other genes that help make the organism an effective pathogen. Diphtheria toxin is not found in any other bacterium. Hybrids that formed the toxin must surely have arisen in nature from time to time but did not survive.
Man-made epidemics?
With increasing recognition of these arguments against a special danger in recombinant bacteria, the guidelines for the use of such organisms, in research and in industry, were progressively relaxed. The issue seemed to be pretty well settled. But in 1984 a new wave of concern arose, as scientists began developing potential applications of biotechnology that would involve the deliberate introduction of engineered organisms into the environment. Examples include the use of such organisms to replace nitrogen in fertilizer, to replace toxic chemical pesticides, to digest toxic organic pollutants (such as oil spills), or to prevent frost damage on crops.
This latest alarm over biotechnology has centered largely on two existing models. The first is of damage to the environment from toxic chemicals. The extent of this damage depends on the scale of the introduction. There is an important difference, however, between this model and that of bioengineered bacteria. With chemicals the harm is created directly by the introduced material, while with bacteria the harm would depend on the uncontrolled multiplication of the progeny. Such spread in turn would depend on the ability of the introduced organism to compete, in a Darwinian world, with those organisms that are already present. And the effect of scale on that competition is clear. If an introduced soil organism is not competitive, even huge numbers can have only a transient and local effect before dying out. Conversely, if it should be more competitive than the native organisms (though that would not be expected, for reasons presented above), even a small, accidental escape from the laboratory could start a spread, just as a single import of smallpox can start an epidemic in a susceptible population.
Thus, scale of introduction is not decisive for competing bacteria. Of course, one can imagine that on a huge scale the "transient and local" effect could be significant, even though it would not result in spread. But that problem would not come as a surprise, and we should be able to control it.
The second major basis for concern over introduced organisms is the harmful spread of certain "exotic" organisms after their importation from distant regions. This analogy has had widespread appeal. The unexpected and costly spread of certain imports, such as starlings and kudzu vine in the United States, or rabbits in Australia, has legitimately caused great concern to ecologists. But the parallel to engineered organisms is weak, and perhaps even specious, because of a key difference: One process moves an unchanged organism to a new environment, while the other changes the organism and then returns it to the original environment.
This distinction has large consequences. Specifically, non-engineered exotic transplants have already been well adapted by evolution to their native environment, where their population density has been limited by various physical and biological factors. In a new environment that lacks these factors the organisms will proliferate excessively. Engineered organisms, in contrast, are ordinarily returned to the original environment--and as we have already noted, they are highly likely to be less well adapted than the parental strain to that environment. (Of course, the altered organisms may be transplanted to a new environment; but then we would be dealing not with a special problem of recombinants but with the old problem of exotic pests.) Recognizing these considerations, many ecologists have stopped stressing the analogy to transplanted species. Its public appeal, however, persists.
There is still another source of reassurance: the extensive experience already accumulated through the use of genetically modified microbes in agriculture. Such organisms, obtained by traditional genetic methods, were introduced long before recombinants became available. They include strains of Bacillus thuringiensis (used to kill insect larvae) and nitrogen-fixing bacteria (used to spare the need for nitrogen in fertilizer). Their regulation was straightforward, and no harm has been detected. Clearly, commercial use of recombinant bacteria, as of any other bacteria, will require similar measures. But the purpose will be primarily to avoid toxicity for humans and animals, rather than to anticipate uncontrollable, harmful spread. (A parallel experience in medicine has been regulation of the use of live, attenuated viruses as vaccines, including smallpox, polio virus, mumps, measles, and rubella.)
Regulatory responses
The regulatory agencies in the United States have responded in disparate ways to the conundrums created by the new genetics. A February 1990 report on national biotechnology policy by the President's Council on Competitiveness was a watershed. Government policy, the report said, should seek "to eliminate unneeded regulatory burdens for all phases of developing new biotechnology products--laboratory and field experiments, product development, and eventually sale and use."
The report further noted that "existing regulatory structures for plants, animals, pharmaceuticals, chemicals and toxic substances provide an adequate framework for the regulation of biotechnology in those instances where private markets fail to provide adequate incentives to avoid unreasonable risks to health and the environment." This was a tacit admission that, although it is somewhat artificial to view living organisms as chemical substances in order to apply existing legislation, new legislation would likely cause even greater problems by treating recombinants as a special case.
The report also echoed earlier recommendations in reports from the National Academy of Sciences and its National Research Council: Regulation should focus on the characteristics and risks of each biotechnology product, and not on the process by which it is created. With a view to the future, the report further declared that "[r]egulatory programs should be designed to accommodate the rapid advances in biotechology. Performance-based standards are, therefore, generally preferred over design standards." These principles were reinforced and further expanded in a February 1992 report of the Council on Competitiveness, which defined in greater detail the recommended scope of regulation.
Nevertheless, the requirements of the assorted regulatory agencies remain in various stages of readiness and of compatibility with stated policy. The NIH has been closest to that policy. However, its Recombinant DNA Advisory Committee has by now essentially withdrawn from all areas of genetic engineering except gene therapy in humans. The Food and Drug Administration (FDA) has long been responsive to the same concerns that were expressed by the Competitiveness Council, and it has not imposed any special procedures or requirements for products made with the new biotechnology. Moreover, in a May 1992 policy statement on regulation of foods from new varieties of plants, the FDA emphasized that its regulations would be based on objective characteristics of the food, and not on the use of particular genetic techniques. It also suggested that, with certain well-defined exceptions, novel varieties could safely be exempted from regulation.
The Department of Agriculture has been somewhat ambivalent. Policies at its Animal and Plant Health Inspection Service have remained essentially unchanged since 1987--though the federal policy on "scope" would seem to dictate some refinement. Another part of the department is engaged in developing regulations for research; current drafts seem incompatible with the federal "scope" policy.
The Environmental Protection Agency has been especially favorable to apprehensive, risk-averse approaches to biotechnology. Since the EPA is charged with safeguarding the environment it has understandably relied heavily on the advice of ecologists, whose professional interests encourage conservatism toward changes in the environment. Indeed, the Ecological Society of America has formally recommended that no proposed introduction of recombinants be approved until it has been examined by ecologists. Since 1984 the EPA has made proposal after proposal focusing specifically on organisms obtained by recombinant DNA techniques. All of these drafts have been incompatible with the key principles of the Council on Competitiveness--and they have not solved the immense practical problems that will arise if every field experiment by a graduate student in an agricultural school must receive specific approval from Washington.
The Federal Coordinating Committee on Science and Technology is supposed to integrate the policies of the different agencies, but its efforts, and other ad hoc discussions, have not been notably effective. Problems have arisen not only from differences in the rules of various agencies but also from overlaps in their jurisdiction. Unfortunately, large corporations, which might be expected to be deeply concerned, in general have not vigorously opposed regulations that involve extensive bureaucratic procedures. This may be because these companies are bothered less by expense, which can be passed on, than by unpredictability, which makes planning difficult. In any case, the outcome has been hard on academic researchers and small companies, often the best sources of imaginative products.
Prospects for the future
It is easy to arouse public suspicion of microbes, which are most familiar as "germs" that cause disease. Past costs and errors in the exploitation of new technologies have generated a cadre of political activists who have become skillful in using the courts and the media to promote their position. Moreover, as noted before, the professional concerns of ecologists encourage conservatism about changes in the environment, while various lay environmentalist organizations are even more conservative--and emotional. The genetic revolution thus calls for a great deal of education, in our schools and to the adult public, on the beneficent, essential role of most of the world's microbes. (Only a tiny fraction of all microbes are pathogens, attacking organic matter while it is alive instead of waiting until it is dead.)
In this essay I have emphasized evolutionary principles, as the broadest base for prediction in biology. But this reliance on theory is unlikely to be fully accepted even by many biologists--let alone by a public that is skeptical about evolution. Yet if we ignore evolutionary principles, trying instead to assess the risks only on the basis of empirical tests, we encounter other limitations. For we cannot hope to duplicate in our experiments all the conditions that might be encountered in nature, or all the variants that might arise.
Instead, as in all of science, we should feel confident in building on principles after we have examined a reasonable number of concrete variables. For most classes of bacterial recombinants currently attracting interest, I believe we have met that requirement. Nevertheless, recombinants are still subject to special regulation, in the futile and expensive quest for the virtually absolute safety that the public has been led to expect. The cost of such regulations includes the loss of substantial benefits from potential new products and, in some cases, a decrease in safety, since these restrictions impede the use of engineered organisms to replace more toxic chemicals.
As time passes without visible harm from biotechnology it seems inevitable that anxiety will continue to abate, and that regulations will eventually become more uniform and sensible. But meanwhile the success of demagogic appeals to public anxiety about the unknown has been discouraging, and it has led some administrative agencies to be more vigorous in responding to "perceptions of public perceptions" than in trying to educate the public on the scientific evidence. The resulting delays in testing have been burdensome for those scientists whose morale and momentum--not to mention fulfillment of the obligations of their grants--depend on tests of their products in the field.
A conspicuous example involved an ingenious use of a recombinant bacterium to decrease frost damage to certain crops. This organism, a derivative of Pseudomonas syringae called ice-minus, had been deprived of the gene for a component of the bacterial surface that initiates the formation of damaging ice crystals on the plant leaves. This modification makes the organism comparable to the attenuated pathogens used as vaccines in humans. Moreover, similar ice-minus derivatives are found quite widely in nature (in smaller numbers than the ice-forming strain). For these reasons the ice-minus organism, used to displace its ice-plus parent, appears to be as safe a recombinant as one can imagine.
Nevertheless, environmental activists, encouraged by Jeremy Rifkin, succeeded in alarming citizens living in the neighborhood of the proposed tests. The resulting protests caused the tests to be forbidden successively in two growing seasons. When the tests were finally conducted, with stringent precautions, knowledgeable observers could be either disgusted or amused at the sight of the experimenters--forced to wear "moon-suits" while spraying the plants--standing a few feet from photographers plying their trade without protection from the presumptively dangerous spray.
This history illustrates the difficulties encountered when participatory democracy is applied to science. It is one thing for citizens to weigh in on policy decisions about conflicting goals, but quite another for the public to participate in technical analyses of risks and benefits. Unfortunately, the courts and administrative agencies have often shared or responded to scientifically unsound, distorted public perceptions. It is to be hoped that, with experience and familiarity, they will learn to apply common sense to the increasingly important science of genetic engineering.
This article is based on a chapter in G. E. Gaull and R. A. Goldberg, eds., The New Global Food System (John Wiley and Sons, forthcoming).
COPYRIGHT 1993 The National Affairs, Inc.
COPYRIGHT 2004 Gale Group