[MUSIC] So I had just asked you a question about what the conversations were like early on in your tenure on the rack. And you started talking about two kinds of containment and you went through physical containment. So I assume biological containment was next. >> Right, right. Biological containment had to do with kind of prospector systems would be used. And Roy Curtis was on the rack and was working very hard on developing whole specter systems that were very feeble. That couldn't escape the lab and couldn't populate the colons of people and cause them problems. So, there was hope that biological containment would make recombinant DNA research very safe, in that not many institutions would have to renovate their research facilities and make them into B3 laboratories. >> Right, because that was one of the major concerns early on, right. Using e coli. >> Yes, right. >> Which is resident in all of our guts, and would we really be making it antibiotic resistant, for example. >> So there was a very feeble strain of e coli called e coli k12. >> Mm-hm. >> That Roy Curtis developed and that came to be used in many of the recombinant DNA experiments. And it does seem to have stayed in the laboratory as it was supposed to do. >> Mm-hm >> What we didn't know about at the time, at least in the '76, '77 era. Was that the two researchers who had developed the technique of using plasmids to combine pieces of DNA that were not normally together. They had applied for a patent and had received a patent and this was all in the background, it wasn't announced that they had secured the patent until 1980 when the debate was basically over. Because I think that would have complicated the matter from the standpoint of the researchers. They would have worried that people who were anti-business like Jeremy Rifkin. >> Mm-hm. >> Would have ceased on that as one more evidence that this field really needed to be regulated carefully. >> Mm-hm. >> So that was in the background until the debate, the public policy debate was essentially finished. And if I may just comment on what, >> Please. >> Stanford and the University of California did, I think that they adopted a very reasonable licensing policy with their patent. They charged a $10,000 fee upfront for non profit organizations, including universities, and a $50,000 fee for private, for profit companies. And then they requested, insisted on a share of the profit from any inventions that came out of the work. It was a non-exclusive license. The technique was very widely licensed, non-profits could afford this. And so unlike the examples later on of Myriad Genetics tying up the test for BRCA1 and BRCA2 and charging what seemed like exorbitant fees. Stanford and University of California licensed broadly, and I think there just was not any resistance by the research community to the policies that Stanford and the University of California adopted. >> So I wanted to go back to when you were talking about biological containment and sort of an evolutionary perspective on the risks. Were there any evolutionary biologists on the rack at that time? >> No, we didn't have evolutionary biologists. So, any comment, About that kind of perspective would have come from the outside, from someone like Bernard Davis, for example who did think a lot in terms of evolutionary biology. I would say the big division in terms of the basic science was between the microbiologists on the one hand, and the molecular biologists on the other. The microbiologists were very familiar with working with infectious agents. >> Mm-hm. >> I think the molecular biologists had had a different kind of training and so in some ways, this was more scary to the molecular biologists than it was to the microbiologists. >> Interesting. And did that difference play out on the committee or just in the broader debate? >> It's a way of looking back at the discussion. >> Okay >> And asking yourself, who were the ones who raised the concern. What was the training of Maxine Singer, for example? Was she trained as a microbiologist or a molecular biologist? I think it was the second. And I think that probably the microbiologists would have been less concerned because they worked with dangerous pathogens. In many cases, in the lab, whereas the molecular biologists were just biochemists, basically. So this was just a whole unknown area for them and maybe it seemed more scarey to them. >> Interesting. So the scientists on the rack presumably had at least bought in to some degree to this idea of regulating this area of science. How did the broader community of scientists feel about the rack? Do you have any sense of that? >> Well, you used the word regulating. I don't think that they would have thought of it in those terms. What they thought of was, what kinds of experiments can go ahead right away? >> Mm-hm. >> With no trouble, and are there any kinds of experiments that we're less sure about, that ought to be deferred. >> Mm-hm. >> If not indefinitely, at least until we have a better database. >> Mm-hm. >> And so certain kinds of experiments were identified. Especially at. That really shouldn't be done right now until we know more about this field. And so what gradually developed and was a classification system that said that some experiments are so risky they shouldn't be done now, others are less risky but we're still not sure about them. So we need some biological and physical containment. >> Mm-hm. >> And then others are probably safe and they can go ahead with sort of standard laboratory procedures. But standard laboratory procedures varied a lot from one lab to another. >> Yes. >> And some people engaged in mouse pipetting for example. >> Yes. >> And others were kind of cavalier about eating lunches on research counters. >> Mm-hm. >> And so I think there was a realization that if you weren't sure about the risks, you ought to use the very best laboratory safety techniques. >> Yeah. >> I think what gradually occurred to the group is that what we thought of as rules for people who were getting grants from NIH were of interest to the congress. And also that any rules we developed for recipients of NIH grants did not apply to private companies that didn't get any NIH funding. So what were our guidelines supposed to do? Were they just rules for NIH funded researchers? Or did we want them to apply to private industry as well? And if so, what business did NIH have, saying what Eli do in Indianapolis? And that's, I think, where congressional interest really started to kick in. >> Mm-hm. >> That a funding agency should not be trying to regulate what could be a public health hazard. At the very least, it ought to be moved upstairs to the Department of Health Education and Welfare, which did have a public health mandate. But a funding agency should not be trying to set rules for research for anyone other than NIH grant recipients. >> Mm-hm, so at Asilomar, I mean one of the role of the lawyers, right, was to say, if you don't take care of this, Congress will. [LAUGH] But then Congress got involved anyway ultimately. >> Right, yes, well, I would say that '76 was the key year in terms of developing the guidelines and reviewing them publicly, publishing them. There was only one hearing as far as I, just for one day in '76 from Senator Kennedy. But there was a delayed reaction and by '77, there were at least 11 bills to try to regulate recombinant DNA research in one way or another. And I think at that point, one of the biggest issues was, how do you apply the guidelines that NIH has worked out through the rack to other spheres of the nation or the economy, including private industry? And that became, then, a big issue that had to be worked out between NIH and the Congress. At a certain point, I think Donald Fredrickson became determined to keep control of the recombinant DNA research field and its guidelines for NIH. He definitely did not want Senator Kennedy's proposal on a Nuclear Regulatory Commission for this field of reasearch. Even if that commission was a part of the Department of Health, Education and Welfare. It was going to have presidential appointments to the regulatory commission for recombinant DNA research. He did not want that. The research community did not want that. >> Yeah. >> But even in the area of how to get the guidelines to apply to private industry, what evolved was a proposal to invite companies to adopt the guidelines formally on the basis that this was a good standard of practice. And it was in their interest, both in terms of safety and in terms of public relations, to adhere to the guidelines and to formally commit to following the guidelines. >> Mm-hm, and did that resolve the tension between NIH and Congress, then? >> After a while, it did. And what was happening in '77, in particular, while Congress was proposing bill after bill to regulate the field, was that more studies were being done that seemed to show that the risks from recombinant DNA research were not as great as initially feared. So you had two things going on simultaneously. You had Congress taking more of an interest but the researchers pushing back and saying, look at this most recent evidence, members of Congress, this is not so frightening. You had the interesting phenomenon. In '73, people at the Gordon Conference were concerned about this fear, [INAUDIBLE] Maxine Singer and Deiter Zol, sent a letter to science expressing their concern. In '77 at the same Gordon Conference on nucleic acids, a majority of the researchers attending the meeting sent a letter to Congress saying, you're in danger of overregulating this field. And if you overregulate it, you'll stifle the field, and maybe other countries will move ahead when we're held back. >> This has become the standard argument. >> Yes. >> [LAUGH] >> That's right, that's true. There's always somewhere else, somewhere in the world that will do something if we regulate this field too tightly. By '78, there was very little interest in any legislation on the topic. And in '79, I think there might've been one bill. But '77 was the pivotal year in terms of public policy And I would say that Donald Fredrickson and NIH won. But in the process they basically turned NIH into a regulatory agency for this one sphere of research. >> Right. >> And it was a strange new role. For NIH, except that NIH had been really pushed into becoming a regulatory agency, on research involving human subjects and research involving animal subjects. >> Yeah. >> And again, in that area they were not keen to be the regulators, but they would prefer to be the regulators, rather than have someone else do it. And there was a long struggle there. And finally, the part involving research with human subjects was moved from NIH to the level of the Department of Health and Human Services. And I think, in some ways, that's the more appropriate level of oversight. It's kind of a conflict of interest to be funding and regulating at the same time. >> Although that's what they do with recombinant DNA research as well. >> That's right. That's what they were doing. And so the question is Is the conflict of interest so severe that the people funding the research aren't seeing real dangers? And I think as it turned out, there weren't real dangers from the research. So it worked out. But I think in the process and for that period of time NIH did become a kind of regulatory agency for recombinant DNA research. >> But they play that role for stem cell research now. For embryonic stem cell research, right? They developed the guidelines for embryonic stem cell research. In the same, so I mean they don't review proposals the way they're acted. >> Right. >> They review funding proposals but. They're the ones who developed the policy for what you can do with federal dollars with regard to embryonic stem cell research. >> Right, that's true, that's very true. And I think NIH has continued to have jurisdiction over animal welfare and animal research. So there are again, there's a mixture of roles. I do know from discussions with my friend, Charles McCarthy, that there were times when he was Head of the office that protected both, human research subjects and animal research subjects. That he told the director of NIH and the office of the director, that he was going to have to investigate a researcher or a group. And the people who were funding the science said, I hope you won't go after that person or that group because they've got lots of grants from us. And he would say, I'm sorry but they're violating the rules for protecting research subjects and I'm going to have to do it. So you could see the conflict occurring at times. I don't think the same kind of conflict occurred with respect to recombinant DNA research. Very few people were interested in going beyond. But the guidelines for recombinant DNA research, if only because of liability concerns, I mean if anything Catastrophic had happened they would have really been in deep trouble. On the other hand there was a kind of conflict of interest in terms of either scientists in general or RAC members in general lobbying Congress to adopt a certain type of public policy. Because, If you were receiving NIH grants you certainly didn't want to go against the direction that the NIH director wanted to go. And you probably didn't want to be critical of a field of research, if you were hoping to get additional funding. >> Right. >> For that area of research. So there was a potential for conflict of interest. For someone like myself training in ethics, the question was are you being co-opted by the research community. Are you maintaining your critical distance, are you trying to be a person of integrity, or are you just going alone because you too have a research grant from the National Library of Medicine? >> [LAUGH] Right.