Spring 2016 Bulletin

Consensus & Controversy in Science

On February 1, 2016, Randy W. Schekman (University Professor; Professor of Cell and Developmental Biology, University of California, Berkeley; Investigator, Howard Hughes Medical Institute) moderated a panel discussion on consensus and controversy in science with Jennifer Doudna (Li Ka Shing Chancellor’s Chair in Biomedical and Health Sciences; Professor, Departments of Molecular & Cell Biology and Chemistry, University of California, Berkeley; Investigator, Howard Hughes Medical Institute), Richard A. Muller (Professor of Physics, University of California, Berkeley; Faculty Senior Scientist, Lawrence Berkeley National Laboratory), and Pamela Ronald (Professor in the Department of Plant Pathology and the Genome Center, University of California, Davis; Director, Grass Genetics, The Joint Bioenergy Institute). The program, the Morton L. Mandel Public Lecture, served as the Academy’s 2031st Stated Meeting and included a welcome from Nicholas B. Dirks (Chancellor, University of California, Berkeley) and Jonathan F. Fanton (President, American Academy of Arts and Sciences). The following is an edited transcript of the presentations.


Randy W. Schekman

Randy W. Schekman

Randy W. Schekman is University Professor and Professor of Cell and Developmental Biology in the Department of Molecular and Cell Biology at the University of California, Berkeley. He has been a Howard Hughes Medical Institute Investigator since 1991. He was elected to the American Academy of Arts and Sciences in 2000.

I would like to start our program this evening with a discussion of the public’s acceptance of science, and in particular, the challenge of the reproducibility of the work that scientists publish. There have been quite dramatic examples, over recent years, of papers that present manipulated data, or conducted outright fraud and misrepresentation, in crucial topics of research, such as cancer biology and psychology. Individuals feel significant pressure to publish some of their most important work in a very select few venues that cater to exclusivity. These venues are often fed by a small number of published works and the journal impact factor, which every postdoctorate student knows to three significant figures.

I will use Nature as my example. Nature is a very important journal that has published some outstanding discoveries. In 1953, the discovery of the structure of DNA was published in Nature. However, there are also a lot of papers that do not make it into Nature. Of the ones that do, some of their results cannot be reproduced, or they might be the result of manipulation. Two years ago, we had a striking example of this: two papers were published in Nature amid great fanfare, claiming that adult human cells could be converted into embryonic stem cells by simply exposing them to low pH. These results immediately became a sensation, with coverage in The New York Times, The Wall Street Journal, and elsewhere. But within weeks, we learned that the results of this work were not reproducible; the data were manipulated, and there was evidence of plagiarism. The first author, although she returned to Japan as a conquering hero, was immediately humiliated by the experience. One of her senior co-authors was so embarrassed by the episode that he committed suicide. Some of this tragic fallout, I submit, is a result of the pressure that scholars feel to publish their work in these exclusive venues.

In another example, we can look at the situation in China. Consider award notices, published in bulletins by the Chinese Academy of Sciences, which claim that– and I have had them translated by my Chinese postdoctoral students– if you publish your work in prestigious journals such as Cell, Nature, and Science, you will receive the equivalent of U.S. $33,000 as a personal cash reward. There are some very successful Chinese scholars who have earned over half of their income based on this award system; it does not matter what you have published, only that you, as a scholar, made it into the rare pages of these few select journals. This creates severe distortion in the reward system that many of us grew up with in our scholarship. The reward for scholarship is effectively now reduced to the impact factor and the numbers of papers appearing in prestigious journals– the only papers that seem to count.

Inevitably, I think this has led to an increase in irreproducibility, particularly of papers covering vital topics. Several years ago, Amgen, a pharmaceutical company, conducted a study in which they highlighted the fifty most important papers in cancer biology that they would have used as examples for drug discovery in cancer chemotherapy. They found that many of the papers that were published in very prominent journals were irreproducible. In short: they generated data that could not be used to justify their investment in the development of new drugs to attack cancer. Dr. Glenn Begley, the former head of Oncology Research at Amgen, conducted interviews of some of the senior authors of the papers. In one report, Begley said, “I met for breakfast at a cancer conference with the lead scientist of one of the problematic studies. We went through the paper line by line, figure by figure. I explained that we redid their experiments fifty times and never got their results. [The lead scientist] said they had done it six times and got the result [published in the problematic study] only once. They decided to put it in the paper because it made the best story.”

Statistical anomalies, then, are not the dominant factor in affecting irreproducibility. The irreproducibility is a measure, instead, of the character and the influence of the very selective venues that young scholars feel that they must publish in, in order to succeed. I believe that this leads to a public mistrust of science. If the average non-scientist is reading in the newspaper that the National Institutes of Health is investing $30 billion to create drugs to treat what ails us, but that these studies cannot be reproduced, we have a big problem.

Some of us are trying to tackle this issue directly. For example, I am personally involved in the Cancer Reproducibility Project. I am an editor of eLife, an online open access journal; we have partnered with an outfit called the Center for Open Science, which has garnered a donation from the Arnold Foundation to study, commission, and contract laboratories to repeat the key experiments that were published in fifty of these very prominent studies. Unfortunately, the Amgen study that I referred to was not published because, according to one of the principals at Amgen, they didn’t find it of particular benefit for them to publish their negative results. It is also very difficult, of course, particularly with the very high profile journals, to have them accept and publish data and results that challenge papers that they have published.

But at eLife, we felt very strongly about commissioning, reviewing, and publishing papers that will attempt to reproduce key experiments in these fifty papers. This turns out to be a very expensive, very labor intensive effort. Ultimately, in order to try to capture the confidence of the public in the scientific enterprise, we must be willing, as scientists, to look very carefully at ourselves. We must be willing, as well, to challenge the work of others, and to find journals that are willing to publish negative results, or challenges to established well-known results.


Jennifer Doudna

Jennifer Doudna

Jennifer Doudna is Professor of Chemistry and of Molecular and Cell Biology at the University of California, Berkeley, where she holds the Li Ka Shing Chancellor’s Chair in Biomedical and Health Sciences. She has been a Howard Hughes Medical Institute Investigator since 1997. She was elected to the American Academy of Arts and Sciences in 2003.

I found myself in the field of genome editing after doing work here at the University of California, Berkeley, first in collaboration with Jill Banfield, and more recently with Emmanuelle Charpentier, who is based in Europe.

Genome editing is a technology that enables scientists to make changes to the DNA of cells, changes that are so precise that we can fix a single letter in the entire human genome that might give rise to a specific disease, like cystic fibrosis. We can use genome editing in much the way that you would use your word processing program to change a typo in a document.

This technology came about in both a remarkable and unassuming way, through a basic research project conducted to figure out how bacteria fight viral infection. My own involvement in this project started almost ten years ago when Jill Banfield contacted me about some interesting DNA sequences she was finding in data her lab was producing, from bacteria that were isolated in various interesting environments. What was discovered through her work, and then through the research of a few labs around the world at that time, was that bacteria that have sequences called CRISPRs, which are repetitive arrays of short bits of DNA that include sequences from viruses, have an adaptive immune system. They have the ability to acquire DNA sequences from their invading parasites like viruses, and they can then keep a permanent record of those sequences in the genome in such a way that these bugs can defend themselves against future infections by the viruses.

At the time, Jill wondered whether those bits of DNA acquired from viruses were actually operating in this immune system at the level of a cousin of DNA called RNA. RNA are little molecules that are able to form chemical interactions with DNA in a sequence-specific way, allowing them to find matching DNA sequences. We started investigating this, and that line of research eventually led me to collaborate with Emmanuelle Charpentier. We worked together to figure out the function of a protein that is central to this pathway; it operates as an RNA-guided, DNA-cleaving protein.

As it turns out, the fact that cells like our own human cells, including plant and other kinds, have the ability to detect DNA breaks and repair them in a precise fashion has been appreciated for several decades now. These discoveries were made through a great deal of research completed in other labs. By using an enzyme like the protein Cas9, a remarkable little molecular machine that can be programmed with sequences of RNA that match the sites of DNA sequences that we would like to change in the cell, we can trigger a break in the DNA that is then repaired in a precise fashion. If we zoom into a cell, right to the nucleus, we find the DNA is packaged in a structure called chromatin; the bacterial enzyme has to search through the entire sequence of the cell to find a site that matches the sequence in its guiding RNA. When that happens, the enzyme latches onto the DNA, unwinds it, and then makes a very precise break in the DNA sequence. At that point, the broken DNA is handed off to the repair machinery in the cell, and this leads to integration of new DNA sequences.

Once we understood the function of this bacterial enzyme, we suddenly appreciated that this could be adapted or harnessed as an effective technology for just this kind of genome engineering. This discovery, originally published in 2012 in Science, triggered research work in labs around the world, which began adopting genome editing techniques for all sorts of applications. The discovery has also led to a very important discussion about the ethics of using this kind of technology. What should we do, now that we have a simple, effective, and widely available technology for engineering genomes, including the genomes of human embryos?


Richard A. Muller

Richard A. Muller

Richard A. Muller is Professor of Physics at the University of California, Berkeley, and Faculty Senior Scientist at the Lawrence Berkeley National Laboratory. He was elected to the American Academy of Arts and Sciences in 2010.

I have been working in serious scientific research now for fifty years. And in that time, the public’s trust in science has declined and is at its lowest point right now. What went wrong?

I would like to give some illustrative anecdotes as a way to explain this decline in the public’s trust in science. The first anecdote concerns the Chicago Museum of Science and Industry. Seven or eight years ago, the museum conducted a survey. People outside the museum were stopped and asked, “Name a great living scientist you would like your son or daughter to emulate.” The name that came in first place was Al Gore. Tied for second place with Albert Einstein was Bill Gates. This is the public’s understanding of what it means to be a scientist; it simply means that you know how to talk about the sciences.

In 2000, I wrote a book on the scientific field of paleoclimate, and soon after, as I became more involved in global warming as a writer and a columnist, I was highly critical of Al Gore’s An Inconvenient Truth. The reason was that I did not know yet whether climate change was real or not. I did know, however, that what was being presented in An Inconvenient Truth was largely incorrect: the information was exaggerated, distorted, misleading, and, in some cases, just plain wrong.

I received so many questions about global warming that I decided I had to do some substantial reading on the subject. That led to some substantial research. My daughter Elizabeth and I started our own research organization to study global warming, which we named “Berkeley Earth.” After bringing in some other expert scientists, particularly in the discipline of data analysis, and nearly three years of research, we reached a conclusion: global warming is real. It is roughly of the same magnitude that everybody else has said it is and it is caused by humans. This was a solid scientific result. We found this result by addressing the issues that the skeptics were raising.

Let me provide another anecdote. In the physics department at Berkeley, there was a mix of Nobel laureates alongside ordinary professors, like me. Over lunch one day, one of the Nobel laureates says, “Global warming is in the news again. Everybody here agrees that global warming is solidly established, right?” Now this was before I did my research on global warming. Most of the people gathered meekly raised their hands, but I did not. He asked me, “Rich, are you not agreeing?” And I said, “Well, I’m surprised, professor, that you were convinced that the data choice bias was unimportant and that the poor siting of so many stations did not lead to a mistaken result.” He replied, “Oh, is there a problem with data selection for the data?” He was a Nobel laureate, and people listened to his opinions, but this opinion was not based on his own careful scientific analysis (as had his work that led to his Nobel Prize) but on informal articles by journalists.

I imagined that had I not objected, others would have cited the fact that this famous Nobel laureate had been convinced, even though I now knew that his conviction was not based on careful scientific study.

Afterwards, several of my younger colleagues came up to me and said, “Thank you.” There was the assumption that everyone would agree, that there would be a consensus even if you didn’t know what the scientific issues were.

Today, scientists find themselves using authority as their means of deciding on information, on conclusions. A Nobel laureate might pass on a piece of information, and another person says, “This was told to me by a Nobel laureate, and therefore it must be true.” I personally will tell you that global warming is real, but I will do so only after I did extensive, rigorous, difficult work.

Let me give you another example. A scientist, whose name you would likely recognize, invited Thomas Friedman to give a talk on campus. Friedman is not a scientist, and in my opinion he said many things that were absolutely not true. This prominent scientist sat, listened, made only positive comments, and then he thanked Friedman for his excellent talk. That was the end of it.

I went up to this prominent scientist afterwards and I said, “You know that most of what Friedman said is not based on science. It is either an exaggeration, or a distortion, and it is just not true.” The scientist answered: “Yes, but it gets our people excited. It gets them to work on the subject.” I responded, “They are going to find out eventually that these claims are not right. They will then feel they were fooled.” His response was, “Trust in me; this will work out well.”

I think we are now at the point at which the public has learned it was fooled about global warming. People watched films like An Inconvenient Truth, and they now come up to me and say, “What do you mean the polar bears aren’t dying because of receding ice? What do you mean the rates of hurricanes are actually going down?” These extreme misrepresentations are pushed by some scientists, because the public doesn’t respond as well to the less dramatic truth, such as the one that the Berkeley Earth team verified, that there has been 1.5 degrees Celsius of warming in the last 250 years. They respond to the specters of Hurricane Katrina and Hurricane Sandy, though there is no solid scientific link between global warming and these events. Ironically, in these scenarios, we are abandoning the methods and criteria of science because the problem is so important! It should be just the opposite. When the issue is critical, that is precisely when we need to practice science in its most disciplined way.

A final anecdote for you: I went to Washington, D.C., to raise money for our global warming study, because I wanted to address all of the issues raised by the skeptics, and to find out whether they were right or wrong. I talked to a top scientist working in the government. I was there to give advice on the status of global warming– what is known, what is not known. I described our (then) beginning Berkeley Earth project, in which we would reanalyze all the raw data, and assess the data selection bias.

The top scientist listened, then said, “This is perfect; this is just what we need.” “So, how much money can I get to support this work?” I asked. His answer was, “Don’t ask for any because you know we can’t fund you.” “And why not?” “If we were to fund you, then our opponents would say that there is still some skepticism left.” “But there is skepticism left.” “Yes, but we can’t say that publicly.” This is the terrible state of the public funding of global warming research right now. The subject is treated as settled, even though top scientists in government know better, and skeptics cannot get monetary support because of political impediments.

As scientists, we are partly at fault because we have not trusted the public; we do not trust that they will reach the right conclusions. We try to teach the public that they need to be told who to listen to, even though there is nothing more antithetical to science than an appeal to authority. We feel we have to put a spin on scientific facts, make them dramatic; in the case of global warming, we think we need to talk about drought, floods, and storms in order to get the public interested. In the end, it doesn’t work. I think the growing disbelief of global warming in the United States today is largely a result of the scare stories previously put forth, which are now seen as exaggerations.

We need to learn again how to trust the public. We have to earn again our reputations for being absolutely objective. We cannot put a spin on facts or tell the public how to interpret them. Our job is only to give information, and trust that the public is smart enough to take that information in the right way.


Pamela Ronald

Pamela Ronald

Pamela Ronald is Professor in the Department of Plant Pathology and the Genome Center at the University of California, Davis; and Director of Grass Genetics at The Joint Bioenergy Institute.

I am a plant geneticist, and I study genes that make plants resistant to disease and tolerant of stress. Over the years, many people have started to worry about plant genetic engineering.

Genetic improvement has been carried out for many thousands of years. Today, we are all familiar with modern corn. It yields about one hundredfold more grain than teosinte, the ancient ancestor of corn. I work on rice, which is the staple food for more than half of the world’s people. Rice yields are reduced by diseases, pests, and environmental stresses. For example, although rice grows well in standing water, most varieties will die if they are completely submerged for more than three days. This is a big problem for farmers and their families in South and Southeast Asia, where there are 70 million farmers that live on less than $2 a day. The Intergovernmental Panel on Climate Change predicts that not only will drought and heat be shifting around the globe, but also that flooding will become more problematic in places like Bangladesh.

I was fortunate to have a colleague, David Mackill at the University of California, Davis, who had been working with his graduate student Kenong Xu (now a professor at Cornell University) to study an ancient variety of rice with an amazing property. This variety of rice could withstand two weeks of complete submergence under water. For about fifty years, breeders had been trying to introduce this trait using conventional breeding practices. Because they were dragging in lots of different genes at the same time, they were not able to deliver a variety that farmers would adopt. David asked if I would help them isolate the gene because we had some experience in isolating genes from rice. I hoped that if we were successful, we could help improve the lives of millions of farmers, who would be able to produce rice once their fields were flooded.

Through ten years of research, Kenong Xu and team were able to isolate the gene encoding this important trait. We showed that we could engineer rice with a gene named Submergence Tolerance 1A, or Sub1A, and that the resulting plants were tolerant of two weeks of flooding. This was really exciting, but this was only a laboratory experiment; our main goal was to help farmers.

So, breeders at the International Rice Research Institute carried out a field trial, using a different type of genetic technique called marker-assisted breeding. The breeders made a time-lapse video that shows the results of one of their field trials that took place in the Philippines. Both the conventional variety of rice and the variety with the Sub1 gene grew well at first. However, when the field was flooded for two weeks, only the variety carrying the Sub1 gene thrived. In fact, through these controlled field experiments, the breeders were able to harvest threefold more grain from their field. Over the last few years, the Bill and Melinda Gates Foundation has helped to distribute this seed to farmers. Last year, three and a half million farmers grew this new rice variety and they were able to see a threefold to fivefold increase in yield when their fields were flooded.

When it comes to inserting genes from bacteria and viruses into plants, people tend to get queasy. I often hear, “Yuck! Why would scientists do that?” The reason is that sometimes this approach is the safest, cheapest, and most effective technology to advance sustainable agriculture and enhance food security.

Consider the case of a common caterpillar pest in Bangladesh; this pest can destroy an entire eggplant crop if it is not controlled. For this reason, farmers spray very powerful insecticides several times a week. When infestations are bad, they spray the crops twice a day. We know that some insecticides can be very harmful to human health, especially when farmers and their families fail to use proper protection. The World Health Organization estimates that three hundred thousand people die every year because of misuse of insecticides.

To reduce chemical sprays on eggplant fields, a team of Cornell scientists working with Bangladeshi scientists decided to try a genetic approach that builds on an organic farming technique. Organic farmers often spray an insecticide called Bt, Bacillus thuringeiensis, produced from bacteria. Bt is highly specific to caterpillar pests, but it is nontoxic to humans, fish, and birds; in fact, it is less toxic than table salt. This strategy, however, doesn’t work well for eggplant farmers in Bangladesh, because Bt is difficult to find, expensive, and doesn’t prevent the insect from getting inside the plant. With genetic engineering, scientists are able to cut the gene from the bacteria and insert it directly into the eggplant. After two years of field trials, researchers were able to ask farmers about the results. They have reported that they have been able to reduce their insecticide use, often down to zero. Further, they can save their seeds and replant each year.

People ask me how we can know, with certainty, that it is safe to eat new genes in our food. Genetic engineering has been used for over forty years– in medicine, in the production of cheese– and without much controversy. In all that time, there hasn’t been a single instance of harm to human health or the environment. And after twenty years of study and rigorous peer review by thousands of scientists, every major scientific organization in the world has concluded that the crops that are currently on the market are safe to eat, and that the process itself of genetic engineering is no more risky than conventional methods of genetic improvement. These are the same organizations that most of us trust when it comes to other scientific issues, like the safety of vaccines or the changing climate.

Each crop has to be looked at on a case by case basis. The FDA doesn’t use the term “GMO,” because it is so difficult to define. In my experience, the term “GMO” means something different to each person.

Instead of worrying about the genes in our food I believe that we need to focus instead on the three pillars of sustainability: the social, economic, and the environmental impacts. We must be sure that the poor have access to plentiful and nutritious food. We must ask if farmers in rural communities can thrive. We must make sure that everyone can afford the food. And we must minimize environmental degradation. We have a huge set of challenges in front of us. As a scientist, I believe that we should celebrate scientific progress and use the most appropriate safe technology to advance the goals of sustainable agriculture. It is our responsibility to use our discoveries to alleviate human suffering and safeguard the environment.

© 2016 by Randy W. Schekman, Jennifer Doudna, Richard A. Muller, and Pamela Ronald, respectively

To view or listen to the presentations, visit https://www.amacad.org/consensusandcontroversy.

Share