As the Academy continues to look at issues related to public perceptions of risk, uncertainty, and scientific research through its Public Face of Science initiative, it partnered with the University of Chicago to organize a public symposium on “Communicating Scientific Facts in an Age of Uncertainty.” The public symposium, held in Chicago on February 20, 2017, featured presentations by Olufunmilayo I. Olopade (Walter L. Palmer Distinguished Service Professor of Medicine and Human Genetics at the University of Chicago) and Arthur Lupia (Hal R. Varian Collegiate Professor of Political Science at the University of Michigan) and a discussion moderated by Robert Rosner (William E. Wrather Distinguished Service Professor in the Department of Astronomy & Astrophysics and Physics at the University of Chicago). The event, which included remarks by Robert J. Zimmer (President of the University of Chicago), Eric D. Isaacs (Robert A. Millikan Distinguished Service Professor and Executive Vice President for Research, Innovation and National Laboratories at the University of Chicago), and Jonathan F. Fanton (President of the American Academy of Arts and Sciences), served as the Academy’s 2051st Stated Meeting. The following is an edited transcript of the presentations and discussion.
Robert J. Zimmer
Robert J. Zimmer is President of the University of Chicago. He was elected a Fellow of the American Academy in 2007.
It’s a great pleasure to welcome you to the University of Chicago, and to welcome the American Academy of Arts and Sciences here for this event about communicating scientific facts in an age of uncertainty. This topic is of perennial interest, as it is always important to communicate about science and we are forever in an age of uncertainty. The current nature of the national discourse, however, gives the subject increased salience.
Among the several challenges in communicating about science that I am sure will be discussed during this program, there is an underlying, perhaps even foundational, conundrum we need to confront: how do we communicate about uncertainty? Each of us understands that various forms of uncertainty are intrinsic to the entire scientific process. But communicating with a public audience about uncertainty in science seems to be particularly difficult.
Let me illustrate by considering one of the most natural and familiar of scientific questions, one of intrinsic public interest: does what you eat affect how long you can expect to live, your health, and how good you feel, and if so, how? Now, we all know the answer to the first part of that question is yes. But consider the ways in which we describe work connected to the second part of the question. This is presented as a scientific finding, reached after a long study involving many people over many years, arriving at a definitive conclusion about what you should be eating. Usually, the only uncertainty that is implied is an acknowledgement that the conclusions are statistical – you are x percent more likely to live n years longer if you focus your diet according to some particular guidelines. Sometimes, the additional uncertainty of confidence intervals is mentioned. However, some years later there is a new scientific finding that asserts the discovery of this previous finding was not quite right, perhaps even all wrong. Scientists automatically recognize that it could be that the original work was flawed, but alternatively it simply may be that one has newer techniques and technological capacity, can ask and address different questions, has other related discoveries at hand that were not previously available, all of which may have contributed to reaching a different conclusion. But this understanding of intrinsic uncertainty in what we call scientific findings is something that we do not often communicate to a public audience.
Nor is it easy to do so effectively. In a world in which people want simple answers, in which they are most comfortable with yes or no and black or white, and are surrounded by a discourse at all levels from the time they are young that gives statements in this form, including by scientists themselves, how can we actually communicate this intrinsic uncertainty of science? Without it, one falls prey to the difficulty that when the next diet report comes out nobody actually believes it unless they want to, because it contradicts the last one, and everybody knows that another one’s coming out in the future. Now this may be an extreme example, but this phenomenon, sometimes in milder form, actually appears in many scientific issues of public interest. If you fail to acknowledge uncertainty, you risk undermining confidence in the whole enterprise as results are regularly “overthrown.” If you do recognize uncertainty, how do you indicate that it is still a scientific finding, rather than just another guess? How to communicate effectively to a general audience about this type of uncertainty is to me an unresolved conundrum.
I offer these thoughts as an introduction to our topic, which I believe is important, fascinating, and complex.
Eric D. Isaacs
Eric D. Isaacs is the Robert A. Millikan Distinguished Service Professor in the Department of Physics and the James Franck Institute at the University of Chicago. He is the University’s Executive Vice President for Research, Innovation and National Laboratories.
I am a physicist, a former director of Argonne National Laboratories, and the Executive Vice President for Research, Innovation and National Laboratories at the University of Chicago and I don’t believe in facts. See I’m a scientist, and as scientists, our goal is to engage in what is real, regardless of our belief systems. We look at evidence, we develop hypotheses that seem to explain the evidence, and then test those hypotheses. If belief is involved in this at all, it is my belief that science and the scientific method are the most reliable means that we have to understand our world. So we talk about science, and we talk about public policy, and the seemingly increasing disconnect between the two that we really need to understand. It’s not hard to understand why some people are frightened or anxious: the ease of access to information makes it easy to feel as though the world is entirely out of control. And people want reassurance; they want to hear that they can protect themselves and their families in a world that seems increasingly complex, chaotic, and dangerous. And they seem to be willing to listen to those who can offer them that reassurance and a few simple solutions. But that’s not a message that we as scientists are really able to give the public. The best we can do is what we do really well: cite our research, give evidence, and talk about probability, or the uncertainty in the results that we share with the public. As scientists, we think that should be enough.
Many of us fly many thousands of miles a year with the knowledge that we face, on average, a one-in-eleven-million lifetime chance of dying in a plane crash. And then, most of us spend a good chunk of that time in flight trying to figure out how much our frequent flying is actually affecting our individual cost, our individual risk. But the individual sitting next to you, the one who breaks into a visible sweat and grasps your hand when the plane takes off? He or she is unlikely to be comforted by safety statistics.
Scientists embrace uncertainty and complexity: we use the scientific method to explore and explain fundamental scientific concepts. But this can be at odds with the way that many make sense of complexity using prior beliefs, personal biases, and powerful narratives that have been repeatedly told to them. And these shortcuts can really congeal the kinds of policies from scientific evidence that we’re seeing today. So here is the essential question, the one that may decide the future of our country, our nation, and our planet: how do we talk about science in a world that is shaped by belief and yearning for certainty? This is not a partisan issue; there are people on all points of the political spectrum who question the impact of climate change on the world and who also, at the same time, question the safety of childhood vaccinations. So the answer to this question cannot be found in partisan politics. It is also not an issue of intelligence: there are plenty of smart, well-educated people who have accepted ideas and policies that are not supported by scientific evidence. So the answer, as hard as it may be for us to accept, cannot be found exclusively in education and outreach. It is really hard to beat emotion solely by reciting evidence; you can’t sooth the anxious air passenger next to you by telling him or her not to worry, that he or she is more likely to die in a car crash on the way home from the airport than in the airplane itself. So we need to change the conversation. We need to understand the worldview of people who are rejecting science-based public policy. What are their concerns, what are their fears, and what can we offer them that will meet their needs? We need to find strong emotional arguments. It’s hard for scientists, but it is really powerful, and something that I don’t think we do enough of. I also think we need to find ways to talk about uncertainty without creating anxiety. And to understand that the words we use aren’t heard in the same way we understand them by audiences that need to be persuaded. We’re speaking in a language that often isn’t what our audiences need to hear. So can we do this? I really believe we can.
Our panelists today will explore these topics further, sharing more about their scientific work, the controversies they face in working with patients, the public, and those who are shaping our policies, and the strategies and arguments that they have found persuasive.
Robert Rosner is the William E. Wrather Distinguished Service Professor in the Departments of Astronomy & Astrophysics and Physics at the University of Chicago, as well as in the Enrico Fermi Institute, the Computation Institute, and the Harris School of Public Policy Studies. Elected a Fellow of the American Academy in 2001, he is Codirector of the Academy’s Global Nuclear Future Initiative.
Nothing illustrates the challenges of communicating science and scientific fact to the public better than the many public controversies over science-related topics. And while some of the controversies mentioned earlier, like climate change and vaccines, are contemporary, these kinds of controversies have a long history. The Greeks debated whether the Earth was round. During the Renaissance, in the debate around whether the Earth orbits the sun, Galileo paid a great personal price for his support of Copernicus’s heliocentric model of the solar system: he was brought before the Inquisition and put under house arrest for the remainder of his life. There are many other examples. Is evolution real? Is fluoridation safe? Is homeopathic medicine real or a fraud? What are the health effects of power lines, or in general, of electromagnetic fields? What is the role of vitamin C in cancer? Are GMOs safe? All of these are examples of controversies that have their roots in science, but have entered the public realm. But science, as we all know, is replete with controversy. There are many more controversies that do not see the light of day in the public realm; only a tiny fraction of scientific controversies become public. Why some and why not others?
Now, some cases you could argue are inherently unavoidable for reasons that are basically extrinsic to the science itself. For example, in some cases, economic interests intrude – the obvious example is smoking. Think of the efforts of the tobacco companies in funding research that tried to deny that smoking was a health issue. This is also true in the case of climate change: some fossil fuel companies, in particular the oil companies, have sponsored similar kinds of research to protect their financial interests. Then there is the theological side of things. Galileo, who I mentioned earlier, had a formidable opponent in the Catholic Church. The same happened with evolution. In that case, it was predominantly Protestants in the South who supported the Scopes trial and the conviction of John Thomas Scopes, the teacher who dared to discuss Darwinian evolution. But what about the rest? What about the scientific controversies in the public realm that are not driven by concerns like theology or profit? And is it the case that, somehow, we are not figuring out how to communicate our concerns, as Bob Zimmer said, and the uncertainties in our results, in a way that the public can accept and understand? The other side of the coin is, in fact, the social science issue: to what extent do people, when confronted with facts that seem to be opposed to their deep-seated beliefs, actually change their mind? Do they change their mind?
Olufunmilayo I. Olopade
Olufunmilayo I. Olopade is the Walter L. Palmer Distinguished Service Professor of Medicine and Human Genetics, Dean for Global Health, and Director of the Center for Innovation in Global Health at the University of Chicago. She was elected a Fellow of the American Academy in 2010.
I’m going to speak as a scientist and also as a medical doctor. I happen to work in genetics, one of those areas that’s actually quite controversial. Controversial in the sense that we know that genetics is not deterministic, and yet, any time you hear geneticists talk about the importance of their work, it’s about how they cloned a gene, it does XYZ, and, as a result, if you have an alteration in this particular gene, you may get cancer. And so, when I started my career at the University of Chicago, I really wanted to understand genetics a little bit deeper, because my mentor had spent her career arguing with her colleagues about the importance of genetics in cancer etiology. At that time, everyone thought chromosome alterations in cancer were epiphenomena. Everyone was convinced that cancer was caused by what we had in our food or the environment, and so it possibly had nothing to do with genetics.
Pierre Paul Broca, the famous French scientist, observed generations of women in families who developed breast cancer. In 1866, he wrote about it. But the process took generations, in fact it took nearly twenty-five years for modern day geneticists to find families like Broca’s to finally map and identify the long arm of chromosome 17 as the locus for the BRCA-1 gene. At the time, Mary Claire King and Francis Collins really pushed to map disease genes that explain the genetic bases of diseases. Then, scientists went to Congress to ask for $5 billion to execute the Human Genome Project. The proposal was basically: Why map genes one at a time? Why don’t we just find all of them? The assumption was that each of us has maybe five or more genes that are altered and could potentially cause disease, and if we could identify those genetic mutations, then we would certainly get to predictive and precision medicine.
One important aspect that came out of the Genome Project was remembering that genetics had, in fact, been used for bad purposes in the past. Eugenics, of course, arose from physicians who actually believed that you needed to get rid of people who were not “fit.” So, in the proposal to Congress, 5 percent of the Human Genome Project budget was devoted to studying the ethical, social, and legal implications of the research we were about to do. I remember participating in interdisciplinary groups that came together to think about these issues. As doctors, we had initially just said why not? Let’s find the harmful genetic mutations and figure out a way to cure genetic disorders. Then, we met with religious leaders and social scientists who asked why we assumed that’s what people would want. That was really humbling early in my career, when I had thought that if we just found this gene, we could actually prevent people from getting breast cancer. And it turned out it was the wrong answer. Not only are there many reasons why people want to live with whatever variations that they have, it’s also not realistic to expect that we’re going to get everyone to a perfect state of health. And it turned out that, as we did more studies, our knowledge about the Human Genome was incomplete: we couldn’t have just one Human Genome Project, because each of us has our own personal genome. It cost us $5 billion to clone and put together the first map of the genome, but the more we’ve studied, the more we realize that there’s still so much dark matter in our genome that we don’t yet understand.
So it’s been twenty years since we identified BRCA-1 and we’ve successfully translated the science in the clinic. We can now test for a large number of cancer susceptibility genes, but we don’t know nearly enough to guide every patient through the process. Even though we thought we could use genetic testing to predict who was going to have cancer and who was not, it’s still probabilistic, not definite. How do you communicate cancer risk when we don’t even fully understand how these genes function in a complex human being? As scientists we need to learn how to help patients when we can, but also realize our limitations when we don’t have all the information.
How do we appropriately communicate risk? I have come across many situations in families segregating breast and ovarian cancers when individuals will choose not to know or accept our scientific explanation. Based on personal beliefs, some families will reject inherited genetic mutation as causal, convinced that all the cancers in the family were related to environmental exposure. There are also instances where families are afraid to come forward, because they are concerned about stigma, insurance discrimination, or other social aspects that scientists don’t always consider. Thus far, we have been unable to deliver on the promise of improved population health from our investments in the Human Genome Project. This is just one example of how we can promise too much, and unless we deliver in full to the public, they’re not going to be willing to accept the fact that these discoveries, limitations and all, can actually improve human health. Scientists need to do a better job of not only doing the discovery, but actually talking to people about how the results of their research would actually impact them. As a physician who always tries to talk to families, I have come to understand that there are individuals who simply don’t want to know, and that’s their right. And there are others who, even if you tell them that they don’t need to know, still want to know. And that’s really the beauty of human beings, right? We’re unpredictable.
My second example involves the Human Papillomavirus (HPV) vaccine. We can eradicate cervical cancer as a leading cause of cancer deaths in women globally by vaccinating young girls before the age of eleven. The former governor of Texas had planned to get every student in Texas vaccinated. And then the pushback began with people thinking that vaccinating children against HPV was akin to giving them permission to be promiscuous. And that was the end of it. So now the United States is one of the countries with the lowest rates of HPV vaccination, even though we have scientific evidence that shows you can essentially eliminate cervical cancer through herd immunity by vaccinating the majority of young girls in a community. And what’s one of the fastest growing cancers in men in this country? Head and neck cancers related to HPV. Of course, in order to have herd immunity, we need a certain proportion of the population to be vaccinated. But a lot of families aren’t adopting HPV vaccination because of their religious beliefs. So our vaccination rate is about 37 percent, whereas in other countries it’s significantly higher, such as in Australia, where it is up to about 90 percent. Unfortunately, this vaccine that can essentially eliminate cervical cancer can only save lives if it is widely accepted and adopted.
We have such a well-educated population, and we’re making such sophisticated scientific discoveries in this country, and yet so many of us now doubt that well-proven and long-standing techniques, like vaccination, actually work. Or rather, some believe that they actually cause more harm than good. And I think we, as physicians, have wasted some opportunities to educate the general public. Because in the early part of the twentieth century, children were becoming paralyzed from polio and dying of measles or chicken pox, and the average lifespan was less than sixty years. But now that people are living longer, we’ve somehow forgotten what it used to be like. So it’s up to us also to convey the importance of our work, including these enormous past successes.
Arthur Lupia is the Hal R. Varian Collegiate Professor of Political Science at the University of Michigan and Chair of the National Academy of Sciences Roundtable on the Communication and Use of Behavioral and Social Sciences. He was elected a Fellow of the American Academy in 2007.
I work on how people make decisions when they don’t know very much, which, conveniently, is pretty much always. I also work on how to convey complex ideas to diverse audiences. I apply this research in my work with many different science and public service organizations on developing, implementing, and evaluating communication strategies. As a result of what I have learned in that work, I’m now bilingual: I am fluent in “Democrat” and “Republican.”
In the context of science communication, I think we all agree that science’s insights continue to transform our lives: they improve quality of life for people around the globe by helping individuals and governments make decisions that improve health and reduce disease. Science’s insights grow the economy by increasing the effectiveness of factories, offices, and farms.
Given the range of science’s influence in the world today, you would think that science’s future as a generator of social value would be very bright. But it doesn’t feel like that, does it? A lot of people feel that science is under attack. People are nervous and they’re scared.
Many people believe that our continuing capacity to understand our world and make life better for present and future generations depends on science. They want to know what they can do to help science continue to provide great value to society. For these people, I want to point to two challenges in attempting to communicate scientific facts in an age of uncertainty. One of these challenges is called motivated reasoning. And the other has to do with increased competition for attention and influence. Let’s talk about motivated reasoning first.
Motivated reasoning is a way of processing information. But I didn’t initially believe in this concept. I have some background in mathematics, and when I was in graduate school, I wanted to believe that people are efficient information processors: that they would take information, evaluate it by its accuracy, and then move forward accordingly. Unfortunately, almost all of the evidence suggests that people don’t process information in this way.
Motivated reasoning, a term from psychology, describes a different way people process information. It leads some people to love science and leads other people to work very hard to deny science’s relevance or value. So how does motivated reasoning work?
Motivated reasoning is the practice of paying attention to and seeking to inflate the importance of information that’s consistent with a point of view you already hold. Motivated reasoning also entails ignoring or minimizing the importance of information that challenges your existing views. A less technical term for this phenomenon is hearing only what you want to hear.
Motivated reasoning affects a lot of people, and this next point may be controversial, but that group includes you. Suppose you are a liberal who identifies with Democrats and regularly votes for them. And suppose this evening I were to ask you to welcome our next speakers, Ann Coulter and Ted Cruz (who, by the way, are both highly educated). Would you be willing to listen to them tonight with an open mind? And for conservatives, would you listen to Rachel Maddow and Elizabeth Warren with an open mind? Or, in either case, would you only think about how “they just don’t get it?”
This is a problem for all of us. I work with a lot of groups, and in one climate communication organization, our goal, and I apologize if I offend anyone with what I am about to say, is to keep the right from lying and the left from exaggerating – and try to be accurate in terms of what science really knows. In each case, we are trying to counter effects of motivated reasoning in how people process information about climate.
Now let’s turn to the topic of competition. More people than ever are using the Internet to distribute information of all kinds, leading to a hyperintensive competition for attention and influence. This avalanche of content has changed people’s expectations about the kind of information they can get, the kinds of information they should be able to get for free, and most important, the kinds of information they can trust. When we put scientific information on the Internet, it’s important to know that not only are we dealing with people who are motivated reasoners, but we’re also competing with all of the other content on the Internet – I’m talking about cat videos and Pokémon, which many people find highly entertaining. For science information to educate a particular group of people, that group has to decide to access our content and then stay with it long enough to learn from it. With that competition in mind, the question becomes, why would they do that?
In Uninformed, my most recent book that Oxford University Press published last year, I point to the importance of providing information that is not just factually accurate, but conveyed in ways that people will want to hear. This means rigorously pursuing the intersection between the content of your science and the problems that are most important to your target audience.
If I could tap into someone’s core concerns, that is, the things they worry about when they go to sleep at night or when they look at their children, and find a link between my science and those core concerns, I would have a better chance of speaking to a receptive audience. But if I speak in abstractions that they can’t access, the likelihood of me losing the battle for attention to cat videos or Pokémon is great. And this isn’t because people have bad character.
One of the reasons we can’t help using motivated reasoning is that our attentive capacity is so profoundly limited. But our brain has great ways of tricking us to think we perceive more than we know. We pay attention to very little of what we’re exposed to, we remember very little of what we pay attention to, and we use very little of what we remember. So motivated reasoning is a way for us to try and make sense of the world. And it’s only in rare moments that we counteract that instinct. Most of the time we’re looking for information that will make us feel good.
With these challenges in mind, some of us want to communicate scientific information in an age of uncertainty to improve people’s lives. And that means improving decisions. When you walk into an area like policy, the competition for science isn’t just cat videos, it’s other ways of knowing. In policy and in other decision contexts, there are four ways of defending a claim to know something. These four ways are collectively exhaustive, but not mutually exclusive.
One way of knowing is appeals to metaphysics. That is, we say that there are phenomena beyond our ability to perceive and that these phenomena affect what is true and what is false, what is good and what is bad. There are some people in our society who claim to have special access to these phenomena and there are others who rely on them for guidance about what is true and what is good. This is a very powerful way of knowing throughout the world.
A second way of knowing is appeals to personal experience, or testimony. That is, a person tells you about what they saw or how they felt at a particular moment. They describe their feelings and testify to the validity of what they felt. These personal narratives are a common way of trying to help other people understand some things about the world.
A third way of knowing is what I’ll call the space between God and man. We could call it culture. This is history, art, and other pursuits that take elements of the past, represent them in a digestible way for audiences in the present or future, and say that because this happened in the past and we interpret that past in a certain way, we can now know something important about the present or future.
I would like to describe the fourth way of knowing through an analogy. Suppose I have a device I can hold in my hand, and at one end there’s a big red button, at the other end there’s a green light, but it’s not lit. When I press the button, the light turns green. And someone might ask, well what happens when I press the button? If I have conducted my science in accordance with best practice, I can answer that it turns green, and that it does so regardless of who presses the button. In other words, when science is done in accordance with best practices, when we’re rigorous about how we choose cases, how we categorize what we observe, how we analyze what we categorize, and how we interpret what we analyze, we create knowledge that’s true regardless of theological commitments, personal experiences, or cultural connections. That is the power of science. That is what gives science special ability to improve policy. In some cases, science is our last and best defense against wishful thinking.
In a complicated world, there are moments when we need that kind of knowledge, but we have to understand that the other three ways of knowing don’t disappear. In fact, there are many moral and ethical questions for which the other ways of knowing offer better guidance about what should be done. Sometimes in science we forget about the power of the other ways of knowing. When we present ourselves as having the only way of knowing, particularly in policy contexts in which other people are smart enough to recognize our error, people trust us less and we actually do damage to our ability to use science to improve quality of life in situations in which other ways of knowing are useful.
So when scientific claims are competently conveyed – that is, with respect to the core interests of an audience – the target audience is more likely to be interested in what we have to say. And when we are true to the scientific method, we can use that moment to convey information that is true for them, true for other people, and can serve as the basis of a stronger society.
As a question regarding other ways of knowing, I would like to bring up the case of vaccination. In 1998, The Lancet published an article by a physician named Andrew Wakefield who claimed that there was a connection between vaccination and autism. That paper has since been debunked and retracted. Yet it remains true that there are those, including a son of Robert F. Kennedy, who simply do not accept that there is no connection. So the obvious question is, why is it that a presumably highly educated person would be so motivated to hold onto a belief that’s directly contradicted by scientific studies?
Olufunmilayo I. Olopade
I think one of the challenges of really dealing with human beings who have suffered, whether it be a loss or a disease, is understanding that they’re looking for explanations for what went wrong, they’re looking for something to hold onto. And so, autism is on the rise – we know that – and there are many families who are experiencing the challenges of raising autistic children. And because we don’t always have a reason to explain a phenomenon, patients who are suffering will find their own explanations. Cancer is another subject that we can’t sufficiently explain that has motivated a lot of people to actively search on the Internet for answers. And people are comforted by the answer that best explains what they think has just happened to them or their loved ones. Grief turns into activism and advocacy. In some cases, those advocates have actually helped us to advance science communication. But the challenge that we face is assuming that this way of thinking is not legitimate. I’ve been challenged by advocates who absolutely would not accept the fact that all the cancers in their family were caused by genetics, because they’re holding onto something else. The motivations, the cultures, and the beliefs that people hold onto are not only informed by science, they’re also informed by their lived experience. And until we have definitive answers, there will be alternative ideas out there. Our job is to keep looking for solutions to alleviate the suffering, and reduce that need for answers.
To add to that comment, sometimes people who believe in the power of vaccines inflame the controversy by using the term antivaxxers and then going out in public and attacking the folks with these other beliefs. That action in and of itself heightens the idea that there’s a controversy on the subject of vaccine effectiveness and safety. And so, people know that autism is frightening, they’re afraid for their children, and now they are being reminded that there’s a controversy. In some cases, people feel backed into a corner, and that’s when motivated reasoning really kicks in.
Dan Kahan at the Yale Law School has a cultural cognition project. He shows that these are really instances in which people no longer hear the debate, they just kind of dig in on their side. The best way to make progress is to focus on the benefits of the vaccine, to talk about how horrible the diseases are, and try and create a channel through which people can refocus on the benefits of vaccination. If you spend so much of your time participating in controversy and repeating false claims in an effort to debunk them, you’re fighting against yourself.
And who are the right promoters in society of this kind of message that focuses on the benefits of vaccines? Is it the scientists themselves?
Olufunmilayo I. Olopade
This is one of the reasons why scientists should get out of their bubbles. Scientists need to interact with real human beings. If you can’t communicate your science to somebody who’s totally clueless about your work, then you are actually doing yourself a disservice. When you publish a paper and you think it’s the definitive result, and you communicate it in that way, then you have not done the public a service. We need the humility that requires us to be suspect of our own findings. We can’t just hype the results. And as exciting as immunotherapy for cancer may be, for example, we can’t overplay it and set impossible expectations. If we want the public’s support, we need their trust. But I do think the American public supports a lot of science. Now we need to show our appreciation by going out and talking to more people about it.
Scientists aren’t always the best messengers. One of the things that an audience has to do is trust you, because if they don’t trust you, they can’t hear you. Where does trust come from? One of the places it comes from is a perception of shared values. So, if we are looking down our nose at an audience, they usually pick up on that. When I was working with climate scientists about fifteen years ago, one of the things we talked about was the importance of religious leaders in the climate sphere. There were some pastors, particularly in the evangelical faith, who talked a lot about the responsibility of their congregation for stewardship of God’s creation. And if they could reach their congregation by reminding them of this responsibility, something the congregation already firmly believes, and if the pastor can connect that shared belief to a call to protect the environment, that would be much more effective than a scientist giving an abstract lecture on global climate change.
With regard to the issue of the whole process being distorted by somebody deliberately trying to affect an audience’s reasoning, such as the tobacco industry funding research to protect its own business interests, what do you recommended for communication strategies in that particular case?
The initial reaction to a piece of information is almost always emotional. To the extent that there is any rational or cognitive element, it happens later. And usually it doesn’t happen at all. We react, we have a feeling, and we move on to the next thought. In terms of how to get people to engage a scientific view of climate, here’s an analogy I like to use: imagine a tall man and a short man in a wrestling match. You might think that the tall man, because he’s much bigger, has the advantage. But in wrestling, the smaller man, closer to the ground and crouched down, can bring the bigger man down. The same is true when you enter a debate with someone assuming you have all the facts and they don’t. In terms of communication, you have to get close to the ground, down to where the core problems and concerns of the public are. Often scientists will tell the abstract story first, and then later on we move to how it can directly affect someone’s life. But the way to approach an audience is to connect the information to their core concerns first and then move to abstractions later, if you can get them sufficiently interested. This approach means beginning by asking people, What about the health of your children? What about pollutants in your water? What about something you care about directly in your life? Wouldn’t you like to know how to protect it? If you tell that story, some people will want to know more, and then you will have an audience. Most of the time, we tell the story backwards, and we turn off people who would listen to us otherwise.
This is very similar to a related problem: selling science in general, not just the big problems you’re talking about. If you’re trying to sell material science, for example, which is a fundamental science that includes the discovery of new materials, why should I care? You could just tell me to pick up my iPhone, which uses materials discovered thirty years ago. But most scientists want to avoid that approach because it starts with the promise of a miracle. It is harder to express, though, that we don’t know what the outcome will be, or if we’ll be able to overcome the challenges anytime soon. That isn’t necessarily going to inspire a nonscientist.
And when I talk about strategy, I’m not talking about dumbing it down; I’m talking about us smartening up about how we convey what we know to other people. Most of us in science are trained to talk to folks who study almost exactly what we study using some of the methods that we use. We call them “ideal reviewers” or “the ideal conference panel discussant.” But when we want to reach other people, it’s an away game. Most everything interesting about an act of learning happens between the other person’s ears. So, we have to start with narratives and facts that they can hear. And then we can build understanding that way.
Returning to the autism and vaccination issue, as an example, do you think that a way of reaching the resisters could be to present the problem in terms of which is the worse, or the least bad decision? What are the chances of developing autism versus the chances of the disease you’re vaccinating against? What are the relative harms of the two? Would that be a way of addressing the issue, and persuading people to look at it in a little bit more open-minded way?
Based on the research that I’ve done, I think that most of the people who are currently skeptical of the vaccines/autism link would not be interested in that conversation. It’s like asking them to make a tradeoff between a cow and magic beans. The autism is the cow; they know what it is, they’ve seen children with autism. They fear autism. The other diseases, like measles, are abstractions, the magic beans. They can’t really imagine them. A better strategy is to have human scale stories about the effects of these diseases and circulate them. But for the folks who have already made the decision against vaccination, it’s likely they’ll never get the statistics, because autism is this real scary thing that they want to avoid pretty much at all costs.
Olufunmilayo I. Olopade
Another thing to consider is that everyone can post their opinion on the Internet. There are communities of people chatting online independent from so-called experts, and now, in medicine, most patients come in having already done some online research and may even have an opinion about what you’re trying to get them to do. So I think the conversation that scientists ought to be having is how to get our voices out there more often, including into these online forums, so that we can amplify the voice of science. People are already having these conversations and recruiting others to their views, so how can we expect, with one publication, to get all of them to come to our opinion? There’s a deep pool of ideas that are out there, and you need to compete. Scientists can’t take for granted that because we have done the research that everyone is going to accept it as the gospel truth.
We as scientists focus on how best to portray facts and convince people of facts. Whereas social scientists have looked at this question of whether rhetorical prowess alone is more important than if you’ve got your facts straight. Your ability to present may be as important as whether or not you have the accurate portrayal in a debate. I’m curious what you would say about that.
I think that it’s overstated just a little bit. We can all think of an example where somebody used rhetorical prowess over fact. And we assume it must happen all the time. But in fact, most efforts to convince people of most things fail. Most communication campaigns fail to have their intended effect. But if we’re talking about something and it’s consistent with a person’s lived experience, they’ll accept it. If we talk to a person and we agree on what the problem is, and on what a good and bad solution is, and there’s an idea that science can help it, then there’s a lot of interest in that science. Where the interest starts to fall apart, or where the doubt comes in, or where rhetoric can really have an effect is when we don’t agree on what the problem is, or we don’t agree on what a good or bad solution is, because the problem could be primarily moral or ethical. That’s where someone could use a rhetorical flourish to move a conversation away from facts, and persuade through morals and ethics. But if we agree on what the problem is and we seek a more technical solution, it harder to do that.
Thinking about alternative sources of information and these online communities, it seems that the numbers aren’t in our favor. Even if we’re very, very skilled scientists, and know how to tell stories and listen, we are outnumbered. So often, the most trusted people in our lives are our friends and family, whereas even if scientists do a good job of trying to talk with people, they actually don’t rate among the most trusted voices. So I’m not convinced that asking scientists to communicate in a different way is going to get us anywhere here in this new world. What are some of your ideas about that?
I’m not certain that we’re outnumbered on the science. Let’s take climate change, for example. I think the most reputable polls on climate change will say that 80 to 90 percent of self-identified Democrats and 50 to 60 percent of self-identified Republicans endorse the two basic propositions that climate change is occurring and that it’s human-caused (and the partisan divide largely happened after Kyoto, for what it’s worth). But where things break down is not on that, it’s on the moral and ethical question: what should we do about it? Some very well-informed conservative legislators accept the two propositions, but don’t want to do anything about it. In private, they will tell you that they understand the science, but publicly they can’t “go there,” because once they do, they’ll be pressured to do something. So in public, they say, “the science is uncertain, so we can’t act.” And in the case of vaccines, there is a very small segment of the population who connect vaccines with autism – it might not even be above 20 percent. So, in that case, these are huge victories in terms of science.
Our problem in the scientific community is that we want it to be 100 percent, and we don’t understand why it isn’t. But there are many cases in which science conflicts with other things that people want to believe. And so we’re not going to get 100 percent on these issues, particularly if we’re trying to influence policy. What I am trying to do with my work is to help scientists not slash our own tires before we leave the driveway. A lot of times, we make it worse for ourselves by not knowing how to talk to other people.
Olufunmilayo I. Olopade
It’s also important to cultivate the culture of the scientific method early on in life. Let’s not start talking about science only when you get into high school or the university; we need to develop a scientifically literate citizenry, and that has to start early. Unfortunately, I think that the vast majority of the population live in a place where they can’t access scientific discoveries, and if they can’t access the information, they can’t believe it or experience it. And that lived experience really influences a lot of decisions that people make.
I’m wondering how much of this is just, perhaps, that society has changed? Particularly American society. At one point post – World War II, science was a hero, science was going to solve all of our problems. Now, people aren’t so sure. We need to create a society in which our children, young adults, and adults are good consumers of information, so they can make decisions and understand that there is a government process and a scientific process. Can you comment on that?
I would like to respond in two parts. First, one of the fundamental things we have to understand in science, to articulate a strategy, is how the world has changed for us.
Prior to about forty years ago, for most people, most of the information about science they could get was in their home, in their library, in their local school, and maybe in their church. And that was it. There wasn’t very much in the newspaper, on TV, or on the radio. And most people didn’t have the wealth or the resources to go to another town to get scientific information.
For about nine hundred years, people who studied at universities and became scientists had a monopoly over the provision of information. And we got used to it. We got used to not being questioned by the public. We got used to giving terrible presentations and blaming it on our students if they didn’t get it. The Internet broke the monopoly.
So the first thing to realize is that now we have to compete with many more sources of information. If we do not compete in this information space, people who can tell their stories much more effectively are going to get more eyeballs.
Second, how do we get people to pay attention to facts? In policy, there’s not always a distinction between facts and values, because your values affect what facts you think are important. For example, one of the most divisive debates in America today is over abortion. Both pro-life and pro-choice people have some pretty interesting facts on their side, many of which can be validated by science. Although these different groups will have huge disagreements on which facts are most important, both sides actually have facts at their disposal. So I think one thing that’s important is to get people used to the idea of critical thinking: show how examples of critical thinking can help them with decisions, and show that science is a form of critical thinking that can help them. And if we’re not competing in the information space to try and find the link between our science and people’s core concerns, we’re going to lose the battle for attention.
Marketing is a bad word in academia. But if you look at the Republican Party or a company that’s trying to sell something, they have a large communications arsenal behind it. You can argue that scientists have a large arsenal, too, but it’s not a coherent group that speaks with one voice. So, there needs to be kind of a communal push on some of these ideas. Otherwise, they can’t stick. Sometimes there’s a lone voice, or sometimes scientists are out there promoting conflicting ideas, because they have differing opinions about a scientific fact or approach. Can you say a little bit about that kind of collective marketing?
One-size-fits-all is really dangerous, because you’re competing on all different types of levels. When I work with an organization to try to help it communicate more effectively, we focus on two things. First is the mission statement: what is the value that you provide to a particular group? That’s the core of a narrative. Second, we look at a set of prospective learners, people we want to talk to. If there is nothing that they care about, then we have nowhere to go. But if we can find an audience where there’s an intersection, then we have to hit that intersection. This isn’t like some kinds of marketing because you aren’t selling your soul. Instead, you’re working really hard to think about this intersection between the content of your science and the set of things people want to hear.
The National Academy of Sciences, for example, has struggled with this kind of messaging for years. On their new website, “From Research to Reward,” they have a five-minute film talking about the relationship between a set of economic algorithms and kidney transplants. And if you’re not moved or crying by the end of it, we need to check if you have a pulse.
The story is that an economist, Al Roth, developed these algorithms several decades ago, but now they are used to form networks of people who can donate and receive kidneys: these algorithms are used to match large sets of people up, so that if you wanted to donate a kidney to your loved one, but you’re not a match, you can go into these marketplaces, these algorithms, and they’ll find someone who is a match for you and they’ll find someone who’s a match for your loved one, and you fill each other’s need. And thousands of people now are alive because of it. The couple in the movie are from central casting, so, in a sense, they’re not necessarily science advocates. But they’re telling the story, they’re taking this thing and making it human. It’s just math. But at the end of the day, this math saved this life – these thousands of lives. That’s an example of a way to stay true to the science but communicate it in ways that people can really hear.
I think there’s an interesting question about the role of public education in the preparation of consumers of knowledge, and I think there’s a distinction to be made between individual decision-making and public policy. And in a democracy, people have rights to make decisions that affect their lives: they can choose whether or not to vaccinate their children. And we can tell them their child cannot go to public school without being vaccinated. So my question is, how do we begin to prepare citizens, in public education settings, which is the only place that we can do that, to be willing to develop dispositions to wrestle with uncertainty? And what are the criteria that we’re going to use in making these difficult kinds of decisions? I’m suggesting there’s another partner in this, beyond, first, the scientific community and, second, people who are listening to scientists and communicating their ideas. The other leg in this tripod, if you will, is public education. And it’s not just about early preparation in science as a field; across many of the disciplines we’re teaching in schools, there are issues of probability. So how do we help young people become disposed to approach and confront uncertainty?
Olufunmilayo I. Olopade
That’s a good point. One thing that’s different in a lot of countries that have good health systems, in which they can actually use research to inform policy, is that a lot of vaccines are tied to school attendance. In other countries, the government tells everyone to go get vaccinated, and they get vaccinated, and there are no questions around it. But here, because we don’t have a unified system of educating our children and supporting their health, or anything that is sort of communal, then it becomes individualized. You have to go find your doctor to recommend vaccination, and depending on what state you live in, or even what community you live in, you may not have access to vaccination even if you wanted it. So there’s a lot of work to be done in terms of the public sphere: How do we educate for science? How do we make sure people have access to science early on? And how do we do science that can inform policy that we would all accept?
© 2017 by Robert J. Zimmer, Eric D. Isaacs, Robert Rosner, Olufunmilayo I. Olopade, and Arthur Lupia, respectively
To view or listen to the presentations, visit https://www.amacad.org/scientificfacts.