In March 2010, some two hundred environmental and climate scientists convened at
the Asilomar Conference Center in Pacific Grove, California, near Monterey. Their
goal: to head off a mounting conflict between science and the public over the emerging
topic of “geoengineering”—the intentional modification of the
planet or its climate system to counteract the increasingly dire consequences of
Over the past several years, as the climate problem has steadily worsened, a growing
number of researchers have become convinced that geoengineering options—whitening
low-level sea clouds to reflect solar radiation back to space, for instance, or
injecting sulfate particles into the stratosphere to achieve the same effect—should
be studied and perhaps field-tested on a small scale. These scientists would have
us move, cautiously and deliberately, into a world where geoengineering might be
available as a last resort, a planetary insurance policy if the warming really
gets out of control.
But not everyone trusts scientists to exercise wisdom and restraint if handed such
powers. Resistance is growing among those who suspect that researchers suffer from
a steep case of hubris and are itching to “play God” with the planet.
In particular, a Canada-based civil society organization called the ETC Group mobilized
a bevy of left-wing organizations to criticize the 2010 geoengineering gathering
(an event intentionally meant to echo a famous 1975 Asilomar meeting in which biomedical
scientists assembled to set guidelines for research on recombinant DNA). Their sign-on
letter labeled the conference organizers “almost exclusively white male scientists
from industrialized countries” and implied that financial interests might
be pulling the event’s strings. The ETC Group has previously charged that
scientists are part of a “geoengineering lobby,” working in step with
those who would make big money from the deployment of planet-altering technologies.
Meanwhile, street protests have taken place outside scientific meetings where geoengineering
is under discussion. The battle has begun between scientists and activists to win
over the broader public—which, at least for the moment, appears almost entirely
clueless. According to survey data gathered by Anthony Leiserowitz of the Yale Project
on Climate Change, 74 percent of Americans have never heard of geoengineering. Another
26 percent say they have heard of it, but most appear to be misinformed, with some
confusing it with geothermal energy. Less than 1 percent of Americans appear to
know what “geoengineering” really means, or what the fight is truly
In sum, it’s yet another brewing conflict between science and society—one
that seems set to explode at an unspecified time in the future, at which point there
will be little reason to expect the calm voice of scientific reason to prevail over
alarmism, demagoguery, and simple fear.
Here we go again.
What should the scientific community do when conflicts erupt between scientists
and members of the public, as is beginning to occur over geoengineering? A steady
stream of rifts has arisen over the years, on topics ranging from climate change
and evolution to vaccination and genetically modified foods. In the future, as scientific
and technological advances have an increasingly profound influence on policy and
society, that stream may become a torrent.
From a scientist’s perspective, members of the public desperately need to
understand the scientific basics of a given issue in order to make good decisions
about it. When scientists find their expertise rejected—especially by activists
who seem biased or ill-informed, and who may even have a penchant for street theater—it’s
a slap in the face, a mockery of their hard work and dispassionate methodology.
One response to such offenses is simply to dismiss the public, to paint average
Americans as stupid, scientifically illiterate, or emotional. During the 1970s,
Nobel laureate James Watson famously dubbed those hoping to constrain recombinant
DNA research as “kooks,” “incompetents,” and “shits.”
Another more recent example of such lashing out was captured in the 2006 documentary
Flock of Dodos by scientist-filmmaker Randy Olson. Olson gathered a group
of scientists around a poker table to talk about the anti-evolutionist “intelligent
design” movement and how to respond to it. One offered the following strategy
for addressing the creationists: “I think people have to stand up and say,
you know, you’re an idiot.”
Whether or not these scientists recognize it, they are working in what science and
technology studies (STS) scholars have dubbed the “deficit model.” They
assume that if only their fellow Americans knew more about science and ceased to
be in a state of knowledge deficit, a healthier relationship between science
and the public would emerge.
Yet there is another possibility: perhaps scientists misunderstand the public and
fail to connect in part because of their own quirks, assumptions, and patterns of
behavior. Indeed, there is no guarantee that increasing scientific literacy among
the public would change core responses on contested scientific issues, for those
responses are rarely conditioned by purely scientific considerations. Scientists
and non-scientists often have very different perceptions of risk, different ways
of bestowing their trust, and different means of judging the credibility of information
sources. Moreover, members of the public strain their responses to scientific controversies
through their ethics or value systems, as well as through their political or ideological
outlooks—which regularly trump calm, dispassionate scientific reasoning.
The powerful influence of politics and ideology is underscored by a rather shocking
survey result: Republicans who are college graduates are considerably less
likely to accept the scientific consensus on climate change than those who have
received less education. These better-educated Republicans could hardly be said
to suffer a knowledge deficit; a more apt explanation is that they are politically
driven consumers of climate science information—and often quite voracious
ones at that. They strain information through a powerful ideological sieve and end
up loudly supporting a viewpoint that is incompatible with modern scientific understanding.
A more scientifically informed public, then, is not necessarily the same as a public
that will side with scientists more frequently. Perhaps what is needed instead is
a public that is more familiar, comfortable with, and trusting of scientists; that
is more regularly engaged by the scientific community on potentially controversial
subjects; and moreover, that is engaged before truly fraught conflicts are
allowed to emerge.
Fortunately, in recent years the deficit model has begun to lose its grip. A smattering
of recent books, with titles like Don’t Be Such a Scientist and Am I
Making Myself Clear? exhort researchers to better understand their nonscientific
audiences and the often counterintuitive dynamics of communication. In an innovative
twist, meanwhile, a much noted 2009 survey by the Pew Research Center for the People
& the Press, undertaken in collaboration with the American Association for the
Advancement of Science, inverted the traditional “scientific illiteracy”
paradigm. The survey not only polled Americans about their views of science but
also polled scientists about their views of Americans. Revealingly, it found that
while Americans tend to have positive views of the scientific community, scientists
tend to consider the public ignorant and the media irresponsible.
The resulting headline: “Public Praises Scientists; Scientists Fault Public,
Possibly the most sweeping effort yet to challenge deficit thinking
took shape as a series of four workshops organized over the past year-and-a-half
by the American Academy of Arts and Sciences and funded by the Alfred P. Sloan Foundation.
Entitled “Improving the Scientific Community’s Understanding of Public
Concerns about Science and Technology,” the interdisciplinary sessions homed
in on four areas where conflicts between scientists and the public have either already
emerged or seem ready to sprout up: the disposal of nuclear waste, the future of
the Internet, the dissemination of personal genetic information, and the adoption
of new energy technologies intended to fix our climate crisis and wean us off our
dependence on foreign oil.
Collectively, these four sessions sought to invert the common complaint that the
public needs to understand more science; instead, they suggested, perhaps scientists
need to understand more public. As Stanford University’s Thomas Isaacs,
chair of the workshop on nuclear waste, put it: “In order to be successful,
we have to do more than think we know it all, and our job is simply to tell people—and
if they don’t understand, then our job is to tell them a little bit louder.
That tends not to work.” Later in the same session, Eugene Rosa, a public
opinion expert at Washington State University, criticized the “hypodermic
needle” view of the scientist-public relationship, according to which scientific
facts are to be “injected” into Americans almost as if they are in need
of medicine—a cure that rarely, if ever, seems to take.
Rather than telling the public to take its scientific shots, the American Academy
sessions suggested that if there is a divide between scientists and the public,
perhaps both sides bear a responsibility for its existence and for bridging
the gap. Indeed, scientists and technical experts may shoulder an even greater responsibility,
considering their dramatic advantage in the knowledge arena and the funding resources
at their disposal. Most important, no one benefits from the too-common practice
of lobbing missiles across the “culture war” divide between scientists
and various subsets of the American public. This strategy simply leads to damaged
trust, a hardening of attitudes, and long smoldering conflicts—the unending
battles over the teaching of evolution and the science of climate change being the
primary cases in point.
A review of the four American Academy workshops, then, sets us on a path toward
a better, less contentious, and more productive means of managing— and heading
off—conflicts between scientists and various publics. However, the workshops
also show that there is some distance to go before scientists are accustomed to
seeing the world through the eyes of the many and diverse groups of citizens affected
by their work.
One of the workshops treated a decades-old and much studied American scientific
dispute, one in which a wealth of data and experience can be brought to bear in
discussing the causes for rifts between experts and the public: the conflict over
how and where to dispose of the nation’s spent nuclear fuel and high-level
Although it is difficult today to remember any other reality, Americans have not
always been deeply divided over nuclear power. During the 1950s and 1960s, a nation
buoyed by slogans like “Atoms for Peace” overwhelmingly supported its
deployment. But in the wake of the Three Mile Island and Chernobyl accidents, and
then the conflicts over arms control during the Reagan years, a nuclear divide emerged.
For many members of the public, the problem of how and where to dispose of the nation’s
nuclear waste ranked among the most contentious aspects of the debate.
For an eloquent testimony to this fact, consider the long and dysfunctional history
of attempts to establish a national nuclear waste repository at the remote Yucca
Mountain site in Nevada. When a 1987 amendment to the 1982 Nuclear Waste Policy
Act designated Yucca as the sole site to be studied for its suitability as the nation’s
central waste repository (removing several other sites from contention), the basis
for the choice included highly scientific and technical considerations about geology,
hydrology, and tectonic activity, among many other factors. Nevertheless, the legislation
was quickly dubbed the “Screw Nevada Bill” by locals, who saw a political
ploy to dump on their state. Soon, Nevadans’ sense of grievance found political
champions like current Senate Majority Leader Harry Reid, who has fought for two
decades in opposition to the Yucca plan.
Meanwhile, the U.S. government began to spend what would eventually total $9 billion
on the research and infrastructure necessary to establish Yucca Mountain as a nuclear
waste repository. Beginning in 1987, teams of government scientists set to work
studying the Yucca site as the law required—and found themselves “pilloried
on a regular basis” by anti-nuclear activists as well as by many Nevadans,
according to Hank Jenkins-Smith, a political scientist at the University of Oklahoma
who has studied the Yucca case. The Yucca process, he opines, “was optimized
to create as much antagonism [as possible] between the way scientists understood
the world and their view or their model of the public.”
Nevertheless—and however unwelcome—the research progressed, so much
so that the Yucca site has been dubbed “the most studied real estate on the
planet.” Yet in the last year, it has become apparent that political opposition
(which includes dozens of lawsuits) is more than capable of trumping long-term government
financial commitments. Although the Bush administration moved to open Yucca by about
2020, the Obama administration has reversed course. Yucca Mountain is “off
the table,” Energy Secretary Steven Chu remarked recently. In the meantime,
the nation’s nuclear waste remains in more than one hundred temporary storage
facilities located across the country, some quite close to populous areas.
Yucca Mountain is just one example of a long-standing but problematic strategy of
identifying nuclear waste disposal sites through an approach that has been called
“decide, announce, defend.” In the past, sites have been selected through
bureaucratic and technocratic processes. Experts, working largely outside the public’s
ken, have been called on to determine whether they are safe and sustainable. Often
these technical decisions are then sprung upon the public—which has resisted
And no wonder: the different sides approach the issue from different paradigms or
worldviews. If scientists who specialize in nuclear issues often feel unfairly attacked
by the public, the reality is that for many members of the public, scientific and
technical justifications alone—however sound—do not suffice to quell
their fears about nuclear waste disposal, its long-term safety, and its proximity
to where they live. In other words, on a topic that stirs emotions as much as this
one does, the science can very easily be good enough for the scientists but not
good enough for everyone else.
The American Academy workshop on nuclear waste highlighted a striking example of
this phenomenon. In 1991, the American Nuclear Energy Council launched a Nevada
ad campaign that employed scientific spokespersons to convince the public that the
Yucca repository itself, and the transport of waste to the site, would be safe.
However, observed Eugene Rosa, the campaign backfired dramatically: just 15 percent
of respondents in a follow-up survey said the ads made them feel more supportive
of the repository. A whopping 32 percent of respondents were moved in the opposite
direction, and roughly half did not change their opinions. Rather than softening
resistance, the ad campaign hardened the views of those who already opposed the
repository—precisely the opposite effect from what was intended.
Is there a better model for handling the fraught issue of nuclear waste disposal,
and can it lead to a different result than the policy mess—and gigantic waste
of time, effort, and taxpayer money—that is Yucca Mountain? Finding such an
approach could be especially significant in light of the growing recognition that
nuclear power, because it is carbon-free, is likely to serve as a core component
of any future solution to our intertwined climate and energy problems. No matter
how strongly desired, a “nuclear renaissance” will not be possible without
a resolution to the problem of waste disposal.
A different approach to managing potential conflicts over nuclear waste has been
attempted in Canada, under the auspices of the country’s Nuclear Waste Management
Organization (NWMO). Instead of “decide, announce, defend,” the new
approach is “engage, interact, cooperate.” Founded in 2002, the NWMO
undertook a sustained three-year program to engage the Canadian public on how to
dispose of nuclear waste and to consider—sometimes over scientists’
objections—the public’s views on the ethics and societal implications
of any waste disposal decision. The NWMO also explicitly promised that every community
would retain veto power over the location of a waste site in its neighborhood or
While the final decision on Canada’s waste repository site has not yet been
made, those involved in the NWMO process report that, thus far, even critics have
remained engaged and supportive. Dialogue has not broken down; rather, it has been
fostered and strengthened.
This kind of thinking is also becoming increasingly prominent in the U.S. context,
where the Nuclear Regulatory Commission (NRC) has undertaken new measures to strengthen
public support of its activities. According to Janet Kotra, head of the NRC’s
High-Level Waste Public Outreach Team, these steps include improving the ability
of government scientists to engage with citizens in well-designed, effective public
meetings. As Kotra put it at the American Academy meeting: “I will never forget
a former colleague who said, ‘You mean, I have to dumb down my presentation
for Ma and Pa Kettle?’ And of course, the answer to that is, yes, if you see
it that way. But if you see it that way, I don’t want you talking to them.”
If scientists want to better connect with the public on its own
terms, improved communication will be vital to their success. As Thomas Isaacs stated
at the conclusion of the nuclear waste workshop, “I think we’re talking
the talk and we’re starting, some at least, to walk the walk. But that’s
the challenge that remains.”
To unseat the deficit model and get scientists and the public talking on equal terms,
a variety of institutional barriers must be overcome. One problem is that the incentive
system in science remains highly inimical to greater public engagement. Scientists
who value or excel at public outreach often face the explicit or implicit scorn
of their peers, for whom success in technical research is the epitome of scientific
achievement and all else is secondary or even a waste of time. While attitudes may
be slowly changing in the academy, most young scientists today are still largely
trained in the mould of their professors —although, as we’ll see, some
are beginning to rebel.
Furthermore, science journalism—supposedly the means of bringing scientific
information to the public so that scientists don’t have to—is in steep
decline, at least within traditional media institutions like newspapers and television
news networks. This fact makes improving the communication and outreach abilities
of scientists more crucial than ever: increasingly, there is no one else to do this
work for them.
How exactly should scientists go about engaging different segments of the broad
American public? The nuclear waste workshop participants noted two separate communication
roles for scientists, both of which are vital (and both of which have been neglected
in the past). One is slow, steady engagement with the public on issues of concern—being
available, being open and ready to listen, and working to defuse conflicts before
they begin. Another is crisis communication, so that if and when a major event occurs
with the potential for a long-term or dramatic impact on public opinion (such as
the Three Mile Island meltdown in the nuclear arena or, in the realm of climate
change, the infamous “Climate Gate” scandal over scientists’ stolen
email messages), representatives of the world of science are able to respond quickly
before irreversible damage is done.
The nuclear waste workshop drew heavily on the work of social scientists, public
opinion researchers, and media specialists (including current and former journalists).
If scientists wish to better prepare for potential conflicts with the public—and
manage existing ones to achieve better outcomes—it will be essential to involve
these “experts.” True, they do not hail from the hard sciences. But
they have much needed skills: the ability to determine where different subsets of
the public stand on a particular issue based on survey data, for instance, and experience
studying issue cycles and patterns of media coverage so as to determine where the
tipping points may lie and which types of arguments, or frames, seem to be gaining
or losing momentum as public debate progresses and evolves. For example, social
scientist Matthew Nisbet of American University has demonstrated that with any
nascent science-policy issue (geoengineering and nanotechnology are good examples),
a series of latent meanings are already present in public discourse that could gradually
harden into dominant views on the matter.
Understanding the terms of a science-policy debate before it goes fully public—and
grasping how a particular interpretation of the issue could rise to the fore due
to a confluence of media coverage and pivotal events—would better prepare
scientists for managing the issue before it becomes widely contested. This point
deserves close attention from scientists thinking about geoengineering, and should
also guide our interpretation of two other American Academy workshops devoted to
gaps between scientists (or technical experts) and the public. Both workshops focused
on areas where scientists have already begun to anticipate future policy issues
or conflicts, but where the public seems largely unaware or ill-attuned. One concerned
the evolution of the Internet. The second covered the uses (and misuses) of personal
genetic information in an age of “personalized medicine” and direct-to-consumer
marketing of genetic tests for a variety of purposes, ranging from studying one’s
ancestry to uncovering potential health risks.
In the American Academy workshop “The Next Generation of the
Internet,” participants seemed less certain than the nuclear waste experts
about how to approach the inversion at the heart of the undertaking: the idea that
scientists (and, in the case of the Internet, technical experts) need to understand
the public, and not just vice versa. Nevertheless, the vast gap between skilled
Web technologists and average Internet users was immediately recognized. “Many
Internet experts or computer scientists are not trained in human behavior,”
opined meeting chair David Clark, an Internet expert at MIT. “They understand
the public interacts with the Internet differently, yet lack the training to effectively
incorporate public behaviors into Internet design.”
Experts and citizens also differ widely in their outlook on the Web’s future.
Experts tend to be much more concerned about issues of privacy and security than
most members of the public, who seem to want the Internet simply to function as
a reliable utility and don’t appear to worry much about entering their personal
credit card information or social security numbers on any number of websites. This
lack of concern raises a potentially troubling question: how would a public that
thinks of the Web largely as a utility—an appliance— react to a future
in which governments impose identity requirements for Web use, essentially requiring
every user to be identified by the equivalent of a driver’s license? Perhaps
they would not worry about such a development nearly as much as they should.
The overwhelming impression conveyed by the “Next Generation of the Internet”
session was that many potential problems involving security, personal identity,
and privacy could develop as the Internet evolves—problems that ex-perts can
begin to anticipate but that the average citizen scarcely considers or worries about.
What kinds of public reactions might be expected if any of these issues were to
explode and become a matter of mass media coverage or crisis? How might we prepare
citizens for different eventualities of the Internet’s future? That was a
subject the session largely left unresolved.
Similar questions emerged from the American Academy workshop on the “Spread
of Personal Genetic Information.” As human genome sequencing becomes faster
and cheaper due to inexorable technological advances, it is becoming possible to
envision a Gattaca-like world in which knowledge of one’s own genetic
makeup is a given, not only to oneself but potentially to others as well. Indeed,
in the past half-decade genetic testing companies like 23andMe and DecodeMe have
begun marketing their wares directly to consumers, but many experts wonder how valuable
the information provided can be without the help of a skilled interpreter or genetic
counselor. Still, some citizens will undoubtedly seize upon the results and may
use them to shape their health choices.
As we move into this new world, scientists caution that there is a “mythos
of the gene” that has led much of the public to think of individual tracts
of DNA as directly linked to particular traits or disease susceptibilities. “There
is very good historical evidence from about 100 years ago to today that the public
has a very powerful notion of the influence of genes and attributes to it much more
power really than the scientific community does,” noted Philip Reilly, Chief
Medical Officer of Genetix Pharmaceuticals in Cambridge, Massachusetts.
While observable traits certainly run in families—as do diseases—in
many cases their emergence, expression, and characteristics are conditioned by hundreds,
sometimes more than a thousand, separate genes, as well as by interactions with
the environment and random events in human development. The increasing speed and
declining cost of gene sequencing provide some access to this complexity, but the
information revealed may not be particularly profound: it is not as if any single
gene “causes” anything in the vast majority of cases. Yet members of
the public may latch on to newly revealed genetic information anyway and scurry
with their 23andMe reports straight to their doctors, who may not know how to handle
or advise about the results.
Many other potential problems could arise in a world of cheaper, easier, and largely
unregulated access to personal genetic information. Will there be discrimination
based upon one’s genes? Will there be more terminations of pregnancies based
on five-week fetal genome sequencing and the alleged “flaws” it reveals?
Will law enforcement agencies have universal DNA databases for all citizens? Will
particular genetically based diseases become linked to particular races—echoing
eugenics, Tuskegee, and other nightmares of the earlier days of genetics and biomedical
science? Certainly, one of the most important recognitions about the “public”
that came out of the workshop is the fact that particular segments, such as the
African American community, have very good, historically grounded reasons to be
suspicious of medical research and advances, particularly with regard to genetics.
In general, however, the personal genetics session featured a fair amount of “hand
waving” about what the public does and does not believe about genetics. “A
number of us have said, ‘The public believes this, the public believes that,’”
objected Harvard psychologist Steven Pinker at one point. “But what is our
evidence for what the public believes? In my experience many scientists have a condescending
attitude towards what the public believes.” While the assembled scientists
and experts could envision many potential flashpoints in the future of personalized
genetics, they were less able to describe with any certainty how the public would
respond to such controversies or scenarios— much less how scientists might
prepare the public for these situations.
To be fair, the genetics workshop participants knew well what they didn’t
know. As Duke University’s Huntington Willard put it, “There’s
a thousand publics out there that one could address, any of whom has to be understood
by the scientists in order to know how to deal with them, how to work with them,
engage them, try to benefit them and be benefited by them.” It sounds, in
short, like a research agenda.
From this survey of three out of the four American Academy workshops
on scientists’ understanding of the public, general patterns begin to emerge.
On issues where a long-standing conflict exists between scientists and the public—
such as nuclear waste disposal—social scientists have also been long engaged
and have conducted considerable research on the conflicts and corresponding public
views. What’s more, scientists are probably more likely to be conversant with
this social science research, and can perhaps glean from it a better path forward.
But decades into such debates, the political and societal rift already exists. The
crisis-communication opportunities have probably been missed or squandered, and
much analysis is retrospective and “woulda, coulda, shoulda” in nature.
Battle lines have hardened (as in the Yucca Mountain case), and it may be far too
late to “fix” the situation.
On issues that are new and emergent, by contrast—the future of the Internet,
the spread of personal genetic information, geoengineering—there is comparatively
less solid research available to help scientists glean what the public “thinks”
and how it is likely to respond to future controversies. The experts are able to
glimpse, or at least imagine, what some of these controversies might look like.
But they are unaccustomed to mapping them onto existing public opinion configurations
or understandings and, in many cases, are not particularly comfortable with doing
so. Moreover, the requisite data and social science analyses may not exist in the
The obvious suggestion, then, is that scientists and social scientists should team
up earlier in the issue cycle and figure out—together—how to
envision different scenarios in which a nascent field of science may impact or alarm
society. They should do so based on a well-researched and scientific sense
of where the public stands and where it is likely to move when prompted by events.
Such an anticipatory approach would not only better serve the public, it would have
the added benefit of enabling the scientific community to prepare for any crises
or conflicts that may occur.
In other words, a forward-looking collaboration is needed between research scientists,
social scientists, public engagement experts, and trained and skilled communicators.
The latter may or may not be scientists, but they should be ready to move, on a
moment’s notice, to address controversies and concerns. Meanwhile, in the
absence of any pressing conflagration, public engagement initiatives could help
sculpt a citizenry that will be less likely to distrust the scientific community,
or reject its expertise, and more willing to understand the scientific perspective
(so long as scientists approach the public openly and take citizens on their own
In the competitive world of academia, how would such a forward-looking research-and-response
infrastructure be established? How would it move gingerly across policy areas and
disciplinary divides? As it happens, precisely such an initiative already exists—for
one scientific issue, anyway. That issue is nanotechnology. The National Nanotechnology
Initiative (NNI) is an interagency research effort that was launched in 2000 and
organized and given greater prominence by the U.S. Nanotechnology Research and Development
Act of 2003. This law requires federally funded research on the societal impacts
of nanotechnology, thereby codifying an impulse already strongly present at the
NNI’s creation: that it should foster interdisciplinary research and sustained
efforts in public engagement.
Why was the central U.S. initiative to fund nanotech research—an innovative
technology that we hope will generate economic growth and new industries, if not
a “new industrial revolution”—so sensitive to societal impacts?
Nanotechnology had been viewed for some time as a potential subject for future controversy;
many feared it would be the next “GMO” issue. With the release of Michael
Crichton’s 2002 novel Prey, in which nanobots wreak havoc, and Sun
Microsystems cofounder Bill Joy’s 2000 warning in Wired magazine about
a world of “gray goo” that could result from nanotech run amok, the
groundwork seemed well prepared for such an outcome.
Therefore, the NNI has focused heavily on engaging social science researchers to
undertake the anticipatory work that will allow us to imagine how a future full
of nanotech innovations may evolve and to envision the public’s place in that
future. As David Guston, the head of the NSF-funded Center for Nanotechnology in
Society at Arizona State University, explains, “We structure dialogues between
scientists, engineers, social scientists, stakeholders, and users around a variety
of different socio-technical trajectories in a given technological space.”
Indeed, the 2003 Nanotechnology Research and Development Act is, according to Guston,
the first piece of U.S. legislation that instructs researchers to conduct social
science alongside pure science and engineering work and to involve the public and
determine what its values are in connection with nanotechnology. The model provides
much to build on, and could easily be applied to, say, synthetic biology research
and (perhaps especially) geoengineering research.
But the NNI is not the only positive sign on this front. There is also a demographic
and educational phenomenon occurring right now at universities across the country
that could be turned to the advantage of those who wish to bring scientific research,
and scientists, into better contact with society.
Surveys of young university scientists show that many would like to do something
other than follow in the research footsteps of their mentors—especially
at a time of fierce competition for a relatively small number of traditional academic
jobs. In a recent survey of one thousand graduate-level science students at a top
research institution (the University of California, San Francisco), less than half
designated academic research as their top career choice. Instead, these young scientists
are often interested in public engagement and communication, but face limited career
opportunities to pursue these goals.
In other words, if there is a crying need to forge better connections between scientists
and the public, there is also an army of talent within universities looking for
such outreach work. That base is young, optimistic, and stands ready to be mobilized.
The final American Academy workshop, which delved into issues surrounding climate
and energy, neatly blended many of the characteristics of the workshops discussed
above. On the one hand, it addressed a much studied and long-standing science-society
problem, one where it is far too late to stave off massive, entrenched conflict:
global warming. Anthony Leiserowitz of Yale University, a leading expert on climate
change and public opinion, made this point crystal clear in his presentation. Leiserowitz
has classified Americans into six now-famous groups based on reactions to the issue;
as of January 2010, his results were as follows: “alarmed” (10 percent),
“concerned” (29 percent), “cautious” (27 percent), “disengaged”
(6 percent), “doubtful” (13 percent), and “dismissive” (16
percent). (Disturbingly, the last group has grown dramatically from just 7 percent
in 2008, as climate-science denial has experienced a strong resurgence.)
As Leiserowitz’s results suggest, we understand the public very well on climate
change. We know Americans are thoroughly polarized and view the issue through partisan
lenses—which explains why better informed and educated Republicans are more
likely to reject modern climate science, whereas better informed and educated Democrats
respond in precisely the opposite fashion.
At the same time, the session also showed that despite the seemingly irreversible
political polarization of the public around climate change, there is much greater
potential to achieve solutions if the issue is reframed around new energy innovations.
Americans are broadly in favor of advancing energy technologies, regardless of their
political affiliation. (This finding neatly explains the recent trend in leaving
the word “climate” out of the title of various pieces of energy legislation
in the U.S. Congress.)
If we are going to throw our weight behind a variety of energy innovations, from
wind farms and solar installations to smart meters and electric cars, now is the
time for scientists and social scientists to work together to anticipate the kinds
of public resistance that may emerge to aspects of the new energy future. The American
Academy session did just that. To give but one example, the session featured a revealing
presentation, by Roopali Phadke of Macalester College, about the growing anti-wind
energy movement, which is motivated by a set of aesthetic concerns about the marring
of landscapes that scientists and the wind industry have often treated lightly or
callously. Phadke suggested that the American anti-wind movement is “growing
at a rapid pace” and is mobilizing around a common platform of concerns. Statements
by opposition leaders also suggest that future campaigns are less likely to take
the form of polite protests and may consist of more “direct actions”
against wind farms. (Incidentally, controversies over wind power installations recall
a lesson from the nuclear waste saga: don’t spring a wind farm on a community
Happily, social science research is already in progress on how members of the public
are responding, or are likely to respond, to new energy innovations —for while
Americans express strong support for these innovations, all humans also have a tendency
to resist change when it is thrust upon them quickly, as some of these technologies
Moreover, whether old or new, energy systems require large facilities, which have
to be put somewhere. Thus, while the public may support less carbon-intensive fuels
in theory, there may also be great resistance to attempts to obtain large volumes
of natural gas from newly reachable shale resources, often located in parts of the
country (Michigan, the eastern United States) that are not accustomed to major extraction
endeavors. Similarly, capturing carbon dioxide and removing it from the atmosphere
sounds wonderful in theory— but then it has to be stored, likely underground
and perhaps in close proximity to a community that feels uncomfortable with the
Ensuring a new energy future does not merely require an understanding of the potential
for resistance to new sources of power, or new technologies for environmental cleanup.
We must also understand how members of the public make energy decisions on an individual
and household level, where dramatic efficiency gains (and emissions reductions)
are possible. If there was one extremely heartening theme from the American Academy
meeting it was that this, too, appears to be a major growth area for research. As
Jan Beyea, an independent scientist, put it after a presentation on public adoption
of smart meters, smart appliances, and new auto technologies: “Almost every
study I cite is 2009. This area has exploded. . . . This is the time to be in it,
and I hope we can head off some of the problems ahead of time.”
Overall, the four American Academy sessions represent a critical
step in forging a more fruitful relationship between scientists and the public.
They demonstrated how little scientists often know or understand about non-scientific
audiences and technology users—and yet, at the same time, also highlighted
the fact that there is reliable data on the public to be obtained, a sound methodology
for doing so, and many opportunities for research collaborations awaiting those
who wish to undertake such projects.
As this knowledge takes hold, the hope is that it will produce more than just interdisciplinary
research. What is ultimately needed is a systematic and forward-looking way of gathering
diverse thinkers—from the hard sciences, the social sciences, and among communication
specialists—who can peer ahead at scientific issues, identify impending controversies,
and determine methods for staving off conflict. Needless to say, these researchers
will also necessarily have studied, in great detail, what can be learned from past
mistakes on issues such as nuclear waste disposal or climate change.
In sum, scientists and their institutions must set up an integrated system of research
and action that will anticipate future problems and determine how to handle
them. If the goal is to preserve public trust or to head off conflicts before they
become so fraught that there is no chance to defuse them, then reactive measures
will not suffice.
Fortunately, there are scientific means available for studying the public
and how it responds to scientific controversies—which can only mean that in
the long term, scientists will surely come to embrace them.