Winter 2014 Bulletin

2013 Induction Ceremony Class Speakers

By
Xiaowei Zhuang, Marc Trevor Tessier-Lavigne, Alison Gopnik, Paula Fredriksen, and Phyllis M. Wise

On October 12, 2013, the American Academy inducted its 233rd class of Fellows and Foreign Honorary Members at a ceremony held in Cambridge, Massachusetts. The ceremony featured historical readings by Sally Field (actor, producer, director, and screenwriter) and Ken Burns (documentary filmmaker). It also included presentations by five new members: Xiaowei Zhuang (Harvard University and Howard Hughes Medical Institute), Marc Tessier-Lavigne (Rockefeller University), Alison Gopnik (University of California, Berkeley), Paula Fredriksen (Boston University and Hebrew University, Jerusalem), and Phyllis M. Wise (University of Illinois at Urbana-Champaign); their remarks appear below. The ceremony concluded with a memorable performance by Herbie Hancock (pianist and composer).


Xiaowei Zhuang

Xiaowei Zhuang

Xiaowei Zhuang is Professor of Chemistry and Chemical Biology and Professor of Physics at Harvard University and a Howard Hughes Medical Institute Investigator. She was elected a Fellow of the American Academy in 2013.

It is my great pleasure and honor to speak on behalf of Class I, the mathematical and physical sciences. I would like to dedicate my speech to these scientists, driven by curiosity, armed with mathematical and physical principles, inventing tools that have transformed our knowledge and changed our lives, often in unexpected ways.

One of the major attributes that distinguishes human beings from animals is that humans use tools in fascinating ways. Scientists are among the most creative tool inventors and users, developing marvelous technologies to explore the wonders of nature. For many of these technologies, their most profound applications were not foreseen at the time of their creation. As a physicist who ventured into biology, I would like to give two examples of physical tools that have transformed life sciences and medicine in unanticipated ways.

Nuclear magnetic resonance (NMR) is one such example. NMR was originally discovered as an interesting physical phenomenon of nuclei in a magnetic field. Later, through its ability to precisely determine the structures of molecules – both chemical compounds and large biomolecules – NMR spectroscopy has transformed chemistry and structural biology. More recently, NMR-based imaging, more commonly known as MRI, has become a powerful method used by doctors to diagnose pathological tissues such as brain tumors. But when Isidor I. Rabi, Felix Bloch, and Edward M. Purcell first detected NMR signals in the 1930s and 1940s, they probably did not anticipate the enormous influence their discovery would have on life sciences. And they surely did not predict how many patients’ lives would today be saved by MRI.

The second example, one that is near and dear to my heart, is optical microscopy. Although often debated, the invention of the optical microscope is generally attributed to Galileo, one of the founding fathers of physics. Legend has it that Galileo took inspiration from the telescope that he used to look at the stars in the sky and invented a microscope with which he could study small objects on Earth. It was Antonie van Leeuwenhoek, generally known as the father of microbiology, who popularized the use of optical microscopes in biology. Using handcrafted lenses and microscopes, van Leeuwenhoek discovered bacteria, sperm, and, along with Robert Hooke, the cell. Optical microscopy has since become one of the most widely used methods of investigating the microscopic world of living things. Using modern microscopes today, we can observe signals from objects as small as a single molecule. Recently, the century-old resolution limit of optical microscopy has been overcome through inventions in physics and chemistry. Thanks to these advances, we can now use optical microscopes to see with nano-meter-scale resolution how tiny molecules are arranged in cells, which is helping us understand how these molecules function together to give life to a cell. As visionary as Galileo was, he probably could not have foreseen the enormous contributions optical microscopy would make toward our understanding of the living world.

These are but two examples of the vast number of technologies that were originally invented for studying physical matters but ended up changing the way we investigate living systems. Still, many mysteries of life remain too difficult to solve today, due to the lack of proper tools. One prominent example is how the billions of neurons in our brain work together to give us cognitive power – how we think, in other words. The White House recently recognized this question as one of the twenty-first century’s great challenges, and in response, President Obama announced the BRAIN Initiative – Brain Research through Advancing Innovative Neurotechnologies – earlier this year. Clearly, more tools are needed.

As we ponder upon what new tools to invent and what new discoveries to make, let me reiterate that many of the technologies and scientific discoveries we rely on today were not originally intended for their current applications. Rather, they came about due to the curiosity of scientists, their innate craving for understanding how nature works. Such curiosity-driven research has advanced science and technology and benefited human well-being in profound ways. It is critical for us to remember this, especially now, as funding gets tighter and trendy science becomes both more fundable and publishable. I hope that we still do scientific research because of our curiosity and our pure love of understanding things. I hope that research institutes and funding agencies do not judge scientific research based solely on impact factors for journals in which the work is published, for example. I hope that Congress does not base its funding decisions exclusively on whether or not the research will directly lead to a cure for cancer or other diseases. I may be too greedy here; we first have to hope that Congress will reopen our government.

When I was at the White House listening to the president’s announcement of the BRAIN Initiative as a bold research effort to revolutionize our understanding of the human mind and uncover new ways to treat brain disorders such as Alzheimer’s, I remembered hoping that “understanding the human mind alone” could be sufficient motivation for the initiative. Important as it is to cure devastating diseases like Alzheimer’s, our curiosity to understand the human mind may ultimately lead to even greater breakthroughs, the effects of which we cannot even begin to fully contemplate.

Let me end with a quote by Antonie van Leeuwenhoek: “My work, which I have done for a long time, was not pursued in order to gain the praise I now enjoy, but chiefly from a craving after knowledge.” Scientists are a curious bunch, whose cravings for knowledge have long served science and society well. Let’s hope it will remain that way.

© 2014 by Xiaowei Zhuang


Marc Tessier-Lavigne

Marc Tessier-Lavigne

Marc Tessier-Lavigne is President and Carson Family Professor at Rockefeller University. He was elected a Fellow of the American Academy in 2013.

I am deeply honored to accept induction into the American Academy of Arts and Sciences and to represent my fellow honorees in the biological sciences. The most pressing issues I can discuss today are the wonderful and timely opportunities to advance human health that lie before us, the challenges that simultaneously threaten our ability to benefit from them, and the steps I believe must be taken to overcome those obstacles to realize the full potential of science to improve the human condition.

Let me start by celebrating the fact that we live in a golden age of biomedical research. The past several decades ushered in a glorious revolution in the life sciences that has created unprecedented opportunities to make rapid progress in understanding, treating, and preventing disease. To give just two examples: today, we can sequence an entire human genome for only a few thousand dollars. This capability, unimaginable twenty years ago, is enabling us to decipher the genetic basis of normal human variation and of diseases, both rare and common. We also can now grow artificial stem cells, providing unprecedented windows into human biology. In my own field, this technology provides the first noninvasive route to generating human nerve cells from patients with neurodegenerative diseases like Alzheimer’s: these nerve cells can be studied in the test tube to understand why these diseased cells die prematurely. Powerful tools like these, and the growing body of knowledge they enable, have the potential to lead to effective medications for poorly treated diseases.

And these opportunities are coming none too soon, because the health challenges facing us are immense. It is true that we have seen great improvements in health during our lifetimes – mortality from heart disease and stroke has been cut in half in the past forty years, HIV infection is no longer lethal but is manageable – yet despite these successes, the growing prevalence of illnesses such as antibiotic-resistant infection, diabetes, and neurodegeneration threatens to overwhelm us. As the population ages, the number of people afflicted with Alzheimer’s disease in the United States is projected to triple by 2050, with costs reaching $1 trillion a year. Without effective therapies, the human suffering and economic toll will be devastating.

Now is clearly the time to take maximum advantage of the huge opportunities for discovery in biomedicine. Yet instead of redoubling our efforts, we are sliding back.

On the government side, funding for biomedical research has been flat for a decade and has lost 20 percent of its purchasing power due to inflation. Scientists spend more time raising money than doing research, and younger scientists are increasingly discouraged from entering research careers. If both trends continue, we are in danger of wrecking our scientific enterprise, a situation made worse by the current government shutdown.

Meanwhile, the private sector’s drug discovery efforts are not making adequate progress. Despite important successes, the number of drugs approved annually by regulators has remained flat for decades, yet the cost to make a drug has doubled every five years, climbing to over $2 billion, an astonishing amount. High costs result partly from ever-increasing regulatory requirements, but the root problem is in research and development. Companies are now adept at making drugs that modify biological targets, but have a poor record of identifying which targets are best to reduce disease and which patients will benefit most from those drugs. The upshot is that only one in twenty-four expensive drug start-up projects lead to a marketed drug – a huge and costly attrition. And many ailments – for instance, most psychiatric diseases – still cannot be tackled adequately because we do not yet know their underlying causes.

We therefore face a double crisis in generating and in applying knowledge. I believe we can nonetheless succeed, provided we deal with key funding and organizational challenges. In this context, I offer a few prescriptions for progress, some of which echo points made earlier today.

First, we must continue to work to reverse the decline in public investment in basic research. From my years of experience in the private sector, I can say that industry builds its applications on the fundamental discoveries made in academia. You cannot apply what you do not know, and history has shown that truly transformative knowledge comes from curiosity-driven research.

Second, we must simultaneously encourage and facilitate public/private interactions. For instance, for new therapies, insights from academia are already helping industry do a dramatically better job of target selection and patient selection, keys to reducing attrition and containing costs.

At the same time, while academia must engage with industry, it should not be asked to do industry’s job. There is a trend in the highest reaches of government toward favoring applied, rather than basic, research, since it helps secure public support. But if resources are constrained – as they are – and if something has to give, then it should be the public investment in applied, not basic, research. The reason is simple: there are alternate funding sources for applied research – industry, disease foundations, philanthropists – but there is only one significant source for basic research; namely, government. Industry cannot and will not fund it. We must make the perhaps unpopular case that the greatest gains will come if the public sector supports basic research, and industry supports applied research, with both working hand-in-glove to translate research results.

We must also draw on all disciplines, not only biomedicine. We must draw from the physical sciences, for powerful imaging technologies, as Professor Zhuang has explained; from the social sciences, to understand and encourage behavior that supports good health; and from the arts and humanities, to nourish patients’ souls.

Finally, we must free up sufficient resources so brilliant young scientists can launch their careers, even if this constrains senior scientists like those of us here today. If we fail to do this, the younger generation will continue to drift toward other professions, to the detriment of all.

If we can summon the discipline to tackle these challenges, I believe the future will be bright; and so it behooves all here today, individuals of great talent, accomplishment, and influence, to continue to exercise leadership – to make that bright future ours.

© 2014 by Marc Tessier-Lavigne


Alison Gopnik

Alison Gopnik

Alison Gopnik is Professor of Psychology and affiliate Professor of Philosophy at the University of California, Berkeley. She was elected a Fellow of the American Academy in 2013.

I am delighted to be here, speaking to the Academy on behalf of social scientists. I am a social scientist myself, but I also study the very best social scientists in the world, namely, babies and young children. That is not a joke or a metaphor; our scientific work over the past few years has shown that babies and very young children learn so much about the world quickly because they implicitly use the techniques of science. They analyze statistics, perform experiments, and then they use that data to construct theories and make causal inferences. They figure out intuitive theories about the world around them, and revise them on the basis of the new data that they have. And not only do they do this deep and profound science, but they do it spontaneously, as part of their everyday play, without even incentives like becoming members of the American Academy.

And like the rest of us social scientists, for babies and young children, the most fascinating, the most important, the most profound problem, the one they work on the hardest, is trying to figure out what is going on in the minds of the other people around them. In fact, from an evolutionary perspective, babies and young children are designed for learning. One of the great puzzles of human evolution is why it is that we have children at all. Most people in the audience have figured out at least the superficial, proximal answer to that question; but, of course, there is a deeper evolutionary explanation. After all, babies are, not to put too fine a point on it, useless. In fact, they are arguably worse than useless because we have to spend so much time and energy taking care of them. And for human beings, our babies are useless for much longer than the young of any other species. We have a much longer period of childhood, of protected immaturity, than any other species. And that extended period of development seems to have been linked in the course of human evolution with our great cognitive abilities.

Why would that be? The answer seems to be that childhood gives us a protected period in which all we have to do is explore and learn. We human beings have a wonderful evolutionary strategy: we can rely on learning. You can plunk us in any new, varied, unpredictable environment and we can learn how to cope with that environment. But the downside to that trait is that until you have actually done all of that learning, you are going to be helpless. You don’t want to wonder, while the mastodon is charging at you, “How shall I deal with this? A slingshot, maybe, or possibly a spear?” You want to have all of that learning in place before you actually face the real challenges of being an adult human being.

And that is what we have evolved to do. There is a kind of evolutionary division of labor between children, who have nothing to do but learn and whom we protect and invest in, and we adults, who take the things that we have learned as children and put them to use to fulfill our adult goals. In fact, when computer scientists are trying to create machines that can learn, they use a similar strategy. It turns out that if you want a system to learn, and especially if you want a system to be able to learn things that are novel and unexpected, the very best strategy is to first give that system free reign to explore, to look around, to play, and only then narrow it down to solve particular problems.

Evolution seems to have used this strategy in its invention of humans, but then we humans discovered it for ourselves in the seventeenth and eighteenth centuries when we actually invented science (and also when the Academy was being founded). What we discovered was that by giving adults, who are not as swift as two- and three-year-olds, a protected place in which they could exercise their curiosity, explore, and play, we could make discoveries that would eventually provide benefits to everybody, just as the discoveries that we make through sheer curiosity and play as children helped the entire species thrive in evolutionary history.

From this perspective, it is not that children are little scientists; it is more that scientists are big children. Yet recently, this powerful strategy of providing a period of protected, playful, exploratory learning has been under pressure on both fronts. Rather than providing more investment for childhood, we are actually disinvesting in children. Very young children are more likely to live in poverty than any other group: 20 percent of American children are growing up in poverty, and that number is actually increasing. And when we do invest in early childhood education, children and teachers – rather than being able to exercise this exploratory, playful learning – are caught between the pressure of parents, who want their children to go to Harvard, and policy-makers, who want to show that their investments have not been wasted, usually evidenced through standardized test scores. And those same policy-makers are cutting the research budgets of both basic science, which is fuelled by curiosity and a spirit of play, and basic social science, which uses that curiosity and spirit of play to figure out how we ourselves work as human beings.

Alongside the defunding of these curiosity-driven pursuits, there is an increasing pressure on the scientific disciplines to produce immediate and applicable results. The irony is that the biological and psychological sciences that we have used to start to understand young children show how misguided and counterproductive this approach actually is. It is not simply that we would like to be able to do basic research; our science shows that basic research is the way to get to the most interesting and important solutions to our applied problems. What I would argue is that we should take a page from evolution: if we, in the future, want to thrive in a complicated, unpredictable, constantly shifting world, we should stop trying to make our child scientists be more like grownups, and instead continue to let our grown-up scientists be more like children.

© 2014 by Alison Gopnik


Paula Fredriksen

Paula Fredriksen

Paula Fredriksen is the Aurelio Professor of Scripture Emerita at Boston University and Distinguished Visiting Professor of Comparative Religion at the Hebrew University, Jerusalem. She was elected a Fellow of the American Academy in 2013.

Art, music, drama; language, literature, and poetry; history, philosophy, religion – these are some of the premier subjects constituting that area of our culture that we designate “the humanities,” the disciplined study of the human experience. As a family of academic disciplines, the humanities are a product particularly of the European Renaissance. Those were the good old days, when “man was the measure of all things.” The products of humanistic scholarship presupposed a certain construction of intellect, or of mind, or of self, as an autonomous “knower.” This idea, in turn, reflected commitments to or presuppositions about the individual as a moral agent freely exercising his or her will. And in these good old days, no chasm yawned between the humanities and science, which was the disciplined study of the universe. Both stood on a continuum of meaning that in a sense defined Western culture.

That was then. This is now. Despite Descartes’s best efforts, this humanist foundation has eroded. How? We could list the names of those thinkers whose work marks the stages on our road to post-modernism – Marx, Darwin, Einstein; Freud, Wittgenstein, Heidegger – but such a response would itself be “humanist,” an attempt to identify causes through the generation of an intellectual history.

Let me pull the camera further back. Looking at where we are in 2013, what defines the cultural gap between the Renaissance and us, here, now? The answer lies in a huge mosaic of issues, changes, and factors. The one most evident to me is the commercial development of technology. This child of the scientific revolution has grown much more powerful, socially, than its parent. (Creationists use email, too.) Technology indexes man’s control over the universe. And this control – most significantly, in the production of energy – has been rewarded by huge influxes of money and power. It has profoundly affected human society for good and for ill; profoundly affected, mostly for ill, the planet itself. Technology’s rewards – power and wealth – are immediate and quantifiable. More than anything else, it is these developments in the commercial deployment of technology that have displaced humanism. And if humanism has been displaced, where does this leave the humanities?

What are the humanities good for? What metric measures their worth? We were told a few weeks ago that people who read Chekhov score higher on psychology exams measuring “empathy” than people who do not. That’s nice. People who read good literature tend to write better than people who do not. This saying seems to commend literature, but it is really a commendation of good “communication skills,” something that many employers look for. That is nice, too. These apologetic efforts interpret and measure the humanities’ practical utility: majoring in English or in comp lit, they urge reassuringly, does not necessarily disqualify you from having a job. I am reminded of the New Yorker cartoon wherein a Mafioso addresses his elementary-school-age son and asks, “And how do you expect to be a made man without a good liberal arts education?”

Money complicates this picture in simple ways. Sciences bring huge grants to institutions; the humanities do not. The price of a college or of a university education has skyrocketed. What is the value of a degree that costs over $200,000 and prepares you for no job? How, practically, can a philosophy degree help you to pay back your education loans? Surely, only the independently wealthy can be indifferent to this problem. The rest can only rejoice if their seventeen-year-old opts for Wharton over St. John’s. Where does this leave the humanities?

I was born in 1951. When I was a child, one of the earliest and most significant, most imaginatively liberating leaps forward was the transition from picture books to chapter books. A page of unbroken prose allowed me as a reader to conjure persons and places however I wanted. A vestige of this value lingers in our hesitation to see a film made of a favorite book: we have already pictured the characters in a certain way and don’t want the disruption of seeing them embodied by somebody different. (I have to add that Colin Firth helped me get over this fear with his Mr. Darcy.)

I started teaching in university in 1977. By the early 1990s, I finally acceded to my students’ requests that I assign a textbook. The sources and articles that filled my syllabus were too various for them; they wanted a unified view of the material. By the mid-2000s, I could no longer tolerate doing my own homework assignments because I could not stand all the visual noise on the textbook page. Sidebars, maps, and graphs; photos, timelines, and study questions: the spread was so congested, so broken by boxes imitating windows on a computer screen, that I could scarcely pick out the exiguous thread of prose supposedly binding them all together.

What had happened? The short answer, I think, is: computers. (We can now include in this class tablets and smartphones.) Reading, too, is a technē, a skill that enables control over texts. What I have noticed as an educator is that the physical and cognitive act of reading has become progressively harder for the generations of students who have passed through my classroom. Images, sound bites, the staccato communications of social media: this is what students read. Connected prose is laborious. (Grammar is defunct.) Think again of the Renaissance, and wonder: if the very nature of literacy is changing – indeed, if it has changed already – then where does this leave the humanities?

My short answer is, I don’t know. I am a historian: I understand things only after they have happened. But just as the digital revolution has challenged our idea of what a “book” is, surely all of these seismic changes in our culture and society will alter also our idea of what a “university” is, what a “department” is, what a “major” is, what a “degree” is – and indeed, this is already happening. The modern university is also the brainchild of the Renaissance. It has had a glorious six-hundred-year run, but what comes next, I do not know.

So I cannot say what institutional shapes the humanities will take in the future; and I do not know what changing standards of literacy will do to humanistic learning. I do know that the humanities help you to grow your soul. They articulate and enrich your experience of living. They connect us with each other, across cultures, across centuries, across generations. This is a wonderful enrichment.

I would like to close by briefly telling a story of two experiences that I had in the past couple of weeks. The first is about me and Homer, the second is about me, Beethoven, and the city of Boston. Book 17 of The Odyssey: Odysseus is home, he’s mad, and he’s been disguised by Athena to look like a beggar so that nobody, for his own safety, will recognize him. But Athena forgot about one person: Odysseus’s dog, Argos. Argos is blind, he’s wasted, he’s covered with lice, and he’s lying on a dung heap, but he hears his master’s voice when Odysseus speaks to a palace servant. In that moment, Argos lifts his head, pricks up his ears, wags his tail, and dies. (I also saw Old Yeller because, as I said, I was born in 1951.) I sat on my porch, sobbing over the issue of The New Yorker that had translated this particular paragraph of The Odyssey. My husband asked me, “What’s wrong?” I replied, “Homer does have legs. The dog scene still works.”

The second experience centered around Beethoven’s Ninth Symphony, performed by the Boston Philharmonic at Symphony Hall. Benjamin Zander was conductor, and he spoke about an interesting observation that he had made while reviewing Beethoven’s notations about tempo for the Ninth Symphony. Each of the first three movements, if performed at the tempo that Beethoven indicated, lasted exactly thirteen minutes, and the choral movement, the glorious fourth movement, lasted twenty-one minutes, which meant that the entire symphony was brought home in exactly sixty minutes. But the context of this performance of the Ninth Symphony was also special. The first scheduled performance of the symphony had been canceled; it had been slated for Patriots’ Day, the day of the Boston Marathon. As a result, more than one hundred of the injured from the marathon bombing were present at Symphony Hall for the rescheduled performance, and so were a goodly number of the first responders. Off Zander went, carrying the rest of us with him, leading the symphony in a majestic gallop. It was all we could do, by the time the chorus entered, to stop from standing. When the symphony ended, everybody jumped up and erupted in applause, and the lady standing next to me, a perfect stranger, flung her arms around me. There was incredible electricity in that room, made possible through music and through human community.

Human interconnectedness. The power of disciplined imagination and of feeling. No matter how our culture goes on to configure itself, people will crave this interconnection. Humans are the hardware, but the humanities are the software. Digital revolutions notwithstanding, we the people have the priority. After all, we were the first World Wide Web.

© 2014 by Paula Fredriksen


Phyllis M. Wise

Phyllis M. Wise

Phyllis M. Wise is Chancellor of the University of Illinois at Urbana-Champaign. She was elected a Fellow of the American Academy in 2013.

As the chancellor of a major research university, my job is to blend the idealism of 43,000 students with the practicality of running a billion-dollar enterprise with more than 2,500 faculty and 8,000 employees. An institution like the University of Illinois has no shortage of idealism. Every fall, we welcome about 7,000 students who leave their communities and their families to join us and 36,000 other students in the middle of cornfield country. They come from all over – predominantly from Illinois, but also from every state in the nation, and over one hundred countries around the world – and with incredibly different backgrounds. It is our privilege to work with these students, to teach and learn from them during these formative years of their lives. When I think about the students who grace our campus, I am reminded of Daniel Burnham, one of the great architects in the world, from Chicago, who advised, “Make no little plans. They have no magic that stir the blood of men.”

When I think about the time that the students will spend with us, I think about the amount of change that will occur during only those four to six years of study. I think about Thomas Friedman, author of The World is Flat (2005), and, six years later, That Used to Be Us (with Michael Mandelbaum, 2011). Friedman has said that when he wrote the first book, Facebook didn’t exist, tweeting was something birds did, 4G was a parking space, the cloud was what was in the sky, LinkedIn was a prison, and apps were what you did when you wanted to go to college. But despite the rapid pace of change, the mission of academic leaders and faculty remains steadfastly the same.

Our mission is to extend the boundaries of the minds of our students, and to extend the boundaries of what is possible for the faculty so that they can pursue what they must. Our mission is to combine that idealism with practicality. For a leader of a public research-intensive university with eroding support from the state, with revolutionary research and innovations in learning, with rising tuition that is obstructing our wish to provide excellence and access for our students, this is an incredible time in higher education. At a visioning exercise held at the University of Illinois over the last few years, I have asked two questions: what are society’s greatest challenges going to be twenty to fifty years from now; and what is the role of the major public research university in the United States? We gathered information from many people, including our faculty, staff, students, and alums, and the community leaders in Urbana-Champaign and in Chicago. And we came up with six emerging themes, such as energy and the environment, health and wellness, and cultural understanding.

These themes will frame our strategies over the next several years of how we recruit new faculty and how we develop new courses. Are we ambitious? Yes, we are. But I hearken back to Daniel Burnham: “Make no little plans.” And we are not alone in our ambition; we share it with the great universities of this country, which are together the envy of the world. I believe that higher education’s great contribution to civilization has been to develop the talent of predominantly young people. It fueled the Industrial Revolution, it fueled the knowledge economy, and today, it provides social mobility to people who may otherwise never dream of becoming leaders of our society.

Higher education, particularly at research universities, has transformed agriculture, medicine, communications, energy, our study of the environment, and transportation. And if we plan carefully, higher education will continue to play this role as a shaper of our world. I strongly believe that as educators and academic leaders, we owe it to the people of this world to provide to their daughters and sons the most transformative educational experiences that we possibly can, while we also meet the challenges of society, providing the basic discoveries, innovations, and applications that will make the world a better place for us all.

© 2014 by Phyllis M. Wise

To view or listen to the presentations, visit https://www.amacad.org/content.aspx?i=1364.
Share