Thursday, June 21, 2018
AAAS 2018 Forum on Science and Technology Keynote

I recently wrote a short editorial for Science in which I celebrated the science and technology golden age that we’re living through.  It’s really astonishing if you think about it.  Never has the pace of discovery been so rapid, the range of achievements so broad, and the changing nature of our understanding so revolutionary.  Despite what one often reads in the press, and despite the way politicians often prey on our fears, our future is astonishingly bright.  In ways we couldn’t have imagined just a generation ago, we’re catalyzing new technologies, powering new businesses, fostering new industries, and improving lives.  These are remarkable times.

But there’s a catch, and it’s a big one.  Not everybody is benefitting equally from our efforts.  Inequality of all kinds—of income, of wealth, of education, of opportunity—have for the past 30 years been rising alarmingly quickly.  This is a grave problem for us as a society, and today I’d like to focus on what we can do as scientists, engineers and engaged citizens to address it.  In particular, I’ll discuss two vast economic sectors where I think science and technology can help get us back on track, improving conditions and opportunities for all:  health science and manufacturing.

In both sectors, innovation will be the key.  The economist Robert Solow demonstrated that nothing spurs economic growth more than technological advance, and that’s certainly how we’ve managed to keep expanding opportunities in the United States.  In the twentieth century, we established ourselves as an economic powerhouse and global leader in large part by creating new technologies—and we managed that feat by making major commitments to the science behind those technologies.

The twenty-first century, I’m convinced, will belong to those who commit themselves in even more substantial ways to innovative science and technology.  Naturally, I think we’re up to the challenge—but only if we continue doing innovative science and translate that work into practical new technologies that will keep us internationally competitive.  And on both of those fronts, we’ve got lots of possibilities in health science and manufacturing.


Health Science

Let’s start with health science.  The healthcare sector makes up 17% of our economy, and it’s growing fast.  It will soon constitute one-fifth of our economy.  That’s substantial. We need to do everything we can to ensure a healthy society, of course, but achieving that goal requires strategies that are economically viable.  But to do that, we have to make the most of the economic opportunities that the health science sector offers us.

A new report from the Information Technology and Innovation Foundation lays out the many strengths of the science side of the health sector, among them the following:

  • Health sciences employ more than 1.2 million U.S. workers.
  • The average wage in the pharmaceutical sector is close to $125,000 and just over $85,000 in the medical device sector.
  • The U.S. captures more than 40% of the world’s major patents in pharmaceuticals and medical devices.
  • ​American pharmaceutical firms invest more in R&D than firms in any other nation.

​That’s all terrific news.  But the story is more complicated than these facts suggest.

Other nations, for one thing, have learned from our success and are working hard to catch up.  China, Ireland, Singapore, the UK, and India have policies to increase their performance in this sector—and as a result they’re already gaining ground.

In addition, despite its many strengths, our health-science sector is running a trade deficit.  The numbers are sobering.  In pharmaceuticals, for example, the U.S. ran deficits last year of roughly $56 billion, and in medical devices $4 billion.  This is a serious issue.  This magnitude of deficits signals that competitors are moving in on a high-tech sector that we thought we dominated.

And, there’s another side to this story: the rising cost of health-care delivery.  Relative to the size of our economy, we spend about a third more on health care than other developed nations—no small sum when you’re talking about a fifth of the world’s largest economy!  Not only that, the government is on the hook for the most expensive part of this system: care for the elderly.  And like many developed countries, we are facing a dramatic demographic bulge—the dreaded aging of the Baby Boomers—that promises to exacerbate this problem.

This is not a new problem.  For decades, we’ve been discussing how to redesign the health-care system to stop its costs from ballooning—but onward and upward those costs still go.  So today I’d like us to consider a different approach.  Might it be possible for us to innovate our way out of this problem?

I think so.

For starters, we already know that innovative science and technologies have made it possible for our aging population to live healthier and more economically productive lives.  Think of what we’ve accomplished in the past century.  By developing antibiotics, we’ve dramatically reduced pneumonia, which used to be a leading cause of death.  By developing new heart-disease therapeutics, through both new drugs and new devices, we’ve reduced deaths from heart disease by over 70 percent in the last 50 years.  By developing entirely new approaches to HIV-AIDS, we turned what was a death sentence into a chronic, treatable disease.  And, new targeted drugs and immunotherapy now provide lasting remissions in some previously untreatable cancers, with the promise of more to come.  I could go on.

We also know that innovation can make a major economic difference in the cost of health-care delivery.  These costs are massive.  For example, the Agency for Healthcare Research and Quality estimated that the annual direct medical cost for cancer in 2015 was over $80 billion.  A 2017 study for the American Diabetes Association, found the direct medical cost of diagnosed diabetes was $237 billion.  A recent UCSF study found that 1% of the Medicare population has End Stage Renal Disease – kidney failure.  And just this 1% accounts for 7% of the Medicare budget:  the annual cost of dialysis is $42 billion.  But innovation offers us the promise of actually solving disease challenges, of cutting these staggering treatment costs, not to mention reducing the terrible human toll.

Finally, the health sciences themselves are also a major economic positive: this is an advanced technology sector where the U.S. retains an innovation lead.  Further innovation could help us better translate these gains into our economic benefit.

It’s clear, then, that innovation has done great things for the health sector in the past.  The question now is:  How can we do more of it in the future?

One answer is to turn to what I and others have called the Convergence Model—an integrated, cross-disciplinary approach to research and development that, in my view, offers the most promising approach to tackling our innovation challenge.

What do I mean by “convergence”?  I mean taking very different fields—the engineering, physical, and life sciences—and bringing them together in the pursuit of common research goals.

As those of you who know me can attest, I’ve been a proponent of the Convergence Model for a long time.  It’s something I’m absolutely passionate about.  In 2011, I participated in writing a report on Convergence, based on a gathering we held here at the AAAS.  We came up with a definition that still works well, I think.

“Convergence,” (from the report), “means a broad rethinking of how all scientific research can be conducted, so that we capitalize on a range of knowledge bases, from microbiology to computer science to engineering design.  In other words, the convergence revolution does not rest on a particular scientific advance but on a new integrated approach for achieving advance.”

I made the convergence of engineering and the life sciences one of my top priorities while I was the president of MIT, which means I can tell you firsthand just how promising and powerful a model convergence is for the health sector.  Biomedicine, especially, has been one of the major beneficiaries of our convergence efforts, both at MIT and in the larger community of scientists and engineers who work in the Boston area.

While implementation of convergence models has accelerated over the last several decades, many have championed it going back a very long time.  To offer just one example that’s near and dear to my heart, Karl Taylor Compton, MIT’s president from 1930 to 1949, and one of the major architects of the remarkable technology developments of World War II, recognized the potential of the convergence of biology with engineering.  He anticipated that it would ultimately be just as powerful -- and as socially and economically transformative -- as the convergence of physics and engineering that produced the technological miracles of the electronics industry -- and that would soon give rise to the computer and information industries of the late 20th century.  In 1939 Compton described a curriculum for a new discipline, Biological Engineering, and in 1942 he changed the name of MIT’s biology department to the Department of Biology and Biological Engineering.  But Compton was well ahead of his time.  The biologists of his day didn’t yet have the necessary tools or interest, and the newly named Department of Biology and Biological Engineering within a few years reverted to its former identity.

However, (with apologies to Carmen Reinhart and Ken Rogoff) this time IS different, and now IS the right time to really put this model to work, on many fronts.

Over the last 15 years or so, an acceleration of convergence efforts has allowed us to significantly expand our existing knowledge base and has led to new medical therapies.  We’ve developed new drug-delivery systems at the nanoscale, new disease diagnostics, new predictive computer models of disease, and new interventions for genetic disorders. And we’re closing in on the management of diseases that have long proven refractory to our efforts: several kinds of cancer, a whole range of neurologic diseases and psychiatric disorders, adult-onset diabetes, and more.

Trust me—this is big.  Over the course of my career, and especially during my time at MIT, I’ve had the great fortune to meet many of the pioneers of this Convergence Model, and I’ve seen how they have translated lab discoveries into marketplace products.  Since 2007, for example, MIT’s Koch Institute for Integrative Cancer Research has fostered an exciting mash-up of engineers, clinicians, and biologists who have been working together to understand, diagnose, and treat cancer and other diseases in new ways.  Their work has already spawned more than 50 startups, at least some of which promise to revolutionize how we practice medicine.  The potential benefits to our health sector are vast.

Given all of the potential benefits, what can we do to accelerate the Convergence Movement?  We all need to think hard about this.  But I can suggest a few promising approaches:

  • First, Cross-Agency Initiatives:  I’m a big fan of government-sponsored initiatives that are designed to facilitate convergence-based thinking about specific topics or problems.  Some are already getting off the ground, like:  the Brain Initiative, to help revolutionize our understanding of the brain, particularly important given the challenges of Alzheimers and other neurodegenerative diseases; the Microbiome Initiative, to understand and better utilize the communities of microorganisms that live within us; the Cancer Moonshot, to accelerate promising cancer breakthroughs; and the Precision Medicine Initiative, to use big-data analytics to evaluate information about disease at the individual level.
  • Second, University Research Institutes:  I like this idea too—no surprise, given that I’ve played a role in creating a few of them myself at MIT.  One of the big advantages of cross-cutting research initiatives is that they leave departmental and school-based organizations in place.  While faculty appointments and the organization of undergraduate teaching can still reside in departments and schools, faculty can amplify their research through cross-disciplinary collaborations.  Other universities have adopted this convergence institute model for health science research, including Northwestern, UCSF and UC Berkeley, Harvard, Michigan, Georgia Tech, Illinois Champaign-Urbana, and more.  At MIT we used this strategy for the Institute-wide MIT energy initiative as well as for cancer.
  • Importantly, this strategy has been deployed in government research, too.  DARPA’s new Biological Technologies Office is a terrific example.  The Biological Technologies Office brings the life sciences into DARPA’s model for breakthrough innovation, which has spawned untold number of new technologies out of computer science and the physical sciences, with an explicitly convergence-based approach.
  • And third, Interagency Planning:  We need significant coordination at a high level in government, where leaders can commit collectively to translate ideas into action.  I’m a strong proponent of an interagency working group—consisting of NIH, NSF, DOD, FDA, and DOE—that would work collaboratively to develop a 5-year strategic plan for advancing biomedical science – and beyond! -- using the convergence model.  OSTP has driven this kind of game-changing collaboration in the past, and we would welcome their adoption of the challenge of fostering cross-agency convergence strategies.

Those are just a few examples of ways we can use the convergence model to spur innovation and growth in the health sector.  I’m sure other ideas have already occurred to you.  That’s just as it should be—we all need to put our heads together, no matter what our fields and backgrounds, to think in new ways about how to do science and develop technologies to make this major part of our economy as dynamic and productive as possible.



Manufacturing is a second economic sector where better performance can make a major difference in societal wellbeing and geopolitical stability—and where innovative science and technology can make that better performance possible.

Like health sciences, manufacturing is one of the biggest sectors in the economy, accounting for 12% of GDP.  It’s the sector in which we achieve most of our productivity gains, and it’s our largest job multiplier.  Manufacturing far outpaces our service sectors in the jobs it creates, both directly and indirectly.

That said, American manufacturing is in decline.  From 2000 to 2010, in part due to the Great Recession of 2008-09, the sector lost one-third of its jobs—5.8 million of them.  That’s serious.  And almost a decade after the end of the Great Recession we’re only just getting back to pre-recession levels of industrial production.  Not only that, our investment level in manufacturing capital, plant, equipment, and IT remains very low, as does our manufacturing productivity.  These are all signals of a sector that has been hollowing out.  A very sobering prospect.

But, there’s some good news, too:  on average, the manufacturing jobs that remain in this country still pay significantly better than jobs in the service sectors.  And manufacturing jobs are still the largest job multiplier; that is, they create more jobs in other areas than other kinds of jobs.  But the reduction in the number of jobs – both directly and from the multiplier effect - means the median income for our working class has declined, and the growing gap between our struggling working class and our increasingly prosperous upper-middle class is putting strains on our social order and our political system.

The obvious solution to growing income equality is to enhance quality job prospects throughout the economy.  We’re trying, but to date we’ve largely ignored the manufacturing sector in our efforts— a huge mistake, given the size of the sector and its potentially transformative economic potential.

Part of the problem is that we have moved much slower in manufacturing innovation than competitor nations.

One possible structural reason for this reaches back to the end of World War II.  As the war wound down, the federal government put together an ambitious and innovative R&D system that put a big emphasis on basic research.  The effort succeeded wildly, creating great advances for us, but we lost sight of continuing to innovate in manufacturing.  This actually isn’t very surprising.  Our post-war policies allowed us to lead the world in manufacturing.  More distant, forward-looking innovation didn’t really seem like a priority.

Other nations had to think and work differently.  They had to play “catch up.”  Germany and Japan, for example, had to completely rebuild their production systems after the end of the war, and they made innovation in manufacturing a priority.  You may recall that Japan invented “quality manufacturing” and used it to capture leadership of the automotive and consumer electronics sectors from the U.S in the 1980s.  Germany’s famous production engineering has enabled it to run the largest trade surplus in manufactured goods ever, including with Asian nations.

Other countries have since followed the lead of Germany and Japan.  Think, for example, about how Korea, Taiwan, and China have all used manufacturing-led innovation systems to invigorate their economies. Korea and Taiwan now lead much of the world’s semiconductor sector, and China in 2011 passed the U.S. in manufacturing output – China is now the leading industrial nation.

It’s well past time for us to implement a serious program of manufacturing innovation.  In recent decades, we’ve proven ourselves, once again, to be second-to-none when it comes to innovation in technology and discovery.  But if we want to ensure our continuing social, political, and economic health, we need to think innovatively about manufacturing too.  Can we use science and technology to make possible innovations in the sector that will lead to a more prosperous future?

You won’t be surprised at my answer to this question.

I think we can.

The strategy, in my view, will be not to focus on the assembly-line–style work of the past—even though that itself was a manufacturing innovation of ours that helped spur great growth and productivity in the twentieth century.  Instead, today we need to focus our energies on what’s often called “advanced manufacturing”—that is, the emerging suite of high-tech technologies that are transforming production in the American economy – and in competitor nations around the world.  These include digital production, advanced materials, nanomanufacturing, biofabrication, and mass customization (including additive manufacturing - 3D printing).  These are revolutionary manufacturing technologies with almost boundless potential, and the countries that think the most innovatively about how to produce them will be the countries that in the decades ahead will lead the world economically.  And, we need to be one of those countries.

I made advanced manufacturing another of my top priorities during my time as MIT president, and I’m proud to have served from 2010-2012 as the inaugural co-chair of the Advanced Manufacturing Partnership, a White House–led task force of government, industry, and academic leaders.  Those of us involved in that effort all agreed that we need to recognize advanced manufacturing as a highly creative process that must be fully integrated with the innovation system that we now associate primarily with our R&D efforts.

The Boston Globe recently ran an article on how my state, Massachusetts, is moving into advanced manufacturing.  Let me quote the opening of that article, because it so clearly showcases the potential that advanced manufacturing offers us as a country.

“After a prolonged period of little or no growth in the manufacturing industry,” the article begins, “Massachusetts—one of the country’s original powerhouses—is reasserting itself as a leader in what is known as advanced manufacturing, selling products that are transforming the industry nationwide.  Innovation helped drive production output in the state to a record $50 billion last year and boost employment by 1,700 jobs in the first quarter, the biggest year-over-year increase since 2000-2001.”

A little later, the story notes that this turn to advanced manufacturing has led to a rise in what’s known as “reshoring”—the opposite of “offshoring.”  “At the same time,” the reporter writes, “there are signs that more manufacturing is coming back from overseas … as the costs of separating production from research, and the risks of far-flung quality control, become more apparent …”  

This is very encouraging—and we need to encourage more of it nationally.

The question, of course, is how.

In fact, we’re already off to a good start.  As many of you know, between 2011 and 2017, the federal government sponsored the creation of 14 Advanced Manufacturing Institutes—collaborations between small and large companies, universities, community colleges, states, and state and regional economic agencies.  Let me give you just a few examples:

Power America, a power-electronics manufacturing institute based in North Carolina;

Advanced Robotics Manufacturing (ARM), based in Pennsylvania;

America Makes for 3D printing, based in Ohio;

the Advanced Regenerative Medicine Institute (ARMI) for tissue engineering, based in New Hampshire; and

Advanced Fibers and Fabrics of America (AFFOA) for integrating IT technologies into fibers and textiles, based in Massachusetts.

These institutes are already making a difference.  In 2016, for example, in an applied R&D project that was backed and organized by Power America, researchers from John Deere—the American producer of farm and construction vehicles—teamed up with the Department of Energy.  Within a year, they developed a prototype high-power inverter for the hybrid motors used in heavy-duty construction vehicles and trucks.  The new inverter has much higher efficiency and lower breakdowns compared to traditional transformer-based inverters.  It has become a key enabler for hybrid and electric-powered heavy-duty equipment—the kind of technology that will create jobs and help us establish ourselves as a global leader for a new type of energy-efficient vehicles.

We need more institutes like these, along with the many technologies they will help bring into being.  But as we create more and more of these institutes, we will also need to broaden our efforts.  Let me offer only two (of many) important new directions:

  • One component would be A Communications Network:  I’d envision a collective structure that will allow all of our manufacturing institutes to share technologies and best practices.  Here, too, we’ve made a promising start, with a developing network called Manufacturing USA, that is sharing approaches to thorny IP issues, to skills education, and to cybersecurity protections for new digital manufacturing processes.
  • Another, critically important component is More Workforce Education:  New manufacturing technologies will never be adopted unless we have a trained workforce ready to implement them.  We need to better integrate community colleges with industry to develop curricula for the new skills.  New online courses could be integrated with “learning by doing” in industry settings, to provide a blended learning model for advanced manufacturing skills.  In just one example, MIT Professor Harry Asada is developing “TeachBot” for the Advanced Robotics manufacturing institute.  Collaborative, voice-commanded, flexible robots will assist workers with simple but precise tasks, as a new aid to small manufacturers.  Let me be clear, in this time of fear about “robots replacing people”:  in this example, it’s robots helping people.  This innovation will require employees who can install, operate, program, repair and update these new technologies. TeachBot’s desktop robot "demonstrator" and its online interactive "instructor” are designed to educate our workforce to take on these new jobs.  

All of this exciting promise represents the world of “tough-tech”:  capital-intensive, long-cycle technologies.  None of these great, tough-tech manufacturing ideas will make it into the marketplace without long term investment.  We need better strategies to encourage long-term investing.  One possible strategy to create incentives for long-term investments has been proposed by Larry Fink, the chairman and CEO of the investment group BlackRock.  He has suggested government-supported tax benefits that grow proportionally with the length of time of an investment.  It’s one idea that could encourage capital to flow to tough-tech companies, to the benefit of the American economy and improve lives around the world.  We need to come up with more ideas like it—and then act on them.



There’s a lot to be excited about here.  Like health science, manufacturing is such a large and important sector that if we can breathe new life into it, we will significantly boost our economic fortunes, both nationally and internationally.  To my mind, the opportunities are vast—but they depend greatly on our committing ourselves as a country to consistent funding of fundamental research, along with developing new strategies to accelerate the transfer of new discoveries into new marketplace products, just as we did during the second half of the twentieth century.

This brings me back to that editorial I recently wrote for Science.  In the editorial, I asked what we can do to make the most of the scientific golden age we’re living through, and I suggested that a vital first step is to recognize the critical role that institutions play in nurturing the scientific enterprise.  I think that’s a point worth reiterating for all of you here with us today.

All too often science and discovery are viewed in terms of individual achievements:  what someone did to win the Nobel Prize or a MacArthur “genius” award; what someone else did to achieve tenure or to launch a billion-dollar business.  This isn’t surprising:  the institutions that support scientific inquiry—universities, research centers, federal funding agencies, and private philanthropies—are designed to foster individual achievements, amplify individual abilities, and protect individual efforts.  But achieving success in science is a team sport, and our nation’s institutions make it possible for its scientists to take part.

I’ve made quite a lot of policy suggestions in this talk—about how to harness the power of convergence and advanced manufacturing to boost our health and manufacturing sectors.  As I’m sure you noticed, these suggestions assume the existence of healthy scientific institutions—institutions that at home and abroad sponsor innovation, enable communication, and translate new ideas into action.

Unfortunately, our scientific institutions today are under political attack, and skepticism abounds about the utility of the research enterprise and even higher education.  Also under attack are many of the core principles that unite scientists:  that objective reality can be discovered; that anyone can compete in a game governed by ideas; that disagreements are best resolved by assembling facts to test competing views; and, that science and the application of scientific principles have the capacity to improve lives.

I don’t need to tell any one of you how misguided this all is.  And not just because vibrant institutions of science can help make possible the innovations in health and manufacturing that I’ve been talking about today.  In supporting individual inquiry and enabling innovation and discovery, our scientific institutions also guard democratic principles and foster human advance.  They convene people with shared purpose and amplify their impact.  It is easy to assume that our institutions can stand on their own, but they cannot.  None of science’s successes are solely “mine” or “yours.”  They are all “ours”—and it is our shared responsibility to actively defend the institutions that enable them and contribute so importantly to our well-being and prosperity.

If we want to encourage a recommitment to innovation-based economic growth, we must also recommit ourselves to the defense of our research and education enterprises that provide unrivaled opportunities for both individual success and societal advance.