Somewhat to my surprise, Walter Isaacson’s new book, The Innovators, a group portrait of the men and women who invented computers and the Internet, is riveting, propulsive, and at times deeply moving. My surprise is not rooted in doubts about Isaacson’s skills; he is considered to be the leading biographer of the digital age for a reason. I was surprised because I find books about technology unreadable. I enjoy machines as much as the next Amish-by-disposition American, which is to say, among other things, that I don’t care very much about where they come from, and on those occasions when I do apply myself to the study of machines, I usually fail to understand how they work.
One of Isaacson’s jealousy-provoking gifts is his ability to translate complicated science into English—those who have read his biographies of Einstein and Steve Jobs understand that Isaacson is a kind of walking Rosetta Stone of physics and computer programming. Thanks to my close read of The Innovators, I could probably explain, with a gun to my head, the principles of semi-conduction.
But it is the very human humans behind the digital revolution who are the main focus of The Innovators, and they are the reason I found this book to be not infrequently inspiring. I read The Innovators this past summer, a comprehensively unhappy summer, as Gaza was on fire and ISIS was erupting and Ebola was beginning its fatal run across West Africa. Here at home, the mood long before the summer had soured. We are living through a period of straitened dreams, of doubt about our country and its purpose, and of widespread cynicism about its most important institutions. What I’m saying is that right now I’m a sucker for optimism, and The Innovators is one of the most organically optimistic books I think I've ever read. It is a stirring reminder of what Americans are capable of doing when they think big, risk failure, and work together.
One of the surprising features of Isaacson’s latest book, coming, as it does, after his biography of Steve Jobs—who is generally, though not entirely correctly, understood to be the model of the radical (and congenitally irascible) American —is that it is a paean to cooperation, to the idea of force-multiplication through collective effort and, in particular, to the transformative power of the diamond triangle of industry, academia, and government. (In the interview published below, I ask Isaacson why America has traditionally been the seedbed of global innovation, and whether that will continue).
Isaacson sets out to accomplish several large things in The Innovators. Since he is fundamentally an optimist, he argues that human-computer symbiosis, rather than artificial intelligence, represents the main and best path forward, and he makes a compelling case that A.I., whether it manifests itself in benevolent or malevolent form, always seems to be 20 years away for good reason. (For a dystopian view of our future robot overlords, see this interview Isaacson just conducted with Elon Musk.) Building an "intimate connection between humans and machines” is what Isaacson says he believes in, and what he argues for.
The Innovators is also an extended argument for the U.S. to renew its commitment not only to the funding of basic scientific research, but to the rebuilding of an equitable and universally accessible public education system. Isaacson tells the story of Jean Jennings, an early computer programmer (one of six women who made themselves quietly indispensable in the development of the University of Pennsylvania’s ENIAC computer), who grew up practically penniless in Alanthus Grove, Missouri, but was able to pull together $76 in tuition each year to earn a mathematics degree from Northwest Missouri State Teachers College. The same education today, Isaacson notes, would cost $14,000, a 12-fold increase even after adjusting for inflation.
Another goal of the The Innovators is to restore to history the many women who were instrumental in the development of computing, first and foremost Lord Byron’s daughter, the visionary mathematician Ada Lovelace, whose work on Charles Babbage’s Analytical Engine makes her, in essence, the world’s first computer programmer. Isaacson makes it a point to celebrate the achievements of other women in computing, including Admiral Grace Hopper, and the aforementioned Jean Jennings, who, with her female ENIAC colleagues, had been shamefully forgotten. (One maddening moment in The Innovators comes when Jennings and the other women were excluded from a January 1946 celebration at the University of Pennsylvania held to mark the first public demonstration of ENIAC. “That night there was a candlelit dinner at Penn’s venerable Houston Hall,” Isaacson writes. “It was filled with scientific luminaries, military brass and most of the men who had worked on ENIAC. But Jean Jennings and Betty Snyder were not there, nor were any other women programmers.” Isaacson quotes Jennings: “Betty and I were ignored and forgotten following the demonstration.”)
Mainly, though, The Innovators is a group biography of men who, building on each other’s achievements (and occasionally borrowing each other's achievements), accomplished extraordinary things. The heroes of this book include such figures as Vannevar (rhymes with "achiever") Bush, who is something of a hometown hero at The Atlantic, which in 1945 published his article, “As We May Think,” perhaps the most important single article about technology ever written. In it, Bush predicted the coming of personal computers, the Internet, and, in essence, Wikipedia.
Isaacson’s other heroes include J.C.R. Licklider, the father of interactive computing; Douglas Engelbart, the creator of the mouse (and much else); and Alan Kay, who is more-or-less the father of the personal computer. The lives of these men, who are known to almost no one today outside the world of technology (compare their fame to men such as Bill Gates and Steve Jobs, who stand on their shoulders) are testaments to collaboration, entrepreneurship, curiosity, and risk-taking.
These first chapters, about figures largely unknown outside Silicon Valley, are fascinating. Later chapters deal with better-known figures. Steve Jobs, about whom Isaacson probably knows more than any other observer, makes an extended appearance, and Isaacson has drawn a vivid portrait of Bill Gates. I imagine that most readers will find these later chapters more interesting than the front-half profiles, but it is the men and women who did their work before the rise of the celebrity innovator that I found so exceptionally interesting.
Below are portions of an interview I conducted with Isaacson about this book. Read it; his answers are illuminating.
Jeffrey Goldberg: You had already set out on this book when you were diverted by an offer you couldn’t refuse, to write the biography of Steve Jobs. Did the act of spending so much time with Jobs, and immersing yourself in his thinking, change the focus or idea behind The Innovators?
Walter Isaacson: The main thing I learned from Jobs was the importance of the connection between the humanities to technology. And that became a theme in this history of the digital age. Ada Lovelace represented this, all the great innovators—Alan Kay at Xerox Parc—all of them realized that beauty mattered and that our technology should have a streak of humanity in it.
Goldberg: Literally a streak of humanity, by which I mean, you are quite skeptical about the future of artificial intelligence, and everyone has worries about this dystopian artificial intelligence future when machines run away from us.
Isaacson: The intimate connection between humans and machines is something that Steve Jobs really believed in, and it was a great counter to the notion that the machines would take over. The other thing that was important from Steve, something that I had to wrestle with when understanding him, was that he had this quality of loner individualism that made him a difficult person to work with and yet he was also a builder of really strong teams, and it helped me appreciate the importance of collaboration and teamwork but also the importance of having a strong visionary as part of the team. Steve seemed on the purpose to be a prickly, difficult teammate but in fact he brought together the strongest and most loyal team of any company in the digital age. So I had to get beyond looking at him as a kind of headstrong loner.
Goldberg: So he wasn’t a radical individualist in the classic American model?
Isaacson: The American mythology is of the person with the radical individual streak, but what Tocqueville missed is that individualism is not antithetical to forming associations. Americans have been great at barn raisings and quilting bees and all sorts of common endeavors that were undertaken by very individualistic and pioneering people.
Goldberg: Stay on this idea of pioneering for a moment. The West Coast, East Coast, divide. As a Penn guy, this nags at me. By rights, the University of Pennsylvania should be to digital innovation what Stanford actually is today, because of the work done there on ENIAC, just as Bell Labs should be Xerox Park. But they aren’t. What causes incubators to go stale?
Isaacson: The early East Coast pioneers were the University of Pennsylvania and Bell Labs, but they were hierarchical and they didn’t allow for entrepreneurial growth. For example, (John) Mauchly and (J. Presper) Eckert, the two who create ENIAC, wanted to commercialize it at Penn but couldn’t. Neither was there an entrepreneurial culture at Bell Labs, where there was not this sort of anti-authoritarian, entrepreneurial culture that you saw develop in the Bay Area in the 1970s. You have a cultural mix in the Bay Area that includes Stanford being a very entrepreneurial place, where people are encouraged to do startups, where you have a counterculture that emanates from the hippie movement and the anti-war movement, plus you have the individualist Whole Earth Catalogue mentality that involves wanting to have access to tools.
Intel is founded by Robert Noyce and Gordon Moore who rankled under the authoritarian, hierarchical system at Fairchild Corporation. They ran a division of Fairchild but they having to report back to headquarters on the East Coast. So they start their own company, which becomes Intel, with almost no hierarchy, just an open work space where Noyce and Moore just sit in the middle of the room.
Goldberg: East Coast culture would have swamped them?
Isaacson: Even Xerox, when it decides that it wants an entrepreneurial research center, decides to put it across the country from its headquarters.
Goldberg: Go to something that I worry about: the status of the U.S. as a whole as the best incubator of technology and risk-taking invention. Most of the action in The Innovators takes place in America, but I don’t think that’s because the author is American, it’s just that such a disproportionate percentage of the most important innovations were created here. Why is the U.S. the seedbed of digital innovation, and will it remain in the dominant position for a long time to come?
Isaacson: The U.S. indulges and even encourages risk-taking and failure. The pioneering spirit translates into an entrepreneurial spirit. You have a mix of anti-authoritarian startup junkies and venture capitalists willing to roll the dice. This also brings us to the education question. America had a great education system, the kind in which a young girl from Atlantis Grove, Missouri, Jean Jennings, could go to a state college for $78 a month and become a mathematician and become a programmer of ENIAC. Today, that same college costs $14,000 a year. Our education system used to serve everybody. Now there’s a divide between the education wealthy people get versus what poorer people get.
Goldberg: Has the pace of innovation slowed because of this?
Isaacson: I don't think so. I do think that America’s education system does promote creativity. It allows people to question and challenge. Einstein ran away from his school in Germany because he hated the fact that it was considered improper to challenge the teacher. I suspect that in a lot of Asian countries, the notion of challenging the teacher is less accepted than it is in the United States. I also think that societies that are comfortable with the free flow of information and the clash of opinions tend to be more creative in the information age because that’s the DNA that defines the information age.
Goldberg: Another huge task you’ve set out to achieve with this book is to remind people of the contributions women have made in advancing technology, starting, of course, with the pioneer Ada Lovelace.
Isaacson: I think I’m careful to show that a lot of women played these important roles but how they were not as much a part of the system. On Ada Lovelace, sometimes you hear criticism that she wasn’t a great mathematician, at which I ask them to explain Bernouli numbers, and how you would write a program to generate them, something I wrestled with for several days. And wrestling with it caused me to admire her even more, to admire her ability to write such a program.
Goldberg: Which figure in this group biography do you admire the most? It’s pretty clear that William Shockley is the one you least admire.
Isaacson: J.C.R. Licklider is certainly among the most noble. I think there’s a little-known succession of people who, instead of pursuing artificial intelligence, pursue an intimate connection between humans and machines, and that procession starts with Vannevar Bush, then Licklider, then Doug Engelbart, Alan Kay, and Steve Jobs. Someone like Licklider is a true hero of mine because he knew how to form teams; he envisioned interactive computing with easy-to-read screens—because he was developing an air-defense system where the console jockeys had no room for error if they misread the screen; then there’s the Intergalactic Computer Network, which showed that he had a great sense of humor, it showed that he had a good “aw shucks, let’s do this together” sense about him. He wasn’t the sort of person who was trying to take credit for the big idea. Then he becomes the first director of the Pentagon office that creates Arpanet, which becomes the backbone of the Internet. His fingerprints are on everything, but he doesn’t claim credit for everything, which proves the maxim that there’s an unlimited amount you can get done if you don’t seek credit.
Goldberg: Is this one of the reasons you wrote this book, to give credit where credit is due?
Isaacson: As a kid, I was a real electronics geek. I loved soldering circuit boards. When I started doing digital media at Time, Inc., I couldn’t figure out who invented computers, or who invented the Internet. I became fascinated by these little-known people, but I also realized—and remember, I’m at Time, where we were always putting an individual on the cover, and I'm also a biographer—I realized that these contributions were collaborative in nature, and that many of the people who invented computers and invented the Internet worked in teams. We don’t always celebrate people who do things in teams very well. We celebrate the big, high-profile individual. But this was something I did with Evan Thomas when we wrote “The Wise Men,” which was about six not-very-well-known people who shaped foreign policy. I wanted to write a story about people who work in groups.
Goldberg: Another of your obvious heroes is Vannevar Bush, who wrote the 1945 Atlantic article, "As We May Think," which might be the most important article about technology ever published. What was Bush’s genius?
Isaacson: Vannevar Bush built a big analog computer at M.I.T. He was a great, great academic. He helps found Raytheon, so he understood the corporate world. And finally, he manages America’s military research efforts during World War II, overseeing computers and the Manhattan Project. He’s able to get people to collaborate—government, private industry, and universities. This becomes the core of America’s creative strength. And then he writes the article in which he says that machines are going to be extensions of our minds. They’re going to amplify our minds, they are going to help us think. This was a counter-thought to the idea that machines were going to replace us, machines that were going to think instead of us. If you read Vannevar Bush’s other great article of 1945, “Science, the Endless Frontier,” which unfortunately wasn’t published in The Atlantic, he argued that government has to fund basic scientific research because that becomes the seed-corn for future inventions. Especially in the Eisenhower years, government spent a lot of money encouraging basic scientific research that led to things like transistors and microchips and rockets to the moon and the Internet. Government has cut back radically on basic research funding today. There are two things that make me have some worry about America’s future in innovation. One is the cutback in basic research funding for universities by the government. The other is the decline of America’s K-12 education system, and the fact that it is a two-tiered system for rich and poor.
Goldberg: Is it possible that the U.S. could cease being the world leader in digital innovation?
Isaacson: I’m more optimistic than that. I didn’t write this book as a warning. I’m optimistic because I can just look at the data points. We still have people creating Google and Amazon and Facebook and Snapchat. My worries are pretty specific—the cutbacks in basic research funding, the problems in our K-12 education system. But venture capital is doing fine. People who are well-educated are doing fine. We still have a tolerance for risk—just talk to Travis Kalanick at Uber. Ask him what he did before Uber and he’ll tell you about all the companies that flamed out. And you still don’t see companies and ideas like Uber springing up from Europe and Asia. So I’m an optimist.
Goldberg: Is there a lesson for Washington in the way the tech sector has worked?
Isaacson: It’s an interesting question. Government has trouble being as entrepreneurial as the private sector. As you know, people in the private sector have a tolerance for risk and failure that doesn’t really exist in government. You know what happens in government—you get devastated even if you make a gaffe, much less have an actual failure. So there is something there.