The Myth of the Innovator Hero

Every successful modern e-gadget is a combination of components made by many makers. The story of how the transistor became the building block of modern machines explains why.

hero inventor.png

WIKIPEDIA: Steve Jobs, Thomas Edison, Benjamin Franklin

We like to think that invention comes as a flash of insight, the equivalent of that sudden Archimedean displacement of bath water that occasioned one of the most famous Greek interjections, εὕρηκα. Then the inventor gets to rapidly translating a stunning discovery into a new product. Its mass appeal soon transforms the world, proving once again the power of a single, simple idea.

But this story is a myth. The popular heroic narrative has almost nothing to do with the way modern invention (conceptual creation of a new product or process, sometimes accompanied by a prototypical design) and innovation (large-scale diffusion of commercially viable inventions) work. A closer examination reveals that many award-winning inventions are re-inventions.

Most scientific or engineering discoveries would never become successful products without contributions from other scientists or engineers. Every major invention is the child of far-flung parents who may never meet. These contributions may be just as important as the original insight, but they will not attract public adulation. They will not be celebrated by media, and they will not be rewarded with Nobel prizes. We insist on celebrating lone heroic path-finders but even the most admired, and the most successful inventors are part of a more remarkable supply chain innovators who are largely ignored for the simpler mythology of one man or one eureka moment.


Where great ideas really come from. A special report

Perhaps nothing explodes the myth of the Lonely Innovator Hero like the story of modern electronics. To oversimplify a bit, electronics works though the switching of electronic signals and the amplification of their power and voltage. In the early years of the 20th century, switching and amplification was done (poorly) with vacuum tubes. In the middle of the 20th century, it was done more efficiently by transistors. Today, most of this work is done on microchips (large numbers of transistors on a silicon wafer), which became the basic building block of modern electronics, essential for not only computers and cellphones but also products ranging from cars to jetliners. All of these machines are now operated and controlled by -- simply stated -- the switching and amplification of electronic signals.

The dazzling and oversimplified story about electronics goes like this: The transistor was discovered by scientists at Bell Labs in 1947, leading directly to integrated circuits, which in turn led straight to microprocessors whose development brought us microcomputers and ubiquitous cellphones.

The real story is more complicated, but it explains how invention really happens -- through a messy process of copy, paste, and edit. The first transistor was patented 20 years before the Bell Labs scientists, in 1925 by Julius Lilienfeld. In 1947, Walter Brattain and John Bardeen amplified power and voltage using a germanium crystal but their transistor -- the point-contact transistor -- did not become the workhorse of modern electronics. That role has been played by the junction field-effect transistor, which was conceptualized in 1948 and patented in 1951 by William Shockley (seen in next photo). Today, even the Bell System Memorial site concedes that "it's perfectly clear that Bell Labs didn't invent the transistor, they re-invented it."

Moreover, germanium -- the material used in the epochal 1947 transistor -- did not become the foundation of modern electronics. That would be silicon, since the element is roughly 150,000-times more common in the Earth's crust than germanium.



This is where another essential invention comes into the story. Semiconductor-grade silicon must be ultrapure before doping, or adding tiny amounts of impurities to change its conductivity.

The number of transistors on a chip went from 2,300 in 1971 to 1 million by 1990 to 2 billion by 2010

In order to lower the production costs of silicon wafer, a crystal from which the wafers are sliced must be relatively large. These requirements led to new ways of silicon purification (purity of 99.9999% is common) and to ingenious methods of growing large crystals, both being enormous technical accomplishments in their own right. The story of crystal-making began in 1918 when Jan Czochralski, a Polish metallurgist, discovered how to convert extremely pure polycrystalline material into a single crystal; procedures for growing larger crystals were introduced in the early 1950s by Gordon Teal and Ernest Buehler at the Bell Labs. Soon afterwards Teal became the chief of R&D at Texas Instruments where a team led by Willis Adcock developed the first silicon transistor in 1954.

Presented by

Vaclav Smil is the author of more than 30 interdisciplinary books. His website is

How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well. Bestselling author Mark Bittman teaches James Hamblin the recipe that everyone is Googling.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus


How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well.


Before Tinder, a Tree

Looking for your soulmate? Write a letter to the "Bridegroom's Oak" in Germany.


The Health Benefits of Going Outside

People spend too much time indoors. One solution: ecotherapy.


Where High Tech Meets the 1950s

Why did Green Bank, West Virginia, ban wireless signals? For science.


Yes, Quidditch Is Real

How J.K. Rowling's magical sport spread from Hogwarts to college campuses


Would You Live in a Treehouse?

A treehouse can be an ideal office space, vacation rental, and way of reconnecting with your youth.

More in Business

Just In