There's now a staggering amount of relevant information young academics need to consume before embarking on significant careers
There's good news and bad news in the latest research on aging and innovation in science, according to The Scientist, reporting on a recent paper by the economists Benjamin Jones and Bruce Weinberg.
Their analysis of 525 Nobel Prize winners (182 in physics, 153 in chemistry, and 190 in medicine) between 1900 and 2008, revealed that while the mean age at which they did their Nobel-prize winning work was around 37 for the three fields in the early 20th century, they are now around 50, 46, and 45 for Physics, Chemistry, and Medicine, respectively.
So there's hope for older researchers. But the mathematician Jordan Ellenberg presented, in Slate a few years ago, a disheartening side of productive maturity: progress of his discipline had lengthened the time needed to learn enough to make a significant original contribution:
[T]here's simply much more mathematics to learn than there was 100 years ago. The undergraduate curriculum at Princeton brings students to the state of the art in research -- as it was around the time of Poincaré's death in 1912. A year of backbreaking work in graduate school suffices to turn the clock forward to 1950 or so. At the age when a contemporary student first opens a current research journal, Galois had already been dead for two years (footnote: apologies to Tom Lehrer). In literature, pace Harold Bloom, it's possible to produce a great work without a deep knowledge of the work that went before. Not so in mathematics, not any more; maybe, in fact, not ever.
It's an open question whether the extension of the human lifespan, and the growing preference of academics for remaining active (often keeping full teaching loads) past 65, plus new electronic tools and instruments, are winning the race against the need to acquiring the additional knowledge accumulated over the last hundred years.
Benjamin Jones' speculation is worth considering:
If people are naturally very productive in their 20s, either because there is some innate physiological advantage or because they are just very energetic and have strong incentive, but instead they're saddled with having to learn all this accumulated knowledge, that does suggest that we are taking a chunk out of people's innovative capabilities at a time in their lives when those capabilities are potentially very high. That suggests there's a really strong opportunity cost. That doesn't mean that's an easy problem to solve because it is necessary for these scholars to become experts before they can really make a big contribution. It's also the case that there's more to know and you just can't know everything. One implication is that people become much narrower experts. That also makes people's creativity a bit narrower.
I continue to be skeptical about low-hanging-fruit theories. (My most recent thoughts on the issue are here.) But it's intriguing to wonder whether the increase of knowledge, far from being a purely exponential process, might also have an element of negative feedback.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.