The supposedly short shelf life of public intellectuals

February 8, 2022 • 9:15 am

The following intriguing post, written by author and independent researcher Tanner Greer, appeared on the website “The Scholar’s Stage”.  Greer’s thesis is that “public intellectuals” (he uses Thomas Friedman as his main example) have a short “shelf life”—the period during which they’re producing truly creative work and ideas that intrigue the public.

Note, this differs from the period of most productive scholarship: a public intellectual becomes so by expounding creative ideas in a popular and easily understood way, and in a way that influences intelligent people in general. Many academics famous in their fields are not public intellectuals because they don’t write for or address the public.

The period of fertility Greer gives public intellectuals is 7-10 years, though as Greer admits, the fame of public intellectuals can last well beyond their “sell-by” date. In general, though, he sees this creative period of public recognition running through one’s thirties, with a falloff after that. Greer does allow that in few areas, like history, public intellectualism can start later or last longer than a decade.

We all know that most mathematicians and physicists do their pathbreaking work when young, but is this true for public intellectuals as well?  After all, Greer notes that such people usually become “famous” for their academic work before they transition into the more general kind of punditry that characterizes “public intellectuals”.  I’m not sure I agree, but decide for yourself after reading Greer’s piece, which you can see for free by clicking on the screenshot below (note that the piece is two years old but still relevant):

Greer’s contention:

Several months ago someone on twitter asked the following question: which public thinker did you idolize ten or fifteen years ago but have little intellectual respect for today? [1] A surprising number of people responded with “all of them.” These tweeters maintained that no one who was a prominent writer and thinker in the aughts has aged well through the 2010s.

I am not so harsh in my judgments. There are a few people from the last decade that I am still fond of. But the problem is inevitable. This is not a special pathology of the 21st century: when you read intellectuals of the 1910s talking about the most famous voices of the 1890s and early 1900s you get the same impression. You even get this feeling in a more diluted form when you look at the public writing of the Song Dynasty or Elizabethan England, though the sourcing is spottier and those eras and there was no ‘public’ in the modern sense for an individual living then to intellectualize to. But the general pattern is clear. Public intellectuals have a shelf life. They reign supreme in the public eye for about seven years or so. Most that loiter around longer reveal themselves oafish, old-fashioned, or ridiculous.

. . . Thus most humans develop their most important and original ideas between their late twenties and early forties. With the teens and twenties spent gaining the intellectual tools and foundational knowledge needed to take on big problems, the sweet spot for original intellectual work is a person’s 30s:  these are the years in which they have already gained the training necessary to make a real contribution to their chosen field but have not lost enough of their fluid intelligence to slow down creative work. By a person’s mid 40s this period is more or less over with. The brain does not shut down creativity altogether once you hit 45, but originality slows down. By then the central ideas and models you use to understand the world are more or less decided. Only rarely will a person who has reached this age add something new to their intellectual toolkit.

He also quotes from a textbook (reference not given but there’s a reference list at the end):

In most fields creative production increases steadily from the 20s to the late 30s and early 40s then gradually declines thereafter, although not to the same low levels that characterized early adulthood. Peak times of creative achievement also vary from field to field. The productivity of scholars in the humanities (for example, that of philosophers or historians) continues well into old age and peaks in the 60s, possibly because creative work in these fields often involves integrating knowledge that has crystallized over the years. By contrast, productivity in the arts (for example, music or drama) peaks in the 30s and 40s and declines steeply thereafter, because artistic creativity depends on a more fluid or innovative kind of thinking. Scientists seem to be intermediate, peaking in their 40s and declining only in their 70s. Even with the same general field, differences in peak times have been noted. For example, poets reach their peak before novelists do, and mathematicians peak before other scientists do.

Still in many fields (including psychology) creative production rises to a peak in the late 30s and early 40s, and both the total number of works and the number of high quality works decline thereafter. This same pattern can be detected across different cultures and historical periods….

Greer’s example:

To give you a sense of what I mean by this, consider the career of public intellectual whose career peaked in the early aughts. Thomas Friedman is now the butt of a thousand jokes. He maintains his current position at the New York Times mostly through force of inertia, but secondly through his excellent connections within the Davos class and his sterling reputation among those who think as that class does. But this was not always so.

(I rarely read Friedman, so I can’t weigh in here. But I’m sure many readers do and can.)

He reviews Friedman’s career, which began with his earning his laurels, and two Pulitzer Prizes for journalism (reporting in a war zone) in his late 20s and 30s. He was not a public intellectual then—that began, in Greer’s view, in Friedman’s early forties, with two Pulitzers under his belt and then a number of books based on his columns, including the famous The World is Flat, which appeared when Friedman was 52.  Then, according to Greer, Friedman, still a public intellectual, had passed his shelf life:

Friedman would close out the decade with another book and three documentaries. These were mostly restatements of his columns (which in turn drew heavily from ideas he first introduced and developed between Lexus and The World if Flat). Friedman was still a part of the national conversation, but his perspective had lost its originality. His columns began to bleed together. This is the era when “Friedman Op-Ed Generators” went viral. Increasingly, Friedman was not argued against so much as joked about. By 2013 or so (just as he was turning 60) Thomas Friedman was done. Not technically so—between then and now he would rack up two more books, hundreds of columns, and heaven knows how many appearances at idea festival panels and business school stages. But intellectually Friedman was a spent force. His writing has been reduced to rehashing old rehashes, his columns the rewarmed leftovers of ideas grown old a decade ago. It is hard to find anything in his more recent books or columns that has mattered. He is able to sell enough books to live comfortably, but you will have difficulty finding anyone under 50 who admits they have read them. Friedman lingers still as a public figure, but not as a public intellectual. His thinking inspires no one. The well has run dry.

Why did this happen? According to Greer, Friedman was no longer living in the world he inhabited when he was younger. He was no longer a journalist covering a war and drawing his ideas and columns from his experience in that earlier world, but recycling ideas and living in an entitled world where he jets around giving lectures.  The well had run dry.

According to Greer, what holds for Friedman holds for others as well. Greer offers two reasons for the decay of public intellectuals:

1.) Brain decay. 

I suspect the underlying mechanism behind this pattern is brain cell loss. Neuroscientists estimate that the average adult loses around 150,000 brain cells a day; in the fifty years that follow the end of brain maturation (ca. years 25-75), the average brain will lose somewhere between 5-10% of its neurons.[3] Fluid intelligence begins declining in a person’s 30s.[4] This implies that most humans reach their peak analytic power before 40. Crystal intelligence holds out quite a bit longer, usually not declining until a person’s 60s or 70s. This is probably why historians reach peak achievement so late: the works that make master historians famous tend towards grand tomes that integrate mountains of figures and facts—a lifetime of knowledge—into one sweeping narrative.

He gives little evidence that loss of brain cells is responsible for public intellectuals’ decline (after all, is 5-10% so substantial that it erodes creativity?). And there’s some special pleading here: he says that historians and philosophers are exceptions because their public intellectualism builds on studies and thoughts that “crystallized over the years.” Well, that sounds good, but before we start advancing theories, can we have some data? Remember, though, that he’s not talking about scholarly or academic achievement or excellence, but about the avidity of the public to consume what the person produces. And it’s hard to get data on that. (For scientists, we could get an idea of academic achievement over the years using a number of metrics, including citations. I do that below for myself.)

2.) Loss of the experiences that produced their renown.  Again Greer uses Thomas Friedman (poor guy!) as an example:

Recognizing this helps us make sense of a many interesting aspects of human social life. I think often about Vaisey et al’s 2016 study, which demonstrated that most shifts in social attitudes occur not through change in the attitudes at the individual level, but through intergenerational churn.[5] Old attitudes die because generations that hold them literally die off. Such is the stuff of progress and disaster.

Such is also the problem of the public intellectual. A public intellectual’s formative insights were developed to explain the world he or she encountered during a specific era. Eras pass away; times change. It is difficult for the brain to keep up with the changes.

Not impossible, just hard. And this bring my second, sociological explanation into play. There are things that a mind past its optimum can do to optimize what analytic and creative power it still has. But once a great writer has reached the top of their world, they face few incentives to do any of these things.

Consider: Thomas Friedman’s career began as a beat reporter in a war-zone. He spent his time on Lebanese streets talking to real people in the thick of civil war. He was thrown into the deep and forced to swim. The experiences and insights he gained doing so led directly to many of the ideas that would make him famous a decade later.

In what deeps does Friedman now swim?

We all know the answer to this question. Friedman jets from boardroom to newsroom to state welcoming hall. He is a traveler of the gilded paths, a man who experiences the world through taxi windows and guided tours. The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?

More importantly: What incentive does he have to live any other way?What, then, should you do if you aspire to this status—and who does, really?  Doesn’t it often happen by accident, when someone is just driven to write a book and get their ideas out into the ether? (Greer’s emphasis below).

There are practical implications for all this. If you are an intellectual, the sort of person whose work consists of generating and implementing ideas, then understand you are working against time. Figure out the most important intellectual problem you think you can help solve and make sure you spend your thirties doing that. Your fifties and sixties are for teaching, judging, managing, leading, and dispensing with wisdom. Your teens and twenties are for gaining skills and locating the problems that matter to you. Your thirties are for solving them.

Crikey? Does anybody really set out to do this?

When somebody with a public presence reads a piece like Greer’s, your mind automatically reviews your own career. I do not regard myself as a public intellectual, not by any means, but simply as a (ex) scientist who has a bit more public presence than other scientists. But my career, both scholarly and in the public eye, doesn’t fit Greer’s narrative. (Remember, though, we shouldn’t attack his thesis by citing anecdotes, even though he himself adduces no anecdotes.)

Perhaps I can say that my scientific creativity peaked in my late forties—around 1989, when I was doing some decent experiments—but if that were the case, my citations by other scientists should have begun dropping by the time I was sixty, around 2009. That was palpably not the case according to the plot below (this is the first time I’ve ever seen these data!). I was still doing experiments and the citation rate started rising. But remember, Greer’s talking about public renown, not academic renown. By the time I was thirty—at my supposed creative peak, I was only getting started (note that the plot below begins when I was forty!)

My proudest achievement, when I think my mind was most capacious and supple, was when I wrote the book Speciation with Allen Orr. It came out in 2004, when I was 54, and when I look at it now, I think, “How could I have thought of that stuff?” I know now that I couldn’t write that book again; my mind is duller and, of course, I’m retired.

I suppose my “public presence” began in 2009 when I published Why Evolution is True and was also writing a lot of book reviews for well known venues. But at that time I was 60, and Faith versus Fact came out five years later. The citations below don’t measure that kind of “public presence”, and I don’t know how to measure that, nor do I much care. I’m content with what I’ve done.

So I don’t fit the narrative; my scientific achievements were not primarily in my thirties, as I was still a postdoc at that time, and didn’t get a real job until I was about 33. Greer argues that academic achievements peak during one’s twenties.

The public intellectuals I do know, mainly Richard Dawkins and Steve Pinker, don’t seem to me to have passed their sell-by date even now, because they’re still followed and read avidly, and their stuff is not mere recycling of old ideas. Steve’s Better Angels book came out when he was 57, and that, whether you agree with its thesis or not, is a work of thought and originality.

Dawkins hews a bit better to Greer’s thesis: The Selfish Gene, arguably his most famous book, came out when he was just 35, but since 1976 he’s been turning out a number of books that aren’t recycling ideas from that first book, but further and different ways to consider evolution. (Richard has said he’s most proud of The Extended Phenotype as a work of originality. Its thesis is not at all the thesis of the earlier work.)

My favorite “popular” book by Dawkins, The Blind Watchmaker, came out when he was 45, and The Ancestor’s Tale when he was 63. Yes, he’s still writing about evolution, but his biggest burst of fame came when he was 65 with the publication of The God Delusion. I don’t think you can say he’s missed a beat over all that time, and of course he’s entered a new field: anti-theism, though I maintain that with him, as with me, opposition to religion grew out of the frustration of trying to sell evolution to a recalcitrant, religious public. At any rate, he’s still writing about evolution with his new book on the evolution of flight, and he’s still doing pretty much what he was doing forty years ago.

But I digress.  It’s clear that many people decline both academically and as public intellectuals as they age, and part of that may be due to loss of brain cells. But I can’t say that, with Pinker or Dawkins, either has shown obvious loss of neurons! And both continue to be immersed in the worlds that motivated their “publicness.” Maybe, then, the public intellectuals I know well just happen to be exceptions to Greer’s Rule.

In the end, Greer’s thesis may be a case of Hitchen’s razor: “What can be asserted without evidence can be dismissed without evidence.” But I won’t be quite that glib, for there are two phenomena (creativity and pubic engagement) that do generally decline, and this deserves some scrutiny.

32 thoughts on “The supposedly short shelf life of public intellectuals

  1. I read both of your popular books and that is how I found your blog. And I notice that the bias here is about writing and publishing and being able to live off of being a writer/thinker. Lawrence Block is a popular writer who explores this bias alot. Writing has to make money to be valuable. Shelf life, sell by date are wierd terms to apply to people, but determining a person’s value by their creative output seems to me the harshest of all punishments

  2. Possibly Greer is out of date as well. Whatever you call yourself the introduction and continued management of your web site has extended your time by a number of years and continues. Maybe Greer is too old to consider that. Certainly Dawkins has lasted well past this use by date. The world changes faster all the time so if Greer does not watch out the “intellectuals” may come and go before he has even heard of them. In the world of evolution these things can last a long time.

  3. A conjecture:
    The ideas that eventually led to your “scientific creativity peaked in [your] late forties—around 1989, when [you were] doing some decent experiments”, those ideas you may have started to form in your twenties. Then it took years to refine these ideas, and wait for the right opportunity to do an experiment.

    Peak creativity, engaging in truly novel ways of thinking for the first time, may happen early in a research career. But many researchers are not aware of that. Except for mathematicians.

  4. I suspect that there are many factors at play. Some will be more important than others.

    However I suggest that one key element is that once you have earned public acclaim it is very easy to slide into defence of your original ideas to protect your reputation – no matter how appropriate this is. Epicurus held that seeking fame for itself was unwise and likely to cause unhappiness.

    Some intellectuals do ‘mature’ though… Darwin updated his Origin of Species for each edition (although he may have been more accurate if he hadn’t). But then Darwin was a gentleman who mostly avoided the limelight.

  5. I have a speculation as to why public intellectuals’ creativity dries up after their 30s, at least in the humanities. I think that for most people it is during their 20s or 30s that their understanding of the world crystallizes and is likely to change little in their later decades. So, after that time they simply regurgitate their worldviews adapted to changing circumstances. In other words, when an idea or belief takes hold in the human mind it is very hard (although not impossible) to change it.

  6. I wouldn’t be surprised if it’s actually partly a case of regression to the mean. Most people will never do anything that will make them a “public intellectual”, however one wants to define it, even for those who work in fields in which that’s possible. For those that do, such occurrences will be outliers even in their own lives, a rare flash of brilliance or catchiness or being in the right place at the right time. Then they revert to their more “normal” state.

    Another factor, of course, is the fact that once one is in a position of fame or notoriety, one’s incentives change. And humans respond to their incentives. When you’re not at the top, there’s less risk in thinking or trying or creating something new, and if one thing doesn’t work, you can try another. When you’re already successful, you don’t tend to upset the apple-cart, you play it safe(r) and you also have people around you parroting yourself and related ideas back to you in a reinforcement bubble, and it’s more comfortable. The doesn’t incentivize originality much.

    People ARE perfectly capable of remaining creative and original as time goes by…especially if their circumstances require them to do so. Science, thankfully, especially in the most active fields, often provides surprising changes to prior understanding (as in physics and biology for most of the last century and a half), forcing those who aren’t too set in their ways to adjust creatively to new ideas.

    1. “Another factor, of course, is the fact that once one is in a position of fame or notoriety, one’s incentives change.”

      I’ve never been “in a position of fame or notoriety”; I’m a poet, after all, so that would be oxymoronic. But I think you’re right, Robert, that “once one is in a position of fame or notoriety, one’s incentives change.”

      I spent 10 years trying to get my first poetry collection published; the manuscript was never on my desk for more than a day in all that time. I was incentivized mainly by the fact that I knew my work was good and no one else seemed eager to agree. Finally, when I was in my thirties my first book won the Princeton Poetry Series competition and was nominated for a Pulitzer Prize. So I now had at least some “notoriety” and started being concerned and careful about losing it, deciding that from now on I’d write only good poems. Which is suicide for a poet, since you have to write at least 10 “bad” poems for every one “good” one. I suspect this change in motivation when one succeeds after a long struggle is not uncommon among writers.

      But you’re a writer, Robert—are you basing this observation on your own experience?

      1. I’ve never been successful enough to know, I’m just basing on the entirely understandable way people behave in response to the various pressures, internal and external, that weigh on them.

        But I’ve certainly been through losing literally (almost) everything that ever mattered to me, and I’m still alive (from a certain point of view, anyway), so I probably am a little less afraid of loss than some others might be. You know what the song says, “Freedom’s just another word for nothin’ left to lose.”

        Still, if I were to suddenly become an internationally acclaimed and wealthy author, I might become more risk averse. I’m eager to try the experiment! ^_^

        1. That link was hilarious, Ken, not to mention chucklesome, droll, farcical, riotous, risible, sidesplitting, uproarious, and just plain funny. I miss ol’ Garrison.

  7. My scientific career also is the complete opposite of Greer’s thesis. I’m 64 now and last year was my best year for citations (approx 1900). I started my “second life” as a quantitative biologist in 2006 at the age of 49 . Before that, I had almost zero citations. So there is hope for us old codgers even after we pass our thirties.

    1. Heck, Lou, the way things are heading, you may turn out to be the Grandma Moses of quantitative biologists. 🙂

  8. I’ve long noticed a phenomenon among aging artists. Whether it’s actors, directors, writers, musicians etc, an artist who blazed on to the scene due to the vibrancy of their talent and originality, later on mellows but always seems to think “I’m better now than when I was younger.” They talk about how they were irascible, idealistic when young, didn’t have all the tools, but now with age they’ve grown wisdom, perspective, more sophistication and a better grasp of their craft. And almost inevitably these same people are now producing more banal, forgettable work.

    This creative burst thing really does seem to be for “the young.”

    (Which of course a statement like that, and an article like the one Jerry cites here, will immediately bring to the mind of any reader exceptions he/she can think of…but it seems more the rule than the exception IMO).

  9. This isn’t exactly breaking news. Back in 2003, one of this nation’s leading public intellectuals — Richard Posner, then the chief judge of the federal 7th Circuit Court of Appeals (and a one-time SCOTUS nominee heir apparent) and a former law professor at the University of Chicago (not to mention, our host’s cat buddy) — wrote a book about it, Public Intellectuals: A Study of Decline.

    The focus of Posner’s book was, nonetheless, a bit different — the decline of the quality in people deemed “public intellectuals,” rather than the decline in the quality of work of individuals once they are so deemed.

    1. Edit: the paperback edition of Posner’s book that I read was published in 2003; the original hardcover edition, in 2001.

  10. Nearly all of the successful academics I know, as measured by tenure, publications, citations, grants, etc. are those who appear to actively avoid being a public figure and generally just “stay in their lane” mentoring students, serving on various college committees, submitting grant proposals and manuscripts and the like. Financially, all of these professors are set for life. It may explain why there appears to be so little push back against some of the excesses of the progressive left. It just isn’t worth it, especially if it is all just another fad that will soon blow over.

    As I see it, being a public intellectual also means being a public target and being under constant pressure to keep the gravy train flowing. Thomas Friedman is a good example of someone who capitalized on his initial fame and has managed to rest on his laurels ever since, even though in my view, his ponderings were (and still are) rather dull and unoriginal.

    Taking more extreme positions is another way to keep the gravy flowing, so as to capitalize on clicks and outrage. Plenty of examples of for this approach, too (i.e. Jordan Peterson, Dave Rubin, too many on Fox and the political right to name).

  11. I feel that Greer dismisses artists out of hand. Here’s a list of artists (conceived broadly) who come readily to mind for their continually increasing creativity into ripe old age: Michelangelo, Verdi, Monet, Matisse, Picasso, Clint Eastwood, Martha Argerich, Martha Graham…I think that one mark of a great artistic genius is that their powers don’t fall off as they age. (Vaal, I was composing this as your comment appeared.)

  12. Tanner Greer’s leading example, Thomas Friedman, has been the butt of jokes for longer than Mr. Greer seems to suppose. Many were mocking the mustachioed mahatma in print — none, perhaps, more mordantly than Matt Taibbi — even when Friedman was supposedly at the peak of his public-intellectual powers.

  13. Then there is the question of musical creativity in old composers. Some of the most interesting works of Janacek, Ralph Vaughan Williams, and Havergal Brian were products of their old ages. On the other hand, Jean Sibelius, who was famously self-critical, published not a note after age 64, though he lived another 28 years. The question was explored by a psychologist and music lover in the Psychoanalytic Review at: https://pep-web.org/search/document/PSAR.087.0429A .

    As for me, I believe I had one new idea about a year ago, but then I forgot it. There are those who claim that I have been slipping into dementia since my first academic appointment a long time ago.

  14. There is also the issue that even in the world of ideas there are fads and fashions. Thinkers go in and out of style among the intelligentsia, often in the same way clothing or bands do. Someone like Richard Dawkins is currently “out of fashion” among many woke people, but this might change if wokery declines.

  15. That is interesting, but another side to the public shelf life is the audience. I posit (without evidence, but what the hey) that a public persons’ audience ages with them, and that audience just stops paying attention to the re-hashing of old ideas after a while. Younger people may not pick on on the aging public intellectual bc they don’t relate to them as they would to a younger and more relatable intellectual.

    It may be the case that Pinker and Dawkins have “kept” longer bc their public work falls into the areas of history and philosophy. Those being fields that are kinder to older persons, according to Greer.

  16. Martha Argerich
    Not a public intellectual
    But
    Still performing (piano) at over 80

    : https://youtu.be/AYkQleTcck8

    She is a force of nature. Note her _fluency_ in _multiple_ languages, and survival of .. I think ovarian cancer.

    I think the notion of a 30 year old peak “productivity” in humans will go away as lifespan and age (two different things) both increase due to … well, basically due to Enlightenment Now!

    Martha Argerich displays extraordinary ability even at 80.

    Add on for my comment :

    John Williams (composer)
    90 today

  17. I think my mind was most capacious and supple, was when I wrote the book Speciation with Allen Orr. It came out in 1989, also when I was forty, and when I look at it now, I think, “How could I have thought of that stuff?” I know now that I couldn’t write that book again; my mind is duller and, of course, I’m retired.

    Reminds me of what Bob Dylan said in a 60 Minutes interview with Ed Bradley in 2004. He’s incapable of writing intricate lyrics like those to “It’s Alright, Ma, I’m Only Bleeding” now, though there are other things he can do better. (He did not elaborate on that those other things might be. 🙂 )

    Here’s the relevant minute-and-a=half clip from that interview.

  18. I think our host is a good example countering Greer. He wrote ‘ Speciation’ with Orr and then WEIT, but then in his sixties (?) he wrote a great, already classic, treatise on a somewhat different subject, ‘Faith vs Fact’. .
    No, i dont think he lost much of his intellectual sharpness, it increased it if anything.
    Pinker would be another example.
    I find this in myself too, not as keen on ‘difficult details’, but having a better, broader and sharper view of things, or maybe that’s just a delusion of early Alzheimer’s kicking in (for me, not our host or Pinker, that is)..

  19. I miss Hitch. Whilst he has tended to disappear from the public eye his idea that extraordinary claims require extraordinary evidence is still one of the most powerful five words in the battle against stupidity and ignorance

    1. I believe almost all of Hitch’s books are still in print and continue to sell briskly. (This is certainly the case regarding God Is Not Great.) And, though I do not follow such things nearly so closely, I think his videos on YouTube still get their fair share of views, too.

      Rumors of his waning may be greatly exaggerated. 🙂

  20. I’m pretty sure my best work was a book I published when I was 55, and it was a whole new area for me.

    If anyone’s interested, it’s “The Contradictions of Jazz.”

  21. This is a matter which, for me anyway, stimulates a huge number of matters of intense interest. Please pardon the length below!

    One is the fairly easy trichotomy into people who are mostly in a single one of three categories:
    1/ Creators of (approximate) truth in science–writ large– and including history, somewhat less in other social sciences IMHO, almost entirely these days people labouring in academic institutions, including institutes which may have no enrolled students for formal qualifications;
    2/ Creative artists in music, painting, fiction, poetry, etc.; and
    3/ People who put forth ideas of reasonable seriousness to usually, but sometimes aiming more to entertain, the ‘general’ (not really) public. Nothing worthwhile for me to add here.

    Of course there are those who move in a couple of these categories eventually.

    As for 2/, and in music at least, at the highest level of what I admire, I completely disagree with the thesis of that book, taking Bach, Mozart, Beethoven and Verdi as glaring examples of almost monotone improvement, from already fantastic early stuff, as the years went by—and for me also Mahler. I realize my lack of interest and knowledge in most music stuff less than a century old, except for temporary diversion.

    I generally agree with the book with respect to 1/, but I think Jerry’s comment about logic, mathematics and theoretical physics all involving numbers a bit younger than the book puts forth. I took the trouble to look up a few dates on this:

    Einstein, born 1879, had by the end of his 30th year in 1909, all but a small amount of his biggest ideas (including e.g. the Equivalence Principle for General Relativity though its completion was only achieved around 1915 IIRC), the others being the big 1905 “miracle year” for Special Relativity, for Brownian motion closing the case for existence of atoms/molecules, and for photoelectric phenomena and photons making a huge step towards quantum theory. I think the EPR paper much later has also been huge, so not all was during his initial 30.

    Godel, born 1906, had done his completeness and especially the two incompleteness contributions by his mid-20s, and I think the biggest part of the consistency of choice and especially of continuum, by 1936, i.e. 1st 30 years, despite a serious illness intervening.

    I haven’t looked up Maxwell and Newton dates in that vein.

    Schrodinger was fairly young, and Heisenberg and Dirac very young, as well as Von Neumann re finally creating Quantum Mechanics.
    So was Alan Turing, thinking about the two places where Von Neumann was just a bit too late to do what he undoubtedly was capable of, despite huge contributions there and later before a much too early death, for both the latter.

    In pure math, where I have some professional history personally, the examples of people in their 20s, even younger, doing their best stuff are too numerous to even start listing. But a few personal opinions about the giants ‘recently’:

    Gothendieck did an extraordinary amount almost entirely in an early 15 year period.
    Serre, another late middle 1900s Fields Medalist, got the latter younger than all others, yet as a now 90+ may still be active, certainly was one of several who had a big role, well after age 50 in his case, in the eventual proof of Fermat’s last theorem.
    The two others who seem to me the best (perhaps I rather mean affected me the most) were especially Atiyah, and also Milnor, again work in very early 30s and earlier (and not only Fields Medal stuff, but continuing, though I’d hesitate to compare with the earlier career).

    An amusing and I’m pretty sure true story about John Milnor had him as a student one time rushing in late to a Friday undergraduate lecture (by the knot theorist Fox at Princeton I believe). He saw on the blackboard some problem which he assumed to be in the homework assignment, and so solved it over the weekend. But it turned out, until that weekend, to have been a famous unsolved problem in the field, and his (perhaps first) publication, I think in the Annals of Mathematics, the world’s top journal at that time.
    World fame can come very early but almost instantaneously relatively speaking in this subject. Milnor produced very many other extremely fundamental contributions, much of it well before the informal age 40 limit on Fields Medal work. Now there’s also the Abel Prize as well, not just for ‘youngster’s’ achievements, all the four above I believe having also got it.

  22. Greer’s hypothesis seems to me to be blatant ageism propped up with assertions propped up with a few examples. And ageism arises from fear of old age, which promptly comes back to bite us. Because if we’re lucky,sooner or later, we become the hated Other, an old person. As for the hypothesis, as a old person I ask my favourite question: does it matter? Our host’s take on Greer and all the discussion here is far more interesting to me.

Leave a Reply to Robert Elessar Cancel reply

Your email address will not be published. Required fields are marked *