The supposedly short shelf life of public intellectuals

February 8, 2022 • 9:15 am

The following intriguing post, written by author and independent researcher Tanner Greer, appeared on the website “The Scholar’s Stage”.  Greer’s thesis is that “public intellectuals” (he uses Thomas Friedman as his main example) have a short “shelf life”—the period during which they’re producing truly creative work and ideas that intrigue the public.

Note, this differs from the period of most productive scholarship: a public intellectual becomes so by expounding creative ideas in a popular and easily understood way, and in a way that influences intelligent people in general. Many academics famous in their fields are not public intellectuals because they don’t write for or address the public.

The period of fertility Greer gives public intellectuals is 7-10 years, though as Greer admits, the fame of public intellectuals can last well beyond their “sell-by” date. In general, though, he sees this creative period of public recognition running through one’s thirties, with a falloff after that. Greer does allow that in few areas, like history, public intellectualism can start later or last longer than a decade.

We all know that most mathematicians and physicists do their pathbreaking work when young, but is this true for public intellectuals as well?  After all, Greer notes that such people usually become “famous” for their academic work before they transition into the more general kind of punditry that characterizes “public intellectuals”.  I’m not sure I agree, but decide for yourself after reading Greer’s piece, which you can see for free by clicking on the screenshot below (note that the piece is two years old but still relevant):

Greer’s contention:

Several months ago someone on twitter asked the following question: which public thinker did you idolize ten or fifteen years ago but have little intellectual respect for today? [1] A surprising number of people responded with “all of them.” These tweeters maintained that no one who was a prominent writer and thinker in the aughts has aged well through the 2010s.

I am not so harsh in my judgments. There are a few people from the last decade that I am still fond of. But the problem is inevitable. This is not a special pathology of the 21st century: when you read intellectuals of the 1910s talking about the most famous voices of the 1890s and early 1900s you get the same impression. You even get this feeling in a more diluted form when you look at the public writing of the Song Dynasty or Elizabethan England, though the sourcing is spottier and those eras and there was no ‘public’ in the modern sense for an individual living then to intellectualize to. But the general pattern is clear. Public intellectuals have a shelf life. They reign supreme in the public eye for about seven years or so. Most that loiter around longer reveal themselves oafish, old-fashioned, or ridiculous.

. . . Thus most humans develop their most important and original ideas between their late twenties and early forties. With the teens and twenties spent gaining the intellectual tools and foundational knowledge needed to take on big problems, the sweet spot for original intellectual work is a person’s 30s:  these are the years in which they have already gained the training necessary to make a real contribution to their chosen field but have not lost enough of their fluid intelligence to slow down creative work. By a person’s mid 40s this period is more or less over with. The brain does not shut down creativity altogether once you hit 45, but originality slows down. By then the central ideas and models you use to understand the world are more or less decided. Only rarely will a person who has reached this age add something new to their intellectual toolkit.

He also quotes from a textbook (reference not given but there’s a reference list at the end):

In most fields creative production increases steadily from the 20s to the late 30s and early 40s then gradually declines thereafter, although not to the same low levels that characterized early adulthood. Peak times of creative achievement also vary from field to field. The productivity of scholars in the humanities (for example, that of philosophers or historians) continues well into old age and peaks in the 60s, possibly because creative work in these fields often involves integrating knowledge that has crystallized over the years. By contrast, productivity in the arts (for example, music or drama) peaks in the 30s and 40s and declines steeply thereafter, because artistic creativity depends on a more fluid or innovative kind of thinking. Scientists seem to be intermediate, peaking in their 40s and declining only in their 70s. Even with the same general field, differences in peak times have been noted. For example, poets reach their peak before novelists do, and mathematicians peak before other scientists do.

Still in many fields (including psychology) creative production rises to a peak in the late 30s and early 40s, and both the total number of works and the number of high quality works decline thereafter. This same pattern can be detected across different cultures and historical periods….

Greer’s example:

To give you a sense of what I mean by this, consider the career of public intellectual whose career peaked in the early aughts. Thomas Friedman is now the butt of a thousand jokes. He maintains his current position at the New York Times mostly through force of inertia, but secondly through his excellent connections within the Davos class and his sterling reputation among those who think as that class does. But this was not always so.

(I rarely read Friedman, so I can’t weigh in here. But I’m sure many readers do and can.)

He reviews Friedman’s career, which began with his earning his laurels, and two Pulitzer Prizes for journalism (reporting in a war zone) in his late 20s and 30s. He was not a public intellectual then—that began, in Greer’s view, in Friedman’s early forties, with two Pulitzers under his belt and then a number of books based on his columns, including the famous The World is Flat, which appeared when Friedman was 52.  Then, according to Greer, Friedman, still a public intellectual, had passed his shelf life:

Friedman would close out the decade with another book and three documentaries. These were mostly restatements of his columns (which in turn drew heavily from ideas he first introduced and developed between Lexus and The World if Flat). Friedman was still a part of the national conversation, but his perspective had lost its originality. His columns began to bleed together. This is the era when “Friedman Op-Ed Generators” went viral. Increasingly, Friedman was not argued against so much as joked about. By 2013 or so (just as he was turning 60) Thomas Friedman was done. Not technically so—between then and now he would rack up two more books, hundreds of columns, and heaven knows how many appearances at idea festival panels and business school stages. But intellectually Friedman was a spent force. His writing has been reduced to rehashing old rehashes, his columns the rewarmed leftovers of ideas grown old a decade ago. It is hard to find anything in his more recent books or columns that has mattered. He is able to sell enough books to live comfortably, but you will have difficulty finding anyone under 50 who admits they have read them. Friedman lingers still as a public figure, but not as a public intellectual. His thinking inspires no one. The well has run dry.

Why did this happen? According to Greer, Friedman was no longer living in the world he inhabited when he was younger. He was no longer a journalist covering a war and drawing his ideas and columns from his experience in that earlier world, but recycling ideas and living in an entitled world where he jets around giving lectures.  The well had run dry.

According to Greer, what holds for Friedman holds for others as well. Greer offers two reasons for the decay of public intellectuals:

1.) Brain decay. 

I suspect the underlying mechanism behind this pattern is brain cell loss. Neuroscientists estimate that the average adult loses around 150,000 brain cells a day; in the fifty years that follow the end of brain maturation (ca. years 25-75), the average brain will lose somewhere between 5-10% of its neurons.[3] Fluid intelligence begins declining in a person’s 30s.[4] This implies that most humans reach their peak analytic power before 40. Crystal intelligence holds out quite a bit longer, usually not declining until a person’s 60s or 70s. This is probably why historians reach peak achievement so late: the works that make master historians famous tend towards grand tomes that integrate mountains of figures and facts—a lifetime of knowledge—into one sweeping narrative.

He gives little evidence that loss of brain cells is responsible for public intellectuals’ decline (after all, is 5-10% so substantial that it erodes creativity?). And there’s some special pleading here: he says that historians and philosophers are exceptions because their public intellectualism builds on studies and thoughts that “crystallized over the years.” Well, that sounds good, but before we start advancing theories, can we have some data? Remember, though, that he’s not talking about scholarly or academic achievement or excellence, but about the avidity of the public to consume what the person produces. And it’s hard to get data on that. (For scientists, we could get an idea of academic achievement over the years using a number of metrics, including citations. I do that below for myself.)

2.) Loss of the experiences that produced their renown.  Again Greer uses Thomas Friedman (poor guy!) as an example:

Recognizing this helps us make sense of a many interesting aspects of human social life. I think often about Vaisey et al’s 2016 study, which demonstrated that most shifts in social attitudes occur not through change in the attitudes at the individual level, but through intergenerational churn.[5] Old attitudes die because generations that hold them literally die off. Such is the stuff of progress and disaster.

Such is also the problem of the public intellectual. A public intellectual’s formative insights were developed to explain the world he or she encountered during a specific era. Eras pass away; times change. It is difficult for the brain to keep up with the changes.

Not impossible, just hard. And this bring my second, sociological explanation into play. There are things that a mind past its optimum can do to optimize what analytic and creative power it still has. But once a great writer has reached the top of their world, they face few incentives to do any of these things.

Consider: Thomas Friedman’s career began as a beat reporter in a war-zone. He spent his time on Lebanese streets talking to real people in the thick of civil war. He was thrown into the deep and forced to swim. The experiences and insights he gained doing so led directly to many of the ideas that would make him famous a decade later.

In what deeps does Friedman now swim?

We all know the answer to this question. Friedman jets from boardroom to newsroom to state welcoming hall. He is a traveler of the gilded paths, a man who experiences the world through taxi windows and guided tours. The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?

More importantly: What incentive does he have to live any other way?What, then, should you do if you aspire to this status—and who does, really?  Doesn’t it often happen by accident, when someone is just driven to write a book and get their ideas out into the ether? (Greer’s emphasis below).

There are practical implications for all this. If you are an intellectual, the sort of person whose work consists of generating and implementing ideas, then understand you are working against time. Figure out the most important intellectual problem you think you can help solve and make sure you spend your thirties doing that. Your fifties and sixties are for teaching, judging, managing, leading, and dispensing with wisdom. Your teens and twenties are for gaining skills and locating the problems that matter to you. Your thirties are for solving them.

Crikey? Does anybody really set out to do this?

When somebody with a public presence reads a piece like Greer’s, your mind automatically reviews your own career. I do not regard myself as a public intellectual, not by any means, but simply as a (ex) scientist who has a bit more public presence than other scientists. But my career, both scholarly and in the public eye, doesn’t fit Greer’s narrative. (Remember, though, we shouldn’t attack his thesis by citing anecdotes, even though he himself adduces no anecdotes.)

Perhaps I can say that my scientific creativity peaked in my late forties—around 1989, when I was doing some decent experiments—but if that were the case, my citations by other scientists should have begun dropping by the time I was sixty, around 2009. That was palpably not the case according to the plot below (this is the first time I’ve ever seen these data!). I was still doing experiments and the citation rate started rising. But remember, Greer’s talking about public renown, not academic renown. By the time I was thirty—at my supposed creative peak, I was only getting started (note that the plot below begins when I was forty!)

My proudest achievement, when I think my mind was most capacious and supple, was when I wrote the book Speciation with Allen Orr. It came out in 2004, when I was 54, and when I look at it now, I think, “How could I have thought of that stuff?” I know now that I couldn’t write that book again; my mind is duller and, of course, I’m retired.

I suppose my “public presence” began in 2009 when I published Why Evolution is True and was also writing a lot of book reviews for well known venues. But at that time I was 60, and Faith versus Fact came out five years later. The citations below don’t measure that kind of “public presence”, and I don’t know how to measure that, nor do I much care. I’m content with what I’ve done.

So I don’t fit the narrative; my scientific achievements were not primarily in my thirties, as I was still a postdoc at that time, and didn’t get a real job until I was about 33. Greer argues that academic achievements peak during one’s twenties.

The public intellectuals I do know, mainly Richard Dawkins and Steve Pinker, don’t seem to me to have passed their sell-by date even now, because they’re still followed and read avidly, and their stuff is not mere recycling of old ideas. Steve’s Better Angels book came out when he was 57, and that, whether you agree with its thesis or not, is a work of thought and originality.

Dawkins hews a bit better to Greer’s thesis: The Selfish Gene, arguably his most famous book, came out when he was just 35, but since 1976 he’s been turning out a number of books that aren’t recycling ideas from that first book, but further and different ways to consider evolution. (Richard has said he’s most proud of The Extended Phenotype as a work of originality. Its thesis is not at all the thesis of the earlier work.)

My favorite “popular” book by Dawkins, The Blind Watchmaker, came out when he was 45, and The Ancestor’s Tale when he was 63. Yes, he’s still writing about evolution, but his biggest burst of fame came when he was 65 with the publication of The God Delusion. I don’t think you can say he’s missed a beat over all that time, and of course he’s entered a new field: anti-theism, though I maintain that with him, as with me, opposition to religion grew out of the frustration of trying to sell evolution to a recalcitrant, religious public. At any rate, he’s still writing about evolution with his new book on the evolution of flight, and he’s still doing pretty much what he was doing forty years ago.

But I digress.  It’s clear that many people decline both academically and as public intellectuals as they age, and part of that may be due to loss of brain cells. But I can’t say that, with Pinker or Dawkins, either has shown obvious loss of neurons! And both continue to be immersed in the worlds that motivated their “publicness.” Maybe, then, the public intellectuals I know well just happen to be exceptions to Greer’s Rule.

In the end, Greer’s thesis may be a case of Hitchen’s razor: “What can be asserted without evidence can be dismissed without evidence.” But I won’t be quite that glib, for there are two phenomena (creativity and pubic engagement) that do generally decline, and this deserves some scrutiny.