I’m doing a bunch of reading trying to balance science with non-science (the latter mostly fiction). Click on the book cover reproductions to go to the Amazon links.
Science: Flights of Fancy by Richard Dawkins. It’s a good book and I learned a lot, though it’s centered more on the adaptation of flight than on concepts. However, Richard does bring in a lot of the ideas (the genic viewpoint, adaptive compromise, the necessity of step-by-step adaptation, and so on) from his earlier works. I’m wondering if this isn’t Richard’s equivalent of Darwin’s book on earthworms, where he instantiated with worms his idea that even small changes can add up to big results over a long period of time. The book seems to be written for young adults, but I enjoyed it and learned a lot. Nota bene: there will be an “event”
More science: A lot of stuff on the Galápagos to prepare me for lecturing in February. (I prepare well in advance!) It’s good to revisit the old stuff, so I reread these books:
I haven’t read The Voyage of the Beagle for decades, and it was instructive to go through it again. The section on the Galápagos is very short, and I’d forgotten that it didn’t contribute much to The Origin. Reading this and other sources, it’s clear that Darwin did not have an “aha moment” in the islands, and in fact only used them trivially in constructing his big theory. One reason is that he messed up his collecting notes, especially for finches and tortoises, and thus was unable to suss out their meaning until others (notably the ornithologist John Gould) properly classified the species. I was also surprised at how judgmental Darwin was about animals, finding them “odious”, “horrible” and “ugly” (he has very few kind words to say about the marine iguanas!). But he was young: only 22 when he embarked on the 5-year voyage. The book is alternatingly boring and absorbing, the latter when he comes close to thoughts about evolution (he was a creationist on the trip). There is far too much geology, since Darwin was more engaged with geology than biology when he went aboard.
It’s interesting to contrast the sickly Darwin of his later years with the robust youth described in his journal, a time when he’d camp out in the snow in a blanket and ride all day through swamps in the rain, not to mention hiking up every mountain he saw.
Beagle, which made Darwin’s reputation as a naturalist, was published three years after his return to England, and within five years after that he was a full-fledged advocate of evolution and natural selection. But he was timorous, and only published his ideas when forced to by Wallace’s coincidental discovery of natural selection in 1858. The Origin, which came out in 1859, was published fully 23 years after the Beagle returned to port.
Here’s a good book on the influence of the Galápagos on biological thought, beginning with the discovery of the uninhabited islands in the 16th century and continuing on to the present day. Darwin’s own trip to the archipelago occupies about 1.5 chapters.
And I went back to the Galápagos section of Janet Browne’s magisterial two-volume biography of Darwin: Voyaging and The Power of Place. The first book ends as he arrives home on the Beagle, never to leave England again. This is truly the definitive biography of Darwin, and I was amazed, on rereading the Beagle section, at how much scholarship Browne managed to sneak into a biography that is truly a page turner. You MUST read this book: both volumes.
Fiction: I once again started Donna Tartt’s The Goldfinch, but for some reason found it unreadable: the writing seemed too self-conscious to me. Only rarely do I start a book and not finish it, so I’ll never know how it comes out.
On the advice of two friends, I got the book below instead, which won the Pulitzer Prize for Fiction in 2015—a year after Tartt’s book won the same prize. I’ve just started it, but it’s a parallel narrative of two young people: a blind French girl who flees Paris for Saint-Malo with her father, and a German youth who’s destined to fight for the Nazis. As I’m only a wee bit into it, I can’t pass final judgment, but it hooked me instantly, and I can see why it was so lauded. This one I know I’ll finish.
This is, of course, your impetus to tell us all what you are reading, and whether you recommend it.
Of the several independent assertions that constitute Darwin’s “theory of evolution” in On The Origin of Species, Darwin regarded the idea of natural selection as his most important and original. After all, it alone explained how naturalistic processes could lead to the remarkable adaptations of animals and plants heretofore seen as some of the strongest evidence for God. And although the idea of evolution itself had been broached by others before Darwin, including his own grandfather Erasmus, natural selection seemed to be sui generis.
Well, not entirely. It was anticipated by several people, including the Scottish polymath James Hutton in 1794. But the most remarkable precursor to the idea of natural selection was published by Scottish horticulturalist and agriculturalist Patrick Matthew (1790-1874) as an appendix to his book On Naval Timber and Arboriculture (1831). Although the book was about how to build ships using wood, and what kind of wood to use, Matthew added a 28-page Appendix. In that Appendix were 29 sentences that laid out what he called “selection by the law of nature”, which bore a striking similarity to the idea made famous by Darwin 28 years later.
THERE is a law universal in nature, tending to render every reproductive being the best possibly suited to its condition that its kind, or that organized matter, is susceptible of, which appears intended to model the physical and mental or instinctive powers, to their highest perfection, and to continue them so. This law sustains the lion in his strength, the hare in her swiftness, and the fox in his wiles. As Nature, in all her modifications of life, has a power of increase far beyond what is needed to supply the place of what falls by Time’s decay, those individuals who possess not the requisite strength, swiftness, hardihood, or cunning, fall prematurely without reproducing—either a prey to their natural devourers, or sinking under disease, generally induced by want of nourishment, their place being occupied by the more perfect of their own kind, who are pressing on the means of subsistence.
. . . There is more beauty and unity of design in this continual balancing of life to circumstance, and greater conformity to those dispositions of nature which are manifest to us, than in total destruction and new creation. It is improbable that much of this diversification is owing to commixture of species nearly allied, all change by this appears very limited, and confined within the bounds of what is called Species; the progeny of the same parents, under great difference of circumstance, might, in several generations, even become distinct species, incapable of co-reproduction.
The self-regulating adaptive disposition of organized life may, in part, be traced to the extreme fecundity of Nature, who, as before stated, has, in all the varieties of her offspring, a prolific power much beyond (in many cases a thousandfold) what is necessary to fill up the vacancies caused by senile decay. As the field of existence is limited and pre-occupied, it is only the hardier, more robust, better suited to circumstance individuals, who are able to struggle forward to maturity, these inhabiting only the situations to which they have superior adaptation and greater power of occupancy than any other kind; the weaker, less circumstance-suited, being permaturely destroyed. This principle is in constant action, it regulates the colour, the figure, the capacities, and instincts; those individuals of each species, whose colour and covering are best suited to concealment or protection from enemies, or defence from vicissitude and inclemencies of climate, whose figure is best accommodated to health, strength, defence, and support; whose capacities and instincts can best regulate the physical energies to self-advantage according to circumstances—in such immense waste of primary and youthful life, those only come forward to maturity from the strict ordeal by which Nature tests their adaptation to her standard of perfection and fitness to continue their kind by reproduction.
Well yes, that has variation, differential survival, culling of most individuals in a species, speciation, and adaptation—all features of Darwin’s own theory. It’s a remarkable anticipation of Darwin’s ideas.
Does this mean that Matthew deserves credit for the idea of natural selection? Only as an anticipation of Darwin’s far more thorough explication (Darwin, by the way, never read Matthews’ Appendix). Matthew deserves no more credit for natural selection as a popular idea than does Erasmus Darwin for evolution. Matthew’s ideas weren’t adopted, were almost never cited, had no influence in biology, and Matthew never realized until after The Origin was published (and sold out the printing in a single day) that he once had within his grasp The Big Idea that explained the design-like features of nature.
Nevertheless, several people have tried to diminish Darwin’s idea by pointing out that Matthew had it first—and that Darwin plagiarized it. The latest attempt is by Mike Sutton in this book published two months ago (click on image to go to Amazon link):
I haven’t read it, but according to Geoff Cole, a cognitive scientist at the Centre for Brain Science at the University of Essex, who reviewed the book in the latest issue of Evolution (click below for free access), Sutton’s book is a real hit job on Darwin.
The title of Sutton’s book clearly asserts that Darwin took credit for Matthew’s theory, and it’s true that once Patrick Matthew had read The Origin, he argued for his own precedence, even though Darwin had never seen the “incriminating” sentences above. Sutton also claims that Matthew’s idea had real priority because Naval Timber was cited by others before 1859, but as Cole notes in a very critical but polite review, those citations were almost all to the book itself, not to the ideas in the Appendix.
Cole also notes Sutton’s ridiculous accusations of Darwin’s “plagiarism”:
What is most uncomfortable about Sutton’s thesis is his treatment and personal attack on Darwin. He suggests that Darwin ”was a plagiarist who lied repeatedly” and undertook “deliberate, knowing fraud”. Indeed, “the biggest science fraud in history”; fraud that Darwin supposedly hoped “nobody would notice”. Sutton also expresses suspicion about the chronic illness Darwin was known to suffer; a subject that many historians have written about (e.g., Hayman, 2009). From every single account of Darwin and how he went about his life, these “lies” are the complete opposite of what we know about the man. I have lost count of the number of times I have seen a scholar write that a particular event “is testament to his honesty”. As Browne (1985) stated, “By the time Descent of Man was published in 1871 reviewers were falling over themselves to congratulate Darwin’s “unassailable integrity and candour, and his “wonderful thoroughness and truthfulness” (Browne, 1985, p.257 & 258).
Every serious historian who’s studied Darwin’s life knows that he was neither a plagiarist nor a liar, although he did, understandably, want to preserve credit for his own ideas. After Matthew wrote a claim of his priority in The Gardner’s Chronicle in 1859, Darwin not only published an acknowledgement of Matthew’s precedence in the same magazine, but also inserted this long acknowledgment of Matthew’s work into the 3rd edition of On the Origin of Species:
In 1831 Mr. Patrick Matthew published his work on ‘Naval Timber and Arboriculture,’ in which he gives precisely the same view on the origin of species as that (presently to be alluded to) propounded by Mr. Wallace and myself in the ‘Linnean Journal,’ and as that enlarged on in the present volume. Unfortunately the view was given by Mr. Matthew very briefly in scattered passages in an Appendix to a work on a different subject, so that it remained unnoticed until Mr. Matthew himself drew attention to it in the ‘Gardener’s Chronicle,’ on April 7th, 1860. The differences of Mr. Matthew’s view from mine are not of much importance: he seems to consider that the world was nearly depopulated at successive periods, and then re-stocked; and he gives, as an alternative, that new forms may be generated “without the presence of any mould or germ of former aggregates.” I am not sure that I understand some passages; but it seems that he attributes much influence to the direct action of the conditions of life. He clearly saw, however, the full force of the principle of natural selection. In answer to a letter of mine (published in Gard. Chron., April 13th), fully acknowledging that Mr. Matthew had anticipated me, he with generous candour wrote a letter (Gard. Chron. May 12th) containing the following passage:—”To me the conception of this law of Nature came intuitively as a self-evident fact, almost without an effort of concentrated thought. Mr. Darwin here seems to have more merit in the discovery than I have had; to me it did not appear a discovery. He seems to have worked it out by inductive reason, slowly and with due caution to have made his way synthetically from fact to fact onwards; while with me it was by a general glance at the scheme of Nature that I estimated this select production of species as an à priori recognisable fact—an axiom requiring only to be pointed out to be admitted by unprejudiced minds of sufficient grasp.”
Cole explains patiently why Darwin should get nearly all the credit for the idea of natural selection. A few excerpts from Cole’s excellent review:
Who then should be credited with discovering the process by which evolution occurs? Matthew, Hutton, Maupertuis, Wells? Or anyone else who also chipped in? The answer is simple. Charles Darwin.
. . . A necessary condition of insight is that the knowledge must be reflected upon and placed within the appropriate context. Unless a person fully recognises what they have said, done, or found, no formal insight has occurred. There is no priority.
. . . I suspect Matthew was annoyed with himself, as I was with myself, for not realising the importance of what he had written. That may have been why he dedicated so much of his later efforts on his priority claim. If he had realised he would surely have submitted an academic paper outlining his theory; a paper that was only about the theory. Given fear of religious establishment, this could have initially been anonymously penned. He may have even published a book on the origin of all life forms and how the development of every single species can be explained. He would have also repeatedly used his phrase “the process of natural selection”, a phrase Sutton places great emphasis on, as opposed to the one time he did so in Naval Timber. As it was, there was no paper or book. There was no in-depth development of ideas about evolution and how it relates to divergence, heredity, the geological record, geographic distribution, classification, morphology, and embryology. No lengthy discussion of how there are problems and “difficulties” with his own theory. There was not 30 years of methodical work in which he used his theory to explain aspects of cross-pollination and movement in plants, not to mention work on human psychology, sexual behaviour, and emotions. There were no lengthy and numerous discussions with colleagues about his theory and when he should go public.
In fact, Sutton acts like a creationist, arguing that generations of evolutionary biologists have realized that Matthew should really get credit for the idea; but we have, because of our mindless adulation of Darwin, kept that quiet:
Essentially, Sutton has to explain why generations of evolutionary biologists and the like have never come to the same conclusion as himself. The usual explanation is that we are all involved in a “cover up” (p. 5) or part of the “Darwin Industry”, as Sutton calls it, in which a “loosely affiliated in-group of scientists, historians of science, other writers, publishers, editors, and journals, share a common goal to protect the perception of Charles Darwin as a genius science hero” (p. 10). But how This article is protected by copyright. All rights reserved. about this for an alternative explanation? Those generations of biologists have independently decided that there is nothing to see here, that Darwin should be honoured with discovering evolution. Furthermore, if a few sentences in which natural selection is referenced warrants priority, as Sutton seems to believe, then why pick out Patrick Matthew? Why not his predecessors, Hutton, Wells, or Maupertuis? In fact, shouldn’t Matthew be accused of plagiarism, having failed to acknowledge the fact that his ”own original child” was described at least 30 years before by various others?
Sutton’s book is his latest, in his decade-long, attempt to undermine Darwin’s priority. As all others before, this one will fail.
Of that there’s no doubt. Matthew’s independent musings about natural selection are a remarkable coincidence, but he didn’t make much of them, didn’t examine them further, and certainly didn’t try to integrate them into a grand theory of organic evolution. But judge for yourself: I hope you’ve read The Origin, so just peruse Matthew’s brief discussion and then ask yourself whether Matthew should get the lion’s share of the credit for the idea of natural selection.
One brief correction of Cole’s fine review: on its first page it describes Darwin as being “the ship’s naturalist” on the voyage of the Beagle. That’s a common misconception, for an “official” naturalist—the ship’s surgeon Robert McCormick—had already been designated. Darwin sailed on the Beagle using his own money, and his position was as both a “self funded naturalist” and also the “captain’s companion”. He was taken aboard largely to provide gentlemanly company for Captain FitzRoy, with whom he dined and conversed. Darwin’s researches and collections during the voyage were done on his own volition and enthusiasm.
This is the fifth book I’ve read by Kazuo Ishiguro, who in my view has a strong claim to be the finest novelist writing in English. That’s unusual because his first language is Japanese, yet, like Joseph Conrad or Isak Dinesen, he’s overcome whatever strictures his first language imposed to become not only unrecognizable as a non-native speaker of English, but to excel even among writers. In Ishiguro’s case, this may come in pat from his moving to England when he was about five, and he’s visited Japan only sporadically since then.
Ishiguro was awarded a Nobel Prize for literature in 2017—before this book come out. (I recommend listening to Ishiguro’s wonderful 45-minute Nobel Prize lecture here.)
I was, as you know, trying to read all the Booker Prize novels I haven’t read, which are many, but I took a break, on a friend’s advice, to read Klara and the Sun (2021), Ishiguro’s latest novel.
At any rate, of the five Ishiguro books I’ve read, two are great masterpieces (The Remains of the Day and Never Let Me Go), and this book, which was nominated for a Booker Prize but didn’t win, is very great but not a masterpiece of world literature. It comes close, though, and I recommend it very, very highly. (By the way, if you haven’t read the two I recommended above, do so immediately, and then go watch the movies made of both. The movies are fantastic; I actually saw both before reading the books, and the disquieting film Never Let Me Go is what got me onto Ishiguro.
In the short video on the Amazon Page, Ishiguro touts his book almost reluctantly, but does say this:
It’s probably not inaccurate to say that Klara and the Sun is positioned in terms of its imaginary world and its approach somewhere midway between Never Let Me Go and Remains of the Day.
You can (and should) order the book; the screenshot below goes to the Amazon page:
I don’t want to recount the entire plot, which would reveal many spoilers (you can see it on the Wikipedia page), but I do want to say a few things. Why is it positioned between those two books? The subject and narrator of the book is a “robot”, an “AF” (artificial friend) named Klara. She is purchased by the mother of Josie, a chronically ill teenage child, to be Josie’s friend and prepare her for college. (AFs were made in this tale to relieve loneliness.) The story, like that of Never Let Me Go, takes place in a not-too-distant dystopian future, and the first-person narration, as in The Remains of the Day, is also presented by a servant (in this case the AF, in the other book the head butler), both of whom come to realizations about life and humanity during the story.
Because the narrator of this book is an AI in a presumably humanoid body (Klara has hair, legs, and other organs, but they’re never explicitly described), the language used is simple, the way a newly made and naive robot entering the world would have written. Moreover, Klara doesn’t perceive her surroundings the way humans do: she is so constructed that most of the time she sees the world as a series of boxes in which separate images appear. As the book progresses, you learn more and more about what’s going on, just as you do in Never Let Me Go. There is, for example, the practice of being “lifted”, which you must undergo to go to college, but only halfway through the book do you learn what that means. And Klara learns more and more about humans and her environment; before being purchased, she’d been awaiting a customer in an AF shop.
There are many themes of this book, the main one being Klara’s attempt to understand what it is to be human, and especially to be Josie. Klara obviously was instilled with some “feelings”—or simulacra of feelings like empathy (don’t ask me how) in the factory, but the subtleties of human emotion have to be learned over time. Fortunately, Klara is a keen observer, and at the end of the book reviews her experiences and what she’s learned. It’s not clear that she has real inner experiences—though it seems she does—or whether her AI program is just being updated.
The main emotion she can’t master is love, and she’s always inquiring of Josie and her paramour, Rick, if they are in love. I’ll leave the resolution to you except to say that the book, although it could be considered science fiction, is an absolutely serious novel that inquires “What does it mean to be human?” Klara is also religious in her own way, worshiping the Sun as a god because she herself apparently runs on solar power. She thus attributes the healing power of the Sun to everything, and has her own form of paying homage to it, and even some prayers to the Sun. This plays a crucial role in the book.
The rest I leave to you so you can have the great pleasure of reading this short book (about 310 small page). I give it two thumbs and two big toes up. Of all the dreck on the market today, this stands out. As I said, it’s not a classic novel, but it comes damn close.
Here’s an 8½-minute video of Ishiguro talking about this book, noting that it started as a children’s book (and was inspired by picture books for children), but quickly expanded into a novel dealing with artificial intelligence.
I may have mentioned Jason Rosenhouse‘s new book before, but I just finished it and wanted to give it two thumbs up. The image below links to the Amazon site:
This book is a withering critique of the so-called “probabilistic” arguments against evolution promoted by Intelligent Design advocates like Michael Behe and William Dembski. Jason is ideally equipped to write about them as he’s both a professor of mathematics at James Madison University and a diligent reader of creationist and ID literature. An earlier book of his, Among the Creationists: Dispatches from the Anti-Evolutionist Front Line, describes his many visits to creationist meetings and gatherings and his attempt to suss out the psychology of anti-evolutionists without being judgmental.
In this book, though, Jason pulls no punches, analyzing and destroying the arguments the evolution simply could not have occurred because the probability of getting organisms, proteins, or “complex specified information” is too low to be explained by materialistic processes. Ergo, the ID arguments supposedly point to the existence of the Intelligent Designer, who we all know is God. (IDers like to pretend that it could be a space alien or the like, but it doesn’t take much digging to descry the religious roots of ID, sometimes described as “creationism in a cheap tuxedo”.) The real target of ID is not just evolution, but naturalism or “materialism”, as they sometimes call it. Their ultimate goal is to sneak religion into public schools and into mainstream science. But they’ve already lost.
Still, the religious motivations aren’t important when the calculations are wrong—or rather, can’t be made. Jason’s main point in this book is that although complex-looking mathematics is often invoked to show the improbability of naturalistic evolution, IDers lack information about probability space to plug into their equations, so they can’t come to any mathematically-based conclusions. (And when we do have information, like that bearing on the claim that evolution violates The Second Law of Thermodynamics, or that the evolution of chloroquine resistance to malaria is impossible, that evidence doesn’t support the IDers’ and creationists’ claims.)
In the end, all IDers do, argues Jason, is throw sand in the layperson’s eyes with fancy equations, and then simply assert, without actually making valid calculations, that evolution by natural selection is too improbable to have occurred.
Other arguments that are less mathematical, for example that bacterial flagella could not have evolved in an adaptive, step-by-step process, are also discussed, and Jason shows how they’ve been refuted.
Another admirable aspect of the book is that Jason writes very clearly and elegantly, so it’s easy to read. Here are two specimens of his prose that also make his main point:
What about specificity? Dembski’s theoretical development of this concept essentially required graduate-level training in mathematics. He helped himself to copious amounts of notation, jargon, Greek letters, and equations. Anyone unaccustomed to wading through prose of this sort could easily come away thinking it represented work of depth and profundity just from the level of technical detail in its presentation.
However, when it came time to discuss the specificity of an actual biological system, the flagellum in this case, all of the technical minutiae went clean out the window. For all the use Dembski made of his elaborate theoretical musings, they might as well never have existed at all. He just declared it obvious that the flagellum was specified and quickly moved on to other dubious claims. At no point did he attempt to relate anything in reality to the numerous variables and parameters he included in his mathematical modeling.
As Jason shows, the lack of parameters needed to show that evolution is too improbable to have happened in a Darwinian way is a ubiquitous problem for iD. One more quote:
This pattern, of introducing difficult mathematical concepts without ever really using them for any serious purpose, is ubiquitous in anti-evolution discourse, and this fact goes a long way to explaining why mathematicians an scientists are so disdainful of it. Professionals in these areas strive for the utmost clarity when presenting their work. Used properly, the jargon and notation permit a level of precision that simply cannot be achieved with more natural language. This might seem hard to believe, since a modern scientific research paper will be unreadable for anyone without significant training in the relevant discipline. But the problem is not a lack of clarity in the writing. Rather, it is just that the concepts involved are difficult, and experience is needed to become comfortable with them.
. . . In section 2.6, I remarked that anti-evolutionist arguments play well in front of friendly audiences because in that environment the speakers never pay a price for being wrong. The response would be a lot chillier if they tried the same arguments in front of audiences with the relevant expertise. Try telling a roomful of mathematicians that you can refute evolutionary theory with a few back-of-the-envelope probability calculations, and see how far you get. Tell a roomful of physicists that the second law of thermodynamics conflicts with evolutionary theory, or a roomful of computer scientists that obscure theorems from combinatorial search have profound relevance to biology.
You will be lucky to make it ten minutes before the audience stops being polite.
If you want a clear and convincing response to IDers’ (and earlier creationists’) claims that evolution could not have happened without God or a Designer because it’s simply improbable via naturalism, read this book.
It will convince you, as Laplace supposedly tried to convince Napoleon about astronomy, that science—in this case, evolution—has no need of the God hypothesis.
I’ve finished my second novel in my quest to read all the Booker Prize winners (I read eight before I started this odyssey). The latest one was one was The God of Small Things written by Arundhati Roy. Published in 1997, it won the Booker the same year.
This is just a mini-review as I’m still processing the book. To avoid spoilers, I’ll be brief.
The plot involves a pair of fraternal twins (boy and girl) growing up in Kerala, India in the 1960s in an extended family of Christians. The story, which jumps back and forth in time over about 25 years, touches on many aspects of Indian life, including the Communism of Kerala (India’s only Communist state), the caste system—still very much alive then, and the ambiguity of love. The central story is the close relationship between the boy, Estha, and his sister Rahel, which borders on romantic love. Their close relationship with each other ultimately culminates in a not-so-wonderful act of incest near the end. Both are also involved in a close but platonic relationship with a “dalit”, or untouchable, who befriends them and ultimately becomes the lover of their mother. That forbidden love triggers the shocking ending of the book.
When I was about halfway through this book, I found the writing so atrocious—with the author seemingly trying to show off by capitalizing words to emphasize them, using the same phrase over and over again with slight variations, and spelling words them in weird ways (“Gnap” for nap)—that I almost put the book down. Even though the peculiarities of spelling and grammar may reflect the view of a child, it’s ultimately annoying.
It is only near the end, when the family finds out that the twins’ mother is involved with a dalit, do things pick up, the bad writing falls away, and the story becomes mesmerizing. It is this contrast between the slow and annoying bulk of the book with the absorbing finale that makes me unable to pass judgment on the book right now. I’m glad I read it, but I don’t know if it merits a Booker.
“The God of Small Things”, by the way, is a leitmotif, reflecting the fact that seemingly insignificant occurrences can have huge consequences.
My advice is “read it”, because many critics loved the thing, though some felt the way I do: conflicted at best.
I’m ready for my next Booker Book, but I’ve been sidetracked because a good friend just finished Ishiguro’s new novel and wrote me this about it:
I read Ishiguro’s latest recently, Klara and the Sun. It is, again, a superb and profound (and profoundly touching) novel.
My friend has excellent taste in literature, so I’ve asked the U of C’s Interlibrary Loan to get me a copy of (I have no room on my shelves to buy books). Meanwhile, I’m pondering the reader’ last suggestions for Booker winners. I’ll have to read Klara first because I simply love Ishiguro. Klara and the Sun was “longlisted” for the 2021 Booker Prize, but I’m starting to find out that not all Booker winners are masterpieces, at least in my view.
In my quest to read all the novels that have won the Booker Prize, I finished my first one on the list (I read four or five others in the past): Disgrace, by J. M. Coetzee (1999). Highly lauded, it not only won the Booker Prize, but was a major impetus for Coetzee’s getting the Nobel Prize for Literature in 2003.
Not wanting to convey spoilers, I’ll just say that the story involves two parallel cases of “disgrace”: a professor who has an affair with one of his students and then, refusing to defend himself, is fired; and the Professor’s daughter, with whom he goes to live after his expulsion. She too meets up with an even worse fate, but again refuses to take steps to assure her a better life after her own misfortune. In both cases I take “disgrace” to mean a character’s refusal to try to mend a broken life.
I found the novel readable (not saying a lot) and the plot engrossing, but the prose plodding. When I really love a novel, the prose has to be a major factor, just as the musicality of poetry is an inseparable part of its overall effect. When the Nobel Prize committee gave Coetzee its citation, it said, among other things, this:
J.M. Coetzee’s novels are characterised by their well-crafted composition, pregnant dialogue and analytical brilliance. But at the same time he is a scrupulous doubter, ruthless in his criticism of the cruel rationalism and cosmetic morality of western civilisation. His intellectual honesty erodes all basis of consolation and distances itself from the tawdry drama of remorse and confession. Even when his own convictions emerge to view, as in his defence of the rights of animals, he elucidates the premises on which they are based rather than arguing for them.
Coetzee’s interest is directed mainly at situations where the distinction between right and wrong, while crystal clear, can be seen to serve no end. Like the man in the famous Magritte painting who is studying his neck in a mirror, at the decisive moment Coetzee’s characters stand behind themselves, motionless, incapable of taking part in their own actions. But passivity is not merely the dark haze that devours personality, it is also the last resort open to human beings as they defy an oppressive order by rendering themselves inaccessible to its intentions. It is in exploring weakness and defeat that Coetzee captures the divine spark in man.
. . .In Disgrace Coetzee involves us in the struggle of a discredited university teacher to defend his own and his daughter’s honour in the new circumstances that have arisen in South Africa after the collapse of white supremacy. The novel deals with a question that is central to his works: Is it possible to evade history?
Well, maybe, but I’m not sure what “evading history” means here? Overcoming misfortune, or simply denying it? Both the professor and his daughter are stuck with what happened to them, and refuse to move on, so if that’s “evading history,” well, it’s not something that everyone does. In fact, Coetzee’s failure to make me really understand why the professor refused to defend himself seems a failure of character delineation.
The theme, then, isn’t anything near universal, at least to me. One can understand the feeling of ennui and hopelessness in The Plague, which can be redeemed by caring for others, but that’s something that resonates to many readers. After all, we’re all mortal.
So my take is that Disgrace is a good book, but not a great one—not near the quality of other winners like those by Ishigura, Barker, or Scott. Yet sometimes I think I don’t know how to read novels, and perhaps I’m missing the subtle themes and qualities that struck other critics, the Booker Prize committee, and the Karolinska Institute.
But nevertheless, I persist. I am halfway through the next choice, The God of Small Things, by Arundhati Roy, which won the Booker in 1997. It’s about a pair of twins and their extended family in Kerala, India, and is set in the 1960s. So far I found the plot engaging (it goes back and forth in time) but the prose overly baroque, with Roy apparently trying to show off her writing skills. At times the writing is so self-consciously “clever” that it makes me cringe. The style, while unusual, doesn’t seem to grow out of the author herself, as it does with writers like Cormac McCarthy, whom I love.
By the way, I noticed a similarity between Roy’s opening paragraph and part Thomas Wolfe’s “Poem to October” from his novel Of Time and the River:
“May in Ayemenem is a hot, brooding month. The days are long and humid. The river shrinks and black crows gorge on bright mangoes in still, dustgreen trees. Red bananas ripen. Jackfruits burst. Dissolute bluebottles hum vacuously in the fruity air. Then they stun themselves against clear windowpanes and die, fatly baffled in the sun.”
“October is the richest of the seasons: the fields are cut, the granaries are full, the bins are loaded to the brim with fatness, and from the cider-press the rich brown oozings of the York Imperials run. The bee bores to the belly of the yellowed grape, the fly gets old and fat and blue, he buzzes loud, crawls slow, creeps heavily to death on sill and ceiling, the sun goes down in blood and pollen across the bronzed and mown fields of old October.
Any more suggestions for Booker winners I should read next?
I’m a sucker for lists of what people are reading, as it tells us something about them and also can be a source of good things to read. I suppose, though, that when a famous person is asked what books they’re reading, they may well pad the list with books that make them look more serious and intellectual.
But I don’t think that’s the case with this list from Blinkist Magazine of nine books that Elon Musk found extremely influential in his life. Now Blinkist seems a bit slippery to me, since its mission appears to be to distill long books down into bite-size 15-minute audio bits that can help you succeed. And it’s all about what will help you get ahead in life, rather than books that could change your point of view.
Nevertheless, this is a genuine list of books that Musk reads, and it says he “reads a lot”:
Elon Musk, the billionaire CEO of SpaceX, Tesla, and other game-changing tech companies, somehow finds time to read a lot of books when he’s not sending rockets into space. From classic sci-fi works to complex studies on artificial intelligence, Musk credits books with helping him achieve his success. In fact, when asked how he learned to build rockets, he famously replied, “I read books.”
But to tout its Reader’s Digest-like format, Blinkist also adds at the beginning:
According to a study by the Bureau of Labour Statistics, most Americans find time to read just 17 minutes per day. At that rate, it could take you more than a month to read one of Musk’s recommended non-fiction titles.
OH NO! Well, why nottry reading more than 17 minutes a day! And noting that Americans can’t “find the time” to read just 17 minutes a day” don’t impress me much. Think of the hours that the average person spends online or in front of the telly.
But I fulminate. Here’s the list of the books Musk recommends. The article gives a short paragraph on each, which I won’t reproduce (click screenshot to read). I’ve added the Amazon link to each book, and also note whether I’ve read it:
1.) Steve Jobs by Walter Isaacson. A biography; I haven’t read it.
3.) Zero to One by Peter Thiel with Blake Master. It’s about how to build a business; I haven’t read it.
4.) Merchants of Doubt by Naomi Oreskes & Erik M. Conway. It’s about disinformation and environmental issues, and I haven’t read it.
5.) Life 3.0 by Max Tegmark. Another book about AI, and one I haven’t read.
6.) The Big Picture by Sean M. Carroll. Now I’m impressed, as this has no business relevance but shows pure intellectual curiosity on Musk’s part. I have read it, and liked it.
7.) Lying by Sam Harris. Another impressive book; I have read it. While it’s not one of Sam’s best (I disagree with his view that it’s never okay to lie), it’s nevertheless a thoughtful work. But I would have preferred that if Musk recommended a short life-changing book by Harris, it would have been Free Will.
9.) The Wealth of Nations by Adam Smith. I’m ashamed to admit that I haven’t read it, but it’s the one real “classic” on Musk’s list.
Now remember, these are books that Musk says could “change your life”, and I suspect he means that largely in a vocational sense.
I could make a list of nine or ten books that changed my life, but could not ever guarantee that they’d change yours (one of mine would be Zorba the Greek by Nikos Kazantzakis). But I will divulge the two books I’m reading now (I usually read one at a time):
1.) What is Real? The Unfinished Quest for the Meaning of Quantum Mechanics by Adam Becker. This is an absorbing book that I picked up in my lifelong and desperate quest to understand something that I’ll never grasp. Yes, I know the phenomena, but this book is about whether quantum mechanics is simply a useful mathematical apparatus for predicting things, or actually describes a real, underlying world. So far I’m a third of the way through, and don’t know the answer.
2.) People Love Dead Jews: Reports from a Haunted Present by Dara Horn. I’ve just begun this, and don’t have much to say about it yet. Horn is a prizewinning novelist, but here she tackles the striking fact that whenever she’s asked to write about Jews (she is Jewish), it’s always about dead Jews, as in the Holocaust. This seeming affection for ex-Jews contrasts with the rising anti-Semitism Horn sees in the present, and the fact that she’s not asked to write about living Jews.
I also just finished a book that a reader recommended: the 1400-page doorstopper A Suitable Boy by Vikram Seth (a good travel book). I thought it was very good, though could have used a bit of pruning, especially in the bit about politics. Also, the main character, Lata, never seems to come to life in a way that some of the other characters do. Nevertheless, I enjoyed the hell out of it, finished it, and am grateful for the suggestion.
So, this is your own cue to let us know what you are reading, and whether you recommend it.
The tweet below from Steve PInker, which is spot on, brought this NYT book review to my attention. Once again, the paper dilates on the supernatural without any warning to the reader that there’s no evidence for the efficacy of “precognition”—being able to see into the future, a form of extra-sensory perception (ESP). Yet the book review implies that there might be something to it.
Here’s the tweet:
Yet again the NYT treats the paranormal credulously, failing to indicate clearly that it does not, um, exist. Harnessing ESP to Forestall Death and Disaster https://t.co/WugzOVR5yj
The NYT piece reviews this book (note the title). The account may be true, but the “death foretold”? Fuggedabout it. (Click to go to Amazon link):
The author of this big of clickbait is W. M. Akers, whose bona fides, as given by the NYT, are “W.M. Akers is the author of “Westside,” “Deadball: Baseball With Dice” and the newsletter “Strange Times.” His most recent novel is “Westside Lights.”
The review (click to read):
Knight’s book tells the story of a British psychiatrist named John Baker, who was drawn to the supernatural and especially to precognition. He thought that if he could suss out credible instances of people foreseeing disasters, he might be able to prevent those disasters (think of a non-crime version of precogs in “Minority Report”). Here’s one instance of precognition that got Baker’s juices flowing:
n Oct. 21, 1966, Lorna Middleton woke up choking. The sensation passed, leaving behind melancholy and a sense of impending doom. After a lifetime of experiencing premonitions of misery and death, Middleton, a North London piano teacher, recognized the signs. Something hideous was on the way.
A few hours later, workers on a heap of coal waste in South Wales watched with horror as the 111-foot tower of “spoil” collapsed and cascaded down the mountain toward the village of Aberfan — thousands of tons of slurry and rock bearing down on the primary school. It was just past 10 in the morning and the classrooms were full of students doing spelling exercises, singing songs, learning math. When a 30-foot wave of refuse slammed into the building, they were buried alive. One hundred and forty-four people died that day. One hundred and sixteen were children, most between 7 and 10 years old. It was the sort of horror that makes people demand meaning — the sort for which meaning is rarely found.
Now that’s not a very precise example of precognition, is it? In fact, thousands of people probably had bad dreams that night, and where is the coal spoil in Middleton’s nonspecific dream? This seems like nothing more than pure coincidence. And how could precognition work, anyway? This doesn’t appear to be a subject of much interest to Knight—or Akers.
But in fact coincidence is what Baker was trawling for, looking for cases in which real “precogs” could be used in a practical way. If only the exact nature of the disaster could be predicted! Baker got to work, teaming up with Alan Hencher, a postal employee whose migraine headaches were supposed to predict disasters (but of what sort?) and Lorna Middleton, who had the bad dream that was followed by the coal-spoil avalanche::
Barker used his connections at The Evening Standard to solicit premonitions of the disaster. He found 22 he believed credible, including Middleton’s — he believed any vision accompanied by physical symptoms to be particularly strong. On the back of this research, The Standard recruited Barker to create a standing “Premonitions Bureau” that could catalog predictions and check to see how many came true. The Standard brass saw it as an offbeat way to sell papers. Barker considered it his chance to save the world.
“He wanted an instrument that was sensitive enough to capture intimations that were otherwise impossible to detect,” writes Knight. “He envisaged the fully fledged Premonitions Bureau as a ‘central clearinghouse to which the public could always write or telephone should they experience any premonitions, particularly those which they felt were related to future catastrophes.’ Over time, the Premonitions Bureau would become a databank for the nation’s dreams and visions — ‘mass premonitions,’ Barker later called them — and issue alerts based on the visions it received.”
So were any disasters averted? Nope, of course not. What we got is what we expected: there were dozens of premonitions, and some of them roughly matched something that happened, but most (more than 97%) did not. And even when they didn’t, they stretched the premonitions so they’d be sort-of true:
In the first week of 1967, Barker and the Standard staff began sorting predictions into categories like “Royalty,” “Racing,” Fire” and “Non-specified disasters.” (The science correspondent Peter Fairley often drew on the racing file for betting tips.) Once categorized, they would wait to see what happened, and attempt to connect the tragedies on the news page with the prophecies in their files.
Along with Alan Hencher, a postal employee whose migraines seemed to anticipate disaster, Middleton became Barker’s best source. He greeted her successful predictions with glee. When the death of the astronaut Vladimir Komarov bore out her warning of peril in space, Barker wrote to say, “You were spot on. Well done!” When Bobby Kennedy was assassinated after months of her warning that his life was in danger, Barker called it her best work yet.
But what about Middleton’s unsuccessful predictions—her “worst work”? The review says nothing, except that they fudged the unsuccessful guesses to make them seem more accurate:
If they sometimes had to stretch to make the news fit what Middleton and her fellows dreamed up — letting tornadoes in the Midwest satisfy a prediction of catastrophic weather in California, for instance — Barker saw no problem. He was overjoyed with the success of his star psychics and hoped to scour the country to find more like them. He believed second sight was as common as left-handedness. It didn’t matter that the premonitions were rarely specific enough to be useful, warning simply of a train to derail somewhere, an airliner to crash at some point. Barker believed he was onto something cosmic.
Rarely specific enough to be useful? Why don’t they give us one instance in which a predication wasuseful, and evidence that the predictions that proved accurate were more common than could be accounted for by coincidence (e.g., was there one person whose precognitions were almost invariably accurate?) If this worked, that person would have won a million bucks from James Randi (nobody ever did). Even according to the author’s count, only 3% of the predictions “came true” (mostly from MIddleton and Hetcher). And that, I’m sure, is stretching it.
The text here gives one no assurance that anything other than coincidence was involved. To make a scientific and definitive statement about the efficacy of precognition, you’d need a rigorous and accurate set of tests, tests incorporating fraud-detectors like James Randi. There are no such tests that have proved successful. The NYT does not mention this.
But wait! There was one “successful” prediction: Middleton and Hencher predicted that Barker would soon die (they give no date) and a year and a half after the “predictions bureau” was founded, Barker had a cerebral hemorrhage and croaked. Is that uncanny, or just coincidence?
The review concludes that the precog experiment was indeed “worth a shot”:
Barker’s psychics’ predictions had proved accurate, but they did not help him avoid his fate. He had hoped to use the Bureau to change the future. It had not even come close. By Knight’s count, only 3 percent of the Bureau’s predictions came true — nearly all of the successes from Middleton and Hencher. It found no useful data and prevented no tragedies. But that doesn’t mean it wasn’t worth a shot.
“We confer meaning as a way to control our existence,” writes Knight. “It makes life livable. The alternative is frightening.”
Three percent is better than nothing. Even false meaning is preferable to fear.
I’m sorry to have to say this, but that conclusion is bullshit. If “false meaning is preferable to fear”, then we should all become religious. And, anyway, what kind of fear does 3% of coincidental matches dispel? What is the frightening abyss into which we must gaze if none of the predictions were even remotely true? The last two paragraphs are pure New Yorker-style prose: they sound good, but they say nothing.
Some of the evidence for precognition that people found convincing came from Daryl Bem. This evidence has not held up (see also here). Doesn’t the NYT or its authors owe us that information, or the fact that there is no conceivable way that the laws of physics could even allow precognition? No, because the paper is are wedded to cosseting our “spiritual” side.
The complete title of McWhorter’s new book is Woke Racism: How a New Religion Has Betrayed Black America, and we’ve talked before about some of the contents that McWhorter posted on his earlier Substack column. The book isn’t yet out in paperback, but I got a hardback copy several weeks ago from interlibrary loan. (I have no more room to put books on my shelves–not even 2 inches of space.) The book is available now only in hardcover, but you can either wait until the paperback appears this fall, get it from the library, borrow it, or buy the hardbound copy for $18.01. But don’t wait to read it.
I recommend it most highly. (You knew I would.) It’s a short read—187 pages of text—and written in a simple but punchy style. McWhorter doesn’t pull any of those punches, either, describing the performative character of “woke racism” in a way that only a black man could get away with. (For instance, he says that a lot of people’s offense is simply a lie.)
You can get a taste of the style from the Amazon site “look inside” feature, and the topics from Table of Contents. Here are the contents and then a table from the first chapter which shows the contradictory nature of what McWhorter calls “third wave racism” (Electism):
A screeenshot, since I can’t transcribe it:
The lens through which McWhorter views “wokeism” is as a religion: a real religion, not just a metaphor for religions that worship a God. Although I don’t think this trope is absolutely necessary for McWhorter to make his case, but it does add considerably to our understanding of the phenomenon. The “Elect” (his word for the “woke”) will brook no dissent, believe in an original sin (racism, of course), demonize those who are against them, cast them to a social-media hell (or worse: getting them fired or banned), have a common set of tenets that, as shown above, contradict each other (cf. Christianity: God is loving but if you don’t accept him you’ll burn forever), and have a set of inerrant prophets, including Ibram Kendi, Robin DiAngelo, and Ta-Nehisi Coates. Their words are not to be questioned; the prophets are to be worshipped and evoked as often as possible.
The book is not intended for The Elect because, as McWhorter asserts, their minds aren’t open. That’s true, just as my book Faith Versus Fact wasn’t intended for fundamentalist religionists. In both cases our books were intended for either those on the fence, those with open minds or, in McWhorter’s case, for those who already dislike Wokeness but want a critical analysis of its flaws as well as some bucking up. Wokeism may, for instance, repel you for reasons you don’t understand, and McWhorter supplies those reasons.
There are several, and since this isn’t a full review, I’ll just touch on them. First, “Electism” (or, as I prefer, “Wokeism”) is largely performative: it is a show of virtue without really accomplishing anything to lessen the inequalities that have plagued black people. How, for example, does firing a professor who explicates the “fill-in” word in Chinese “ne-gah” (just as “like” is a fill-in word in American English), accomplish anything to eradicate racism? We know of dozens of such performances. Academia is full of them, and they’ve spilled over into society at large. I see them every day.
Don’t get McWhorter wrong: he does see inequality of blacks and whites as a serious problem, but also thinks that black people have to lend a hand in helping us fix it. I’ll mention his solutions below. But by laying out the arrant stupidity (well, “misguidedness”) of performative Electism, he not only helps us understand it, but also to fight it and to stop flagellating ourselves as irreparably broken racists. In this sense it is heartening. It doesn’t aim to perpetuate racism by mitigating white guilt, but to show that much of that guilt is unwarranted.
In fact, McWhorter’s notion is that Electism actually harms black people in several ways. One way, which I’ve seen at my own university, is by infantilizing them: treating them as an especially sensitive group that must be coddled rather than respected. Once you realize how this infantilizing is done—and it’s done by both blacks and whites, but is especially odious when by whites—you can see signs of it everywhere. And this infantilizing leads to lower both the expectations we have for black achievement as well as the standards that we hold everyone to. It is, in fact, the very reason why the meritocracy is being dismantled, and why colleges and schools are getting rid of standardized tests. But this doesn’t help black people. How could it? It may get more of them into universities, but McWhorter claims that, in the elite schools at least, poor secondary-school education plus a culture not based prizing learning leads to many black students being underprepared, and either dropping out of or changing schools.
Another virtue of the book is that, like Mill’s “On Liberty,” McWhorter constantly anticipates the objections of the Woke and defuses them in advance. These include the idea that McWhorter must be a self-hating black, that we need affirmative action for all minorities, whether or not they’re disadvantaged, and that affirmative action must be based solely on how one is grouped racially. It must also last forever.
The initial chapters describe the phenomenon of Electism, make the case that it’s a real religion, and give many examples—you’ll be familiar with some—of how Electism plays out in everyday life. It’s horrifying to see what the Elect have gotten away with, but of course they get away with their shenanigans for one reason only: white people really don’t want to be called racists, and will do nearly anything to avoid that label.
Electness meets the road in the last two chapters. Chapter 5 contains McWhorter’s recommendations for how to really help black people. They may sound too few, or too silly, but the more one thinks about them, the more they make sense. In his view, there are only three correctives.
1.) End the war on drugs
2.) Teach reading properly (he recommends phonics, and knows whereof he speaks)
3.) Get past the idea that everybody must go to college
Each of these has wide ramifications that you can imagine if you think about them. But you needn’t, for McWhorter gives the rationales in detail. Sadly, none of these things are being emphasized or accomplished by the Woke, and none of them are the subject of the performative wokeism we encounter every day.
The last chapter deals with people who oppose performative wokeism but still want to help black people. What do you do when the Elect come for you? McWhorter sees acting on his advice as critical, for Electism is no longer a problem with colleges alone. It’s plagues all of American (and much of British and Canadian) society. McWhorter’s suggestion includes not engaging the Elect (they won’t listen), do not apologize for your actions or views if you advance them in reason good faith, and, most important, stand up to the woke. Don’t buy their bullshit, don’t let them make you feel guilty, and, if you disagree, just say so and walk away. And build your own group of like-minded people who are also antiracist.
That, of course, requires that you “out yourself” as an opponent of the Elect. I have already done so, but what do I have to lose? I don’t use Twitter, I have my own platform here, and I’m retired. Nobody can fire me. But there are many who do have things to lose. McWhorter’s advice is to stand up for your principles, even if you suffer by doing so. Just as atheists did, the more one “comes out”, the more heartened your ideological confrères become, and the more likely they’ll be to join you. The Elect, of course, will deem you a racist simply for opposing their mishigass. Don’t let them get away with it.
McWhorter finishes the book by addressing those who agree with his arguments:
The Elect will ever be convinced that if you join these brave, self-possessed survivors, you are, regardless of your color, a moral pervert in bed with white supremacy.
But you aren’t and you know it.
Buy and read this book. Surprisingly, the professional reviews have been good (it even got a star from Kirkus!), and it’s selling quite well. Don’t miss out.
Oh, and let me add that, as you might expect, the book is wonderfully written with simple and stylish prose. But if you’ve read McWhorter before, you’ll expect that. He’s a national treasure, a man whose voice is especially urgent as America tears itself apart over racism.
The title of this post is mine (Jerry’s), and since there’s no word that has replaced “wokeness”—a word I construe as “performative social justice, often going to ludicrous extremes, that has little effect on society”—I’ll use that one. The articles below were called to my attention by reader Smith Powell, and his commentary was substantial enough that I asked him if I could post it (along with some additions that I’ll identify, like adding the screenshots). He gave permission, and so I’ll put it between the lines. My own additions and comments are in brackets with a “JAC:” beginning the note.
Smith Powell briefly discusses a book review and a letter to the editor that appeared in a recent issue of Science, giving his reaction to both. I’ve left in the “deadnaming” for the book review simply because many of you may have read the author, Riley Black, who was well known for science writing before transitioning to the female gender and taking the name “Riley Black”. Her piece turns out to be far more of a manifesto for inclusiveness than a book review, committing the cardinal sin of book reviewing: assessing a book that the author should have written instead of the one he did.
Commentary by Smith Powell
A recent issue (24 December 2021) of Science had two woke items of interest. The first was a review of Dinopedia by Darren Naish. The review is called “Revisiting paleontology’s greatest hits”. The second is a letter “Transgender rights rely on inclusive language” with multiple authors.
[JAC: Click on the screenshots, which may take you to the article. pdfs are available via judicious inquiry]:
The reviewer, Riley Black, is herself an author of books and many articles and blog posts on dinosaurs and she has written about a number of other science topics. Until recently, Ms. Black was associated with Scientific American where she wrote for a blog entitled Laelaps. She used to write as Brian Switek but came out as transgender and non-binary in 2019.
Ms. Black makes her point in the very first paragraph of her review [emphases mine]:
Dinosaurs garner esteem that is often reflected onto the people who search for, excavate and study them, and therein lies a fundamental problem with the ever-increasing number of popular tomes about the “terrible lizards” hitting bookshelves. Even as the field of vertebrate paleontology pushes to become more inclusive, personages from decades past remain the only experts many members of the public encounter. Although there is a trove of dinosaurian information to recommend paleontologist Darren Naish’s short encyclopedia Dinopedia, it does little to correct this antiquated view of who is, or can be, a paleontologist.
Ms. Black notes that Naish has written a “friendly and breezy tour of dinosaurs and what paleontologists have come to know about them.” She further notes that the book is illustrated with Naish’s own drawings and that “the result is a solid primer on dinosaur science…” Again, she makes her point when she writes, “Nevertheless, the book offers a view of modern dinosaur scientists that is practically petrified.”
Ms. Black continues:
Naish includes profiles of a handful of paleontologists: Robert Bakker, Jack Horner, Halszka Osmólska, John Ostrom, Richard Owen, Greg Paul, and Paul Sereno. These figures were indeed pivotal in the dinosaur debates and discussions of the late 20th century, but Naish’s decision to focus on them, rather than on contemporary paleontologists, makes the book feel decades out of date rather than representative of modern dinosaur studies. Aside from the gender imbalance, nonwhite scientists and researchers from the Global South are given short shrift.
[JAC: I’ve added the next indented section]
Naish’s book joins a number of recent titles that have failed to effectively convey the increasingly diverse practice of paleontology. I, too, have fallen far short in achieving equality and inclusivity in my writing.
There is no doubt that the Dinosaur Renaissance was huge for paleontology. Many of the children who were inspired by the museum exhibits, books, and films that debuted during that time are paleontologists or fossil fans still. But the height of that era’s dinomania was nearly three decades ago, when discussions about diversity and representation in the field often occurred in the background, if they happened at all. Paleontologists today openly consider such issues, as well as adjacent topics such as the ethics of collecting specimens and samples in other countries and the repatriation of illegally exported fossils.
Still, there is much work to be done. In a field in which even gender equity between white cisgender researchers has been difficult to achieve, now is not the time to reaffirm the male-dominated days as representative of where the field stands today.
Ms. Black gives no hints as to what debates and discussions have been prominent in the last couple of decades that should have been addressed by Naish, nor does she offer any examples of researchers who should have been recognized by him. Indeed, she notes:
Change is likely to come slowly. Diversity in paleontology is currently highest among volunteers, students, and early-career researchers, all of whom are less likely to be conducting research that is covered by the press, and less likely to write books themselves.
But she does offer a mea culpa when she wrote above, “I, too, have fallen far short in achieving equality and inclusivity in my writing”.
In summary, it appears to me that Ms. Black thinks that Darren Naish has written a nice book on dinosaurs that is marred because it is not sufficiently woke.
[JAC: I’ve added the excerpt below. I was appalled when I read the review, for it’s far more about the author failing to socially engineer paleontology than it is about paleontology itself. In other words, Black criticizes the book for failing to be the kind of book she wanted. But the kind of book she wanted would be far more about making paleontology more “inclusive” than about dinosaurs themselves. And Black doesn’t even name the advances or the BIPOC paleontologists that she wants to see represented, even after admitting that there are few of them. I suggest that Black herself write that book! The excerpt:]
Again and again, op-eds and sociological studies have pointed out that a lack of visible representation affects who goes into science and who is supported through its process, which, in turn, affects scientific theory and thought. It is time to start embodying the change we wish to see. Ensuring that popular accounts of paleontology reflect the field’s 21st-century practitioners would be a strong step toward this goal.
But “embodying the change we wish to see,” as Black has done, is not the same as imbuing every aspect of the world with a single change you wish to see.
The second item in this issue is a letter [above] that notes in the first sentence, “Inclusive language around sex diversity has never been more important”. Do the authors mean “gender diversity”? Apparently not, as they write:
It is important to recognize the context-dependent and multidimensional nature of sex. Rather than privilege any characteristic as the sole determinant of sex, “male” and “female” should be treated as context-dependent categories with flexible associations to multiple variables (such as, but not limited to, genitalia, gametes, or karyotype). The usage of “male” and “female” should be explicitly defined in any given study. Failing to do so promotes harmful language (such as “male chromosomes” rather than “Y chromosomes”) that attributes an essential “maleness” or “femaleness” to traits, obscuring the true biological mechanisms at work (e.g., the Tdf gene leads to testicular development, not to “being male”). No one trait determines whether a person is male or female, and no person’s sex can be meaningfully prescribed by any single variable.
I am not a biologist. I thought sex was very much bimodal with a very small percentage of indeterminant cases. I guess I need a little help in understanding this issue. [JAC: Smith Powell is actually correct in his criticism. Sex is for all intents and purposes binary and bimodal, for it depends on whether the individual is capable, or would be capable, of producing eggs or sperm. There is only a tiny, tiny minority of people who, at birth, elude this dichotomy.]
I’m further confused as the authors continue:
Awareness of the distinction between sex and gender is another vital element to inclusive, quality research. Conflating the two harms and invalidates gender minorities by implying that these distinct attributes are inextricably linked.
[JAC: It is the authors who are conflating sex and gender here. There is an accepted biological meaning of “male and female” in organisms that have two sexes. Yet in the first paragraph they mistake sex for “gender” when they say that sex is “context-dependent and multidimensional.” It is not.]
In conclusion, the authors write:
As scientists, we must push back against the misappropriation of biological terms by promoting precise language that focuses on the variables themselves (e.g., “menstruating people”) and acknowledging that people express these variables in ways that may not conform with a binary system of sex or gender. This both creates a more inclusive environment for gender-diverse scientists and reinforces that sex is a context-dependent summary of a multidimensional variable space.
“Menstruating people”? Isn’t that an example of conflating sex and gender? I think I understand some of the words individually, but when they are put together as these authors did, I see a word salad with some words apparently not meaning what I thought.
JAC: Apparently it’s not sufficient to recognize that “gender”—the sex role or aspects of sex roles that people assume or behave according to—is indeed a spectrum. No, they want to distort biology itself by claiming that sex itself is equally continuous and a “spectrum.” This is a Lysenko-ist tactic to try to make nature conform to an ideology.
But I swear, I spent my whole career sorting fruit flies—thousands a day—and maybe once every six months I’d find one “gynandromorph”: a fly that was part male and part female (this happens rarely when an X chromosome gets lost during female development). These XO individuals, which must constitute less than 0.00001% of flies, are not a third “sex,” but a developmental accident. And they are invariably sterile. For all the other flies, when I dissected one that looked like a male, it had testes and sperm. When I dissected one that looked like a female, it invariably had ovaries and nascent eggs.