Amazon review of “The War on Science” volume rejected for using “woke” as pejorative

March 5, 2026 • 11:00 am

Reader Jon Gallant recently finished the essay collection compiled and edited by Lawrence Krauss, The War on Science:  Thirty-Nine Renowed Scientists and Scholars speak Out About Current Threats to Free Speech, Open Inquiry, and the Scientific Process.” (Luana and I have a paper in it taken from our Skeptical Inquirer paper on the ideological subversion of biology).

Jon decided to leave a review of the book on its Amazon page (his review is shown below in the Amazon rejection). Yep, his submitted review was rejected. He sent the rejection to me and I reproduce it and his emailed speculations (with permission).  I’ve put a red box around the submitted review:

At first I was puzzled, as I don’t follow Amazon reviews and know nothing about the ideology of the site or company.  Can you guess why the review was returned with requests for changes?  I suspect you’ve guessed correctly, though we can’t be sure.  I asked Jon what he thought, and here’s some of his response:

Use of the term “woke” in a less than reverential tone is no doubt classified by Amazon’s editors as “hate speech”.  After all, it makes wokies feel unsafe.  My hunch is that the dopier Communications majors from the 2010s work as review editors at Amazon.  The better-connected ones get into the editorial offices of some Nature publications we have encountered.

In truth, I can see no other explanation.  The review was not worshipful enough of wokeness, and in fact made fun of it, even expressing a hope that it would disappear.  If you have another explanation, by all means put it in the comments. I had no patience to read Amazon’s “community guidelines” to see if there were other infractions.

I don’t know if Jon will resubmit his review, but I thought that this was both sad and amusing. The other reviews (126 of them) are bimodal (70% five star, 18% one star), and it’s also amusing to look at the negative ones, most of them finding the book guilty of association with the wrong people, or not hard enough on Trump and right-wing assaults on science (not its purpose)

The New York Times highlights faith again

March 2, 2026 • 10:45 am

Originally I was going to call this post “The New York Times coddles faith again,” but there is not all that much coddling in this review of Christopher Beha’s new book Why I am not an Atheist. 

What puzzles me is that the review is on the cover of the NYT’s latest Sunday book section. That position is usually reserved for important or notable books, but Timothy Egan’s review doesn’t make the book seem that interesting. Could it be that the cover slot came from the book being about . . . . God? At any rate, given that Beha’s book came out February 17, the fact that its Amazon ranking is only 1,562 (very low for a new book on the benefits of faith), and there are only 8 reviews (all 5-star reviews, of course), is not a sign that this is a barn-burner that will fill the God-shaped lacuna in the public soul.

Beha has previously given an excerpt of his book in the NYer, which I discussed in my recent post  “A New Yorker writer loses faith in atheism.”  I found Beha’s arguments lame, and I summarized the book this way, as well as provided information on the author.  From my post:

Even the title of this New Yorker article is dumb: “faith in atheism” is an oxymoron, for a lack of belief in gods is not a “faith” in any meaningful sense. But of course the New Yorker is uber-progressive, which means it’s soft on religion. And this article, recounting Christopher Beha’s journey from Catholicism to atheism and then back to a watery theism, is a typical NYer article: long on history and intellectual references, but short on substance. In the end I think it can be shortened to simply this:

“Atheism in all its forms is a kind of faith, but it doesn’t ground your life by giving it meaning. This is why I became a theist.”

So far as I can determine, that is all, though the article is tricked out with all kinds of agonized assertions as the author finds he cannot “ground his life” on a lack of belief in God. But whoever said they could?  But it plays well with the progressive New Yorker crowd (same as the NY Times crowd) in being soft on religion and hard on atheism.  The new generation of intellectuals need God, for to them, as to Beha, only a divine being can give meaning to one’s life.

Christopher Beha, a former editor of Harper’s Magazine,  is the author of a new book, Why I am Not an Atheist, with the subtitle Confessions of a Skeptical Believer. The NYer piece is taken from that book

You can read the Sunday NYT review by clicking on the screenshot below, or find it archived for free here.

Here’s the cover highlighting the book (thanks to Greg for sending me a photo of the paper version he gets).  Stuff like this roils my kishkes:

Reviewer Tinothy Egan is somewhat lukewarm about the book, even though he avers that he is a believer and had his own search for faith as well as an inexplicable faith epiphany. The NYT identifies him this way:

Timothy Egan is the author of “A Pilgrimage to Eternity: From Canterbury to Rome in Search of a Faith,” among other books, and a winner of the National Book Award for nonfiction.

So both author and reviewer, as well as the MSM (including the NYT), are rife these days with either promotions of religious books or softball reviews of them.  And all this manages to center on the search for meaning in these dire times, a search for meaning that always winds up filling the “God-shaped hole” in our being. That is something Egan apparently documents in his own book and is, of course, the subject of Beha’s book.

As I noted when reviewing Beha’s New Yorker piece, he went back and forth from a youthful Catholicism to a materialistic atheism and then found his way back to God again, always tormented by the fact that he saw an angel who spoke to him when he was 15.  As reviewer Egan says:

As someone who also saw something inexplicable (a long-dead saint opening her eyes from a crypt in Italy), I preferred the teenage Beha who was filled with religious wonder. Not to worry. By the end of the book, he returns to the angel with an expanded view. It was both miracle and real. “I know what ‘caused’ these visitations, from a strictly material standpoint, but I also know what they in turn caused — a lifelong journey that I am still on.”

Not to worry! That statement alone speaks volumes. But Egan continues:

In between are several hundred pages that make up that journey, almost all of it through the mostly atheistic philosophers of the Western canon. Unlike a traditional pilgrimage, this book is an odyssey of the mind. Beha debates the old masters: Descartes, Kant, Locke, Mill, Hobbes, Camus, Nietzsche and many, many others, but he starts with a poke at the “New Atheists” Sam Harris, Richard Dawkins, Christopher Hitchens and the like — all of them now passé, in his view.

This tells you two things: the reviewer is soft on spiritual experiences, since he himself had one (see the link three paragraphs back), and that the author bashes the New Atheism as being “passé”, a cheap shot which doesn’t at all give New Atheism credit for pushing along the rise of the “nones” and making criticism of religion an acceptable thing to discuss.

But Beha is still somewhat critical of the scholastic tenor of the book, so it’s not a totally glowing review:

Beha is not a stone thrower or even much of a picker of fights. He reveres the great minds, to an obsessive degree. He’s the guy you wanted as your college roommate in the pre-A.I. era. Or maybe not. He’s done all the reading and even wrote a memoir about it, “The Whole Five Feet,” recounting the year he consumed all 51 volumes of the Harvard Classics series. Just looking at the list makes most of us tired.

He climbed that mountain, so we don’t have to. But, alas, at times in his new book he gets lost in the clouds. Here’s a sample, discussing Immanuel Kant, the German philosopher: “Kant is here invoking two binaries we’ve already discussed. The first is that between a priori and a posteriori truth; the second is that between analysis and synthesis.”

But Beha is sincere, honest and likable on the page. I found his personal story more engaging than his intellectual one. He started to doubt his faith at 18 when he nearly lost his twin brother to a car accident. He suffered from depression and life-threatening cancer, drank too much and took too many drugs. (He was an atheist for a long time.)

But as for the things I highlighted in my own take on Beha’s NYer article—things like the “faith in science” that we supposedly have, and the “romantic idealism” that is coequal to science in its inability to apprehend universal truths—of these things Egan says nothing. Nor does he point out that many people (I’m one) have found satisfaction without God, though many of us don’t have a God-shaped hole nor are actively looking for meaning.  Instead, Egan’s take is anodyne, for one simply cannot get away with pushing nonbelief in the New York Times. What you can do is bash atheism in general and New Atheism in particular.

Egan:

Ultimately, atheism failed [Beha], as it did some in the French Revolution who briefly converted the Notre-Dame Cathedral into the spiritually barren Temple of Reason. The religion of nonreligion can be like nonalcohol beer: What’s the point?

I have to interject here to note that “nonreligion”—atheism—is not religion, in the same way that not drinking is a form of alcoholism.  The trope that atheists have “faith” is simply ridiculous. What they have is a failure to be convinced of a phenomenon when there is no evidence for it. But I digress. Egan continues his review’s peroration:

Beha is not interested in trying to sway those who’ve given up on God. He simply wants to explain what moved him back to the faith of his fathers, “listening to the whispering voice within our souls.” There’s no Road-to-Damascus conversion. He’s not blinded by the light. It’s more about his often miserable life getting better with the right woman, a Catholic confession, regular attendance at Mass. And that woman — “she was the reason I believed in God” — isn’t even a believer. She’s a lapsed Episcopalian.

If Beha doesn’t necessarily win his argument with Russell, give him credit for following the imperative of all sentient beings — to deeply consider the mystery of ourselves in an unknowable universe.

“I don’t believe I will ever see things clearly; not in this mortal life,” he concludes. “The best we can hope for is to be looking in the right direction, facing the right way.”

The proper response to this conclusion is “meh”.

Another critique of Agustín Fuentes’s claim of a sex spectrum in humans and other species

February 1, 2026 • 11:20 am

Although the view that sex is a spectrum, and that there are more than two biological sexes in humans and other species, is still prevalent among the woke, others are realizing that sex in humans (and nearly every other species of plant and animal) is indeed a binary, with a tiny fraction of exceptions in humans. These include individuals with “differences in sex determination” (DSD) and almost nonexistent hermaphrodites. Estimates of exceptions in our species range from 0.02% to 0.005%.

The rise of the “sex is a spectrum” notion is due solely to the rise of gender activism and to people who identify as nonbinary or transgender.  But gender is not the same thing as biological sex: the former is a subjective way of feeling, while the latter is an objective fact of biology based on a binary of gamete types.

I personally don’t care if someone identifies as a member of a nonstandard gender, but I do care when people like Steve Novella, who should know better, argue that biological sex is not a binary but a spectrum. In fact, there are far more people born with more or fewer than 20 fingers and toes than are born as true intersexes, yet we do not say that “digit number in humans is a spectrum.”

It’s a shame that many of those who claim that sex is a spectrum are biologists who recognize the sex binary and its many consequences, like sexual selection. The misguided folks include the three main scientific societies studying evolution, who issued a statement that biological sex was a spectrum, and further that this was a consensus view. (Their original statement is archived here.) The societies then took down their claim when other biologists pointed out its inanity (see here, here, and here). And it’s not only biologists who recognize the ideology behind the claim that sex is a spectrum; the public does, too.  NBC News reported this in 2023 (note the conflation of sex and gender):

A new national poll from PRRI finds Americans’ views on gender identity, pronoun use and teaching about same-sex relationships in school deeply divided by party affiliation, age and religion.

Overall, 65% of all Americans believe there are only two gender identities, while 34% disagree and say there are many gender identities.

But inside those numbers are sharp differences. Fully 90% of Republicans say there are just two genders, versus 66% of independents and 44% of Democrats who believe the same

Sadly, if you’re on the side of truth in this debate, at least as far as the number of sexes go, you’re on the side of Republicans. So it goes. Further, Americans and sports organizations themselves are increasingly adopting the views that trans-identified men (“transwomen,” as they’re sometimes called) should not compete in sports against biological women. This is from a 2025 Gallup poll.

Sixty-nine percent of U.S. adults continue to believe that transgender athletes should only be allowed to play on sports teams that match their birth sex, and 66% of Americans say a person’s birth sex rather than gender identity should be listed on government documents such as passports or driver’s licenses.

Thus, although wokeness is like a barbed porcupine quill: easy to go inside you but hard to remove, I’m pretty confident that the claim of a biological sex spectrum will eventually decline even more. But there are still some ideologues who twist and misrepresent the facts to argue that there are more than two sexes. (The argument centers on humans, of course.)  One of these is Princeton anthropologist Agustín Fuentes, who has written several papers and a recent book arguing for the human sex spectrum. I’ve pushed back on his arguments many times (see here), and wrote a short review of his book Sex is a Spectrum, a book that should be read with a beaker of Pepto-Bismol by your side. There’s another and better critical review of Fuentes’s book by Tomas Bogardus, here,  which Bogardus has turned into his own new book, The Nature of the Sexes: Why Biology Matters.

This post is just to highlight another critical review of Fuentes’s book and his views on sex, one written by Alexander Riley and appearing at Compact. You can get to a paywalled version by clicking on the title below, but a reader sent me a transcript, and I’ll quote briefly from that below.

A few quotes (indented). I don’t know how readers can access the whole review without subscribing:

Fuentes, an anthropologist who has extensively studied macaques, begins with a primer on the evolution of sexual reproduction in life on the planet. To show how “interesting” sex is, he offers the example of the bluehead wrasse, a fish species in which females can turn into males in given ecologies. The example, he says, is “not that weird” in biology.

But the reality is that species like this one most definitely are weird, not only in the animal kingdom, but even among fish, who are among the most sexually fluid animals. Among fish, the number of species that are sexually fluid in this way is perhaps around 500 … unless you know that there are approximately 34,000 known fish species. In other words, even in the most sexually fluid animals, transition between male and female by one individual can happen in only 1.5 percent of the total species. What Fuentes describes as “not that weird” is certainly highly unusual. [JAC: note that switching from male to female or vice versa does not negate the sex binary.]

This sleight of hand is typical of Fuentes’s handling of evidence. He attacks a classic argument in evolutionary biology that differences in male and female gametes (sperm an eggs, respectively) explain many other differences between the two sexes. In short, because eggs are much costlier to make than sperm, females have evolved to invest more energy in the reproductive chances of each gamete compared to males. This bare fact of the gamete difference means, according to the Bateman-Trivers principle, males and females typically develop different mating strategies and have different physical and behavioral profiles.

The distortion below is typical of ideologues who promote Fausto-Sterling’s data even when they know it’s incorrect:

Fuentes notes that what he calls “3G human males and females,” that is, those individuals who are unambiguously male or female in their genitalia, their gonads (the gland/organ that produces either male or female gametes), and genes, do not make up 100 percent of human individuals. He goes on to suggest that at least 1 percent of humans, and perhaps more, do not fit the 3G categories. This is a claim unsupported by the facts. The citation he links to this claim is an article by biology and gender studies professor Anne Fausto-Sterling. The claim made by Fausto-Sterling about the percentage of those who are intersex has been thoroughly debunked. She includes a number of conditions in her category of intersex (or non-3G) that are widely recognized as not legitimately so classified. One such condition (Late Onset Congenital Adrenal Hyperplasia, or LOCAH, a hormonal disorder) makes up fully 90 percent of Fausto-Sterling’s “intersex” category. Individuals with LOCAH are easily classed as either male or female according to Fuentes’ 3Gs, and nearly all of them are able to participate in reproduction as normal for their sex. The percentage of those who are actually outside 3G male or female classes is likely around 0.02% percent, which means that 9,998 out of every 10,000 humans are in those two groups.

What’s below shows that trans-identified men do not become equivalent to biological women when they undergo medical transition:

Transwomen are much more likely to exhibit behaviors of sexual violence and aggression than women. A 2011 study showed clearly that even male-to-female transsexuals who had undergone full surgical transition, and who therefore had undergone hormone therapy to try to approximate female hormonal biology, still showed rates of violent crime and sexual aggression comparable to biological males. They were almost twenty times more likely to be convicted of a violent offense than the typical female subject. This is reason enough to keep individuals who have male hormonal biology out of spaces in which they interact closely with semi-clad girls and women.

And Riley’s conclusion:

The fact that Fuentes can make such ill-founded claims without fearing serious pushback is an indication of how captured academic culture is by the ideology behind this book. A healthy academic culture would not so easily acquiesce to political rhetoric masquerading as science.

Yes, anthropology has been captured—especially cultural anthropology—and, as I said, even some biologists have gone to the Dark Side. I have nothing but contempt and pity for those who know that there are two sexes but twist and mangle the facts to conform to the woke contention that the sexes can be made interchangeable. But I should add the usual caveat that, except for a few exceptions like sports and prisons, transgender people whould be given the same rights as everyone else.

Another sign of people rejecting the “sex is a spectrum” claim is that Fuentes’s book didn’t sell well. Despite coming out less than a year ago. it’s now #301,447 on Amazon’s sales list, and has only 25 customer ratings, totaling 3.8 out of 5 stars. It didn’t exactly fly off the shelves.

Here are two Amazon reviews by savvy readers (note: none of the reviews on Amazon are by me):

 

Short takes: An excellent movie and a mediocre book

January 21, 2026 • 11:30 am

In the last week I’ve finished watching an excellent movie and reading a mediocre book, both of which were recommended by readers or friends. I rely a lot on such recommendations because, after all, life is short and critics can help guide us through the arts.

The good news is that the movie, “Hamnet,” turned out to be great. I had read the eponymous book by Maggie O’Farrell in 2022 (see my short take here), and was enthralled, saying this:

I loved the book and recommend it highly, just a notch in quality behind All the Light We Cannot See, but I still give it an A. I’m surprised that it hasn’t been made into a movie, for it would lend itself well to drama. I see now that in fact a feature-length movie is in the works, and I hope they get good actors and a great screenwriter.

They did. Now the movie is out, and it’s nearly as good as the book. Since the book is superb, the movie is close to superb. That is, it’s excellent but perhaps not an all-time classic, though it will always be worth watching. Author O’Farrell co-wrote the screenplay with director  Chloé Zhao, guaranteeing that the movie wouldn’t stray too far from the book. As you may remember, the book centers on Agnes, another name for Shakespeare’s wife Anne Hathaway, a woman who is somewhat of a seer (the book has a bit of magical realism). And the story covers the period from the meeting of Shakespeare and Agnes until Shakespeare writes and performs “Hamlet,” a play that O’Farrell sees as based on the death from plague of their only son Hamnet (another name for Hamlet; apparently names were variable in England).  I won’t give away the plot of the book or movie, which are the same, save to say that the movie differs in having a bit less magic and a little more of Shakespeare’s presence. (He hardly shows up in the book.)

The movie suffers a bit from overemotionality; in fact, there’s basically no time in the movie when someone is not suffering or in a state of high anxiety.  But that is a quibble. The performances, with Jessie Buckley as Agnes and Paul Mescal as Shakespeare, are terrific. Buckley’s is, in fact, Oscar-worthy, and I’ll be surprised if she doesn’t win a Best Actress Oscar this year.  The last ten minutes of the movie focuses on her face as she watches the first performance of “Hamlet” in London’s Globe theater, and the gamut of emotions she expresses just from a close shot of her face is a story in itself.  Go see this movie (bring some Kleenex for the end), but also read the book.  Here’s the trailer:

On to the book. Well, it was tedious and boring, though as I recall Mother Mary Comes to Me, by Indian author Arundhati Roy, was highly praised. Roy’s first novel, The God of Small Things, won the Booker Prize and I loved it; her second, The Ministry of Utmost Happiness, was not as good.  I read Mother Mary simply because I liked her first book and try to read all highly-touted fiction from India, as I’ve been there many times, I love to read about the country, and Indian novelists are often very good.

Sadly, Mother Mary was disappointing. There’s no doubt that Roy had a tumultuous and diverse live, and the autobiography centers around her  relationship with her mother (Mary, of course), a teacher in the Indian state of Kerala. The two have a tumultuous connection that, no matter how many times Roy flees from Kerala, is always on her mind.  It persists during Roy’s tenure in architectural school, her marriage to a rich man (they had no children), and her later discovery of writing as well as her entry into Indian politics, including a time spent with Marxist guerrillas and campaigning for peaceful treatment of Kashmiris.

The book failed to engage me for two reasons. First, Mother Mary was a horrible person, capable of being lovable to her schoolchildren at one second and a horrible, nasty witch at the next.  She was never nice to her daughter, and the book failed to explain (to me, at least) why the daughter loved such a hateful mother. There’s plenty of introspection, but nothing convincing. Since the central message of the novel seems to be this abiding mother/daughter relationship, I was left cold.

Further, there’s a lot of moralizing and proselytizing, which is simply tedious. Although Roy avows herself as self-effacting, she comes off as a hidebound and rather pompous moralist, something that takes the sheen off a fascinating life.  Granted, there are good bits, but overall the writing is bland.  I would not recommend this book.

Two thumbs down for this one:

Of course I write these small reviews to encourage readers to tell us what books and/or movies they’ve encountered lately, and whether or not they liked them. I get a lot of good recommendations from these posts; in fact, it was from a reader that I found out about Hamnet.

Michael Shermer interviews Matthew Cobb on his Crick biography

January 18, 2026 • 9:45 am

Here we have an 83-minute interview of Matthew Crick by Michael Shermer; the topic is Francis Crick as described in Matthew’s new book Crick: A Mind in Motion. Talking to a friend last night, I realized that the two best biographies of scientists I’ve read are Matthew’s book and Janet Browne’s magisterial two-volume biography of Darwin (the two-book set is a must-read, and I recommend both, though Princeton will issue in June a one-volume condensation).

At any rate, if you want to get an 83-minute summary of Matthew’s book, or see if you want to read the book, as you should, have a listen to Matthew’s exposition at the link below.  I have recommended his and Browne’s books because they’re not only comprehensive, but eminently readable, and you can get a sense of Matthew’s eloquence by his off-the-cuff discussion with Shermer.

Click below to listen.

I’ve put the cover below because Shermer mentions it at the outset of the discussion:

My brief interview of Matthew Cobb about his new biography of Francis Crick

January 7, 2026 • 11:00 am

Matthew Cobb’s new biography of Francis Crick has been out for only a short time, but I’ve never seen a review less than enthusiastic (check out this NYT review). I finished it last week, and was also enthusiastic, finding it one of the best biographies of a scientist I’ve ever read. It concentrates on Crick’s science, but his accomplishments were inseparable from his personality, which focused not only on science but also on poetry (the book begins and ends with a poet), drugs, women, and philosophy (he was, by the way, a hardcore atheist and determinist).

But I digress. I really recommend that if you have any interest in the man and his work, which of course includes helping reveal the structure of DNA, you get this book and read it. It is a stupendous achievement, based on tons of research, sleuthing, and interviews, and only a geneticist could have written it. But it’s not dull at all: Matthew has always written lively and engaging prose. Crick is also a good complement to Matthew’s previous book, Life’s Greatest Secret, about how the genetic code was cracked.

As a complement, a biography of Jim Watson by Nathaniel Comfort is in the works, but hasn’t yet been published.

After I finished the book,  I had a few questions about Crick and his work, and asked Matthew if I could pose them to him and post his answers. on this site  He kindly said “yes,” and so here they are. My questions are in bold; Matthew’s answers in plain text. Enjoy:

What one question would you ask Crick if he could return from the dead? (Perhaps something that you couldn’t find out about him from your research.)

I think I would probably ask him about his view of the state of consciousness research. His key insight, with Christof Koch, was that rather than trying to explain everything about consciousness, researchers should look for the neural correlates of consciousness – neurons that fired in a correlated manner with a visual perception – and ask what (if anything) was special about how they fired, their connections, and the genes expressed within them. Since his death, we have obtained recordings from such neurons, but far from resolving the issue, consciousness studies have lost their way, with over 200 different theories currently being advanced. What did he think went wrong? Or rather, is it time to use a more reductionist approach, studying simpler neural networks, even in animals that might not be thought to be conscious?

 

Why did it take ten years—until the Nobel prize was awarded—for people to appreciate the significance of DNA?

Most people imagine that when the double helix was discovered it immediately made Watson and Crick globally famous and the finding was feted. That was not the case, mainly because the actual evidence that DNA was the genetic material was restricted to Avery’s 1944 work on one species of bacterium (this was contested) and a rather crappy experiment on bacteriophage viruses (this was the famous paper by Hershey and Chase from 1952; the experiment was so messy that Hershey did not believe that genes were made solely of DNA). So although the structure of DNA was immediately obvious in terms of its function – both replication and gene specificity, as it was called, could be explained by reciprocal base pairs and the sequence of bases – there was no experimental proof of this function. Indeed, the first proof that DNA is the genetic material in eukaryotes (organisms with a nucleus, including all multicellular organisms) did not appear until the mid-1970s! Instead, people viewed the idea that DNA was the genetic material as a working hypothesis, which became stronger through the 1950s as various experiments were carried out (eg., Meselson and Stahl’s experiment on replication) and theoretical developments were made (eg Crick’s ideas about the central dogma). Its notable that the Nobel Prize committee awarded the prize in 1962, just after the first words in the genetic code were cracked and the relation between DNA, RNA and protein had been experimentally demonstrated.

 

A lot of the latter part of the book is on Crick’s work on neuroscience (and, later, consciousness). You claim that he made enormous contributions to the field that really pushed it forward. Could you tell us a bit about what those contributions were?

Although he did not make a great breakthrough, he helped transform the way that neuroscience was done, the ideas and approaches it used. From the outset – a 1979 article in a special issue of Scientific American devoted to the brain – he focused attention on one particular aspect of brain function (he chose visual perception), the importance of theoretical approaches rooted in neuroanatomy, the need for detailed maps of brain areas and the promise of computational approaches to neural networks. All these things shaped subsequent developments – in particular the work on neural networks, which he played a fundamental part in, and which gave rise to today’s Large Language Models (he worked with both Geoffrey Hinton and John Hopfield, who shared the 2024 Nobel Prize in Physics for their work on this in the 1980s). And, of course, he made the scientific study of consciousness scientifically respectable, taking it out of the hands of the philosophers who had been tinkering with the problem for three thousand years and hadn’t got anywhere. Later, in a perspective article he published on the last day of the old millennium, he reviewed recent developments in molecular biology and predicted that three techniques would become useful: classifying neurons not by their morphology but by the genes that are expressed in them, using genetic markers from the human genome to study the brains of primates (the main experimental system he advocated using), and controlling the activity of neurons with light by using genetic constructs. All these three techniques – now called RNAseq, transcriptional mapping and neurogenetics – are used every day in neuroscience labs around the world. Indeed, within a few months of the article appearing, Crick received a letter from a young Austrian researcher, Gero Miesenböck, telling him that his lab was working on optogenetics and the results looked promising. During his lifetime, Crick’s decisive leadership role was well known to neuroscientists; now it has largely been forgotten, unfortunately.

 

Is there anything a young scientist could learn from Crick’s own methods that would be helpful, or was he a one-off whose way of working cannot be imitated?

I think the key issue is not so much Crick as the times in which he worked. As he repeatedly acknowledged, he was amazingly lucky. From 1954-1977 he worked for the Medical Research Council in the UK. He did no teaching, no grading, was not involved in doctoral supervision (I’m not even clear how many PhD students he technically supervised – 4? 3? 5? – which highlights that even if he had his name on a bit of paper, he had little to do with any of them). Apart from a couple of periods, he had no administrative duties, and only one major leadership post, at the Salk, which nearly killed him. He wrote one major grant application at the Salk (the only one he ever wrote), but basically he was funded sufficiently well to simply get on with things. And what did he do? ‘I read and think,’ he said. Try getting that past a recruitment or promotions panel today! In a way, the onus for the creation of more Cricks does not lie with young researchers, but with established scientists – they need to allow young people the time to ‘read and think’, and value failure. Most ideas will turn out to be wrong; that’s OK. Or at least, it was to Crick. Many senior researchers (and funders) don’t see things that way. However, even without such changes, young scientists can adopt some of Crick’s habits. Here’s my attempt to sum up what I think were the lessons of his life and work:

  • Read widely and avidly, even engaging with ideas that might seem eccentric or pointless, as ‘there might be something in it’ (one of his favourite phrases).
  • Talk through your ideas with your peers – try to find the weak spots in each other’s arguments.
  • At least in the initial stages of research, don’t get bogged down in the details that might counter your interpretation/theory – Crick and Brenner called this the ‘don’t worry’ approach. They figured that unconnected contrary data points might not undermine their ideas, and would eventually turn out to have specific, varied explanations.
  • Write down your ideas in the form of memos or short documents (keep them short). Writing helps you clarify your ideas and shaped your mind – do not use AI to do this! You can then share your writing with peers, which can be used as a target for discussion and debate.
  • Master the art of clear writing. Avoid jargon, keep your ideas straightforward. Again, the only way to develop this skill is to write – badly at first. So rewrite, edit, recast your writing – it will improve your thinking.
  • Above all, make sure that the science you do is *fun*. That was a word that Crick repeatedly used, and he genuinely got great pleasure from doing science and thinking about it. Seek out an area in which you can have fun and aren’t bogged down by drudgery.

Click below to get the book on Amazon:

A book recommendation: Ian McEwan’s “What We Can Know”

November 26, 2025 • 11:00 am

I decided when I read the NYT review of Ian McEwan’s latest (and 18th) novel, What We Can Know, that I had to read the book.  (Click the screenshots to read the review if you have NYT access, or find the review archived here.)  I quote some of the encomiums from the review:

Ian McEwan’s new novel, “What We Can Know,” is brash and busy — it comes at you like a bowling ball headed for a twisting strike. It’s a piece of late-career showmanship (McEwan is 77) from an old master. It gave me so much pleasure I sometimes felt like laughing.

McEwan has put his thumb on the scale. This is melodramatic, storm-tossed stuff. There is murder, a near kidnapping, a child hideously dead of neglect, multiple revenge plots, buried treasure and literary arson. Writers treat other writers’ manuscripts and reputations the way Sherman treated Georgia. No one is a moral paragon.

. . . I’m hesitant to call “What We Can Know” a masterpiece. But at its best it’s gorgeous and awful, the way the lurid sunsets must have seemed after Krakatau, while also being funny and alive. It’s the best thing McEwan has written in ages. It’s a sophisticated entertainment of a high order.

I had to get it via interlibrary loan, and since it’s new it took some time. But I did get it, and read the 300-page book in a week. And yes, it’s excellent.

 

 

I’m a fan of McEwan, and especially like his novels Atonement (made into a terrific movie) and the Booker-winning Amsterdam. This one also does not disappoint. The NYT gives a plot summary, but I’ll just say that it’s a novel about a poem, and the action takes place over two years more than a century apart: 2014 and  2119. A well-known British poet named Francis laboriously pens a “corona” poem for his wife Vivien on her 53rd birthday. It would be hard to write a normal corona, much less one that, like this one, is said to be a masterpiece. Here’s what the form comprises according to Wikipedia:

crown of sonnets or sonnet corona is a sequence of sonnets, usually addressed to one person, and/or concerned with a single theme. Each of the sonnets explores one aspect of the theme, and is linked to the preceding and succeeding sonnets by repeating the final line of the preceding sonnet as its first line. The first line of the first sonnet is repeated as the final line of the final sonnet, thereby bringing the sequence to a close.

Imagine how hard that would be to write, as the first lines have to form a stand-alone sonnet, and rhyme properly, when put in sequence at the end! To see an example, go here, though the corona has only 12 rather than 14 included sonnets.  At any rate, Francis’s poem gets a national reputation although Francis won’t let it be reproduced or published; it is read aloud on Vivien’s birthday to a dozen guests and then given to her, handwritten on vellum. But only Vivien sees it in print.

Over a hundred years later, with the world devastated by nuclear exchanges, global warming, and skirmishes, a scholar named Thomas Metcalfe, specializing in poetry of the early 2000s, decides to track down the corona to see why it was so renowned despite being unpublished (a nostalgia for the past pervades the 22nd century). As he searches for the work, the story flips back and forth between the 21st and 22nd centuries, giving us two casts of characters, both of which engage in adultery and, in the earlier century, crime.  These intrigues determine the fate of the poem, but I won’t give away the ending. The novel starts a bit slowly, but builds momentum to a roller-coaster finish.  And yes, it’s the best novel of McEwan’s I’ve read since Atonement.

This one I recommend highly.  I keep hoping that McEwan, like Kazuo Ishiguro, will win a Nobel Prize, for he’s pretty close to that caliber. (I tend to lump the authors together for some reason.) But do read it if you like good fiction, and dystopian fiction even more. Two thumbs up!

By the way, it makes constant references to things going on in 2014: cellphones, social media, and people prominent today. I was surprised to find on p. 282 (near the end) a reference to Steve Pinker.  In the earlier century, the pompous poet Francis and his wife invite a couple over to dinner, and the man, named Chris, who is relatively uneducated, uses the word “hopefully” in a sentence, meaning “I hope”.  That was (and is to me) a faux pas, and Francis rebukes the speaker at the dinner table, saying that he doesn’t want to hear that word in his house again. (What a twit!)  But at a later dinner, Chris, rebuked again for the same word, takes Francis apart, showing how he used the word properly and, in addition, a bloke named Pinker said it was okay (I presume this is in Pinker’s book A Sense of Style).  Here’s the passage on p. 282. Chris is speaking and explaining how he discovered that it’s okay to say “hopefully”:

“I don’t know a thing. First time Francis jumped down my throat, I look on Harriet’s shelves. She poined me towards Burchfield’s Fowler and a bloke called Pinker. Seems like some ignorant snob years back picked on hopefully, and a mob of so-called educated speakers got intimidated and joined in and scared each other into never using the word and crapping on anyone who did. Pathetic!”

Below is the book with a link to the publisher. Read it. And, of course, my reviews hopefully will prompt readers to tender their own recommendations. If you have such a book, please name it and tell us why you liked it in the comments below.