Short takes: An excellent movie and a mediocre book

January 21, 2026 • 11:30 am

In the last week I’ve finished watching an excellent movie and reading a mediocre book, both of which were recommended by readers or friends. I rely a lot on such recommendations because, after all, life is short and critics can help guide us through the arts.

The good news is that the movie, “Hamnet,” turned out to be great. I had read the eponymous book by Maggie O’Farrell in 2022 (see my short take here), and was enthralled, saying this:

I loved the book and recommend it highly, just a notch in quality behind All the Light We Cannot See, but I still give it an A. I’m surprised that it hasn’t been made into a movie, for it would lend itself well to drama. I see now that in fact a feature-length movie is in the works, and I hope they get good actors and a great screenwriter.

They did. Now the movie is out, and it’s nearly as good as the book. Since the book is superb, the movie is close to superb. That is, it’s excellent but perhaps not an all-time classic, though it will always be worth watching. Author O’Farrell co-wrote the screenplay with director  Chloé Zhao, guaranteeing that the movie wouldn’t stray too far from the book. As you may remember, the book centers on Agnes, another name for Shakespeare’s wife Anne Hathaway, a woman who is somewhat of a seer (the book has a bit of magical realism). And the story covers the period from the meeting of Shakespeare and Agnes until Shakespeare writes and performs “Hamlet,” a play that O’Farrell sees as based on the death from plague of their only son Hamnet (another name for Hamlet; apparently names were variable in England).  I won’t give away the plot of the book or movie, which are the same, save to say that the movie differs in having a bit less magic and a little more of Shakespeare’s presence. (He hardly shows up in the book.)

The movie suffers a bit from overemotionality; in fact, there’s basically no time in the movie when someone is not suffering or in a state of high anxiety.  But that is a quibble. The performances, with Jessie Buckley as Agnes and Paul Mescal as Shakespeare, are terrific. Buckley’s is, in fact, Oscar-worthy, and I’ll be surprised if she doesn’t win a Best Actress Oscar this year.  The last ten minutes of the movie focuses on her face as she watches the first performance of “Hamlet” in London’s Globe theater, and the gamut of emotions she expresses just from a close shot of her face is a story in itself.  Go see this movie (bring some Kleenex for the end), but also read the book.  Here’s the trailer:

On to the book. Well, it was tedious and boring, though as I recall Mother Mary Comes to Me, by Indian author Arundhati Roy, was highly praised. Roy’s first novel, The God of Small Things, won the Booker Prize and I loved it; her second, The Ministry of Utmost Happiness, was not as good.  I read Mother Mary simply because I liked her first book and try to read all highly-touted fiction from India, as I’ve been there many times, I love to read about the country, and Indian novelists are often very good.

Sadly, Mother Mary was disappointing. There’s no doubt that Roy had a tumultuous and diverse live, and the autobiography centers around her  relationship with her mother (Mary, of course), a teacher in the Indian state of Kerala. The two have a tumultuous connection that, no matter how many times Roy flees from Kerala, is always on her mind.  It persists during Roy’s tenure in architectural school, her marriage to a rich man (they had no children), and her later discovery of writing as well as her entry into Indian politics, including a time spent with Marxist guerrillas and campaigning for peaceful treatment of Kashmiris.

The book failed to engage me for two reasons. First, Mother Mary was a horrible person, capable of being lovable to her schoolchildren at one second and a horrible, nasty witch at the next.  She was never nice to her daughter, and the book failed to explain (to me, at least) why the daughter loved such a hateful mother. There’s plenty of introspection, but nothing convincing. Since the central message of the novel seems to be this abiding mother/daughter relationship, I was left cold.

Further, there’s a lot of moralizing and proselytizing, which is simply tedious. Although Roy avows herself as self-effacting, she comes off as a hidebound and rather pompous moralist, something that takes the sheen off a fascinating life.  Granted, there are good bits, but overall the writing is bland.  I would not recommend this book.

Two thumbs down for this one:

Of course I write these small reviews to encourage readers to tell us what books and/or movies they’ve encountered lately, and whether or not they liked them. I get a lot of good recommendations from these posts; in fact, it was from a reader that I found out about Hamnet.

Michael Shermer interviews Matthew Cobb on his Crick biography

January 18, 2026 • 9:45 am

Here we have an 83-minute interview of Matthew Crick by Michael Shermer; the topic is Francis Crick as described in Matthew’s new book Crick: A Mind in Motion. Talking to a friend last night, I realized that the two best biographies of scientists I’ve read are Matthew’s book and Janet Browne’s magisterial two-volume biography of Darwin (the two-book set is a must-read, and I recommend both, though Princeton will issue in June a one-volume condensation).

At any rate, if you want to get an 83-minute summary of Matthew’s book, or see if you want to read the book, as you should, have a listen to Matthew’s exposition at the link below.  I have recommended his and Browne’s books because they’re not only comprehensive, but eminently readable, and you can get a sense of Matthew’s eloquence by his off-the-cuff discussion with Shermer.

Click below to listen.

I’ve put the cover below because Shermer mentions it at the outset of the discussion:

My brief interview of Matthew Cobb about his new biography of Francis Crick

January 7, 2026 • 11:00 am

Matthew Cobb’s new biography of Francis Crick has been out for only a short time, but I’ve never seen a review less than enthusiastic (check out this NYT review). I finished it last week, and was also enthusiastic, finding it one of the best biographies of a scientist I’ve ever read. It concentrates on Crick’s science, but his accomplishments were inseparable from his personality, which focused not only on science but also on poetry (the book begins and ends with a poet), drugs, women, and philosophy (he was, by the way, a hardcore atheist and determinist).

But I digress. I really recommend that if you have any interest in the man and his work, which of course includes helping reveal the structure of DNA, you get this book and read it. It is a stupendous achievement, based on tons of research, sleuthing, and interviews, and only a geneticist could have written it. But it’s not dull at all: Matthew has always written lively and engaging prose. Crick is also a good complement to Matthew’s previous book, Life’s Greatest Secret, about how the genetic code was cracked.

As a complement, a biography of Jim Watson by Nathaniel Comfort is in the works, but hasn’t yet been published.

After I finished the book,  I had a few questions about Crick and his work, and asked Matthew if I could pose them to him and post his answers. on this site  He kindly said “yes,” and so here they are. My questions are in bold; Matthew’s answers in plain text. Enjoy:

What one question would you ask Crick if he could return from the dead? (Perhaps something that you couldn’t find out about him from your research.)

I think I would probably ask him about his view of the state of consciousness research. His key insight, with Christof Koch, was that rather than trying to explain everything about consciousness, researchers should look for the neural correlates of consciousness – neurons that fired in a correlated manner with a visual perception – and ask what (if anything) was special about how they fired, their connections, and the genes expressed within them. Since his death, we have obtained recordings from such neurons, but far from resolving the issue, consciousness studies have lost their way, with over 200 different theories currently being advanced. What did he think went wrong? Or rather, is it time to use a more reductionist approach, studying simpler neural networks, even in animals that might not be thought to be conscious?

 

Why did it take ten years—until the Nobel prize was awarded—for people to appreciate the significance of DNA?

Most people imagine that when the double helix was discovered it immediately made Watson and Crick globally famous and the finding was feted. That was not the case, mainly because the actual evidence that DNA was the genetic material was restricted to Avery’s 1944 work on one species of bacterium (this was contested) and a rather crappy experiment on bacteriophage viruses (this was the famous paper by Hershey and Chase from 1952; the experiment was so messy that Hershey did not believe that genes were made solely of DNA). So although the structure of DNA was immediately obvious in terms of its function – both replication and gene specificity, as it was called, could be explained by reciprocal base pairs and the sequence of bases – there was no experimental proof of this function. Indeed, the first proof that DNA is the genetic material in eukaryotes (organisms with a nucleus, including all multicellular organisms) did not appear until the mid-1970s! Instead, people viewed the idea that DNA was the genetic material as a working hypothesis, which became stronger through the 1950s as various experiments were carried out (eg., Meselson and Stahl’s experiment on replication) and theoretical developments were made (eg Crick’s ideas about the central dogma). Its notable that the Nobel Prize committee awarded the prize in 1962, just after the first words in the genetic code were cracked and the relation between DNA, RNA and protein had been experimentally demonstrated.

 

A lot of the latter part of the book is on Crick’s work on neuroscience (and, later, consciousness). You claim that he made enormous contributions to the field that really pushed it forward. Could you tell us a bit about what those contributions were?

Although he did not make a great breakthrough, he helped transform the way that neuroscience was done, the ideas and approaches it used. From the outset – a 1979 article in a special issue of Scientific American devoted to the brain – he focused attention on one particular aspect of brain function (he chose visual perception), the importance of theoretical approaches rooted in neuroanatomy, the need for detailed maps of brain areas and the promise of computational approaches to neural networks. All these things shaped subsequent developments – in particular the work on neural networks, which he played a fundamental part in, and which gave rise to today’s Large Language Models (he worked with both Geoffrey Hinton and John Hopfield, who shared the 2024 Nobel Prize in Physics for their work on this in the 1980s). And, of course, he made the scientific study of consciousness scientifically respectable, taking it out of the hands of the philosophers who had been tinkering with the problem for three thousand years and hadn’t got anywhere. Later, in a perspective article he published on the last day of the old millennium, he reviewed recent developments in molecular biology and predicted that three techniques would become useful: classifying neurons not by their morphology but by the genes that are expressed in them, using genetic markers from the human genome to study the brains of primates (the main experimental system he advocated using), and controlling the activity of neurons with light by using genetic constructs. All these three techniques – now called RNAseq, transcriptional mapping and neurogenetics – are used every day in neuroscience labs around the world. Indeed, within a few months of the article appearing, Crick received a letter from a young Austrian researcher, Gero Miesenböck, telling him that his lab was working on optogenetics and the results looked promising. During his lifetime, Crick’s decisive leadership role was well known to neuroscientists; now it has largely been forgotten, unfortunately.

 

Is there anything a young scientist could learn from Crick’s own methods that would be helpful, or was he a one-off whose way of working cannot be imitated?

I think the key issue is not so much Crick as the times in which he worked. As he repeatedly acknowledged, he was amazingly lucky. From 1954-1977 he worked for the Medical Research Council in the UK. He did no teaching, no grading, was not involved in doctoral supervision (I’m not even clear how many PhD students he technically supervised – 4? 3? 5? – which highlights that even if he had his name on a bit of paper, he had little to do with any of them). Apart from a couple of periods, he had no administrative duties, and only one major leadership post, at the Salk, which nearly killed him. He wrote one major grant application at the Salk (the only one he ever wrote), but basically he was funded sufficiently well to simply get on with things. And what did he do? ‘I read and think,’ he said. Try getting that past a recruitment or promotions panel today! In a way, the onus for the creation of more Cricks does not lie with young researchers, but with established scientists – they need to allow young people the time to ‘read and think’, and value failure. Most ideas will turn out to be wrong; that’s OK. Or at least, it was to Crick. Many senior researchers (and funders) don’t see things that way. However, even without such changes, young scientists can adopt some of Crick’s habits. Here’s my attempt to sum up what I think were the lessons of his life and work:

  • Read widely and avidly, even engaging with ideas that might seem eccentric or pointless, as ‘there might be something in it’ (one of his favourite phrases).
  • Talk through your ideas with your peers – try to find the weak spots in each other’s arguments.
  • At least in the initial stages of research, don’t get bogged down in the details that might counter your interpretation/theory – Crick and Brenner called this the ‘don’t worry’ approach. They figured that unconnected contrary data points might not undermine their ideas, and would eventually turn out to have specific, varied explanations.
  • Write down your ideas in the form of memos or short documents (keep them short). Writing helps you clarify your ideas and shaped your mind – do not use AI to do this! You can then share your writing with peers, which can be used as a target for discussion and debate.
  • Master the art of clear writing. Avoid jargon, keep your ideas straightforward. Again, the only way to develop this skill is to write – badly at first. So rewrite, edit, recast your writing – it will improve your thinking.
  • Above all, make sure that the science you do is *fun*. That was a word that Crick repeatedly used, and he genuinely got great pleasure from doing science and thinking about it. Seek out an area in which you can have fun and aren’t bogged down by drudgery.

Click below to get the book on Amazon:

A book recommendation: Ian McEwan’s “What We Can Know”

November 26, 2025 • 11:00 am

I decided when I read the NYT review of Ian McEwan’s latest (and 18th) novel, What We Can Know, that I had to read the book.  (Click the screenshots to read the review if you have NYT access, or find the review archived here.)  I quote some of the encomiums from the review:

Ian McEwan’s new novel, “What We Can Know,” is brash and busy — it comes at you like a bowling ball headed for a twisting strike. It’s a piece of late-career showmanship (McEwan is 77) from an old master. It gave me so much pleasure I sometimes felt like laughing.

McEwan has put his thumb on the scale. This is melodramatic, storm-tossed stuff. There is murder, a near kidnapping, a child hideously dead of neglect, multiple revenge plots, buried treasure and literary arson. Writers treat other writers’ manuscripts and reputations the way Sherman treated Georgia. No one is a moral paragon.

. . . I’m hesitant to call “What We Can Know” a masterpiece. But at its best it’s gorgeous and awful, the way the lurid sunsets must have seemed after Krakatau, while also being funny and alive. It’s the best thing McEwan has written in ages. It’s a sophisticated entertainment of a high order.

I had to get it via interlibrary loan, and since it’s new it took some time. But I did get it, and read the 300-page book in a week. And yes, it’s excellent.

 

 

I’m a fan of McEwan, and especially like his novels Atonement (made into a terrific movie) and the Booker-winning Amsterdam. This one also does not disappoint. The NYT gives a plot summary, but I’ll just say that it’s a novel about a poem, and the action takes place over two years more than a century apart: 2014 and  2119. A well-known British poet named Francis laboriously pens a “corona” poem for his wife Vivien on her 53rd birthday. It would be hard to write a normal corona, much less one that, like this one, is said to be a masterpiece. Here’s what the form comprises according to Wikipedia:

crown of sonnets or sonnet corona is a sequence of sonnets, usually addressed to one person, and/or concerned with a single theme. Each of the sonnets explores one aspect of the theme, and is linked to the preceding and succeeding sonnets by repeating the final line of the preceding sonnet as its first line. The first line of the first sonnet is repeated as the final line of the final sonnet, thereby bringing the sequence to a close.

Imagine how hard that would be to write, as the first lines have to form a stand-alone sonnet, and rhyme properly, when put in sequence at the end! To see an example, go here, though the corona has only 12 rather than 14 included sonnets.  At any rate, Francis’s poem gets a national reputation although Francis won’t let it be reproduced or published; it is read aloud on Vivien’s birthday to a dozen guests and then given to her, handwritten on vellum. But only Vivien sees it in print.

Over a hundred years later, with the world devastated by nuclear exchanges, global warming, and skirmishes, a scholar named Thomas Metcalfe, specializing in poetry of the early 2000s, decides to track down the corona to see why it was so renowned despite being unpublished (a nostalgia for the past pervades the 22nd century). As he searches for the work, the story flips back and forth between the 21st and 22nd centuries, giving us two casts of characters, both of which engage in adultery and, in the earlier century, crime.  These intrigues determine the fate of the poem, but I won’t give away the ending. The novel starts a bit slowly, but builds momentum to a roller-coaster finish.  And yes, it’s the best novel of McEwan’s I’ve read since Atonement.

This one I recommend highly.  I keep hoping that McEwan, like Kazuo Ishiguro, will win a Nobel Prize, for he’s pretty close to that caliber. (I tend to lump the authors together for some reason.) But do read it if you like good fiction, and dystopian fiction even more. Two thumbs up!

By the way, it makes constant references to things going on in 2014: cellphones, social media, and people prominent today. I was surprised to find on p. 282 (near the end) a reference to Steve Pinker.  In the earlier century, the pompous poet Francis and his wife invite a couple over to dinner, and the man, named Chris, who is relatively uneducated, uses the word “hopefully” in a sentence, meaning “I hope”.  That was (and is to me) a faux pas, and Francis rebukes the speaker at the dinner table, saying that he doesn’t want to hear that word in his house again. (What a twit!)  But at a later dinner, Chris, rebuked again for the same word, takes Francis apart, showing how he used the word properly and, in addition, a bloke named Pinker said it was okay (I presume this is in Pinker’s book A Sense of Style).  Here’s the passage on p. 282. Chris is speaking and explaining how he discovered that it’s okay to say “hopefully”:

“I don’t know a thing. First time Francis jumped down my throat, I look on Harriet’s shelves. She poined me towards Burchfield’s Fowler and a bloke called Pinker. Seems like some ignorant snob years back picked on hopefully, and a mob of so-called educated speakers got intimidated and joined in and scared each other into never using the word and crapping on anyone who did. Pathetic!”

Below is the book with a link to the publisher. Read it. And, of course, my reviews hopefully will prompt readers to tender their own recommendations. If you have such a book, please name it and tell us why you liked it in the comments below.

A book recommendation: “The Overstory”

November 11, 2025 • 9:15 am

I don’t often recommend books—well, at least not on a weekly basis —but I’ve just polished off one that ranks among the best books I’ve read in the past several years. It’s a Pulitzer winner (when I’m trawling for fiction I check the Booker and Pulitzer winners): The Overstory (2018) by Richard Powers (see Wikipedia entry here). I recently finished another fat book of his, The Time of Our Singing (2003), and while it was engrossing and worth reading, I was somewhat put off by Power’s use of language that seemed show-offy, as well as the emphasis on music, which made me (a classical-music ignoramus) unable to fully appreciate a lot of the allusions.

This time the book is not about music but about trees. Well, about how trees and the desire to preserve them affects the lives of nine people. Some of the characters work on trees, like a biologist who discovers how trees communicate with one another (I think she’s modeled on Suzanne Simard), while another man, a Sikh computer programmer (they are all in North America), has his tree encounter by falling out of one, paralyzing him from the waist down. Most of the others are deeply into ecology, motivated by knowing that trees are essential for our well being and the health of the planet, and are being cut down at an incredible rate. All but one of the chracters (the computer guy) become eco-activists, chaining themselves to trees and, ultimately, committing crimes against tree-cutting firms, who they see as evil.  The stories have a lot of sadness, but also a lot of joy, and you follow the characters from youth through old age: the narrative is chronological but jumps from person to person. A sense of doom hangs over the whole story, with repeated references to our future demise via global warming and loss of natural habitat.

The title surely refers to the view that trees are more important than transitory humans (humans are surely the “understory”), and that the wanton cropping of forests is immoral.  That is the “truth” of this book, but it’s not a universal truth, as many clearly don’t agree with it. But if you think that humans are exacerbating global warming and creating a potential catastrophe, as I do, then yes, that warning is real. But we don’t have to learn it from this book; scientists have already told us that.

Powers still plays with language in a way that’s occasionally irritating, but the story is mesmerizing—a page-turner, even though it’s 500 pages long.  Here are a few plaudits from the book’s Amazon page, just to let you know that others find it equally impressive. And remember, it got a Pulitzer, which is usually, though not inevitably, a guarantee of literary quality. (I find the Bookers a more reliable guide to quality.)

“It changed how I thought about the Earth and our place in it…. It changed how I see things and that’s always, for me, a mark of a book worth reading.”
― Barack Obama

“The best book I’ve read in 10 years. It’s a remarkable piece of literature, and the moment it speaks to is climate change. So, for me, it’s a lodestone. It’s a mind-opening fiction, and it connects us all in a very positive way to the things that we have to do if we want to regain our planet.”
― Emma Thompson

“Monumental… The Overstory accomplishes what few living writers from either camp, art or science, could attempt. Using the tools of the story, he pulls readers heart-first into a perspective so much longer-lived and more subtly developed than the human purview that we gain glimpses of a vast, primordial sensibility, while watching our own kind get whittled down to size.… A gigantic fable of genuine truths.”
― Barbara Kingsolver, The New York Times Book Review

“The best novel ever written about trees, and really, just one of the best novels, period.”
― Ann Patchett

“I’ll never see a tree the same way again.”
― Louis Sachar, New York Times

“I’ve read a lot of good books, but the last truly great book I read was The Overstory, by Richard Powers.”
― Ed Helms, New York Times Book Review

And yes, The Overstory may change your life by affecting how you view trees, and that is reason enough to read it.  That, of course, is not a “truth”, but a “way of feeling” that’s one of the reason to read books like this.

You can get this big book for only eleven bucks on Amazon, or thirteen for the hardback. It would make a wonderful holiday gift for your friends who love nature.

Finally, as always, I proffer this not just as a recommendation, but also as a solicitation. What books have you read lately that you’d recommend to other readers? I get many good recommendations from posts like these, so don’t be hesitant to tell us what you like.

The cover:

Matthew’s biography of Francis Crick gets a glowing review in Nature

November 3, 2025 • 10:00 am

Matthew’s new biography of Francis Crick is the third one published, but, according to this glowing review in Nature, is by far the best of the lot. I’ve read a lot of it in draft and, while I can’t compare it to the other two, I can tell you that Matthew’s is worth buying and reading, and you don’t have to be a biologist to understand it. Just have a gander at the final assessment of reviewer Georgina Ferry:

Of Crick’s three biographers, Cobb comes closest to making the case that Crick belongs in the scientific pantheon alongside Isaac Newton, Charles Darwin and Albert Einstein, arguing that “Crick’s thinking changed how the rest of us see the world”. Ridley’s book (Francis Crick: Discoverer of the Genetic Code, 2006) is an entertaining primer but brief, unreferenced and unindexed. In his authorized biography Francis Crick: Hunter of Life’s Secrets (2009), Olby is as thorough as Cobb but perhaps more reverent, glancing coyly at Crick’s preoccupations with drugs and sex, whereas Cobb makes them essential accessories to his intellectual pursuits.

Ferry’s review (click on headline below to read, or find it archived here) occupies nearly three pages of the journal—the longest book review I’ve seen in Nature.  That alone tells you of the book’s importance. Matthew must be chuffed (in fact, he told me so), and the only other review he needs now is a good one in the New York Times. I hope they’re reviewing it, for Crick was one of the greatest scientists of our era, and the NYT often pays scant attention to science books.

Click to read the review. And yes, there are drugs and sex.

Crick is best known to the layperson as the co-discoverer of the structure of DNA with J. D. Watson, but he did far more, including hypothesizing the existence of a three-base code for amino acids, of messenger RNA to carry the code into the cytoplasm to make proteins, and formulizing the “central dogma,” best characterized as “information can go from DNA to protein, but information cannot get from the protein back to the genetic material.”

Now Crick, like his contemporary polymath J. D. “Sage” Bernal, was no saint, at least if you expect Crick to be a saint. He was a complex human being and that complexity, including affairs and drug-laced parties, is part of Crick’s life. But it can also be seen as instantiating the same tendencies that helped make his career: his need to interact with others and his desire to open the “doors of perception” when he worked on consciousness at the end of his career.

Let me give just a few quotes from the review. I tell you, had I written this book I’d be popping champagne corks today:

In a magisterial new biography, Crick, zoologist and historian Matthew Cobb revisits the double-helix breakthrough, a discovery he discussed in forensic detail in his book Life’s Greatest Secret (2015). Yet, this time, the publication of the structure and the immediate aftermath of the discovery occupy just 41 pages. Instead, Cobb explores how Crick’s thinking, writing and interactions with others transcended that brilliant, yet contested, episode, revolutionizing molecular biology and influencing evolutionary and developmental biology, visual neuroscience and ideas about consciousness.

At the same time, he makes a more sustained attempt than either of Crick’s previous biographers (Matt Ridley and Robert Olby) to answer several questions. Who was Crick? What kind of person was he? What did he care about?

Crick was notoriously reluctant to divulge personal information or even have his photograph taken. Combing through a remarkably comprehensive set of personal and professional archives with meticulous attention to detail, Cobb has reconstructed Crick’s relationships with those who were essential crew mates on his intellectual odyssey.

People will of course be curious about the Rosalind Franklin episode in the elucidation of DNA’s structure, though the whole DNA-structure narrative occupies only about 40 pages in the book. Matthew’s view is outlined below, and I believe he’s written on this site that Franklin should have gotten the  Chemistry Nobel Prize with Wilkins, but she died of ovarian cancer before the Prize was awarded (they’re not given posthumously).

Cobb presents the double-helix story as much more of a collaboration with chemist Rosalind Franklin and biophysicist Maurice Wilkins at King’s College London than Crick and Watson acknowledged in their iconic 1953 paper (J. D. Watson and F. H. C. Crick Nature171, 737–738; 1953). He exonerates Crick and Watson of theft, but not of bad manners. “They should have requested permission to use the data,” Cobb writes. “They did not.”

The elucidation of the triplet code and the mechanism for translating it into proteins was done by Crick in association with Sydney Brenner, who won his own Nobel Prize much later: 2002. And this collaboration brings up some of the “unsaintly” behavior of Crick. From the review:

These landmark findings involved numerous experiments overseen by Brenner’s highly skilled research assistant, Leslie Barnett; Crick himself was notoriously clumsy in the laboratory. Cobb acknowledges her “vital” role but we learn nothing about her as a person. Various long-suffering secretaries also appear fleetingly: they formed part of Crick’s essential support system, some became close friends, and it would have been good to hear more of their voices (and perhaps less of Kreisel’s). As for the lovers, they drift by like ghosts: noted, occasionally quoted, but not identified. “Not our business”, says Cobb.

After this period, Crick was fruitlessly distracted by problems of development and the origin of life, going “off the rails” according to the reviewer. But then he found his footing again when he moved to the Salk Institute in 1977 and began working on consciousness.

. . for the rest of his life focused mainly on tackling the second of the two problems that he had identified at the outset of his career: the basis of human consciousness. Homing in on the question of how humans experience the visual world, he once again became a brilliant influencer and synthesizer of ideas from both neuroscience and machine learning. His 1994 book The Astonishing Hypothesis argued that all conscious experience stems from brain activity and nothing else; however, it fell short of explaining how. Although this theory was not particularly astonishing to most neuroscientists, it made an enormous public impact.

Well, Crick was certainly right about that: where else could consciousness come from unless it’s some supernatural phenomenon that is outside the ambit of physics. Yet the neurological basis of consciousness is still contested by both scientists (included the deluded “panpsychists” who think that everything in the Universe is conscious) and by laypeople who haven’t thought about the problem. The problem, of course, is connected with determinism, and Crick was certainly a determinist. As I’ve written elsewhere, J. D. Watson told me that he and Crick were motivated to find the structure of DNA partly to demonstrate that the “secret of life” had a purely chemical and materialistic basis.

Here’s the final paragraph of the review: the cherry on the sundae:

Cobb is reliably excellent in maintaining the narrative momentum of a life in science that was anything but mundane. His gripping and accessible account is generous while calling out flaws as he sees them, and discreet when that could hurt the feelings of living friends and relatives. What made Crick Crick, he argues, was his lifelong attempt to “chase the intellectual high” produced by flashes of unique insight. Crick was not, he concludes, a saint or a hero but “an extraordinarily clever man with limits to his interests and perception”.

Are you ready to read the book now? I hope so, and note that I get nothing out of blurbing it here. I do get an autographed copy, though, for having helped Matthew find a fact about baseball in the book (box scores are forever).

You can order the Crick bio from the UK by clicking on the screenshot of the British version below, or here if you’re in the US. And of course there’s always Amazon. The book comes out in three days in the UK and on November 11 in the U.S. (The UK cover is much better, but the contents are identical.)

Kathleen Stock leaves her lane, says that creationist arguments “undermine her faith in science”

October 10, 2025 • 10:15 am

Having read one of her books (Material Girls: Why Reality Matters for Feminism)which I liked, and knowing how Kathleen Stock (OBE) was hounded out of the University of Sussex for her gender-critical views, but has stood her ground since, I’ve been an admirer, though I haven’t followed her doings much. I see this from Wikipedia:

On 9 March 2023, Stock, alongside tennis player Martina Navratilova and writer Julie Bindel, launched The Lesbian Project.  The purpose of the Lesbian Project, according to Stock, is “to put lesbian needs and interests back into focus, to stop lesbians disappearing into the rainbow soup and to give them a non-partisan political voice.”

Stock is a lesbian, and you see above, she doesn’t want gay women stirred into the “rainbow soup” with the “T”s  Yet, at least from that book, I don’t see Stock as a transphobe, but rather as someone who thinks hard about the slippery concept of “gender” and who doesn’t see transwomen as fully equivalent to natal women.

But I have to ratchet back some of my admiration for Stock in view of what she has just published: a semi-laudatory review of a creationist/ID book, God, the Science, the Evidence by Michel-Yves Bolloré and Olivier Bonnassies.  The Sunday Times also extolled the book (which is a bestseller, by the way); I dissected some of its arguments here. As far as I can tell—and the book isn’t yet available to me—the authors give the standard creationist guff touting a “God of the Gaps”, arguing that things that science doesn’t yet understand, like how the Universe began or how life began, are prima facie arguments for God. Of course they were once prima facie arguments for God about things we now have a scientific explanation for, like lightning and plague, but the new book apparently sees the existence of a complicated god as more parsimonious than saying “we don’t yet know, but all the evidence given for God that science has investigated has proven to be purely materialistic.”

Sadly, Stock has somewhat fallen for the God of the Gaps, to the extent that the book has “undermined her faith in science”.

If you subscribe to UnHerd, you can read Stock’s hyperbolically-titled review by clicking on the screenshot below, or you can read it for free as it’s archived here.

The very beginning of the review, in which Stocks ‘the eternal truths of religion” gets the review off to a bad start. (Is she joking here? I don’t think so.) There’s the usual incorrect noting that religiosity is increasing in the West. Then she says the god-of-the-gaps arguments have weakened her faith in science. Bolding is mine, and excerpts from Stock’s review are indented”

The eternal truths of religion are having a moment. Church pews are filling up with newcomers. Gen Z is earnestly discussing demons and sedevacantism on social media. This might, therefore, seem like a good time to publish a book which purports to lay out a positive empirical case for the existence of a supreme being.

God, the Science, the Evidence by Michel-Yves Bolloré and Olivier Bonnassies, out this week in English, is already a best-seller in Europe. It comes with endorsements from various luminaries, including a Nobel Prize winner in physics. Reading it hasn’t affected my religious tendencies either way, but it has definitely undermined my faith in science.

Leibniz once asked: “Why is there something rather than nothing?” Bolloré and Bonnassie’s answer is that God originally decreed “let there be something”; and they think that 20th-century developments in physics, biology, and history support this hypothesis. Their basic strategy in the book is to keep asking “What are the chances of that?” in a sceptical tone, concluding that only the truth of Christianity can explain otherwise unlikely natural circumstances.

Now Stock isn’t completely laudatory about the book, especially its “Biblical” evidence for God (see below), but saying that her faith in science has been weakened by God-of-the-Gaps arguments means she thinks that their priors have increased to the point where scientific evidence for the Big Bang, the “fine-tuning” of the universe, the complexity of single-cell organisms, and “the stunning efficiency of the double helix”—all of this is weakened, strengthing the evidence for God  or at least something divine.  But even here she waffles, ultimately concluding that these arguments “empty nature of mystery”:

Fine-tuning arguments remain interesting, though. Ultimately, they don’t work to rationally justify Christianity, or indeed any other kind of concrete theology, because of the large gaps they leave. One big problem is about how to calculate the probabilities of physical laws being as they are; for on many secular views of the laws of nature, their being different from the way they are is, precisely, physically impossible. But even leaving aside that technical issue, God’s intentions in designing the universe still look worryingly vague: what was He calibrating the background physical laws for, exactly? Was it just to bring carbon into the universe; or carbon-based life forms, generally; or humans, specifically; or even just one human in particular — Liz Truss, say, or Craig from Strictly? Why did He adopt so painfully indirect and slow a manner of implementation, and not just magic up the Garden of Eden in a trice instead, like a pop-up at the Chelsea Flower Show? The nature of God also looks pretty vague, defined only as whoever it was that came up with the floorplans: are we talking just one cosmic architect, or a committee?

“Why did He not just magic up the Garden of Eden in a trice, like a pop-up at the Chelsea Flower Show?”

Effectively, then, though fine-tuning arguments empty nature of mystery, treating it like a piece of machinery we might one day fully understand, they return all the obscurity to God.

The problem is that although her points against fine-tuning are decent, and she raises several other arguments against a divine origin, she doesn’t like the creationist arguments not because there are materialist explanations for fine-tuning, but because she wants these things to remain a mystery.  I suspect this because she says this at the end of her piece:

Perhaps, then, we are at an impasse: two mutually incompatible explanations of how we got here, each with its own measure of confusion and darkness. We could just stipulate that a creationist God, by definition, gets all the glamorous mystery, while the material world gets rational comprehensibility; He is whatever started things off, but that which we cannot otherwise hope to know. Or perhaps — and this would be my preference — we could give up flat-footed quests to prove the existence of the supernatural by rational means; we could start becoming alert to immanence, rather than simply hypothesising transcendence. That is: we could stop treating the natural world as if it were an Agatha Christie novel, where the only real mystery is how exactly the body got into the library.

I prefer our flat-footed attempts to explain things materialistically instead of becoming “alert to immanence,” whatever that means.  What we see throughout the review is Stock not just sitting on the fence, but pirouetting on it, going from one side to the other.  I still don’t know why her faith in science has been undermined, as God-of-the-Gaps arguments have been around for decades, if not centuries.  I would note that my faith in Stock has been undermined.

But in one area her review is good. As I said, it’s the “evidence” that Bolloré and Bonnassie adduce for God from the Bible. Here’s a bit:

Or take the authors’ argument that the historical Jesus must have been the Messiah, by attempting to rule out more prosaic rival explanations. Jesus can’t have been just another wise sage wandering round the Levant, they suggest, because he sometimes said crazy things. Equally, though, he can’t have been a crazy man, because he sometimes said wise things. The possibility that both sages and madmen sometimes have days off seems not to have occurred. The next chapter is of similar argumentative quality: could the Jewish race have lasted so long, been so intensely persecuted, yet achieved so much — including producing “the most sold book in history” and achieving “many unexpected and spectacular military victories” — had God not been intervening on their behalf all along?

By the time you get to the book’s treatment of the Fatima “sun miracle” — not to mention the authors’ insinuation that God instigated it in order to precipitate the Soviet Union — images of Richard Dawkins leaping around with glee and punching the air become irresistible. As chance would have it, only this week Scott Alexander published his own, much more rigorous, exploration of the Fatima sun miracle than the one offered by Bolloré and Bonnassies in their chapter. I recommend that they take this as a sign from God, and give up the explanation game forthwith.

If you have the patience, do read Scott Alexander’s very long piece on the Fatima “sun miracle,” (Spoiler: he suggests a naturalistic explanation.)

What I don’t understand about Stock and her review,then, are four things:

1.) If Stock, as a philosopher, can skillfully debunk Biblical miracles, why doesn’t she adduce the other naturalistic explanations for fine tuning, the origin of life, and the complexity of one-celled organisms.? Granted, she does raise questions about why God would make the universe as it is, but stops there.

2.) How did the book “undermine her faith in science”. She’s not clear about this. Does she find God-of-the-gaps arguments somewhat convincing?

3.) What does she mean in the title “Science can’t prove the ineffable”? “The ineffable” means “things that cannot be expressed in words.”  But of course stuff we don’t yet understand can’t be expressed in words simply because we don’t understand them, not because there’s something “transcendent” about them. If the title and subtitle are the work of an editor, well, I’ve always had the right to okay titles.

4.)  What is the “immanence” she speaks of? Is this the usual interpretation that God is to be found everywhere in the world instead of outside of it? That is, is she a pantheist?”  If so, what evidence does she have for “immanence,” or is that just something she chooses to believe?   And does she worry about where this “immanence” she accepts comes from?If the Universe is really a god in itself, why could it not be a NOT-GOD in itself—that is, something purely naturalistic?

This is a murky review, ending without the reader able to know what Stock really tbinks. That’s unseemly for a philosopher.

h/t: Chris, Loretta