More by Matthew on Crick, Watson, and DNA

November 15, 2025 • 10:45 am

Matthew’s biography of Francis Crick just came out, and I’m delighted, as I’m sure he is, with the spate of glowing reviews. I haven’t seen a bad one yet, and some of them rate the book as superlative. It is certainly one of the best science biographies going, and I hope it wins the Royal Society Science book prize.

I’ll finish up my endorsements of the book (the reviews will keep coming, though) by highlighting two more: one in Science and the other in the Times of London. But first you can listen to Matthew talking about J. D. Watson, who just died, on this BBC show (Matthew’s bit, which is the only discussion of biology, goes from the beginning to 9:35). As Matthew says, “This is the most important discovery in biology since Darwin’s theory of evolution by natural selection. It transforms our understanding of heredity, of evolution–of everything to do with biology.”

The American you hear in the interview is from an old interview with Watson himself.

The moderator then wants to discuss the sexism and racism of Watson, and Matthew eventually gets to it. First, though, Matthew discusses the involvement of Maurice Wilkins and Rosalind Franklin in the DNA structure, and says, as he always does, that the history was complicated, that the discovery was more collaborative than people think, but also that Crick and Watson failed to ask Franklin for permission to use her data, which was a scientific boo-boo. Watson’s further accomplishments are discussed (the Human Genome Project, the upgrading of Cold Spring Harbor Laboratories).  The mention of Watson’s personal arrogance, sexism, and racism starts at 6:50, and Matthew manages to decry it (calling it a “terrible legacy”) while not seeming nasty, something he’s good at.

Next, two reviews, the first in Science. It’s very positive, and I’ll give the exerpts (access should be free by clicking on the headline below).

In October 1958, Francis Crick and his wife, Odile, hosted a party at their house in Cambridge to celebrate Fred Sanger’s Nobel Prize in Chemistry. During the festivities, a rocket was launched from the roof terrace, which landed on the roof of a nearby church and necessitated the services of the local fire brigade (1). This otherwise inconsequential event is an apt metaphor for the scientific assault on mysticism and vitalism that the atheist Crick and his contemporaries helped pioneer through their pursuit of a new “chemical physics” of biology—an endeavor that would eventually help describe the nature of life itself. In his magnificent and expansive new biography, Crick: A Mind in Motion, Matthew Cobb forensically explores and electrifies this important chapter in the history of science through the exploits of one of its key protagonists.
Magnificent and expansive! You’ll be seeing those words on the cover.  And some of these, too:

Another intriguing theme Cobb explores is Crick’s friendship with the psychedelic beat poet Michael McClure (6). Crick was so taken by the charismatic poet, in particular, a stanza in McClure’s “Peyote Poem”—“THIS IS THE POWERFUL KNOWLEDGE / we smile with it”—that he pinned it onto a wall in his home. For Crick, the beauty inherent in the solution of a complex scientific problem and the aesthetic euphoria and sense of revelation it created were reminiscent of the perceptual effects of consuming a hallucinogenic compound, such as peyote.

Cobb also touches on Crick’s eugenicist proclamations and details some of his other disastrous forays into the social implications of science, which ultimately led him to permanently abstain from such activities. Crick’s notable lack of engagement with the 1975 Asilomar meeting, which sought to address the potential biohazards and ethics of recombinant DNA technology, was in stark contrast to Watson and biologist Sydney Brenner. Crick never explained his silence on the topic of genetic engineering (7).

Complex, energetic, freethinking, dazzling, and bohemian, Crick was also ruthless, immature, misogynistic, arrogant, and careless. The phage biologist Seymour Benzer noted that Crick was not a “shrinking violet.” Maurice Wilkins described Watson and Crick as “a couple of old rogues,” and Lawrence Bragg more politely observed that Crick was “the sort of chap who was always doing someone else’s crossword.” Cobb, however, has arrived at a somewhat more benign and nuanced interpretation of the events surrounding the discovery of the double helix, the collaborative nature of which, he asserts, was obfuscated by the fictional narrative drama of Watson’s bestseller The Double Helix.

Crick is set to become the definitive account of this polymath’s life and work. We must now wait patiently for historian Nathaniel Comfort’s upcoming biography of James Watson to complement it.

In my view, the phrase “definite account of this polymath’s life and work” is really the most powerful approbation the book could get.

You can see the review from the Times of London by clicking below, or find it archived here:

If the age of the lone scientific genius has passed, was Francis Crick among its last great specimens? His name will for ever be bound to that of James Watson and their discovery in 1953 of the double-helix structure of DNA. Yet it is a measure of Crick’s influence that this breakthrough, transformative as it was, is done and dusted barely 80 pages into Matthew Cobb’s absorbing new biography.

Cobb, a zoologist and historian of science, presents Crick (1916-2004) as the hub round which a mid-century scientific revolution revolved — a researcher and theorist of unstoppable curiosity, who unravelled the secret code behind heredity before helping to reinvent the study of the mind and consciousness. More than 70 years on, it is easy to forget how penetrating Crick’s insights were — how, before he came along, we did not know how life copies itself and the molecular mechanism behind evolution was a mystery.

But Cobb’s book is no hagiography. Briskly paced, it concentrates on Crick’s scientific life, but also offers glimpses, some unflattering, of the man behind the lab bench. The picture it builds is of a brilliant, garrulous and often exasperating individual.

. . . Cobb writes with clarity and a touch of affection for his subject. His Crick is radical in science and conservative in temperament; deeply irreligious yet moved by poetry; a philanderer who adored his wife. Above all he is insatiably curious — a mind in motion, indeed. And yes, he may also represent something that may now be lost: the era when a single intellect could sit at the centre of a scientific revolution. Crick might be best known for his collaboration with Watson and his notorious debt to Franklin. However, in the crowded, collaborative landscape of 21st-century research, where knowledge advances by increments, achieved by vast teams who work with ever growing volumes of data, it is hard to imagine another individual whose ideas will so completely redefine the life sciences.

I’d call that a good review as well. Kudos to Dr. Cobb. I told him he should celebrate by going off on a nice vacation, but I’m betting he won’t.

Matthew on the subject of his latest book: Francis Crick

November 1, 2025 • 11:00 am

As I’ve mentioned several times, Matthew has written what is the definitive biography of Francis Crick, one of the great polymaths of our time. It comes out in the first two weeks of November.

Today you can see an article that Matthew about the book for the Observor, but he and I both urge you to buy the book itself (the publisher’s site is here, a U.K. purchasing site is here, and the U.S. Hachette site, here, gives a 20% discount with the code CRICK20.

Click the headline to read the article for free:

But what is this about poetry?  Here are a few excerpts from the article.

n 1947, aged 31 and with his career in physics derailed by the war, Francis Crick, the future co-discoverer of the DNA double helix, returned to research, focusing on two fundamental biological problems: life and the brain. Over the following half century, he made decisive contributions to both these fields, becoming one of the most significant thinkers of the 20th century. In 1994, the Times hailed Crick as the “genius of our age”, comparing him to Isaac Newton, Mozart and Shakespeare, while after his death in 2004, parallels were drawn with Charles Darwin and Gregor Mendel.

Like Darwin, Mendel and Newton, Crick changed how the rest of us see the world. He drew out the implications of DNA structure, developed new ways of understanding life and evolution, and later convinced neuroscientists to adopt computational and molecular approaches, and to study the nature of consciousness.

Crick’s aim was not just to make discoveries about two fundamental scientific riddles; he also wanted to replace the superstitious and religious ideas that marked these questions. This did not mean he was stuffy or unimaginative – he was fascinated by the flux of perception and emotion he found in poetry, particularly the work of psychedelic Beat poet Michael McClure, who became a close friend. Poetry and science co-existed in his approach to the world.

Once, when I met with Jim Watson during one of his yearly visits to Chicago, he told me that part of the motivation for his and Crick’s attempt to find the structure of DNA was to confirm materialism (aka atheism): they wanted to show, as Watson told me, that the “secret of life” was a molecule that, in the right milieu, could produce a whole organism. More excerpts:

Crick’s scientific achievements have recently tended to be reduced to those few weeks in Cambridge in February 1953, when he and James Watson discovered the structure of DNA. The widely believed story that they stole the data of King’s College London researcher Rosalind Franklin is untrue: Watson and Crick knew of Franklin’s results and those of Crick’s close friend Maurice Wilkins, but they did not provide any decisive insight into the structure of DNA. Franklin knew that the pair had access to her data and bore no grudge; she soon became friendly with both men, and was particularly close to Crick and his wife, Odile.

Watson and Crick subsequently explained that had they not found the structure, then Franklin, or her colleague Wilkins, or someone else, would have done so – it was inevitable. Crick and Watson succeeded because they were lucky, smart, somewhat unscrupulous, and determined

And the poetry:

The imaginative aspect to Crick’s thinking extended to his vocabulary. In 1953, he told a friend that the double helix made him swoon every time he thought of it; this was because of its beauty, a term he often used rather than the word “elegance”, frequently employed by physicists and mathematicians. Biological results are often messy and complex, not elegant. They are nevertheless beautiful, because of their evolutionary roots and the contingent factors that have shaped them.

This sense of beauty, of deep relationships underlying complex phenomena, drove Crick’s scientific work and was linked to his fascination with poetry. As he explained:

“I hope nobody still thinks that scientists are dull, unimaginative people… It is almost true that science itself is poetry enough for them. But there is no effective substitute for the subtle interplay of words and from time to time one becomes wearied by the exact formulations of science and longs for a poetry which speaks to one’s bones.”

But here I disagree with Crick:

Although Crick admired the works of WB Yeats and TS Eliot, by the mid-1960s he had fallen out of love with them because of their mystical views. As he explained in a letter to his friend, the novelist CP Snow, he felt “you can’t be a major poet without a solid foundation of silly ideas (almost everybody thinks Yeats’s ideas silly but to me Eliot’s are just as bad)”.

Yes, Yeats was a mystic, which of course is antiscientific, but both he and Eliot wrote poetry that was non-mystical (think of Yeat’s gorgeous “The Lake Isle of Innisfree“, or Eliot’s “The Love Song of J. Alfred Prufrock“).

. . .That Crick’s otherwise penetrating mind never challenged his old prejudices and could not master political issues highlights that he was not a flawless hero nor – no matter what graffiti in 1960s Cambridge proclaimed – a candidate for the post of God. Instead, he was an extraordinarily clever man with limits to his interests and perception.

Crick’s withdrawal from cultural debates coincided with a series of shifts in his world. He and Odile moved from Cambridge to California, where he worked on neuroscience and consciousness at the Salk Institute in San Diego.

In his 50s, Crick used LSD and cannabis and became fascinated by Michael McClure’s materialist psychedelic poetry, which he admired for what he described as its fury and imagery and for its open embrace of biology: “When a man does not admit that he is an animal, he is less than an animal,” proclaimed McClure. Crick’s friendship with McClure ran through the second half of his life, and he did not see it as being in contradiction with his scientific views.

. . .In 2004, on the day that Crick died after a long illness, McClure completed what he described as his finest poem, dedicated to Crick. Full of the muscular sensation and vivid imagery that Crick appreciated, one stanza seems to represent McClure’s attempt to grapple with his friend’s inevitable end:

PERHAPS WE RETURN TO A POOL

– STEADY AND SOLID;

ready and already completed in fireworks

and lives and non-lives – thin and faint

as powerful odours stirring

my moment’s soul in the mind of place.

Below is a photo of Crick from Wikipedia with the caption, “Francis Crick in his office. Behind him is a model of the human brain that he inherited from Jacob Bronowski.” 

Francis_Crick.png: Photo: Marc Lieberman, per ticket:2015100910022707derivative work: Materialscientist, CC BY 2.5, via Wikimedia Commons

Cobb on Crick: The “Central Dogma”

December 2, 2024 • 9:45 am

As I’ve mentioned several times, Matthew Cobb has written what will likely prove the definitive biography of Francis Crick (1916-2004), co-discoverer of the structure of DNA and a general polymath. While writing it, Matthew came across some Crick material showing that biologists and historians have misunderstood Crick’s “Central Dogma” of molecular biology.

Matthew has corrected the record in the piece below from the Asimov Press. Click the headline, as it’s free to read:

You may have learned this dogma as “DNA makes RNA makes protein,” along with the caveat that it’s a one-way path. But Matthew shows that this was not Crick’s contention. I’ve indented Mathew’s words below:

The Central Dogma is a linchpin for understanding how cells work, and yet it is one of the most widely misunderstood concepts in molecular biology.

Many students are taught that the Central Dogma is simply “DNA → RNA → protein.” This version was first put forward in Jim Watson’s pioneering 1965 textbook, The Molecular Biology of the Geneas a way of summarizing how protein synthesis takes place. However, Watson’s explanation, which he adapted from his colleague, Francis Crick, is profoundly misleading.

In 1956, Crick was working on a lecture that would bring together what was then known about the “flow of information” between DNA, RNA, and protein in cells. Crick formalized his ideas in what he called the Central Dogma, and his original conception of information flow within cells was both richer and more complex than Watson’s reductive and erroneous presentation.

Crick was aware of at least four kinds of information transfers, all of which had been observed in biochemical studies by researchers at that time. These were: DNA → DNA (DNA replication), DNA → RNA (called transcription), RNA → protein (called translation) and RNA → RNA (a mechanism by which some viruses copy themselves). To summarize his thinking, Crick sketched out these information flows in a little figure that was never published.

Crick’s figure is below. Note that the dogma is simply the first sentence typed in the diagram, implying that information from either DNA or RNA, translated into a protein, cannot get back into the DNA or RNA code again. Thus changes in protein structure cannot go back and change the genetic code (see the bottom part of the diagram).

As you see, the DNA—>RNA—>protein “dogma” is an extreme oversimplification of Crick’s views. And he meant the word “dogma” to mean not an inviolable rule of nature, but a hypothesis. Nevertheless, Crick was widely criticized for using the word “dogma”.

But getting back to the diagram:

The direct synthesis of proteins using only DNA might be possible, Crick thought, because the sequence of bases in DNA ultimately determines the order of amino acids in a protein chain. If this were true, however, it would mean that RNA was not always involved in protein synthesis, even though every study at that time suggested it was. Crick therefore concluded that this kind of information flow was highly unlikely, though not impossible.

Crick also theorized that RNA → DNA was chemically possible, simply because it was the reverse of transcription and both types of molecules were chemically similar to each other. Still, Crick could not imagine any biological function for this so-called “reverse transcription,” so he portrayed this information flow as a dotted line in his diagram.

We now know, though that the enzyme “reverse transcriptase” is used by some RNA viruses to make DNA to insert into their hosts’ genomes.

Here’s what Crick said he meant by the “Central Dogma,” and, in fact, this schema has not yet been violated in nature:

In other words, in Crick’s schema, information within the cell only flows from nucleic acids to proteins, and never the other way around. Crick’s “Central Dogma” could therefore be described in a single line: “Once information has got into a protein it can’t get out again.” This negative statement — that some transfers of information seem to be impossible — was the essential part of Crick’s idea.

Crick’s hypothesis also carried an unstated evolutionary implication; namely, that whatever might happen to an organism’s proteins during its lifetime, those changes cannot alter its DNA sequence. In other words, organisms cannot use proteins to transmit characteristics they have acquired during their lifetime to their offspring.

In other words, there can be no Lamarckian inheritance, in which environmental change affecting an organism’s proteins cannot become ingrained into the organism’s genome and thus become permanently heritable.

Matthew discusses several suggested modifications of Crick’s version of the Central Dogma. Prions, misfolded proteins that cause several known diseases, were thought by some to have replicated themselves by somehow changing the DNA that codes for them, but it’s now known that prions are either produced by mutations in the DNA, or can transmit their pathological shape by directly interacting with other proteins. Prion proteins do not change the DNA sequence.

Some readers here might also be thinking that “epigenetic inheritance”, in which DNA is modified by chemical tags affixed to its bases, might refute the central dogma, as those modifications are mediated by enzymes, which of course are proteins. But as Matthew notes, those modifications are temporary, while the DNA sequence of nucleotides (sans modifications) is forever:

In other cases, researchers have pointed to epigenetics as a possible exception to Crick’s Central Dogma, arguing that changes in gene expression are transmitted across the generations and thus provide an additional, non-nucleic source of information. But still, epigenetics does not violate Crick’s Central Dogma.

During an organism’s life, environmental conditions cause certain genes to get switched on or off. This often occurs through a process known as methylation, in which the cell adds a methyl group to a cytosine base in a DNA sequence. As a result, the cell no longer transcribes the gene.

These effects occur most frequently in somatic cells — the cells that make up the body of the organism. If epigenetic marks occur in sex cells, they are wiped clean prior to egg and sperm formation. Then, once the sperm and eggs have fully formed, methylation patterns are re-established in each type of cell, meaning that the acquired genetic regulation is reset to baseline in the offspring.

Sometimes, these regulatory effects are transmitted to the next generation through the activity of small RNA molecules, which can interact with messenger RNAs or proteins to control gene expression. This occurs frequently in plants but is much rarer in animals, which have separate lineages for their somatic and reproductive cells. A widely-studied exception to this is the nematode C. elegans, where RNAs and other molecules can alter inheritance patterns.

No matter how striking, though, none of these examples violate Crick’s Central Dogma; the genetic information remains intact and the epigenetic tags are always temporary, disappearing after at most a few generations.

That should squelch the brouhaha over epigenetics as a form of Lamarckian evolutionary change, as some have suggested that epigenetic (environmental) modifications of the DNA could be permanent, ergo the environment itself can cause permanent heritable change. (That is Lamarckian inheriance.) But we know of no epigenetic modifications that last more than a couple of generations, so don’t believe the hype about “permanently inherited trauma” or other such nonsense.

And there’s this, which again is not a violation of Crick’s “Dogma”:

. . . enzymes can modify proteins in the cell after they have been synthesized, so not every amino acid in a protein is specified in the genome. DNA does not contain all the information in a cell, but Crick’s original hypothesis remains true: “Once information has got into a protein it can’t get out again.”

Now Matthew does suggest a rather complicated way that the Dogma could be violated, but it’s not known to occur, though perhaps humans might use genetic engineering to effect it. But you can read about it in his piece.

It’s remarkable that Crick’s supposition that information in a protein can’t get back to the DNA or RNA code—made only three years after the structure of DNA was published—has stood up without exception for nearly seventy years. This is a testament to Crick’s smarts and prescience.

And if you remember anything about the Central Dogma, just remember this:

“Once information has got into a protein it can’t get out again.”

Kipnis on Hitchens

December 23, 2023 • 1:00 pm

Here we have one snarky writer dissing another, and I admire both of them. This one is Laura Kipnis, writer and Northwestern University professor, going after Christopher Hitchens, who needs no introduction. It’s a fairly short piece in Critical Quarterly, free to read by clicking below (pdf is here).

Those of you who admire Hitchens—and I’m one—will have to endure him taking a drubbing about his retrograde views of women and politics, his fixation on Bill Clinton, and his overstatements about his sex life.  Yes, he was wrong, but often he was right (viz., Mother Teresa, religion, the Elgin Marbles), but he was always witty and thoughtful.

Just a few excerpts, and I’m gonna go home and cook a nice dinner with a nice bottle.

I can be as humourless as the next leftwing feminist but for some reason Christopher’s, what to call it – lasciviousness? antiquarianism? – amused more than offended me, though his public anti-abortion stance was noxious and, one suspects, hypocritical. Colour me surprised if that particular edict was upheld in practice. In any case, I never thought of him as someone you’d go to for instruction on feminism, and increasingly not on any political question, yet it was perplexingly hard to hold his bad politics against him. Mocking him on gender could even be fun, as at least there, unlike elsewhere, the positions seemed lightly held. When he published his notorious ‘Why Women Aren’t Funny’ piece in Vanity Fair, I responded (I hope a teensy bit funnily) in Slate, where he also frequently wrote, that though it was a fascinating portrait of female nature and relations between the sexes, it was unclear to which decade it applied – it had the slightly musty air of 1960s-ish Kingsley Amis, wrapped in nostalgia ‘for the merry days when sexual conquest required an arsenal of tactics deployed by bon-vivantish cads on girdled, girlish sexual holdouts. “Oh Mr. Hitchens!” you imagine one of the potential conquests squealing at an errant hand on nylon-clad knee.’

My problem with Christopher, hardly mine alone, was (to state the obvious) simply that he was one of the more charming men on the planet and mixed with liquor, this is a dangerous combination. Like most people who knew him at all, a few of the drunkest nights of my life were spent in his company. Conversations were funny, flirtatious, frank. Yet the rightward turn and increasing political rigidity also made him seem ridiculous: eruditely shrill.

And Kipnis avers that Hitchens was obsessive about Bill Clinton, particularly about his philandering (or rape, or whatever he actually did):

Christopher, on the other hand … Something about Bill Clinton’s sex life seemed to derange him. He was off the rails on the subject, literally sputtering. I tried to put it to him that he seemed, well, overinvested. It seemed way too personal, somehow off. What was it about Bill Clinton that had this unhinging effect on him? (I was kind of drunk at that point myself.) I suppose I expected him to at least pretend to ponder the question, devote maybe a few seconds to a show of self-examination. Anyone would. Not him. He was barricaded against anything I could say, also against the ‘what is this “about” for you’ sort of conversation that drunk people are known to have, which is one of the fun things about drinking, Something obdurate and hardened switched on instead. Thinking was not what was taking place, just pre-rehearsed lines and a lot of outrage.

. . . . When I later tried (and failed) to read No One Left to Lie to, his anti-Clinton screed, it reminded me of what had seemed so deranged and shrill that evening in Chicago. Of course, there’d be much more of that to come: the bellicose over-certainty about Iraq, the increasingly militaristic posturing – there was a comic rigidity about it. I’m thinking of what philosopher Henri Bergson wrote in his 1900 book on laughter about what turns people into comedic figures:2 being unaware of something automatic or mechanistic in your attitudes or actions, like Lucille Ball on the chocolate factory assembly line, turning into an automaton herself as the line keeps speeding up. Inflexibility is funny, though also a tragic waste of whatever’s human in us. The human is elastic.

Kipnis is smart and is not without humor herself, so one has to seriously consider her point of view. But what we can all agree on is what she says towards the end:

There was a sentence of Christopher’s that I always remembered, from a review of something by Richard Yates. I wished I had written it. Regarding Yates: ‘It’s clear that he’s no fan of this smug housing development or the new forms of capitalism on behalf of which its male inhabitants make their daily dash to the train.’ It’s a sentence I’m sure he gave little thought to, but I loved its man-of-the-world swoop – from a writer’s oeuvre to the banalities of suburban marriage to the mode of production, crammed into an offhandedly elegant sentence. There were always things to admire in his sentences, even as his political instincts went to shit.

The man could write. And when he wrote and was right, it was great stuff, like “God Is Not Great.”

Dan Dennett: a new book and an interview in the NYT

August 27, 2023 • 12:00 pm

I recently finished Dan Dennett‘s new autobiography, I’ve Been Thinking (cover below; click to get an Amazon link), and I was deeply impressed by what a full life the man has had (he’s 81).  I thought he spent most of his time philosophizing, writing, and teaching philosophy at Tufts; but it turns out that he had a whole other life that I knew little about: owning a farm in Maine, sailing all over the place in his boat, making tons of apple cider, hanging out with his pals (many of them famous), and traveling the world to lecture or study. Truly, I’d also be happy if I had a life that full. And, as Dan says in his interview with the NYT today, he’s left out hundreds of pages of anecdotes and other stuff.

Although I’ve taken issue with Dan’s ideas at times (I disagree with him on free will and on the importance of memes, for example), you can’t help but like the guy. He’s sometimes passionate in his arguments, but he’s never mean, and of course he looks like Santa Claus. Once at a meeting in Mexico, I was accosted by Robert Wright, who was incensed that I’d given his book on the history of religion a bad review in The New Republic.  Wright plopped himself down beside me at lunch, so I was a captive audience, and proceeded to berate and harangue me throughout the meal. It was one of the worst lunch experiences I’ve ever had.

Because of Wright’s tirade, I was so upset that, after the meal was done, I went over to Dan, jumped in his lap, and hugged him (telling him why). I was greatly relieved, for it was like sitting on Santa’s lap. Now Santa, who’s getting on, has decided to sum up his career. The book is well worth reading, especially if you want to see how a philosopher has enacted a life well lived.

In today’s paper there’s a short interview with Dan by David Marchese, who has been touted as an expert interviewer. I didn’t think that Marchese’s questions were that great, but read for yourself (click below):

I’ll give a few quotes, mostly about atheism and “other ways of knowing,” First, the OWOK. Marchese’s questions are in bold; Dennett’s responses in plain text. And there are those annoying sidenotes that the NYT has started using, which I’ve omitted.

Right now it seems as if truth is in shambles, politics has become religion and the planet is screwed. What’s the most valuable contribution philosophers could be making given the state of the world? 

Well, let’s look at epistemology, the theory of knowledge. Eric Horvitz, the chief scientist at Microsoft, has talked about a “post-epistemic” world.

How? 

By highlighting the conditions under which knowledge is possible. This will look off track for a moment, but we’ll come around: Andrew Wiles proved Fermat’s last theorem. 1990s, the British mathematician Andrew Wiles proved a theorem that had stumped mathematicians since it was proposed by Pierre de Fermat in 1637.

It was one of the great triumphs of mathematics in my lifetime. Why do we know that he did it? Don’t ask me to explain complex mathematics. It’s beyond me. What convinces me that he proved it is that the community of mathematicians of which he’s a part put it under scrutiny and said, “Yep, he’s got it.” That model of constructive and competitive interaction is the key to knowledge. I think we know that the most reliable path to truth is through communication of like-minded and disparate thinkers who devote serious time to trying to get the truth — and there’s no algorithm for that.

Note this bit: “the most reliable path to truth is through communication of like-minded and disparate thinkers who devote serious time to trying to get the truth.” This means that all knowledge, including the “other ways of knowing” of indigenous people, has to be vetted by like-minded and disparate thinkers. If it hasn’t been, it’s not another way of knowing, but only a way of claiming to know.

But wait! There’s more!

There’s a section in your book “Breaking the Spell” where you lament the postmodern idea that truth is relative. How do we decide which truths we should treat as objective and which we treat as subjective? I’m thinking of an area like personal identity, for example, where we hear phrases like, “This is my truth.” 

The idea of “my truth” is second-rate. The people who think that because this is their opinion, somehow it’s aggressive for others to criticize or reject them — that’s a self-defeating and pernicious attitude. The recommended response is: “We’d like to bring you into the conversation, but if you’re unable to consider arguments for and against your position, then we’ll consider you on the sidelines. You’re a spectator, not a participant.” You don’t get to play the faith card. That’s not how rational inquiry goes.

Marchese asks too many questions about AI and ChatGPT, topics which, while they may be important, bore me to tears. He also gets a bit too personal. He should have stopped inquiring after the first answer below.

There was something in your memoir that was conspicuous to me: You wrote about the late 1960s, when your pregnant wife had a bowel obstruction. 

Yeah, we lost the baby.

You describe it as “the saddest, loneliest, most terrifying” time of your life. 

Yes.

That occupies one paragraph of your memoir. 

Yes.

What is it indicative of about you — or your book — that a situation you described that way takes up such a small space in the recounting of your life? 

Look at the title of the book: “I’ve Been Thinking.” There are hundreds of pages of stories that I cut at various points from drafts because they were about my emotional life, my trials and so forth. This isn’t a tell-all book. I don’t talk about unrequited love, failed teenage crushes. There are mistakes I made or almost made that I don’t tell about. That’s just not what the book’s about.

Finally, the good stuff about atheism and religion. Although regarded as one of the “Four Horsemen of New Atheism” along with Hitchens, Dawkins, and Harris, Dan has been the least demonized of them, probably because he’s not a vociferous anti-theist and regards religion as a phenomenon deserving more philosophical study than opprobrium. Nevertheless, he makes no bones about his unbelief:

We have a soul, but it’s made of tiny robots. There is no God. These are ideas of yours that I think a lot of people can rationally understand, but the gap between that rational understanding and their feelings involves too much ambivalence or ambiguity for them to accept. What is it about you that you can arrive at those conclusions and not feel adrift, while other people find those ideas too destabilizing to seriously entertain? 

Some people don’t want magic tricks explained to them. I’m not that person. When I see a magic trick, I want to see how it’s done. People want free will or consciousness, life itself, to be real magic. What I want to show people is, look, the magic of life as evolved, the magic of brains as evolving in between our own ears, that’s thrilling! It’s affirming. You don’t need miracles. You just need to understand the world the way it really is, and it’s unbelievably wonderful. We’re so lucky to be alive! The anxiety that people feel about giving up the traditional magical options, I take that very seriously. I can feel that anxiety. But the more I understood about the things I didn’t understand, the more the anxiety ebbed. The more the joy, the wondrousness came back. At the end of “Darwin’s Dangerous Idea,” I have my little hymn to life and the universe.  That’s my God — more wonderful than anything I could imagine in detail, but not magical.

So how do you understand religious belief? 

No problem at all. More people believe in belief in God than believe in God. [Marchese takes issue with this in a sidenote.] We should recognize it and recognize that people who believe in belief in God are sometimes very reluctant to consider that they might be wrong. What if I’m wrong? That’s a question I ask myself a lot. These people do not want to ask that question, and I understand why. They’re afraid of what they might discover. I want to give them an example of somebody who asks the question and is not struck down by lightning. I’m often quoted as saying, “There’s no polite way of telling people they’ve devoted their life to an illusion.” Actually, what I said was, “There’s no polite way of asking people to consider whether they’ve devoted their life to an illusion, but sometimes you have to ask it.”

There are better questions that could have been asked. For example, I would have asked Dan, “What do you think has been your greatest contribution to philosophy?” and “What has been your biggest error in your work on philosophy?”  Readers might suggest other questions below, though I’m not going to convey them to Dan!

A photo of Dan en famille, with caption, from the interview. I knew him only after his beard turned white, so I wouldn’t have recognized him:

Two of my photos of Dan. The first is in Cambridge, MA, on the way to the “Moving Naturalism Forward” meeting in 2016. We drove the three hours from Boston to Stockbridge, and Richard had to fly back early because of a hurricane warning. Ergo Dan argued with me about free will for three hours’ return drive on the turnpike from Stockbridge to Boston (it was not covered with snow). That was something to remember, but I gave no ground:

And Dan at a symposium on religion at the University of Chicago in 2019.  It was tedious at times, and I think Dan is showing some impatience here with the annoying lucubrations of Reza Aslan.

Brief review: “Steve Jobs” by Walter Isaacson

June 18, 2023 • 9:15 am

This weekend I finally polished off Walter Isaacson’s big book (570 pp. of text) Steve Jobs, a 2011 biography of the tech entrepreneur, design genius, and prickly human being. I’m not sure why I took it from the library—I have a feeling a reader suggested it—but I’m glad I did, as I found it an excellent description of the man and his short life (he died at 56 of pancreatic cancer in 2011, two weeks before Isaacson published the book).

It’s the first biography I’ve read that seems to be cast in an interview format:  that is, much of the text involves quotes from people who interacted with Jobs, which, woven together, bring the book to life (Isaacson had more than 40 interviews with Jobs alone, up to right before he died).  Two aspects of Jobs stick out:

a.) The man was a technical genius, devoted to producing products that people didn’t know they needed, integrating those products into a seamless whole (including proprietary software), and controlling the entire supply chain from idea to device, including the factories making the materials for his products as well as the casings of his computers and iPods, to the notion (and design) of the Apple stores themselves. No detail was too small: he worried for weeks, for instance, about the nature and color of the plastic encasing the first Macintosh.  His explicit aim was to meld art and technology, creating a beautiful product that was not only sui generis, but one that was easy to use and gave pleasure to the user.   Here is a list of the products that, according to Isaacson, “transformed whole industries” (pp. 565-566):

  1. The Apple II
  2. The Macintosh
  3. Toy Story and other Pixar blockbutsters
  4. Apple stores
  5. The iPod
  6. The iPhone
  7. The App Store
  8. The iPad
  9. iCloud
  10. Apple itself, “which Jobs considered his greatest creation, a place where imagination was nurtured, applied, and executed in ways so creative that it became the most valuable company on earth.”

b.)  The man was largely a jerk, at least as portrayed in the book.  At once mercurial, charismatic, tyrannical, and hateful, he was fully capable of telling a waitress that the food she served was shit, firing somebody on the spot, and telling his employees that their work was “crap”.  He knew this, and said, according to Isaacson, “This is who I am, and you can’t expect me to be someone I’m not.” But Isaacson adds, “I think he actually could have controlled himself, if he had wanted.” Well, as a determinist I don’t buy it; not unless “he wanted” means changing his style based on environmental influences on him—like other people telling him to shape up. But one can also argue that his personality—the combination of charisma and Manichean authoritarianism—is what allowed him to accomplish what he did.  Under “reception” on the Wikpedia article, his colleagues and friends say this about the biography:

A number of Steve Jobs’s family and close colleagues expressed disapproval, including Laurene Powell JobsTim Cook and Jony Ive.  Cook remarked that the biography did Jobs “a tremendous disservice”, and that “it didn’t capture the person. The person I read about there is somebody I would never have wanted to work with over all this time.” Ive said of the book that “my contempt couldn’t be lower.” [JAC: he probably meant “higher.”]

Still, even if Isaacson overemphasized the odious side of Jobs—and Jobs told Isaacson to write what he wanted, never vetting anything as Jobs “had no skeletons in his closet that couldn’t come out”—the biography is well worth reading. I came away with the sense that I’d encountered a once-in-a-lifetime character, and would dearly have liked to have met him. He certainly has changed my life, as I’ve never used any computers or music devices that weren’t made by Apple. And you’ll never use your Apple computer or iPad again without thinking of the man behind it.

The book was #1 on Amazon in the year it was published, and sold 3 million copies in the U.S. in the first four years alone. I’d recommend it highly; the paperback is selling for only $11.60 (the hardback is $18.69) on Amazon.

Jobs was diagnosed with pancreatic cancer after a routine kidney scan in late 2003. It was one of those rare forms of the disease that isn’t invariably fatal, and had he undergone surgery at the time, he might have lived. But he didn’t want his body “opened up,” and for nine months he sought alternative therapies involving diet, acupuncture, and other forms of useless treatment.  He was finally operated on, but the cancer had spread. Nevertheless, he lived another eight years, dying at 56 on October 5, 2011. (Isaacson’s book came out on October 24th.) Who knows what he could have come up with had he undergone that first operation in time (which, of course, stillmight not have worked)?

Below I’ve put his commencement address at Stanford in 2005, which tells three stories about his life that helped make him what he was. It was the only commencement address he ever gave, and he wrote it himself. (The last story is about his cancer, which he’d already had for two years.)  This is what Isaacson said about the talk (p. 457):

The artful minimalism of the speech gave it simplicity, purity, and charm. Search where you will, from anthologies to YouTube, and you won’t find a better commencement address. Others may have been more important, such as George Marshall’s at Harvard in 1947 announcing a plan to rebuild Europe, but none has had more grace.

Judge for yourself; it’s only 15 minutes long;

Below is the first half of Jobs’s introduction of the iPhone in 2007 (this is part 1; part 2 is here). He always introduced these products on a darkened stage with one screen, directly demonstrating his devices to the cheers of a worshipful crowd. And he always wore jeans, New Balance sneakers, and a black Issy Miyake turtleneck.  There is no script to read from, though of course he’d practiced the presentation.

His cancer recurred the next year, invading his liver and mandating a liver transplant in Tennessee.

And here is a good 60 Minutes interview of Isaacson by Steve Kroft, discussing the book and Isaacson’s view of Jobs. The final anecdote (at 27:20) is also the ending of the book, and is enough to bring you to tears.

If you want a decent one-hour video biography of Jobs, go here.

Was Leonardo da Vinci Jewish?

April 5, 2023 • 1:15 pm

I paid particular attention to this piece because it was published in Tablet, which has a decent history of accurate reporting. That doesn’t mean I believe the claim that one of history’s greatest painters was Jewish, but they do cite a Leonardo da Vinci authority who came to the conclusion, despite his leanings to the contrary, that this was indeed the case.  If he’s right, and Leonardo was a landsmann, then perhaps we should change his name to Lenny da Vinci.

This isn’t a joke, though; click on the article below to see the facts, which are suggestive but not strong enough to convince me of Leonardo’s semitism with a high probability

I’ll have to quote a bit to show you the evidence. Here’s the new theory:

In all likelihood, Leonardo da Vinci was only half Italian. His mother, Caterina, was a Circassian Jew born somewhere in the Caucasus, abducted as a teenager and sold as a sex slave several times in Russia, Constantinople, and Venice before finally being freed in Florence at age 15. This, at least, is the conclusion reached in the new book Il sorriso di Caterina, la madre di Leonardo, by the historian Carlo Vecce, one of the most distinguished specialists on Leonardo da Vinci.

And the conventional wisdom as adumbrated in the 2019 article below (click screenshot) from the Jerusalem Post (quote is from Tablet piece):

The official version of da Vinci’s birth is that it was the fruit of a brief fling between the Florentine solicitor Piero da Vinci and a young peasant from Tuscany called Caterina, of whom almost nothing was known. Yet there had long been a seemingly unfounded theory that Leonardo had foreign origins and that Caterina was an Arab slave. Six years ago, professor Vecce decided to kill the rumor for good. “I simply found it impossible to believe that the mother of the greatest Italian genius would be a non-Italian slave,” he told me. “Now, not only do I believe it, but the most probable hypothesis, given what I found, is that Caterina was Jewish.”

The new evidence (my emphasis):

Vecce was the right man for the job—he published an anthology of da Vinci’s writings and a biography, Leonardo, translated into several languages, and he collaborated on the exhibition of da Vinci’s drawings and manuscripts at the Louvre and Metropolitan Museum in 2003. He embarked on the research for his latest book during the reconstruction of da Vinci’s library, which is where he found the document that changed everything. Dated Nov. 2, 1452, seven months after Leonardo’s birth, and signed by Piero da Vinci [Leonardo’s father] in his professional capacity, it is an emancipation act regarding the daughter of a certain Jacob, originating from the Caucasian mountains,” and named Caterina. According to the document, Catarina’s owner appears to have been the wife of rich merchant Donato di Filippo, who lived near the San Michele Visdomini church in Florence, and whose usual solicitor for business was Piero da Vinci. The date on the document is underlined several times, as if da Vinci’s hand was shaking as he proceeds to the liberation of the woman who just gave him a child.

Leonardo’s mom Caterina, instead of being Italian, is hypothesized as coming from Russia, and brought to Italy to be the property of Leonardo’s father, who made her work and also impregnated her several times. Vecce argues that Caterina was brought to Italy through Constantinople to Venice and then to Florence, where she became pregnant by Piero:

From there, we can follow Caterina to Venice, and then to Florence where she was brought by her new master, Donato di Filippo, who put her to work both in his clothing workshop and at the service of his wife. That she was a sex slave is attested by the fact that she already had several children by Filippo when, at 15, she met da Vinci, Filippo’s solicitor, who at first “borrowed” her as a nanny for his daughter Marie and then fell so much in love with her that he freed her from slavery after Leonardo’s birth. “Da Vinci himself was no stranger to the Jews,” says professor Vecce. “His main customers were among the Jewish community of Florence.”

So much for that. Leonardo’s dad left Florence for Milan, where Caterina, Leonardo’s putative mom, died in 1493. There’s a bit of unconvincing evidence that Leonardo’s painting “Annunciation” has hints of his mother’s origin, but would he really have known?

I’m not sure if the above convinces you (and I’m on the fence), but it did convince the skeptic Carlo Vecce, who is no tyro when it comes to Leonardo.

For counterevidence, though, read this article from 2019. Note that in all likelihood, the “evidence” that convinced Vecce was not available to author Erol Araf:

At the time there were already several claims that Leonardo was Jewish (under Jewish law, if your mother is a Jew, so are you; Jewishness can be regarded as traveling along with mitochondrial DNA). But here Araf takes issue:

As additional proof that he was ashamed of his mother’s origins as a lowly Jewish slave, the implausible argument has been advanced that he treated her funeral as an embarrassment. This contention is not supported by facts: The burial costs listed in the Codex Foster – under a receipt containing wax and lemon juice – includes expenses for a doctor, sugar, wax for the candles, bier with a cross, four priests and four altar boys, the bells and the gravediggers. It all costs a very tidy sum of 123 soldi; a not-insignificant amount.

So much for that. And the best evidence Araf could adduce at the time is this:

Martin Kemp, emeritus professor of art history at Oxford University and recognized as a leading Leonardo scholar, has researched the origins of Leonardo’s mother hoping it will put an end to “totally implausible myths” that have built up about Leonardo’s life. He analyzed 15th-century tax records kept in Vinci, Florence. In various interviews, preceding the publication of his book Mona Lisa: the People and the Painting, written together with Dr. Giuseppe Pallanti, an economist and art researcher, Kemp argued that the evidence was obtained by meticulously kept real estate taxation declarations.

“In the case of Vinci, “Kemp said, “they verified that Caterina’s father, who seems to be pretty useless, had a rickety house which wasn’t lived in and they couldn’t tax him…. He had disappeared and then apparently died young. So Caterina’s was a real sob story.” The records also showed that Caterina had an infant stepbrother, Papo, and her grandmother died shortly before 1451, leaving them with no assets or support, apart from an uncle with a “half-ruined” house and cattle. In short, she was a poor orphaned peasant girl who fell on hard times and in love with Leonardo’s rakish father.

The crucial question, then, since Leonardo was born in 1452, was whether they could establish that Caterina had a real Italian father whose existence can be established with a paper record. Also, Kemp’s claim that mother Caterina was a “poor orphaned peasant girl who fell on hard times and in love with Leonardo’s rakish father” doesn’t comport with Carlo Vecce’s claim that Caterina was the slave of Leonardo’s father’s solicitor, who impregnated her several times before giving her to Leonardo’s father. And was a child produced while Caterina was under the thumb of Piero da Vinci?

So we have a riddle wrapped in a mystery inside an enigma. Are there any living relatives of Leonardo who could be used to establish whether his mitochondria came from Russia? I don’t know, and can’t be arsed to find out. Only history will adjudicate this one, and Vecce’s book is available, though only yet in Italian. The title means “Catherine’s smile”:

One of my favorite Leonardos, Lady with an Ermine (1498-1491). I was lucky enough to see it at the  Czartoryski Museum in Kraków, Poland.