A. N. Wilson beats up Darwin again, this time in The Times

August 29, 2017 • 1:15 pm

I don’t know how A. N. Wilson managed to get several deeply misguided pieces published about his new Darwin biography, Charles Darwin: Victorian Mythmaker (out on September 7 in the UK, December in the U.S.), but the piece in the Evening Standard, “It’s time Charles Darwin was exposed for the fraud he was” (oy!), has now been supplemented with a piece in The Times (screenshot below; click to see the article). I haven’t yet read this book, but nearly every statement that Wilson makes in the two pieces is dubious, and seems designed to push two ideas: that Darwin’s views were not original with him (Wilson claims he stole them all from either A. R. Wallace or Darwin’s own grandfather), and that evolution is neither true nor a science. Added to this mixture is the notion that Darwin really used evolution to buttress the status quo: to elevate white supremacy, with rich British families like the Wedgwoods (Darwin’s wife’s family) at the top of the heap (I’m not making this up).  Wilson’s distortions have not gone unnoticed by scientists; see the damning review in New Scientist.

As I said, Wilson has got his ideas printed in a long piece in the Times with a wonky title:

If you know anything about Darwin or his works, you’ll see that this piece is the equivalent of the Times printing a long screed about how Einstein stole the theory of relativity from someone else, and it wasn’t even true anyway. I’ll give just a few quotes from the piece, which seems, like the one in the Standard, to be a precis of his book.

Wilson’s first paragraph in the article:

Charles Darwin’s version of the evolutionary idea was presented to the world in 1859 with his book On the Origin of Species. It is often spoken of as a work of science. Some have even called it the greatest scientific work ever written. Whatever you make of it, it is a strange book. Most of its central contentions, such as the idea that everything in nature always evolves gradually, are now disbelieved by scientists, and the science of genetics has made much of it seem merely quaint.

Here are what I see as The Origin‘s central contentions:

  1. Evolution occurs: that is, populations undergo a change in their genetic constitutions over time.
  2. That change is “gradual” rather than instantaneous: it can be rapid or slow, but substantial change takes hundreds to millions of years.
  3. Species lineages split over time, so that one species can divide (“speciate”) into two or more. This creates the “branching bush” of life starting from a single ancestor.
  4. Because of (3), all species have common ancestry; that is, any pair of species has a common ancestor some time in the past, and more closely related species have more recent common ancestors.
  5. The driving “force” for the evolution of adaptive evolution is natural selection. (It’s not really an externally imposed “force,” but a description of the differential propagation of genes based on their ability to replicate, which is often correlated with how many offspring are produced that contain those genes.) This materialistic process of gene sorting results in the wonderful adaptations we see in plants and animals; divine creation is not needed.

None of these “central contentions” are “disbelieved by scientists.” They are standing tall and firm after 158 years.

Here’s some of Darwin’s “theft of ideas”:

Darwin had several reasons for wishing to conceal where his evolutionary ideas came from. He was acutely conscious that the most famous evolutionary scientist in British history was his own grandfather, Erasmus Darwin who, 70 years before the Origin of Species, had posited the idea that life had a single origin, from which all the different species evolved.

While Erasmus Darwin suggested the idea of evolution, most notably in a posthumously published poem, he was not “the most famous evolutionary scientist in British history.” He wasn’t even an evolutionary scientist. And Erasmus never came up with the idea of natural selection. His suggestion that species had evolved, which of course others had made before, were not worked out and don’t deserve any of the credit that his grandson got for The Origin. In fact, a man named Patrick Matthew even came up with the idea of natural selection independently of Darwin (as, of course, did Wallace). Darwin gets the credit because, in The Origin, he worked out the consequences of his ideas in detail and showed how they explained previously enigmatic facts about biogeography, embryology, morphology, vestigial organs, and so on.

Oh, and get a load of this:

We see here a classic evolution of mythology. And this is not surprising. Because Darwinism, as opposed to some of his groundbreaking work of natural history, such as in his studies of barnacles and earthworms, and his wonderful book on the expression of emotions in animals, was a religion from the start.

A religion? Where is the god? Where the supernatural? Although Wilson said he spent five years “researching” his Darwin biography, it knocks me flat that he can produce a paragraph like that. Yes, the studies of barnacles and earthworms were interesting and scientific, as was his work on emotions in animals, but, as Steve Gould noted, all of Darwin’s work beginning with The Origin can be seen in some way related to the tenets of that groundbreaking book. The earthworm book (which is a good read, by the way) tries to show how slow forces working over long periods of time can create great changes—one of the lessons of The Origin. The emotion book was designed to show that human emotionality and behavior was continuous with what we see in other animals, ergo our own traits could have been a product of evolution.

Often declared to be dead, as it is here, Darwinism and its major tenets refuse to lie down. But why on earth did the Times publish something so manifestly wrong?

Happy birthday, Max Delbrück!

September 4, 2016 • 9:45 am

by Matthew Cobb

As Jerry pointed out earlier, the scientist Max Delbrück was born 110 years ago today. Because many readers will never have heard of him, Jerry asked me to sketch his life. Here you are:

Max Delbrück (1906-1981) was a key figure in the history of post-war genetics, pioneering the molecular investigation of viruses, and winning the Nobel Prize in Physiology or Medicine in 1969 “for discoveries concerning the replication mechanism and the genetic structure of viruses”  Born in Germany, Delbrück trained as a physicist and worked in Copenhagen with Niels Bohr on quantum mechanics before turning to biology in the 1930s. In 1935, together with the Russian geneticist Nikolai Timoféeff-Ressovsky and the radiation physicist Karl G. Zimmer, Delbrück published a paper in German entitled “On the Nature of Gene Mutation and Gene Structure,” known subsequently as the “Three-Man Paper.”

This important piece of research was recently translated into English, together with an excellent introduction. In 1942, this paper, which attempted to explain the size of genes – which the three men assumed to be proteins – and their mode of mutation, caught the eye of the quantum physicist Erwin Schrödinger, who was in Dublin, preparing for his inaugural lecture under the title ‘What is Life?’ in 1944 this lecture (in fact, there was so much material it turned into three lectures) was published in a small book, and it has never gone out of print since.

Schrödinger’s account of how physics could be used to investigate biology, including his emphasis on Delbrück’s model of mutation, entranced a generation of scientists, many of them physicists. Watson, Crick and Wilkins, who won the Nobel Prize for the double helix structure of DNA in 1961, were all inspired by Schrödinger’s book, and by Delbrück’s approach.

In the 1940s (he had by now fled Nazi Germany for the USA), Delbrück, together with his friends and colleagues Al Hershey and Salvador Luria, began working on viruses that infected bacteria – these were known as ‘bacteriophages’ and the researchers who studied them were later called ‘the phage group’. The idea was that by studying viruses and their mode of replication, you could learn something fundamental about how life works, as viruses were seen as been like a kind of fundamental living particle.
Delbrück (left) and Luria at Cold Spring Harbor Laboratory, during the ‘phage course’ the late 40s-early 50s.

Among the young researchers the phage group attracted was Jim Watson, who famously switched from ornithology to molecular genetics. Much that is good – and some that is bad – of molecular biology lab traditions that exist around the world flow from the way that Delbrück worked. To spread the techniques that were employed in his niche area, he set up a training course at Cold Spring Harbor Laboratory (the ‘phage course’), where budding molecular geneticists could learn key techniques.

This open, sharing attitude to science was important for ensuring that methods and findings were quickly transmitted. The social side of lab work was important, too – he and his lab members would go on hiking trips on the weekend. Meanwhile, when it came to data there were no holds barred and precise, even pedantic and aggressive questioning of speakers and people presenting their data were the norm.

Delbrück was famous for denouncing findings, generally mistakenly. For example, when Seymour Benzer presented data showing that he had created a Drosophila mutation that altered the fly’s body clock, Delbrück walked out of the lecture, saying “I don’t believe a word of it!”. Benzer was right, and Delbrück was wrong.

Delbrück’s reputation for backing the wrong side in any scientific argument led to a joke, according to which a young researcher came out of Delbrück’s office looking pale; “What’s up?” a friend enquired. “Didn’t he like your results?’ “No,” said the researcher, aghast: “He said he thought they were right.” Ho ho.

Probably the best and most perplexing example of this attitude took place in 1943, when Delbrück, who was then at Vanderbilt University in Nashville, was shown a letter from Oswald Avery, a bacteriologist who worked at the Rockefeller Institute in New York. Avery had written to his brother, who worked at Vanderbilt, describing the amazing results he had found which suggested that, in pneumonia bacteria, genes were made of DNA.

This result, which was not published for another eight months, would eventually transform the whole of biology and medicine, and Delbrück was one of the first people to hear about it. What did he do? Nothing. He did not immediately try and see if his bacteriophage viruses used DNA, he simply ignored the discovery. “You simply did not know what to do with it,” he later said. Delbrück was not alone – other researchers similarly did not accept, or understand, Avery’s finding.

But Delbrück was a very smart man who was interested in what genes are made of. It is bewildering why he did not ‘get it’, while many others, such as the young student Erwin Chargaff, or the French bacteriologist André Boivin both immediately and enthusiastically adopted the new DNA-centred view, helping to shape post-war biology as they did so.

Delbrück was an inspiration to many researchers, and his influence, in particular his skepticism and his attention to detail, is a tremendous legacy. He even played an important role in showing that how evolution by natural selection works. In a 1943 experiment that many researchers claim to be their favourite ever (yes, we all have favourite experiments!) Delbrück and Luria showed that mutations occur randomly, using bacterial resistance to bacteriophage. [JAC: My favorite experiment is the Meselson and Stahl experiment showing that the replication of DNA is “semiconservative.”]

Finally, history could have turned out very differently. A couple of years back, I learned that in 1946 Delbrück accepted a job at the University of Manchester, where I work, where there were researchers studying viruses and, famously, Alan Turing was turning to work on biological questions, in particular pattern formation. Delbrück visited Manchester, signed on the dotted line, and then for some reason that I cannot explain, chose to go back on his word and abandon bomb-damaged, smoke-blackened post-war Manchester for the sunny heights of the California Institute of Technology.

Bob Trivers’ (and my) take on famous evolutionary biologists

April 28, 2015 • 1:00 pm

Several readers sent me a link to a piece by Bob Trivers called “Vignettes of famous evolutionary biologists, large and small” (Trivers is of course also a famous evolutionary biologist.) His essay is at the Unz Review, whatever that is, so I would have missed it.

Anybody who knows Bob, as I do, also knows that he has strong opinions as well as a colorful past, so when he’s talking about his colleagues, you know he won’t pull any punches. And this piece doesn’t disappoint, though I do take issue with his abrasive remarks about my own Ph.D. advisor, Dick Lewontin. (Of course I may not be seen as objective about this, as I truly admire Dick.)

Trivers discusses his take on five evolutionary biologists:

W. D. Hamilton
Steven Jay Gould
Richard Lewontin
Philip Darlington (a short piece), and
George C. Williams

I don’t have time to describe what these people were famous for, but you can check the links if you’re interested. And I’m curious why he omitted two other Harvard people he surely knew, E. O. Wilson (with whom Trivers worked closely, I believe) and Ernst Mayr.

It’s no surprise that Trivers gives big encomiums to Hamilton and Williams, both of whom were enormously accomplished adaptationists who had no beef with Bob. Nor is it surprising that he rips apart both Lewontin and Gould, who both had severe reservations about adaptationism (viz., The Spandrels of San Marco paper) as well as political disagreements with Bob over sociobiology.

Of the five, I met only Gould and Lewontin, who were both on my thesis committee. Although Darlington was at Harvard, we never crossed paths. But I know many people who were colleagues of both Williams and Hamilton, and without exception they had only good things to say about them—especially Hamilton, who was apparently a lovely individual.

Trivers mentions that Hamilton was a dreadful lecturer, which I’ve also heard, but in all other respects he was a real Idea Man, and although some of his notions were wonky (including the idea that AIDS came from polio vaccines), his ideas about behavior, kin selection, disease, and so on made him perhaps the most important evolutionary theorist of the late 20th century.

Gould comes in for the greatest thrashing, especially for his flawed analysis of Samuel Morton’s cranial-skull data described in The Mismeasure of Man. Apparently motivated by his anti-racist sentiments, Gould apparently didn’t look too closely at what Morton actually did before accusing him of unconsciously manipulating data. (Gould’s book, however, is well worth reading for the other stuff.)

Trivers also goes after Gould for his (and Eldredge’s) theory of punctuated equilibrium, and here I think he’s right. If you construe that theory as being not just about patterns in the fossil record but about evolutionary process—about traits being molded by species selection—then Gould was simply wrong about that, and Trivers’s conclusions are correct. Yet, in the last chapter of my book Speciation (coauthored with Allen Orr), I think we make a persuasive case that species selection has operated in nature, and has molded the frequency array of characters that we see around us (i.e., what proportion of birds, among all birds, show sexual dimorphism for color?) Gould’s mistake, I think, was to suggest that species selection could somehow create adaptations themselves rather than just affect the array of existing adaptations.

When I think about Gould’s scientific achievements, I come up with very little concrete discoveries he made that are of any note. But he was seriously important in restoring paleobiology to a respectable discipline, for he had the rhetorical and writing skills to revive that field. And that, at least, is an accomplishment worth celebrating. Further, Gould’s Natural History essays and other popular pieces were always interesting, if sometimes tendentious, and surely helped awaken the public to the marvels of evolution.

As for Gould as a person, I had little use for him. In my experience the man was arrogant, preening, and completely lacked empathy, especially for us poor students trying to ask him questions. He often treated people very shabbily. Gould was a smart man and an eloquent man, but not a nice man. But we’re used to such people in science.

I don’t really want to answer Trivers’s attacks on my advisor Dick Lewontin. Suffice it to say that my experience in his lab was a great one, and I always found Dick caring, helpful, and willing to go the extra mile for his people. He was a humanitarian, even if you disagreed (as I did) with his Marxism. Trivers does allude obliquely to Lewontin’s skills at assembling a good lab and training people in the following largely negative assessment (my emphasis):

Lewontin’s story is that of a man with great talents who often wasted them on foolishness, on preening and showing off, on shallow political thinking and on useless philosophical rumination while limiting his genetic work by assumptions congenial to his politics. He ran a successful lab for many years, and easily raised large sums of research funds, so many U.S. geneticists remember him fondly for their time with him at Harvard, as a grad student or post-doc, but as an evolutionary thinker, never mind geneticist (beyond his early work on linkage disequilibrium), he has turned up mostly empty and the best of his ex-students concede he had done little of note for more than 20 years. [JAC: Remember that Dick is now 86!]

Those who know Dick knew what he accomplished, and although his 1974 book, The Genetic Basis of Evolutionary Change, didn’t achieve the sweeping synthesis it aimed for, he had ample accomplishments to his name, and produced tons of students and postdocs who, inspired and influenced him, comprise a large moiety of modern evolutionary biologists. Your legacy in science isn’t just your work, but in the work that would not have been done without your influence. And Dick has an immense legacy.

Trivers also levels this accusation:

By the way, Lewontin would lie openly and admit to doing so. Lewontin would sometimes admit, in private at least, that some of his assertions were indeed fabrications, but he said the fight was ideological and political—they lied and so would he.

All I can say is that I never heard Lewontin lie or admit he lied for political reasons. It would have been better had Trivers given some examples.

So I have countered some of Trivers’s experience with my own. But I’d love to see him write similar assessments of the other evolutionists he met in his career, including Ed Wilson and Ernst Mayr. Trivers may sometimes be wrong, but he’s always interesting.

*******

And while I’m talking about famous evolutionary biologists, let me take this chance to congratulate my old pal Russell Lande (a fellow grad student in Lewontin’s lab) on his election to the National Academy of Sciences today. Someone fix his Wikipedia page!

A great profile of Al Pacino

September 15, 2014 • 10:34 am

I’m not big on articles about Hollywood stars, but this is an exception. John Lahr, the head drama critic for The New Yorker (and son of Bert Lahr, the Cowardly Lion), has written a terrific profile of actor Al Pacino, and I’m pleased to say that it’s online for free.

Pacino hasn’t done much lately, for, as Lahr notes, he was swindled out of millions of dollars by his business manager (now in jail), and has had to take some pretty crummy roles, including a tour in which he simply talks to audiences, to recoup his dosh. But Lord, the man has some great roles behind him, including those in The Godfather series (especially #2), Scarface, SerpicoDog Day Afternoon, and a number of plays on Broadway that I’ve never seen.

What really struck me about Pacino, now 74 years old, is his absolute immersion in his craft and his character—to the extent that he lives his character well after the camera has stopped rolling, and and seems to have very little life beyond acting. He’s had children and girlfriends, but never a permanent relationship. Lahr discusses Pacino’s longest-term relationship, with actor Diane Keaton:

The conversation turned to Diane Keaton’s bittersweet second memoir, “Let’s Just Say It Wasn’t Pretty,” which had been published the week before and in which she discussed “the lure of Al.” “His face, his nose, and what about those eyes?” Keaton wrote. “I kept trying to figure out what I could do to make them mine. They never were. . . . For the next twenty years I kept losing a man I never had.” Sola expounded on the astuteness of Keaton’s observation. “Al has this ephemeral, childlike quality about him,” she told me. “His friend Charlie used to say he’s like smoke. He’s there, but he’s not there. That’s maybe what drove the women crazy. You want to catch him, but you can’t because Al is—”

It sounds like an incomplete life, but, oddly, I found myself envying Pacino. He is in “the flow” nearly all the time, and that makes him avoid having what most people consider a normal life. He doesn’t seem to miss it, either. At any rate, I’d recommend reading Lahr’s “Al Pacino’s Driving Force.

Beverly Hilton Hotel

Teller reviews Martin Gardner’s autobiography

January 5, 2014 • 12:13 pm

As a stripling I was an avid reader of Martin Gardner‘s “Mathematical Games” column in Scientific American, though I was often too young (or too dumb) to follow them. Gardner died in 2010 at age 95, but near the end he wrote his autobiography, Undiluted Hocus-Pocus, which is reviewed by the magician Teller in today’s New York Times. Two bits of the review are striking, including this one in which Gardner uses decidedly outdated methods of writing:

Gardner, who died in 2010, wrote “Undiluted Hocus-Pocus” at the age of 95 in a one-room assisted-living apartment in Norman, Okla. He worked on an old electric typewriter and edited with scissors and rubber cement as he stood at the lectern from which he had long addressed the world in print. “I am given five pills every morning after breakfast,” he writes. “My blood pressure is low, my cholesterol is so-so, and my vision is perfect.” But, he adds, “at 95 I still have enough wits to keep writing.”

But this is really surprising to me, and I suspect was to Teller as well:

The final part of this book may make ­science buffs uneasy. Gardner, like a human Möbius strip combining the faith and skepticism of his parents, explains that he believes in God, even though he is aware that “atheists have all the best arguments. There are no proofs of God or of an afterlife. Indeed, all experience suggests there is no God.” Carl Sagan once asked Gardner if he believed simply because it made him happier. Gardner said yes. “My faith rests entirely on desire. However, the happiness it brings is not like the momentary glow that follows a second martini. It’s a lasting escape from the despair that follows a stabbing realization that you and everyone else are soon to vanish utterly from the universe.”

This seems to be a case of a man who forced himself to believe despite all evidence to the contrary, simply because it made him happy. That’s a mindset that I simply can’t fathom, especially in a guy like Gardner. Nevertheless, Teller gives the book a thumbs-up: His radiant self lives on in his massive and luminous literary output and shines at its sweetest, wittiest and most personal in “Undiluted Hocus-­Pocus.”

image002
Martin Gardner

Pinker: If the world is a safer place, why can I still die?

December 22, 2013 • 3:11 pm

UPDATE:  From boston.com via reader roqoco (in the comments): Pinker and David Byrne discuss Byrne’s new book, with Pinkah in kickass footwear (my emphasis):

12-5733-DAVIDBYRNE-120

Former Talking Heads frontman David Byrne (left) and Harvard cognitive scientist Steven Pinker (right) kicked off BU’s new arts initiative Monday with a talk focused on Byrne’s new book, “How Music Works.”

Pinker, taking the reins (appropriately, considering his commanding black cowboy boots), introduced theories and then explained why they’re likely wrong, while Byrne, often through nervous laughter, replied with a few alternate (and equally uncertain) hypotheses.

Is that cool or what?

______________

This short and lighthearted video of Steve Pinker, posted just three days ago, is apparently part of a NOVA web series called “The secret life of science and engineers.” Each scientist has a single secret revealed in his/her clip.

Note that he’s wearing a Western belt; I wonder if there are cowboy boots below.

The show has its website here, and you can listen to dozens of clips from the technocracy (photos arrayed vertically on the left), which apparently are continuing to be posted.  If you’ve found some good ones (I haven’t had time to watch), let us know below.

A rather personal interview with Dawkins

September 15, 2013 • 11:06 pm

The Sydney Morning Herald has published a rather revealing interview with Richard Dawkins: “An arch atheist reveals his poetic soul.” It includes a number of Dawkins’s statements that have elicited controversy—and will continue to do so. As I’m visiting Auschwitz today, I’ll let readers argue about these themselves, but, as always, be temperate and considerate of fellow commenters.

Excerpts from the interview:

He frowns: ”Hmmm, well, yes … I chose not to make this a misery memoir or talk about my feelings too much. I do go into bullying a bit; I wasn’t bullied but I am ashamed of having not stood up to bullies on behalf of others. I wasn’t beaten a great deal but when I was it damn well hurt.” He doesn’t blame the headmaster who inflicted the beatings and had two canes – one more painful for greater crimes. ”They all did it and it’s wrong to judge the past by the standards of today. For example, my childhood was thoroughly racist in a benign, paternalistic sort of way. The Africans were all ‘boys’ – nice and funny but you couldn’t actually trust them to do any sort of competent job.”

. . . Recently Dawkins has been attacked for claiming teaching a child about the fires of hell is worse than child sexual abuse. On Twitter people have said things such as, ”Oh Richard, I’m such a fan of yours but you really have gone too far …” He frowns: ”I just don’t get that. I think the reason is that when people think of sexual abuse they think of a horrendous experience like being raped or buggered violently. But I was talking about things like what happened to me – a master at school stuck his hand down my trousers and had a little fiddle and that was it. It was unpleasant but not the same thing.”

But what about a father persistently going into his daughter’s bedroom for years? ”Oh yes, awful. But there is a spectrum of awfulness and the horror of telling a child about hell is somewhere in the middle: you burn forever, your skin peels off and you grow another so it can burn off again. If you were a child who really believed that, wouldn’t it traumatise you more than having someone stick his hand up your skirt?”

. . .It is seven years since The God Delusion but Dawkins can’t stop writing, lecturing, blogging and tweeting about the iniquities of faith. He comes out at the sound of the bell every time; he rings the bell himself.

”I do, I do. But I get fed up with being treated as though I was a nasty, humourless, negative person. I feel that if people can’t argue with you, the best they can do is criticise you as a person.” But his own attacks are often ad hominem too – he is contemptuous of people who disagree with him. ”I’m contemptuous of their ridiculous beliefs, but not of them as individuals.”

I do think it would be best for Richard to lay off the Twitter. If there’s one thing I’ve learned after years of writing, both here and in popular magazines, it’s that you should not respond to criticism in all but the most severe cases.  In certain quarters even an apology is sufficient to damn you even more severely. And I still see no need to tw- – t except to note that I’ve published yet another piece on this site (this piece, BTW, is number 6003).

I have read Richard’s book, and I had the opposite reaction to that of the interviewer:

In his autobiography he writes entertainingly of family, school, friends and undergraduate days in Oxford; but once he starts postgraduate work, we are plunged into detailed descriptions of research, early forays into computer programming, complete with diagrams, and some fancy linguistics. I enjoyed the book, I tell him, up until he disappeared into the laboratory. It is, in a sense, a book of two halves.

I found the “life history” stories intriguing but not nearly as lively as when Richard writes about science. When he describes his Ph.D. experiments on animal decision-making, the book comes alive. It’s clear that what excites him much more than the recitations of his life story is the history of his scientific work.

I realize that the eyes of many readers will glaze over when they get to the parts about computer programming and chicken experiments, but for me, as a scientist, they show an engagement and joy with ideas that is far more intense than Dawkins’s engagement with other human beings.  The book ends with the writing and publication of The Selfish Gene.

h/t: P