Shabby science reporting in the New York Times

January 10, 2021 • 11:00 am

I’ve noticed lately that the quality of science writing in newspapers has declined, even in The New York Times, which used to have some really good writing, especially by Carl Zimmer, who doesn’t seem to appear in its pages so often.

_____________________

CORRECTION:  Zimmer is still writing prolifically in the NYT, but covering a beat—vaccination—that I’d missed, (mis)leading me to believe that he was engaged in activities other than writing for the NYT. He’s asked me to correct this in a comment below, so I’ll just add his comment here:

If you had bothered to look at my author page at the Times, you’d see that I have been busier than ever there as I help cover the science of the pandemic. Over the past 10 months, I’ve written 93 stories about Covid-19, which comes to about two articles a week. Please correct your post. You are misleading your readers about my work. https://www.nytimes.com/by/carl-zimmer

I guess he was peeved. The misstatement was my fault, of course, and I’ve fixed it, but I have to say that this is a rather splenetic reply from someone whose work I’ve always praised.

_____________________

Rather, in place of long-form biology and physics, a variety of people now write for the Times‘s biological “Trilobite” column, and seem to take a more gee-whiz approach to science, producing short columns that are also short on information.

Part of the problem may be that many of these columns are written by freelancers who haven’t spent most of their writing career dealing with biology. My general impression is that the NYT is starting to reduce its coverage of science. That would be a damn shame since it was the only major paper to have a full science section (I don’t get the paper issues any longer, so I don’t know if they still have the Tuesday science section I’d read first).

The sloppy writing seems to be the case with this week’s column, a column reporting a new genome-sequencing study in Nature of monotremes: the platypus and the echidna (“spiny anteater”). I have only scanned the paper briefly, and will read it thoroughly, but on reading the NYT’s short summary I spotted two errors—not outright misstatements of fact, but statements that are incomplete descriptions of the truth, and where an extra word or two would have made the column not only more accurate, but more interesting.

Here’s the article (click on the screenshot):

 

Maybe I’m being petulant, but here are two quasi-misstatements in the piece. First, this one (emphases are mine):

When the British zoologist George Shaw first encountered a platypus specimen in 1799, he was so befuddled that he checked for stitches, thinking someone might be trying to trick him with a Frankencreature. It’s hard to blame him: What other animal has a rubbery bill, ankle spikes full of venom, luxurious fur that glows under black light and a tendency to lay eggs?

The facts: Only the males have ankle spurs, and of course only the males have venom. (This probably shows that the trait is used not for defense against predators, but for male-male competition during mating.) Females have no venom and have rudimentary spur nubs that drop off before maturing. Of course, females have the genes for producing ankle spurs and venom, as those genes don’t know which sex they’ll wind up in—just like human males have genes for vaginas and breasts and human females carry genes for penises. But the sex-development pathway prevents the expression of venom and spurs in females, just as it prevented me from developing a vagina.

The sex-limitation of the spurs isn’t mentioned in the Nature piece, but every biologist who knows their platypuses also knows that only the males have venom spurs. And, by the way, the echidna has some genes that used to produce venom, but they’re non-expressed “pseudogenes” that have become inactivated. That shows that the ancestral monotreme was almost certainly venomous (this isn’t mentioned in the NYT piece, either).

About those egg-yolk genes:

For instance, many birds and insects have multiple copies of a gene called vitellogenin, which is involved in the production of egg yolks.

Most mammals don’t have the vitellogenin gene, said Dr. Zhang. But the new genomes reveal that platypuses and echidnas have one copy of it, helping to explain their anomalous egg-laying — and suggesting that this gene (and perhaps the reproductive strategy itself) may have been something the rest of us lost, rather than an innovation of the monotremes. 

Well, yes, mammals do have the vitellogenin gene. In fact, our own species has three of them, but, as in other mammals they’re pseudogenes—genes that are there in the genome but are broken and not expressed. Humans and other placental mammals don’t require egg yolk because we’re nourished through the placenta, not yolks in shells. The platypus has two vitellogenin genes (described in the Nature paper as “genes”, so the statement that platypuses and echnidas have “one copy” is misleading)—they’re just not “functional” genes.

Now you may say this is quibbling, but it’s not. First of all, the statement that playtpuses have one copy of the egg yolk gene is wrong. They have two, but one doesn’t function. More important, the statement that there are nonfunctional yolk genes in all mammals says something powerful about evolution, something that I discuss in my book Why Evolution is True.  Those “vestigial” and nonfunctional genes are evolutionary remnants of our ancestors who did produce egg yolk. Why else would they be there in our genome, doing nothing? Chickens, who of course evolved from reptiles, as we did, have all three vitellogenin genes in working order.

Another error, then, is the statement “suggesting that this genes. . . may have been something the rest of us lost.” No, we didn’t lose it; it’s still there in our genomes. And there’s no “suggestion” about it: it’s sitting there in our DNA, has been sequenced, and has been shown to be nonfunctional. Finally, we KNOW that this gene is NOT an innovation of the monotremes, and have known that for a long time (e.g., see here). It was inherited from their reptilian ancestors.

This isn’t flat out erroneous science reporting, but it’s incomplete science reporting—the summary of a paper phoned in to the NYT. (I also find the Time’s summary curiously devoid of what’s really new in the paper; at least half of it reprises what we already knew.) More important, the reporter missed a good chance to give some powerful evidence for evolution, both in ourselves and in monotremes, whose genomes harbor some dead egg-yolk genes that are active in our avian and reptilian relatives. And yes, those echidnas have dead genes for venom.

h/t: Gregory

Superfluous article of the month

September 4, 2019 • 12:30 pm

Do we really need another article that telling us that evolution isn’t always “progressive”, going in a straight line towards traits that we consider “advanced”? (These are nearly always traits that humans have, like intelligence, high consciousness, and big brains.)

This form of evolution, often represented by the “straight line” diagram of human evolution shown in the new The Conversation article below, also called “orthogenesis,” is said to misrepresent evolution in several ways. It implies, for instance, that there’s an inherent directionality to evolution, which isn’t true (though in some cases, like arms races, it can approximate truth). It could be taken to imply that the directionality isn’t conferred by natural selection, but by some teleological force, like the “drive to consciousness” broached by computer scientist David Gelernter in a recent, dreadful, and grossly misleading critique of evolution. And it implies a scala naturae—a “scale of nature”—that could be (and was) taken as a ranking of how “evolved” something was. In the case of human races, the scale was used to imply that some races (invariably white ones) were more evolved than, and hence superior to, their pigmented brethren.

Click on the screenshot to read this short piece:

Perhaps I’m being too captious here. Perhaps misconceptions about evolution like this one need constant rebutting as each new generation becomes prey to scientific errors or the blandishments of creationists. Still, the three authors, all biologists, spend most of their time decrying the cartoon depiction of orthogenesis, like the one above, presenting lots of examples (easy to find), but neglecting some really interesting glosses on this idea. In fact, all they really say is that evolution doesn’t work this way, and pointing out three errors (an excerpt):

Originating with Plato and Aristotle, this view gets three main things wrong.

First, it holds that nature is organized hierarchically. It is not a random assortment of beings.

Secondly, it envisions two organizing criteria: things progress from simple to perfect and from primitive to modern.

And thirdly, it supposes there are no intermediary stages between levels in this hierarchy. Each level is a watertight compartment of similar complexity – a barnacle and a coral reef on the same rung are equally complex. No one is halfway between two steps.

Well, I’d argue that the first “error” is wrong, but is not necessarily implied by the figure, which shows straight-line evolution in a lineage, while the hierarchy comes from the branching of lineages, barely mentioned. And I’d argue as well that the progression doesn’t really imply that there are no intermediate stages. It just shows selected segments of an evolutionary lineage.

Had I written this, I would have added a few other points to flesh it out:

a). You can convert a branching bush into a straight line simply by following one line of ancestry. Here’s part of a slide I use to show that point in the evolution of the modern horse, which traces only the path to one twig on the luxuriant historical branching of equids:

The human “progression” above can be derived from picking out one lineage in the evolution from early australopithecines to modern H. sapiens, but hominin evolution was a branching bush, and many of the twigs went extinct. In fact, we are still ignorant of the exact lineage that took early hominins to modern ones.

b.) I would have added that sometimes evolution might take place in one direction, but it’s because natural selection, which could be reversed, drives it that way. No irreversible teleological forces are involved. I gave the example of predator-prey “arms races”, in which predators become ever faster, prompting the prey to also evolve fleetness. Or there might be an “open niche” in which mutations push evolution in a single direction. Brain size (and intelligence) in humans might be one example, although even here not all lineages got bigger and bigger brains: some died out and some, like H. floresiensis, might even have evolved reduced brain size as a consequence of smaller body size (the origin of this species is, of course, a mystery). Another example might be the ancestors of whales, which likely found an open niche in the sea, full of unexploited food. Ergo early whales became more and more amphibian and then fully marine. Natural selection made them that way: more marine whales presumably got more fish and experienced less competition.

c.) I would have pointed out examples in which evolution is regressive, losing features that evolved adaptively in other lineages. Fleas lost their wings, as did penguins. Tapeworms lost most of their sensory systems and their entire digestive system. Some Antarctic fish have lost their swim bladders, and the subgroup of icefish have also lost their hemoglobin, becoming the only vertebrates to lack that protein (dead hemoglobin genes still reside in their genomes, giving evidence of their ancestry). To compensate, they have also lost their scales, so that they can exchange oxygen through their skin. (By the way, I know of no good adaptive story for the loss of hemoglobin in this group).

At any rate, the misconceptions about orthogenesis give plenty of opportunity to impart lessons about and cool examples of evolution. What a pity that the three authors blew this chance (Conversation articles can be longer) to concentrate on example after example of straight-line evolution.

As I said, the article is not harmful to scientific education; in fact, it’s marginally useful. But it could have been much more useful.

h/t: Michael

Marianne Williamson’s vaccine woo

August 2, 2019 • 12:45 pm

I have to admit that when I first saw Marianne Williamson in the Democratic debates my jaw dropped. How did this woomeister, who doesn’t hold elective office and never did, manage to get on stage with credible candidates? I did know about her history of anti-vaccination efforts as well as denial of depression and other mental illnesses, as well as her attacks on “Big Pharma,” but I didn’t comport that with her being on the Big Stage with real candidates. And yet some people like her! One of them is the ever-cringeworthy David Brooks, who wrote this editorial in today’s New York Times (click on screenshot). The title alone is astounding!

How does she know how to beat Trump? With a moral uprising! By defeating the dark psychic force of collectivized hatred! Well, Ms. Williamson (and Mr. Brooks), that’s easier said than done. Where would you propose that we begin?

Here’s Brooks:

It is no accident that the Democratic candidate with the best grasp of this election is the one running a spiritual crusade, not an economic redistribution effort. Many of her ideas are wackadoodle, but Marianne Williamson is right about this: “This is part of the dark underbelly of American society: the racism, the bigotry and the entire conversation that we’re having here tonight. If you think any of this wonkiness is going to deal with this dark psychic force of the collectivized hatred that this president is bringing up in this country, then I’m afraid that the Democrats are going to see some very dark days.”

And she is right about this: “We’ve never dealt with a figure like this in American history before. This man, our president, is not just a politician; he’s a phenomenon. And an insider political game will not be able to defeat it. … The only thing that will defeat him is if we have a phenomenon of equal force, and that phenomenon is a moral uprising of the American people.”

A moral uprising of the American people might also include criticizing of those like Williamson whose anti-vaxer views would lead to the death of children.  And this is what the no-punch-pulling Orac (a surgeon) does in this new article at Respectful Insolence (click on screenshot):

In case you’re wondering about Orac’s title, first, he appears to have gotten her name wrong (I’ve made a comment on his site to that effect), though in the rest of the article he gets it right. More important, part of his article is a criticism of a somewhat exculpatory article about Williamson by science writer Faye Flam at Bloomberg.com (click on screenshot):

Faye is a friend of mine, so this hurts doubly, but I think Orac rightly rakes her over the coals for casting Williamson as a misunderstood “science skeptic” rather as a science denialist. Here’s how Faye gives Williamson a pass:

The accusation of being “anti-science” has become a popular and effective way to discredit people, at least in certain circles. Self-help guru turned presidential candidate Marianne Williamson is learning that after her debate performances.

People often end up accused of being “anti-science” when they question scientific dogma, but questioning dogma is what science is all about. Donald Trump could be more accurately labelled as anti-science for the blatant cutting of funds for important scientific studies – though even he may not be opposed to the scientific enterprise so much as he is trying to protect his friends in industry at the expense of science and people exposed to pollution.

A particularly scathing anti-Williamson critique appeared in the Daily Beast, though the author couldn’t seem to find much fault with anything said in this week’s debate, instead digging up past statements. Indeed, she has dealt with some new-age ideas that are unscientific or even antithetical to science, but not more so than much organized religion is.

Williamson seems likely to disappear from the national conversation soon, and critics are right to go after her lack of policy experience. Criticizing her, or any other candidate, on the basis of ideas and experience makes perfect sense. But trying to discredit skeptics with the label of “anti-science” is not very scientific.

Well, the response to this, especially if you know Williamson’s history as well as her weaving-and-bobbing views now that she’s been called out for her anti-vaxerism and wonky views on mental illness, is this:

Here’s the thing about science (and being “antiscience”). There’s a hierarchy, gradations, if you will, of how unscientific or antiscientific your beliefs are. Believing something for which there is no scientific evidence and, in fact, there is plenty of scientific evidence that refutes that belief is on the extreme end, as is believing such things based on conspiratorial thinking. That’s what Williamson has a long history of doing with respect to vaccines. Remember what she has said on more than one occasion?

And here’s one of MW’s tweets (and a response):

Orac also dismantles Wiliamson’s claim that chronic illnesses in children have risen to 54%, which is simply a lie. When pressed on this, or on her views that mental illness is just reified “sadness”, she tends to revert to her attacks on “Big Pharma”. Now Orac isn’t a huge fan of the pharmaceutical industry, but for Williamson it’s a displacement activity, designed to divert attention from her profoundly antiscientific views when she’s called out for lying:
Orac:

Yes, it must be conceded that there is a legitimate debate to be had over the treatment of mental health and the issue of regulatory capture in the regulation of pharmaceutical companies and their products, but Williamson’s dismissal of so much depression as “medicalizing normal grief” is a vast oversimplification and exaggeration. Of course, when Melber gets around to the issue of “skepticism” on vaccinations (a horrible horrible, horrible choice of a word for this) and tries to press her on it, we see her lay down this “I’m not antivaccine” antivaccine patter:

I think it’s an overstatement to say that I cast skepticism on vaccination. [Orac note: Actually, it’s an understatement.] On the issue of vaccinations I’m pro-vaccination, I’m pro-medicine, I’m pro-science. On all of these issues, what I’m bringing up that I think is very legitimate and should not be derided and should not be marginalized, particularly in a free society, is questions about the role of predatory Big Pharma.

I’ll take “I’m not antivaccine, I just question big pharma” for $800, Alex.

Orac has a lot to say, but even if you just skim his piece and just listen to Williamson’s flaky lucubrations, you’ll wonder why anyone takes her seriously. Is it because Americans don’t know how settled the question of vaccination safety and efficacy really is? I don’t know.

Here’s one more bit from Orac’s piece, but you should watch this video of her with Anderson Cooper first (this was posted yesterday):

Orac’s take on this:

In this segment, Anderson Cooper focused primarily on Williamson’s past statements about antidepressants and psychiatric drugs. Cooper pressed her on her past statements about antidepressants “numbing” people, pointing out quite reasonably that depression itself numbs people. In responses, Williamson goes full woo, denying that she’d ever said what she’s been documented saying and then going on:

What I’ve talked about is a normal spectrum of human despair, normal human despair, which traditionally was seen as the purview of spirituality and religion, that which gave people comfort. gave people hope and inspiration in their time of pain. And with the advent of modern psychotherapy, a lot of the baton passed from religion and spirituality to modern psychotherapy, which was an interesting transition. Then, over the last few years, very very quickly, the baton was passed again to psychopharmacology, and so a nuanced conversation was lost regarding the nature of human despair.

Holy hell. Marianne Williamson’s entire objection to modern psychopharmacology for depression is that it has pushed aside religion and spirituality as the primary means of dealing with “human despair.” Given that she’s a New Age grifter, one shouldn’t be surprised. She doesn’t like a disease-based model of clinical depression because it cuts into her grift. She even goes on to suggest that the treatment of depression is seeking to keep us from feeling normal sadness after, for instance, the death of a loved one, which is a complete mischaracterization of modern psychotherapy.

If that isn’t antiscience, I don’t know what is.

Williamson, of course, got her start by coddling religion, and has simply leveraged that into her non-goddy but still wooey spirituality.

In this case, I think that Faye’s piece is off the mark, for it does conflate healthy skepticism with bald-faced denialism. There’s a huge difference, and Faye’s conflation of these damages the public understanding of science.

And nobody should view Williamson as a viable Presidential candidate, much less a thoughtful human being.

 

 

h/t: Michael

The BBC unwisely jumps on the epigenetics bandwagon

April 8, 2019 • 10:00 am

About two weeks ago,  the BBC’s “Future” website published a long science article touting the importance of epigenetic effects in humans: the idea that various behaviors, traumas, and psychological propensities produced by the environment on parents can be transmitted to their offspring. This is supposed to act in a “Lamarckian” way: the environment modifies the parents’ DNA or proteins by putting chemical markers on them, these modifications get passed on without any change in the genetic code. In other words, it’s the inheritance of an acquired character, something that is generally ruled out by the way genes work.

Click on the screenshot below to see the BBC’s breathless take:

The article gives several reports of the kind of stuff that’s inherited: health problems passed on to the sons of Civil War prisoners (but not their daughters), changes in the stress hormones in the offspring of Holocaust survivors, increase in the mortality of the grandsons of Swedish males who survived a famine and, in mice, increased sensitivity to a chemical odor in offspring and grand-offspring of mice who had learned to fear that odor by getting a shock when they smelled it. The BBC then touts all this as having big implications for humans:

But if these epigenetic changes acquired during life can indeed also be passed on to later generations, the implications would be huge. Your experiences during your lifetime – particularly traumatic ones – would have a very real impact on your family for generations to come. There are a growing number of studies that support the idea that the effects of trauma can reverberate down the generations through epigenetics.

. . .if humans inherit trauma in similar ways, the effect on our DNA could be undone using techniques like cognitive behavioural therapy.

“There’s a malleability to the system,” says Dias [Brian Dias, the author of the mouse study]. “The die is not cast. For the most part, we are not messed up as a human race, even though trauma abounds in our environment.”

At least in some cases, Dias says, healing the effects of trauma in our lifetimes can put a stop to it echoing further down the generations.

Well, we want to heal the effects of trauma in our lifetimes because trauma is painful, and none of these studies show any way to stop the supposed inheritance of trauma save by not exposing parents to trauma in the first place. In other words, the clinical implications of all this work is negligible.

But, as I’ve emphasized repeatedly, studies showing the “legacy of trauma” are more often than not flawed, relying on p-hacking, small sample sizes, and choosing covariates, like sex, until you get one that shows a significant effect. Further, there is no evidence for the inheritance of epigenetic effects in any organism beyond two or three generations, for epigenetic markers get reset, being wiped out during sperm and egg formation.

Finally, almost every study cited by the BBC report—save the Civil War study, which is too new to garner general acceptance— has been subject to criticism, criticism barely mentioned by the BBC. The mouse odor study by Dias and Ressler, for instance, was criticized in Genetics by Gregory Francis, who said that Dias and Ressler’s work was too successful:

The claim that olfactory conditioning could epigenetically transfer to offspring is based on successful findings from both the behavioral and neuroanatomical studies. If that claim was correct, if the effects were accurately estimated by the reported experiments, and if the experiments were run properly and reported fully, then the probability of every test in a set of experiments like these being successful is the product of all the probabilities in Table 1, which is 0.004. The estimated reproducibility of the reported results is so low that we should doubt the validity of the conclusions derived from the reported experiments.

Why was it “too successful”? Francis gives a number of reasons, which include unconscious manipulation of the data, poorly designed studies, and unreported experiments. Regardless, the mouse odorant experiments—and remember, even the effects reported lasted just two generations—should only be mentioned if you include Francis’s caveat. The BBC somehow overlooked that.

In humans, both the Swedish and Dutch famine studies, and the pitifully small sample in the Holocaust study (whose results have largely been disowned by the authors themselves) have been analyzed on a useful post by Kevin Mitchell, a neurogeneticist in Dublin, who rejects all the conclusions and winds up, after reviewing the corpus of highly touted human studies published through May of last year:

In my opinion, there is no convincing evidence showing transgenerational epigenetic inheritance in humans. But – for all the sociological reasons listed above – I don’t expect we’ll stop hearing about it any time soon.

Mitchell also has a useful take on why, given the methodological and statistical issues with the human “epigenetic” findings, they’re still accepted by journals and beloved by the media. The BBC is just one of many examples of the latter; Mitchell cites several breathlessly uncritical articles in the media about epigenetic inheritance in humans.

I’ll reproduce Mitchell’s analysis below about the misguided public love of epigenetic inheritance in humans, but bookmark his article if you want a useful guide to skepticism about such studies:

So, if these data are so terrible, why do these studies get published and cited in the scientific literature and hyped so much in the popular press? There are a few factors at work, which also apply in many other fields:

    1. The sociology of peer review. By definition, peer review is done by experts in “the field”. If you are an editor handling a paper on transgenerational epigenetic inheritance in humans (or animals), you’re likely to turn to someone else who has published on the topic to review it. But in this case all the experts in the field are committed to the idea that transgenerational epigenetic inheritance in mammals is a real thing, and are therefore unlikely to question the underlying premise in the process of their review. [To be fair, a similar situation pertains in most fields].
    1. Citation practices. Most people citing these studies have probably not read the primary papers or looked in detail at the data. They either just cite the headline claim or they recite someone else’s citation, and then others recite that citation, and so on. It shouldn’t be that way, but it is – people are lazy and trust that someone else has done the work to check whether the paper really shows what it claims to show. And that is how weak claims based on spurious findings somehow become established “facts”. Data become lore.
    1. The media love a sexy story. There’s no doubt that epigenetics is exciting. It challenges “dogma”, it’s got mavericks who buck the scientific establishment, it changes EVERYTHING about what we thought we knew about X, Y and Z, it’s even got your grandmother for goodness sake. This all makes great copy, even if it’s based on shaky science.
    1. Public appetite. The idea of epigenetic effects resonates strongly among many members of the general public. This is not just because it makes cute stories or is scientifically unexpected. I think it’s because it offers an escape from the spectre of genetic determinism – a spectre that has grown in power as we find more and more “genes for” more and more traits and disorders. Epigenetics seems to reassure (as the headline in TIME magazine put it) that DNA is not your destiny. That you – through the choices you make – can influence your own traits, and even influence those of your children and grandchildren. This is why people like Deepak Chopra have latched onto it, as part of an overall, spiritual idea of self-realisation.

That’s a good and thoughtful analysis.

So caveat lector, and, BBC, you really were derelict in publishing that article. You misled the public about the findings of these studies, as well as about their implications for clinical psychology.

I’ve put a test below where you can analyze what seems to be an error made by the BBC in its analysis.

h/t: Amy

*************

A TEST FOR READERS

The BBC, in referring to the Civil War trauma that had effects on the offspring of prisoners, rules out one form of cultural transmission by saying this:

But what if this increased risk of death was due to a legacy of the father’s trauma that had nothing to do with DNA? What if traumatised fathers were more likely to abuse their children, leading to long-term health consequences, and sons bore the brunt of it more than daughters?

Once again, comparing the health of children within families helped rule this out. Children born to men before they became PoWs didn’t have a spike in mortality. But the sons of the same men after their PoW camp experience did.

As I interpret this (and I haven’t seen the study), the comparison doesn’t rule out the abuse hypothesis at all. Why not?

Five Books: Adam Hart-Davis’s choice of the best books on popular science

February 28, 2019 • 1:00 pm

Adam Hart-Davis is an English writer, photographer, and broadcaster, known for being the presenter of several popular BBC series. In a Five Books piece (click on screenshot below), Hart-Davis lists and discusses what he sees as the five best popular-science books. According to the site,

Adam Hart-Davis says clear simple writing is the key to an accessible science book. Selects the five books he believes offer the best introduction to Popular Science. Includes works from Darwin, Watson and Hawking.”

I’ll show his choices and give a few of his words about the book (indented) and my own take (flush left):

Micrographia, by Robert Hooke.

In 1665 he produced this extraordinary book. I have a facsimile edition here, not an original. It is big, about a foot high and nine inches wide. It is beautifully printed – there is all this old-fashioned type with the long S and so on and it contains lovely pictures. He was, luckily for us, a very good draftsman. And some of the drawings are just the same as the pages and some of them pull out to make a picture about two foot square. The most famous of all is this picture of a flea. He was almost the first person to use a microscope as a scientific instrument and he looked at things like fleas and drew wonderful pictures of them – and showed people a new world.

JAC: Haven’t read it, though I’m sure Matthew Cobb did for his book The Egg and The Sperm Race. Readers who have read it should weigh in below. It does seem an odd choice, though

Stonehenge Decoded by Gerald S. Hawkins

Well, this book was really interesting for me because it was the first popular science book I had come across. It was published in 1965 originally. I was doing my PhD at the time and this book came to me because I had just joined one of those new-fangled books clubs! I was surprised that there were science books that were readable. I had heard about treatises on the electron or whatever it is but I had never come across a book like this.

. .  it was certainly quite important to me to see that science could be made popular in this way and I think it influenced me quite a lot. I think it showed a lot of people science could be written for the layman.

JAC: I’ve never heard of this book, and it’s quite dated now. Further, it was apparently chosen for sentimental reasons rather than its intrinsic merit. I’ll give this one a pass.

A Brief History of Time by Stephen Hawking. 

Well, I actually did read it. I got stuck in Chapter Six and I then read it again recently and it is a lovely book. It is very hard because he is trying to describe very complicated things, but he has actually done a very good job. I have started on his latest book, The Grand Design, which I think is rather easier. But this book is important not just because he is stuck in a wheelchair and is a brilliant cosmologist but also because it is a really difficult subject aimed at the general reader. This sort of cosmology, looking at whether or not black holes emit radiation, is a very esoteric sort of question. It is not like Stonehenge where people ask things that we can all understand, like, if you look through this gap can you see Capella.

JAC: I really did try to read it, and, like Adam, got stuck. I never finished it, joining the ranks of those who give it the reputation of “The Least-Read Popular Science Book of All Time.” When I read a technical science book, I prepare for a long slog and make sure my brain is well oiled, but for popular science books I expect them to be easier reads, which may be a flaw in my reading style. But I found Hawking’s book tedious and not well written. I still claim that one can make cosmology intelligible without its being a slog.

The Double Helix by James D. Watson.

This is a really interesting story. The discovery of the double helix was fascinating because various people were working on it – Linus Pauling in California and Rosalind Franklin in King’s College London and she was very close. James Watson acquired her results without asking her, which I think was really bad news. And they went off and made this wild guess and they guessed right. And full marks – they were bright young men both of them – but they made a brilliant guess and the result was that they and Maurice Wilkins shared a Nobel Prize and Rosalind Franklin didn’t, which was very unfair.

JAC: Whatever you think of Watson, who has ruined his own reputation through bigoted remarks, this book belongs on the list. I think it’s the best account of a scientific discovery I’ve ever read.  It’s engaging, takes you right back to Cambridge when the discovery was made, and doesn’t spare the controversy and personal animosity involved in a race for a great discovery. As for Rosalind Franklin’s work, I share the view of Matthew Cobb, who wrote a piece about the Crick/Watson/Franklin/Wilkins controversy in the Guardian four years ago:

It is clear that, had Franklin lived, the Nobel prize committee ought to have awarded her a Nobel prize, too – her conceptual understanding of the structure of the DNA molecule and its significance was on a par with that of Watson and Crick, while her crystallographic data were as good as, if not better, than those of Wilkins. The simple expedient would have been to award Watson and Crick the prize for Physiology or Medicine, while Franklin and Wilkins received the prize for Chemistry.

Sadly, Franklin had died of ovarian cancer before the prize was awarded, so she never got her medal.

Watson has written several books since this one, including books on genetics and more personal volumes along the lines of The Double Helix, but none of the latter are nearly as good as The Double Helix. 

The Formation of Vegetable Mould through the Action of Worms, with Observations on their Habits by Charles Darwin.

It’s a wonderful Victorian title. Of course his most famous book is On The Origin of Species and that is actually rather hard work because he was desperately trying to persuade people of his thesis and he collected an absolute mountain of data and you had to wade through this stuff. But he was actually rather a good writer if he was able to let his hair down. His account of the Voyage of the Beagle is lovely. It is a sort of travel book full of derring-do and wonderful adventures. But I love this book about earthworms, which wasn’t published until 1896, because it just shows what a lovely naturalist he was.

JAC: Yes, this is an engaging and underappreciated book, larded with probably unintended humor and “citizen scientist” observations. It’s also short and, as Adam says, not as “hard work” as The Origin. But the place of The Origin in science, and indeed in human history, is overwhelming and secure, while Earthworms at best can be seen as instantiating Darwin’s evolutionary view that slow and tiny processes can create big changes over a long span of time. I would still say that The Origin is the one to read for the science. Remember, it was intended for the public, and was a bestseller in Darwin’s time. It’s not an easy read, to be sure, but there are parts that are wonderfully written and the “one long argument” is compelling and thrilling. If you want to read other Darwiniana, don’t forget The Voyage of the Beagle.

What else would be on my list? Well, I’ve already done a Five Books article on evolution books, which include not just The Origin but also The Blind Watchmaker, which I see as Dawkins’s best fusion of scientific exposition and lyrical writing. Those two would be on my “top five” list.

For other science books, I can’t leave out The Peregrine by J. A. Baker, which is a natural-history book—the best book on a single species ever written. The prose is ineffably moving. I quite like The Microbe Hunters by Paul de Kruif, which had a huge influence on my becoming a scientist. But I haven’t read it in years and it may be dated or I might have outgrown it. I prefer to leave it unread in my dotage.

Carl Sagan’s The Dragons of Eden must be considered, as well as The Demon-Haunted World: Science as a Candle in the Dark and The Varieties of Scientific Experience: A Personal View of the Search for God. Stephen Jay Gould’s books are in the second rank, though I think his books of essays, especially the early ones, should be considered.

I haven’t been overly impressed by more recent science books, and can’t think of one written in the last 15 years that excited me enough to even consider it for the “top five.”

Readers, of course, are welcome and invited to suggest their favorite popular science books.

Science posts go unread. . .

January 17, 2018 • 1:00 pm

I’ve kvetched before about how readers seem to ignore science posts, which started out as the heart of this website and are still dear to my own heart. In response, readers often say that they do read them but simply can’t comment because they don’t have the expertise. That’s fair enough and isn’t a problem for me. But then I decided to look at how many of those posts are actually viewed compared to posts about politics, food, and other stuff. Here are some recent data; I’ve chosen the non-science posts randomly, without looking at the views, and put up some science posts that go back about a month or so.

A false report on hijab cutting. Posted yesterday, 671 views. Leisure fascism: Vegan says that a carnivore can’t eat tofu because it’s “cultural appropriation”. Posted two days ago, 1147 views

Hybrid speciation in Amazonian manakins? SCIENCE POST. Posted Jan. 14. 395 views.

An academic explores the performative social construction of masculinity among South Texas Hispanics by analyzing the size of their barbecues and spiciness of their condiments. Posted Jan. 13, 646 views.

The magnificent obsession: man takes over a decade to design and build a Boeing 777 model out of paper. Posted Jan. 12, 743 views.

Trump denies making “shithole countries” remark. Posted Jan. 12, 1182 views.

Evidence that raptors spread brushfires to flush out prey.SCIENCE POST. Posted Jan. 11.  769 views.

Surprise! Pinker smeared again by those who distort his words. Posted Jan. 10, 6,591 views

The origin of human music? Male palm cockatoos use a stick to beat rhythmically on hollow trees. SCIENCE POST. Posted Jan. 9, 726 views.

Hybrid speciation in Amazonian manakins? SCIENCE POST. Posted Jan. 14, 395 views.

The Left: shut up about the Iranian protests or you’ll make things worse. Published Jan. 3. 1,763 views.

There is no monolithic “Twitter” that makes pronouncements. Posted Dec. 29, 729 views.

Editors of Science name the biggest science advances of the year. SCIENCE POST. Posted Dec. 28, 384 views.

Hybrid speciation in Galápagos finches. SCIENCE POST. Posted November 26,  708 views.

HuffPo finds marginalization, sexism, bigotry, bullying, child abuse, and exploitation in “Rudolph the Red-Nosed Reindeer“. Posted December 25 1,182 views.

Even if people don’t comment on science post, this non-systematic trawling of posts shows what I suspected: the ones that deal with science, particularly research papers, aren’t read as often. I’m not chastising readers, for what interests you is what interests you, but it is a bit distressing to me. All I can say is that it’s infinitely harder to write one of these posts than it is to bang out something about the Templeton Foundation, cats, or postmodern academia. If people want me to continue dissecting science papers, they’ll have to at least view them.

SciBabe, paid by Splenda, touts its product

January 6, 2018 • 12:00 pm

Yvette d’Entremont writes about popular science, especially consumer scams and misconceptions, on her website SciBabe. Her site’s bio notes that she has bachelor’s degrees in theater and chemistry from Emanuel College in Boston, a masters degree in forensic science with a concentration in biological criminalistics from Anglia Ruskin University in Cambridge, England, and worked eight years as an analytical chemist.  Most of her stuff appears to be good and constitutes worthy debunking of fads of issues like the supposed “dangers” of GM food and the touted benefits of jade vagina eggs.  A few years ago I wrote a controversial post criticizing her and other women’s use of sex to sell science, concluding that the advantages of popularizing science were offset by the use of female pulchritude and dirty jokes, which, to me, seemed to contribute to the objectification of women. As I recall, d’Entremont responded sharply to what I said, and possibly will to this post as well.

I note all this here because I’m not on a vendetta against d’Entremont, but  want to criticize another aspect of her blogging in this post, one that I see an unalloyed problem: her writing of science posts showing that a product is safe to use at the same time she’s being paid and sponsored by the company who makes that product. I’m referring to Splenda, an artificial sweetener that I myself use when trying to lose weight. Splenda is largely sucralose, a sugar substitute that isn’t broken down and metabolized by the body, so it contributes fewer calories than sugar. Because of added “bulking agents”, a packet of Splenda actually has 3.4 calories, or 31% of the calories of a single packet of sugar, which itself has 10.8 calories). But the fact that there are fewer than 5 calories per “serving” of Splenda means it can legally be labeled as “zero calories”. (I’d prefer that it be labelled more honestly: something like “69% fewer calories than sugar.”)

Splenda also appears to not cause dental caries or have any injurious effects on diabetics.  I like it because it tastes pretty close to sugar and because research shows that it’s safe, but it wasn’t until today that I found out it’s not really a “zero” calorie product”. At any rate, I was reading one of d’Entremont’s posts on Splenda (she has at least two, here and here, both of which tout the product’s safety), and saw the the following:

So let’s talk about it. I’ve partnered with Splenda to try to combat some of the more pervasive myths about low calorie sweeteners like sucralose that I see every day on social media. And I see them on my timeline or in my inbox every day. Are they causing weight gain? Are they causing stomach pains?

Are they safe?

Aware of how this “partnering” looks, she uses some of her trademarked snark to defuse the issue:

. . . Before every single shill accusation shows up, yes, let’s just get this out of the way. Indeed, I’m working with Splenda and these are all things I never would have said otherwise. I’m kicking my heels up on a desk made of fossilized unicorns wearing a coat made of Dalmatians, sipping a martini made from the tears of my enemies. Specifically Gwyneth Paltrow…. Wait, we collected that fluid from a jade egg, you say? Goddamnit.

…Or more accurately I really like Splenda because it’s safe for everyone, it bakes well (which is important for someone like me who loves to bake), and if this is something that you like the taste of in your diet, you deserve to understand why the science says it’s safe.

There’s a bit of error in her characterization of the product’s “zero calorie” reputation, though:

The amount of calories in a daily iced coffee I would have needed for how sweet I like my coffee? 60 (four raw sugar packets). For the record, I now take mine with three Splenda and one raw sugar – just 15 calories, if it’s a day when I skip the cream,

Well, no, for that implies there truly are no calories in Splenda, so that she can claim that a single packet of raw sugar plus three of Splenda has exactly one-fourth the calories of four raw sugar packets. In fact, it has 25.2 calories (15 + 3 X [3.4]), or 68% more calories than she represents. She concludes:

Or more accurately I really like Splenda because it’s safe for everyone, it bakes well (which is important for someone like me who loves to bake), and if this is something that you like the taste of in your diet, you deserve to understand why the science says it’s safe. Over the coming months, I want to address all the questions and concerns that people have had about Splenda and low calorie sweeteners in general. It’s a field where chemiphobia has run rampant, leading to incorrect assumptions about diets, calories, and health.

So have you heard some crazy things on the internet about Splenda? Comment. Email. Ask, but don’t let it go unchecked without asking, and I will do my damndest to answer with evidence.  I’m not going to find any random paper to support my positions. I’ll hunt for quality evidence and papers that come from the most reputable resources possible. I wouldn’t expect you to trust your health to anything less.

Her sponsorship is noted by the product’s own publicity blog, Splenda living:

Working with two content creators – Yvette d’Entremont, a scientist also known as SciBabe and the parody ecard platform Someecards, we at SPLENDA® Brand will be introducing digital and social content with one goal: to empower fans of the SPLENDA® Brand to take an active role in busting myths about sucralose. We also created a unique hashtag to help you identify this content on social media: #DebunktheJunk. The content will be available beginning today at www.Someecards.com/Splenda and on SciBabe.com/debunkthejunk It will continue to be released in the months to come so be sure to stay tuned for additional information and resources that help you debunk junk science!

These content partners were specifically selected because they have expertise in translating what can often be complicated concepts into understandable, relatable terms, and they are supporters of the brand’s passion for discerning good science from junk science. Additionally, they are SPLENDA® Brand fans.

So we have someone who’s paid by the makers of a product telling us good things about the product.  To me, this represents a perceived conflict of interest that should not exist in a science popularizer.

Now note that I am not accusing d’Entremont of distorting the science about Splenda because she’s sponsored by the product. In fact, I don’t think that’s the case. Although there appears to be an error in the product’s favor in her calculations, I think that comes simply from her accepting its characterization a “zero calorie” product.  As far as I know, d’Entremont has otherwise accurately represented the qualities and usefulness of the sweetener, and in both of her posts (the latest last November) she notes that she’s sponsored by Splenda. She’s also written other posts and articles defending the safety of non-Splenda sweeteners.

To my mind, it’s simply not good for one’s reputation as an objective science popularizer to create the appearance of a conflict of interest. Would I take money from Splenda and at the same time write articles telling everyone how safe it was, even if I believed it? Nope—not a chance. d’Entremont, if she responds to this, will undoubtedly say that she believes in Splenda, and that their sponsorship doesn’t have an iota of influence on what she says or the topics she writes about. And that may be true. But there is a reason why politicians and the like are supposed put their financial investments in a blind trust when in office—to prevent the appearance of a conflict of interest. When people like Hillary Clinton take money for their foundation from foreign donors while they’re in government, that’s a problem, but of course they always say, like everyone who takes dough and then does something to help the donor, “I was not influenced by the money.”

Sometimes that’s true, but it’s best to avoid the problem entirely by not creating the appearance of a conflict. In the case of d’Entremont, I’d recommend that she either ditch the Splenda sponsorship or stop writing about it. (For the good of the public, I’d recommend she do the former. For her own financial good, perhaps the latter is preferable.) I realize that science popularizers have a tough time making a buck unless they’re someone like Neil deGrasse Tyson, but one’s reputation for objectivity seems to me too precious to sully with the appearance of a conflict like this one.

Finally, I’d recommend reading her articles in general, especially if you’re interested in product scams and popular misconceptions about products. She has a recent piece in Cosmopolitan on the stupidity of colon “cleanses” and “juicing” that should be read by the large number of people who practice this worthless cleansing in the hopes it will “detox” them.

In which a predatory journal wants my paper

December 15, 2017 • 8:00 am

Every week or so I get an invitation to republish one of my papers about evolution and genetics in some wildly inappropriate journal. These are, of course, the predatory journals that glom onto nearly any scientist, however relevant their research, to get money (you have to pay to publish in them). Here’s an email that just arrived:

Editor ARCJGO [newsletter@neweraevents.co]

Dear Dr. JA Jerry A Coyne,

Greetings from ARC Journal of Gynecology and Obstetrics

It has been a great experience reading your research article Theodosius Dobzhansky on Hybrid Sterility and Speciation  and we hope that you are continuing to pursue research work in the subject. We would like to know more about your current research work. So, we recommend your name as one of our honorable authors who can contribute to the upcoming issue of ARCJGO.

Articles are invited from all the related aspects of Gynecology and Obstetrics. ARC Journal of Gynecology and Obstetrics.Welcomes you to submit any type of articles on various aspects of Gynecology and Obstetrics, but not limited to the below given classification.

  • Infertility
  • General Gynecology
  • Laparoscopic Surgery
  • Pregnancy Diagnosis
  • Female Urology
  • Puberty

Some of our recently published papers

  1. Treatment of Infertility by Natural Factors in a Patient Who Had Seven Failed Procedures of In Vitro Fertilization
  2. Intravesical Electromotive Botulinum Toxin in Women with Overactive Bladder – A Pilot Study
  3. Antenatal Magnesium Sulphate (Mgso4) for Fetal NeuroProtection Prior to Preterm Labor: Mini-Review

We promise you to provide the best editorial service for you and will support you to publish the article at the earliest possibility. Kindly acknowledge this email. For any further clarification, you can always reach us on:gynecology@oamedicaljournals.com

Note: Article Publication Charges is 450 USD for the articles submitted on or before December 20th, 2017.

If your research interest/topic doesn’t match this journal, you can visit our complete list of journals at: Journals List

Best wishes
InduSri. K
Editorial Assistant
ARC Journal of Gynecology and Obstetrics

Theodosius Dobzhansky was a famous evolutionary geneticist and my academic grandfather. My article was simply a celebration of his famous 1936 paper that reported the first genetic analysis of hybrid sterility, a form of reproductive isolation. It was published in Genetics. Why and how this ob-gyn journal found it, and why they want it for ARC Journal of Gynecology and Obstetrics, is beyond me.

Sometimes I’m tempted to respond to these emails and see if can get this work into an inappropriate journal. But of course I’d have to pay big bucks to do that, for this offer isn’t without strings!

The BBC gets evolution wrong again when describing a new discovery of early eutherian mammals

November 7, 2017 • 9:30 am

Well, after uncritically publishing a piece on the new “species” of orangutan (and not even seeking out any dissenting voices, unlike the BBC’s Discover Wildlife site), the BBC news site once again engages in a misleading piece of science reporting. The misguided piece has the headline below (click on screenshot to go there); I’ll get to the text in a minute:

But first, the finding, documented in a paper by Steven Sweetman et al. in Acta Palaeontologica Polonica (reference and free download below). What they found, in a fossil bed in Dorset, were two tiny teeth and several teeth fragments.  From the nature of these “tribosphenic” (three-cusped) teeth, they concluded that what they had was fossil material of a eutherian (placental) mammal from 145 million years ago: the earliest placental known after the therian ancestor (early mammal) split into the eutherians and metatherians (marsupials). In other words, this was on the placental side of the split between all placental mammals and marsupials, a split that took place about 160 million years ago. And it lived only about 15 million years after that split. That’s a very early eutherian mammal.

Can they conclude this from the teeth alone? I suspect so; paleontologists have a long history of studying teeth, and the authors are confident that these three-cusped teeth, which they identified as belonging to two species, are truly placentals rather than marsupials. Read the paper if you want to know more, but be aware that it’s full of paleontological jargon. At any rate, here are the teeth, which, as you can see, are only about 4 mm long (about a sixth of an inch).

Fig. 4. Stereo scanning electron micrographs of studied eutherian mammal specimens from the Ber riasian Purbeck Group of Dorset, southern England; in occlusal view. A. Durlstotherium newmani gen. et sp. nov., NHMUK PV M 99991. B. Durlstodon ensomi gen. et sp. nov., NHMUK PV M 99992.

Here’s a reconstruction of the two species in the paper, showing one of them being eaten by a theropod dinosaur:

(from paper): Fig. 7. Artist’s impression of the Purbeck lagoon at dusk with Durlstodon gen. nov. (left foreground), Durlstotherium gen. nov. (right and center foreground)and the theropod Nuthetes holding a captured Durlstotherium (centre middle distance). Artwork by Mark Witton.

These were clearly small shrewlike mammals, and probably nocturnal insectivores, something that we’ve long thought to be what the placental ancestor was like (see Dawkins’s The Ancestor’s Tale). A recent paper in Nature Ecology & Evolution by Roi Maor et al. (free download, reference and pdf link below) reconstructed the ancestral behavior of the first mammals by going back from the habits of extant mammals, and concluded that the chances were very high that the first mammals were nocturnal. That comports with an old hypothesis, which may well be true, that the earliest mammals were few, and were nocturnal because they were forced to hide from the many existing predatory reptiles. Only when the asteroid destroyed many reptiles, so the story goes, could mammals radiate into the many niches they occupy today, including those earlier filled by reptiles. This story might be what happened, for the radiation of mammals seemed to occur a while after the “ruling reptiles” had largely vanished.

This is all well and good. What is not well and good is the BBC’s characterization of this discovery. Here’s what they said (my emphasis):

Scientists who identified the specimens say they are the earliest undisputed fossils of mammals belonging to the line that led to humans.

They date back 145 million years.

Fossils of the oldest-known ancestors of most living mammals, including human beings, have been unearthed in southern England.

Teeth belonging to the extinct shrew-like creatures, which scampered at the feet of dinosaurs, were discovered in cliffs on the Dorset coast.

Scientists who identified the specimens say they are the earliest undisputed fossils of mammals belonging to the line that led to humans.

They date back 145 million years.

”Here we have discovered from the Jurassic coast a couple of shrew-like things that are to date unequivocally our earliest ancestors,” said Dr Steve Sweetman of Portsmouth University, who examined the ancient teeth.

We know with reasonable certainty that this was an early eutheran mammal. What we do NOT know, and what the BBC says is the case, is that these animals “belonged to the line that led to humans” and, worse, “are our earliest ancestors.” That last mischaracterization is even more dire given that it was pronounced by one of the paper’s authors.

If either of these species are our earliest ancestors, it must have been the species that produced all extant eutherian mammals, including us. We don’t know that: all we know is that it was on the eutherian side of the eutherian/metatherian split, and lived soon after that split occurred. Further, we don’t even know if it was OUR ANCESTOR, for it could have been part of a radiation of these small creatures that went extinct without leaving descendants. In fact, given the frequency of extinction (99% or more of fossils represent species that died out without leaving descendants), that’s the likeliest result. It may be our relative, but not our ancestor.

It’s like saying that, if the diagram below represents the phylogeny of hominins, that Aredepitchecus ramidus or Sahelanthropus tchadensis was “our earliest ancestor”, or “one of the oldest-known ancestors of living humans”. The shrews, like the many hominin species in the diagram below that went extinct, were neither our ancestors nor “unequivocally our earliest ancestors”.  The two hominin species just named were certainly our relatives, and on the hominin side of the hominin/chimp split, but they weren’t on the lineage leading to modern humans, nor were they our ancestors.

It irks me when a respected site like the BBC gets this kind of science reporting wrong. Because the writer doesn’t understand evolution, or wanted to make the findings more “gee-whizzy”, she wrote a misleading article.  Pity.

UPDATE: I added this comment underneath the BBC’s report:

 

Addendum by Greg Mayer

Jerry and I have been discussing this this morning, and he invited me to add my two cents here.  As Jerry correctly points out, it is in general difficult or impossible to say that a particular fossil is an ancestor, as opposed to being a member of the group that includes an ancestor (the latter sometimes phrased as “near” the ancestor), since the chance of that fossil being exactly in the ancestral line (as opposed to near it) is slim.

But it’s also objectionable to call something the “earliest ancestor” without qualification (as the article, and even more regrettably, at least one of the the scientists involved, do), even if you think it is in the exact ancestral line. And that’s because the earliest ancestor goes back to the beginning of life: the “earliest ancestor” of every life form on earth is a billions-of-years-old urprokaryote, and it is the same ancestor for all of life (unless life arose more than once). We also have, of course, have myllokunmingiid, fish, reptilian, etc.  ancestors that come between our bacterial and earliest mammalian ancestors. So, we must specify the taxon, to which we belong, of which the fossil in question is supposed to be the earliest known member. What they should have said is that the new fossils are the earliest known members of the group from which all later eutherian mammals (including Homo) are descended, or for brevity, the “earliest known eutherian”. (Reader Andrew Norman got this right in the comments here earlier this morning.)

This case reminds me of a similar, and erroneous, claim in the press about Protungulatum (although the BBC did better that time), and of the Darwinius scandal.

h/t: H. Stiles

________

 Sweetman, S. C. Grant Smith, and David M. Martill. 2017. Highly derived eutherian mammals from the earliest Cretaceous of southern BritainActa Palaeontologica Polonica in press, available online 07 Nov 2017 doi:https://doi.org/10.4202/app.00408.2017

Maor, R., T. Dayan, H. Ferguson-Gow, and K. E. Jones. 2017. Temporal niche expansion in mammals from a nocturnal ancestor after dinosaur extinction. Nature Ecology & Evolutiondoi:10.1038/s41559-017-0366-5

The Daily Beast distorts epigenetics with bogus claims that children can “inherit memories of the Holocaust”

September 24, 2017 • 9:15 am

I’ve written extensively on this site about recent claims that environmental modifications of DNA, through either methylation (sticking a -CH3 group onto DNA bases or by changing the histone scaffolding that supports the DNA, can constitute a basis for evolutionary change. This claim is simply wrong. To date, while we can show that environmental “shocks” given to animals or plants can sometimes be passed onto their descendants, the inheritance lasts at most three or four generations, then disappears. This cannot in principle support evolutionary change, which requires DNA changes that are permanent, so that they can spread through a population and effect a long-term genetic transformation.

Further, the changes shown are almost never “adaptive”—that is, they usually don’t produce anything that would enhance reproduction even under the environmental conditions in the lab that produce them.

Finally, extensive genetic mapping of real adaptations in nature, ranging from insecticide resistance in mosquitoes to lactose intolerance in humans and to armoring in marine stickleback fish, show that the changes invariably reside in the sequence of DNA bases themselves, not in add-on methylation or histone changes produced by the environment. I don’t know a single adaptation in nature that, when we isolate down its genetic basis, resides in some environmentally modified, epigentic change in DNA that can be transmitted for generations.

(Let me add here that epigenetic changes have promoted adaptive evolution when those changes reside in the DNA itself: that is, there are stretches of DNA that, in effect, tell the organism things like: “put a methyl group in the DNA on nucleotide X”. But these changes are themselves the product of conventional natural selection, and the epigenetic changes are produced by the DNA itself and not by the of the environment.)

Nevertheless, because the idea of evolution caused by environmental modification of our genomes is both appealing and “nonDarwinian”—violating how scientists think evolution works—it appeals to a subset of people who think that the theory of evolution is woefully incomplete. These are the “Kuhnabees” like Steve Gould, who think a brand new evolutionary paradigm is in order. (The Templeton Foundation gives out millions of bucks for people trying to reach this conclusion.)  I’ve been critical of this type of revisionist excitement, not because I want to defend the modern evolutionary synthesis at all costs, but for the reasons stated above: there’s no evidence for lasting environmentally-caused, adaptive modification of DNA, and genetic mapping experiments of real evolved adaptations invariably show that the evolutionary changes residing in the DNA’s sequence of nucleotide bases: the order of Gs, Cs, Ts, and As.

But the juggernaut rolls on, promoted by articles like this one in The Daily Beast by Elizabeth Rosner, “Can we inherit memories of the holocaust and other horrors?” (Subtitle: “In the trailblazing field of epigenetics, researchers are finding evidence that the descendants of victims of atrocities are inheriting those experiences in their DNA.”)

It’s the standard boilerplate article, showing some environmental modification of DNA that can be inherited for a few generations, but then bears a title and subtitle that are wholly misleading, and are echoed in the article’s text.

Rosner cites three experiments, only one of which I’ve been able to read. That one is a study of mice given an electric shock when exposed to a particular odor (reference and free access below). After a while, the mice froze in the presence of the odor (cherry blossoms), even in the absence of a shock. This is a pretty normal result. But then the authors bred subsequent generations of mice and showed that they, too, froze in the presence of that specific odor but not others. This behavior was associated with hypomethylation in sperm (reduced methylation) affecting a particular gene associated with olfaction. This genetic change was, in turn, associated with changes in brain structure. This is a correlation, not a causation, but it’s certainly intriguing.

But the inheritance of this behavior lasted only two generations: last seen in the grandchildren of the exposed mice. While author Rosner says the inheritance lasted “three subsequent generations” beyond the offspring of the tested mice—that is, a total of four generations of inheritance—I can see only two generations of inheritance. Unless I’m misreading, Rosner has misrepresented the results, and of course there is no testing a hundred generations later, which would be necessary if this freezing behavior (which is simply a neural response to shock, and not necessarily adaptive), were to be the basis of a real adaptation.

Now reacting to an odor behaviorally is not the same thing as remembering the Holocaust. Rosner gets that conclusion from a study of 32 Holocaust survivors (a small sample, with Rosner citing only a book and not the original papers). Apparently that study showed that PTSD in the offspring of PTSD-afflicted Holocaust survivors than in the population as a whole (again, I haven’t seen the data, and don’t know what the control group was, which should properly be the kids of non-PTSD Holocaust survivors). And I’m dubious when the study’s author, Rachel Yehuda, says she’s found this correlation “‘inexplicable’ by any other means than intergenerational transmission.”

Well, cultural transmission is also intergenerational (traumatized parents could treat their children in a way that these kids would themselves be traumatized), but let’s assume that the author of that study managed to show that the inheritance was truly genetic rather than cultural. (This kind of separation isn’t easy, and often uses adoption studies.)

And if the changes were genetic, were they due to methylation or to changes in histones? Nothing is said in the article. What Rosner and other summaries say is that Holocaust survivors have lower levels of cortisol, a hormone that helps people recover from trauma (clearly not an adaptive response), but also lower levels of an enzyme that breaks down cortisol, helping store metabolic energy—something that might be adaptive under starvation. Most important, the children of these survivors also had lower levels of cortisol—but (unlike their parents) also higher levels of the enzyme that breaks it down, an environmental modification of biochemistry that was inherited across one generation. Both of these modifications are palpably nonadaptive in the offspring, as reduced cortisol makes you recover more slowly from trauma, and so the offspring of Holocaust survivors would be more susceptible to trauma.

While you can make up a story why being more likely to get PTSD if your parents were traumatized might be adaptive, you have to do some tortuous argument here. I could, with a minute’s thought, make up an adaptive story were the results to be in the opposite direction. Further, you can easily argue that this is simply a biochemically induced change, heritable for just one generation, that doesn’t modify offspring in an adaptive way. Too, there’s no evidence to date that this change in behavior persists for more than one generation. I’d also like to see adoption or other studies showing that the correlation between parents and offspring is due to genetic rather than cultural inheritance. Finally, note that what we have here is “heritable” changes in mental illness, not heritable “memories of the Holocaust”, as the article’s title implies.

Finally, Rosner mentions a book that produces what I see as very slim evidence for heritable PTSD:

Psychiatrist Nirit Gradwohl Pisano published a book titled Granddaughters of the Holocaust: Never Forgetting What They Didn’t Experience. She focused on ten subjects who are survivors’ grandchildren and, following current theories in epigenetics, found evidence of what she refers to as the “hard-wired” PTSD passed down to the descendants of survivors.

“[These] ten women provided startling evidence for the embodiment of Holocaust residue in the ways they approached daily tasks of living and being … Frequently unspoken, unspeakable events are inevitably transmitted to, and imprinted upon, succeeding generations. Granddaughters continue to confront and heal the pain of a trauma they never experienced.”

Ten subjects! Yes, it’s two generations of inheritance, but did Pisano rule out, in her book, cultural transmission of propensity to PTSD? How did she know it was “hard-wired” (i.e., in the DNA)? Note, too, the bogus “unspoken, unspeakable events inevitably transmitted to. . . succeeding generations.” That is a gross distortion, and one that Rosner doesn’t even bother to examine critically. These grandchildren didn’t inherit memories of the Holocaust—at best they inherited a propensity to get PTSD. The only reason they’d even KNOW about the Holocaust is by cultural transmission from their ancestors or through reading. The events are not remembered at all, at least genetically!

So, in answer to Rossner’s title question, “Can we inherit memories of the Holocaust and other horrors?”, the answer is “we have no evidence for that”. And the subtitle’s claim that we “inherit experiences in our DNA” is just wrong. Rossner and The Daily Beast, unfortunately, have been hit by the epigenetics juggernaut, and in a way that makes them pass on completely misleading implications about inheritance. This is science reporting at pretty much its worst: a sundae of misconceptions topped with a clickbait cherry of a title.

h/t: Saul

________________

Dias B. G., and K. J. Ressler. 2014. Parental olfactory experience influences behavior and neural structure in subsequent generationsNature Neuroscience. 17(1):89-96. doi:10.1038/nn.3594.