Evidence for evolution: Hairless animals have dead genes for a full coat of hair

January 15, 2023 • 9:30 am

In Why Evolution is True, one of the most telling pieces of evidence I adduce for evolution is the existence of dead (nonfunctional or “vestigial”) genes found in the DNA in living species. For example, mammals like us carry three dead genes for making egg yolk. Evolution has rendered them nonfunctional, as mammalian embryos are nourished through the placenta, but they’re still there in the genome, rendered useless by mutations.

The genome of nearly all animals we know is a veritable graveyard of dead genes. These, like our egg-yolk genes, constitute irrefutable evidence for evolution. They’re still there because we inherited them from a common ancestor, but evolution usually inactivates unneeded genes not by snipping them out of the genome, but by allowing “inactivation” mutations to kill the genes’s production of protein. (Alternatively, inactivation can occur by killing off a promoter gene that causes a gene to be transcribed.) The genes just sit there, “silent signs of history.”

Both types of dead genes were found in this new eLife paper, and they are genes normally promoting the growth hair in the relatives of species that have lost most of their hair. That the genes are still there, but are nonfunctional, simply can’t be explained by anything other than common ancestry. That’s why creationists, like the chowderhead I’ll highlight in the next post, ignore them. (Similarly, they ignore the evidence for biogeography of oceanic islands—also explained in Why Evolution is True—because there’s no creationist explanation save “Well, God wanted things to look like they’d evolved.”)

This is a long and complicated paper from eLife, but the popular version in Science Alert, shown below that, is not sufficiently detailed. I’ll try to simplify the eLife paper but give more information than the popular precis.

The pdf for the eLife paper is here, and the reference is at the bottom.

The short take: the authors sequence a handful of relatively hairless species that descended from ancestors that had hair, looking for genes in common to these set that a. were likely involved in producing hair, but b. had been inactivated in these species by “relaxation of selection”. That is, there was no longer natural selection in these species to maintain a coat of hair (and good reason not to), and so mutations inactivating the genes–and their controlling elements–accumulated.  Further, natural selection can accelerate this trend by favoring gene variants that reduce hair, either because regular genes that make hair use up metabolic energy that isn’t needed and, more likely, that hair is an impediment to their lifestyle.

The interesting thing about the paper is that, by sequencing the DNA of relatively hairless species, they found sets of genes in common among the hairless species, implying that there were common evolutionary-genetic pathways for hair reduction. This is what’s called convergent evolution, which usually refers to similar appearances of organisms that have similar lifestyles but aren’t closely related—like the marsupial mole and the placental mole—but in this case it’s convergence at the level of genes.

Here are the species they looked at:

naked mole rat

. . and a subset of all species studied showing their evolutionary relatedness. I love the example they use for humans:

(from the paper): Hairless species show an enrichment of hair-related genes and noncoding elements whose evolutionary rates are significantly associated with phenotype evolution. (A) Phylogenetic tree showing a subset of the 62 mammal species used for analyses. Note that all 62 species were included in analyses and only a subset are shown here for visualization purposes. Foreground branches representing the hairless phenotype are depicted in orange alongside photographs of the species.

Most of these animals have some hair, but the authors conjecture, with reason, that their ancestors were much hairier. This is likely to be true, though the elephant and manatee had a recent common ancestor and it’s not clear whether their hairlessness evolved twice. The authors support this by adducing the existence of the hairy mammoths as animals more closely related to the modern elephant, implying that the ancestral pachyderm was hairy. But hairy elephants could have represented the re-evolution of hair in a relatively hairless ancestor. Likewise with the dolphins and orcas; I’m not sure how these two, which are fairly closely related marine mammals, could be taken as independent losses of hair. (On the other hand, the walrus, less closely related, could have lost its hair independently).

They also looked at 52 other species, for you need a comparison of DNA sequences in hairy animals. The figure above shows some of the hairy species whose DNA was sequenced (they looked at a lot of genome: nearly 20,000 coding genes and 350,000 regulatory regions).

Surprisingly, they found a fair number of genes that lost function (or had “relaxed selection”) in all of the hairless species. Not all of the genes had a known function, but most were associated with hairs themselves, the hair follicles,or the dermal papillae, the crucial structures that allow hair to grow.  Here’s a list of five genes and a table of the likelihood that they would have changed so rapidly in all the species. The colors show where the genes act.

(From the paper): Diagram of hair shaft and follicle with shading representing region-specific enrichment for coding and noncoding sequence. Both coding and noncoding sequence demonstrate accelerated evolution of elements related to hair shaft (cortex, cuticle, and medulla). Noncoding regions demonstrate accelerated evolution of matrix and dermal papilla elements not observed in coding sequence. All compartment genesets were compiled from Mouse Genome Informatics (MGI) annotations that contained the name of the compartment except the arrector pili geneset (Santos et al., 2015).

Note that both coding (genes) and noncoding (controlling-element) DNA was involved; in fact, among all the genes identified as likely contributors to hairlessness, there were more noncoding changes than coding changes, which is often what we find when either new structures evolve or old structures are lost. I used to think—and wrote a controversial paper about this with Hopi Hoekstra—that structural (coding) genes were more important in evolutionary change, but the data show that it might be the other way around. In other words, Hopi and I may have been wrong.

Now the paper is long and complicated, and bits of it are beyond my pay grade, but I do have a few comments. First, the significance levels they use to ascertain common evolution of genes among the set of relatively hairless species are not that small. They even highlight genes, as you can see above, with adjusted probability values above 0.05; conventionally these would be considered “nonsignificant”. I’m not sure why they did that. However, as you can see from the table above, some of the adjusted probabilities were very, very small: the p value for noncoding sequences in the hair cortex is, for instance, 0.000003. I’m confident that they did at least find some genes that changed rapidly in the entire group of hairless species.

Second, they’re not sure in some cases that the rapid gene evolution was indeed associated with loss of gene function. You can tell this for coding genes because there will be a “stop codon” or a “nonsense codon” in the DNA sequence that will code for mRNA that makes a nonfunctional protein.  They don’t talk about this in detail, but simply use “rapid evolution” as an index of nonfunctionality. (I may have missed something.)

Finally, for pairs like the elephant and manatee on one hand and the dolphin and orca on the other, I don’t have a lot of confidence that their loss of hair occurred independently.

Nevertheless, we can have confidence, given the low probability values, that some structural and controlling DNA has evolved independently in a group of hairless species, causing them to lose hair. That’s a case of convergent evolution of genes that is quite novel.

Oh, I forgot to mention why these species lost hair. In our species, it probably happened to promote easier cooling of our bodies via sweating as we evolved into upright creatures on the savanna. This probably also holds for rhinos and elephants, especially because elephantine species in northern climes, like mammoths, were hairy. In marine mammals it’s obvious: hair is useless for insulation, and is just an impediment to swimming. As for armadillos and pigs, it’s anybody’s guess. Wild pigs are pretty hairy (at least the ones I’ve seen), but armadillos have shells, and that serves to insulate the animal (they do have hair on their bellies, but it’s sparse).

An armadillo’s belly from Flickr:


THE UPSHOT:  These are likely cases of “vestigial genes,” though the cases would become textbook examples if they knew exactly what the genes did and, importantly, could show without doubt that they have been inactivated in the hairless species. Those data will come some day, but in the meantime I prefer to cite the broken egg-yolk genes in mammals: remnants of genes that produced nutrients for the embryos in our reptilian, fishy, and amphibian ancestors. That is a very solid case.

You can read the “popular” take below:

h/t: Barry


Kowalczyk, A., M. Chikina, and N. Clark. 2022.  Complementary evolution of coding and noncoding sequence underlies mammalian hairlessness. eLife 11:e76911https://doi.org/10.7554/eLife.76911

My interview about evolution with Ray the Producer

September 4, 2022 • 1:35 pm

Yesterday I had an interview with “Ray the Producer” (his YouTube channel, “Allah Who?” is here, and he tends to interview people who are critical of Islam. I was invited on to talk about evolution, a theory that is widely rejected by Muslims, especially those who are Qur’anic literalists. And so the 1.5 hour conversation is about the evidence for evolution and why people reject it. (Ray is an ex-Muslim atheist.)

Here’s the video, and remember that I had about three hours of sleep when I did it yesterday morning. As always, I haven’t listened to it as I cannot abide seeing myself on video. If you can, and want to, here it is for your delectation.

Lactase persistence in populations that drink milk: a classic story of human evolution re-evaluated

July 29, 2022 • 9:15 am

The classic tale of “gene-culture coevolution” in humans—the notion that cultural changes in behavior changed the selection pressures that impinged on us—is the evolution of “lactase persistence” (LP) over the past four thousand years.  LP is a trait that allows you to consume, as an adult, lots of milk or dairy products without suffering the side effects of indigestion, flatulence, or diarrhea.

Young children are able to tolerate milk while nursing, of course, but after weaning many of them no longer tolerate milk—they are lactose intolerant (LI). The ability to digest lactose goes away after weaning because the gene producing the necessary enzyme gets turned off.

The gain of LP, which enables you to drink milk and eat dairy products into adulthood without ill effect, rests on single mutations in the control region of the gene producing lactase, an enzyme that breaks down the milk sugar lactose.  These mutations have arisen independently several times, but only after humans began “pastoral” activities: drinking milk from domesticated sheep, goats, and cows. And the mutations act to keep lactase turned on even after weaning. (Why humans turn off the gene after weaning isn’t known, but presumably involved the metabolic cost of producing an enzyme that wasn’t used in our ancestors, who didn’t drink milk after weaning until about about 10,000 years ago—when farming and animal domestication began.)

Based on analysis of fossil DNA, the LP mutations began spreading through Europe (starting from what is now Turkey) about 4000 years ago. And so the classic story—one that I taught my evolution classes—is that humans began drinking milk from captive herds, and that gave an advantage to retaining the ability to digest milk even after weaning. Ergo, natural selection for the nutritional benefits of milk led to the spread of LP mutations, as their carriers may have had better health (ergo more offspring) than individuals who turn off the enzyme at weaning).

This leads to the “coevolution” that is the classic evolutionary tale: a change in human behavior (raising animals for milk) led to selection for the persistence of the milk-digesting enzyme, and thus to genetic evolution. The “coevolution” part is the speculation that being able to digest milk without side effects would cause humans to raise even more dairy animals and drink even more milk, intensifying the selection for LP, and so the gene for LP would keep increasing in frequency.

A new paper in Nature, which is being touted all over social media, argues against this classic story, suggesting that it’s more complex than previously envisioned.  Although the new results are touted as overturning the earlier story, they really don’t. There is still human genetic evolution promoted by a change in culture, and there’s still a reproductive advantage in drinking milk.

The new part of the story is simply that that reproductive advantage comes not constantly (as previously envisioned), but only during times of famine and disease, when those who couldn’t digest lactose were at a severe disadvantage because the diarrhea caused by lactose intolerance would contribute to the death of diseased or malnourished individuals. This is a twist on the main story, but doesn’t overturn it completely. There’s still the connection between culture and human evolution, and there’s still a reproductive advantage to LP that leads to natural selection and genetic evolution of our species.  What’s different is how and when the selection acts (see “the upshot” at the bottom).

Click the title screenshot below to read, or you can download the pdf here. The full reference is at the bottom, and Nature deemed this worthy of two News and Views pieces in the same issue: (here and here).

First, the authors show the spread of dairy use in the figure below (the redder the color, the more milk usage over time in Eurasia. This was estimated from looking at the frequency of pot shards that had milk residue (click to enlarge). By 1500 BC, milk use was widespread.

Caption (from Nature): Interpolated time slices of the frequency of dairy fat residues in potsherds (colour hue) and confidence in the estimate (colour saturation) using two-dimensional kernel density estimation. Bandwidth and saturation parameters were optimized using cross-validation. Circles indicate the observed frequencies at site-phase locations. The broad southeast to northeast cline of colour saturation at the beginning of the Neolithic period illustrates a sampling bias towards earliest evidence of milk use. Substantial heterogeneity in milk exploitation is evident across mainland Europe. By contrast, the British Isles and western France maintain a gradual decline across 7,000 years after first evidence of milk about 5500 BC. Note that interpolation can colour some areas (particularly islands) for which no data are present.

One reason the authors doubt the classical story is that while dairying and milk-drinking by adults began about 10,000 years ago, the gene for LP (determined from sequencing “fossil DNA”) didn’t spread widely until about 4,000 years ago.  Why is that? The mutation for LP is dominant, which means it could have spread widely very quickly, as even carriers of one copy would have a reproductive advantage. This temporal disparity is what led the authors to propose their alternative hypotheses for the spread of the LP alleles (there are several).

Further, when the authors tried to correlate the frequencies of the LP allele with the frequency of milk use (the classical explanation), they found no correlation—that pattern was indistinguishable from a general rise in frequency over Europe regardless of milk use.

One other set of data led to the new hypothesis. That is the observation that LI people in both Britain and China can still drink lots of milk without suffering any measurable health or reproductive effects (milk drinking has recently proliferated in China).  Of course, things are different now from 4000 years ago, but one of the differences led to the authors’ two hypotheses: the spread of the LP allele was promoted especially strongly in prehistoric times by the prevalence of famine and of disease—with the latter coming often from animals, either domesticated or those that hang around settlements. (As the authors note: “about 61% of known and about 75% of emerging human infectious disease today come from animals”).

So the authors erected two hypotheses, the crisis mechanism and the chronic mechanism. I’ll let them describe the hypotheses that they tested (my emphases)

Given the widespread prehistoric exploitation of milk shown here and its relatively benign effects in healthy LNP individuals today, we propose two related mechanisms for the evolution of LP. First, as postulated in ref. 24, the detrimental health consequences of high-lactose food consumption by LNP individuals would be acutely manifested during famines, leading to high but episodic selection favouring LP. This is because lactose-induced diarrhoea can shift from an inconvenient to a fatal condition in severely malnourished individuals and high-lactose (unfermented) milk products are more likely to be consumed when other food sources have been exhausted. This we name the ‘crisis mechanism’, which predicts that LP selection pressures would have been greater during times of subsistence instability. A second mechanism relates to the increased pathogen loads—especially zoonoses—associated with farming and increased population density and mobility. Mortality and morbidity due to pathogen exposure would have been amplified by the otherwise minor health effects of LNP in individuals consuming milk—particularly diarrhoea—due to fluid loss and other gut disturbances, leading to enhanced selection for LP We name this the ‘chronic mechanism’, which predicts that LP selection pressures would have increased with greater pathogen exposure.

In other words, the reproductive advantage of having the LP allele came from the reproductive disadvantage (through death) of lactose-intolerant people during times of famine and disease.

They tested the two hypotheses by correlating indices of famine and of disease deduced from archeological and paleontological evidence:

Crisis mechanism: “Subsistence instability”, or famine, was assessed by prehistoric fluctuations in population size, which, the authors say, is correlated with the likelihood of famine (they provide no evidence for the latter supposition). But the correlation gives a significantly better fit to the pattern of LP allele frequency than just assuming uniform selection over time and space.

Chronic mechanism:  The authors hypothesized that the frequency of disease would correlate with the likelihood of “zoonoses” (diseases caught from animals), which itself would correlate with temporal variation in settlement densities.  These data, which to me would be correlated with “prehistoric fluctuations in population size” above, also explained LP allele frequencies better than an assumption of uniform selection.

Of course, there’s no reason (and the authors say this) that both mechanisms couldn’t operate together. Curiously, though, indices of the density of domestic animals did not support the “chronic mechanism” though measurements of the proportion of wild animals around humans did.  This implies that, if the “chronic mechanism” is correct, people were getting sick not from their horses, dogs, cattle, or sheep, but from wild animals (perhaps from eating them).

Other hypotheses that the authors mention but didn’t test include “drinking milk as a relatively pathogen-free fluid”, allowing “earlier weaning and thus increased fertility.” I would add that if diseases are causal here, they could come not from being around animals, but having drunk contaminated water, giving an advantage to those who prefer milk. But there’s no way of assessing that from the archaeological record.

The upshot: On the last page of the paper the authors say that they’ve debunked the prevailing narrative:

The prevailing narrative for the coevolution of dairying and LP has been a virtuous circle mechanism in which LP frequency increased through the nutritional benefits and avoidance of negative health costs of milk consumption, facilitating an increasing reliance on milk that further drove LP selection. Our findings suggest a different picture. Milk consumption did not gradually grow throughout the European Neolithic period from initially low levels but rather was widespread at the outset in an almost entirely LNP population. We show that the scale of prehistoric milk use does not help to explain European LP allele frequency trajectories and thus it also cannot account for selection intensities. Furthermore, we show that LP status has little impact on modern milk consumption, mortality or fecundity and milk consumption has little or no detrimental health impact on contemporary healthy LNP individuals.

Instead, they say that they find support for the increase of LP alleles through both famine or pathogen exposure.

Well, the data are the data, and their indices comport better with those data than does the classical hypothesis—the “prevailing narrative.” I’m still not convinced that their proxies for famine or disease are actually correlated with famine and disease themselves, but other researchers will undoubtedly dig into that.

What I want to emphasize is that if the work of Evershed et al. is accurate, it still does not overturn the story of gene-culture “coevolution”.  The “coevolution” is still there, the fact that a change in human culture influenced our evolution is still there, and the fact that drinking milk conferred higher reproductive fitness is still there. What has changed is only the nature of selection. Granted, that’s a significant expansion in understanding the story, but to listen to the media—social or otherwise—you’d think that the “classical narrative” is completely wrong. It isn’t. It’s still correct in the main, but the way selection acted may be different from what we used to think. The media love “evolution scenarios are wrong” tales, and that seems to be the cast of at least some stuff I’ve seen in the news and on social media.


Reference: Evershed, R.P., Davey Smith, G., Roffet-Salque, M. et al. 2022. Dairying, diseases and the evolution of lactase persistence in Europe. Nature. https://doi.org/10.1038/s41586-022-05010-7

Vox’s evidence for evolution from vestigial traits in humans

June 14, 2022 • 2:00 pm

Here’s an old video from Vox that shows morphological evidence for evolution in the human body based on vestigial organs and traits. Most of these can be found in Why Evolution is True, but it’s good to see them in video like this.

This takes the website back to its original aim: giving the readers evidence for why evolution is true.

A vestigial trait of birds that may have been functional in ancestors: remote-sensing of vibrations in the bill (still active in the kiwi)

December 11, 2020 • 10:00 am

A new scientific paper from the Proceedings of the Royal Society Series B (first screenshot below) tells a rather complex story that I’ll deliberately simplify to save space. The paper is behind a paywall, but a pdf may be found via judicious inquiry, and the reference is at the bottom.

The article above is aptly summarized by Veronique Greenwood in the New York Times‘s “Trilobite” column

Three groups of birds have evolved a remarkable feature: the ability to remotely sense prey (i.e., detecting prey without touching them) by sticking their bills in the ground and sensing vibrations. These groups are the kiwis, the ibises, and some shorebirds. The detection can occur either through the direct sensing of vibrations of prey movement, or the reflection of sound waves off hard-shelled prey as the bird sticks its beak into the ground. This feature is called “remote touch.”

It turns out that the bill tips of “remote touch” birds are pitted with small depressions that contain cells called “Herbst corpuscles”, which are the motion-detecting organs. aThese birds also have an expanded area of the brain that is used to process the extremely important touch signals.

Other species have different ways of using their bills to detect prey by touch. Ducks and geese have a bony organ at the tip of their bills that also have pits with Herbst corpuscles, but they are organized differently, with mechanoreceptors beside them. These are what ducks use in “dabbling”—turning their butts up and sticking their beak into the dirt or sediments to forage. Finally, parrots have a different kind of bill-tip organ with receptors not located in the bone.

Below are photos from the paper showing the different types of bill tips. The authors also examined skulls of hundreds of living species and dissected beak tissue from many to see if there were Herbst cells associated with the bill pits.

First, a bird without remote sensing, as with most birds. It’s a kelp gull (Larus dominicus). There are a few pits at the tip of the bill, but soft tissue analysis showed no receptor cells. It does not forage by touch.

Here are two birds with remote sensing. First, the hadeda ibis (Bostrychia hagedash). Note the highly pitted bill tip organ (enlarged). It also has the Herbst cells as well as an enlarged bit of the brain for detecting touch. (This bird, like all other birds save the ratites and tinamous, falls into the large group of species called neognaths.)

Ditto for the kiwi, which falls into the other group of birds, the paleognaths, a small group that contains only the large flightless birds or ratites (emus, ostriches, etc.) plus the tinamous, which can fly, but not well. Its remote sensing organ with Herbst cells is located at the very tip of its long bill. Indeed, the ratio of bill length to skull size is one of several keys to diagnosing whether these birds have remote sensing.

Finally, the tinamou, a paleognath that= has a remote-sensing organ containing the pits but no Herbst corpuscles in them. But this species doesn’t feed by probing the ground. Other ratites, like the ostrich and emu, also have a pit-studded bill, but no vibration-detecting cells. The pits seem to be a leftover from an ancestor which had pits that were useful because they contained vibration-detecting cells. In other words, they’re a vestigial trait.

The other ratites also lack the expanded brain regions for processing information from the touch receptors. This makes sense, for while it may not cost much to retain some pits in the bill when you don’t need them, brain tissue is metabolically expensive, and if you’re not using it it would pay to divert those resources to other functions that would help you reproduce.

As I said, the presence of the pits in birds that don’t use them suggest that this trait is a vestigial trait carried over from an ancestor.  One can distinguish the remote-touch birds from other species by a combination of bill length/skull size ratio, number of pits, and spacing between the pits.

But which ancestor? It turns out that we have fossil skulls of ancient extinct birds, the lithornithids, which are very early paleognaths. Although soft tissue wasn’t available for these birds, some of the species show the mechano-sensing organ—as evidenced from the number and spacing of the pits, as well as the bill/skull ratios characterized by remote foragers. Here’s a photo of the two lithornithid skulls; captions under the photos are from the paper (click to enlarge photo).

Cranial fossils of two species of lithornithids, showing high degree of pitting on the surfaces of their beaks, similar to all extant palaeognathous birds, potentially indicative of a bony bill-tip organ. (e) Lithornis promiscuus: (i) skull and attached maxilla (USNM 391983) showing the shape of the beak relative to the skull; (ii) distal portions of maxilla and mandible (USNM 336535). ( f ) Paracathartes howardae: maxilla (USNM 404758) and distal portion of mandible (USNM 361437).

The conclusion is that putative ancestors of the paleognaths were remote-touch-sensing species. The fact that living paleognaths like emus and tinamous still retain the pits suggests that this nonfunctional “organ” is a useless remnant of a trait inherited by all paleognaths from a lithornithid ancestor.  Indeed, the authors think that the ancestor of all birds might have been a remote-sensing prober (my emphasis):

Our analyses corroborate that the basal palaeognaths, the small, volant lithornithids, had a tactile bony bill-tip organ enabling them to use remote touch to locate buried invertebrate prey items. This finding, combined with our understanding of the evolution of the lithornithids, suggests a Cretaceous origin of the remote-touch sensory system in modern birds before the palaeognathneognath split.

As for why among living paleognaths only the kiwi has a functional touch organ when it was present in an extinct ancestor, that could be explained by either of two scenarios. The first involves, the organ’s loss in a more recent ancestral species and then the re-acquisition of the organ in just the kiwi lineage. The second possibility is that the kiwi kept an ancestral remote-probing organ while all the other paleognaths lost it. The authors are unable to distinguish between these two scenarios.

What about the neognaths that have remote-sensing organs, like the ibis or shorebirds? Did they retain the ancestral touch organ while all other neognaths—the vast majority of living birds—lost it? Probably not; as the authors say, this is an independent case of evolution.

What is even more fascinating is the possibility that this ability to detect prey remotely may have been present in the reptilian ancestors of birds, which may scientists think are the theropod dinosaurs:

Interestingly, there is increasing evidence that some non-avian theropods had specialized sensory structures located on the distal portion of their rostra, based on a high degree of external foramina/pitting preserved on their mandibles. We speculate that perhaps such sensitive snouts in non-avian theropods may have been precursors to the evolution of remote touch in their avian relatives.

It’s interesting to note that alligators and crocodiles also have touch-sensitive “dome receptors” in their upper jaws, also associated with pitting in the bones.  The archosaurs are a group of early reptiles ancestral to both birds and crocodilians, and maybe the receptors we see in crocs and gators are related, in some way, to the pits in the beak of the kiwi.

This is all speculative, but what seems pretty solid is that the bill pitting and useless “touch organs” in non-kiwi ratites and tinamous are vestigial remnants of a functional organ in an ancient ancestor. And that’s evidence for evolution.


Toit, C. J. d., A. Chinsamy, and S. J. Cunningham. 2020. Cretaceous origins of the vibrotactile bill-tip organ in birds. Proceedings of the Royal Society B: Biological Sciences 287:20202322.

An atavistic claw in a duckling?

June 2, 2020 • 12:30 pm

The other day I took a picture of this juvenile mallard—one of Honey’s babies—and a friend noticed it had what appeared to be an atavistic claw on its wing. At least I think it’s on its wing; it could be on a  foot tucked behind the bird. But I doubt it.

Here I’ve circled it:

And enlarged it:

The question is whether this is an atavistic claw: the remnant of the claw that was on the reptilian forelimb, and was also prominent in early birds (ignore the labeling of Archaeopteryx as “the earliest known bird”.

Birds also have “spurs“, which are outgrowths of bone that aren’t developmentally homologous to a true claw. But the duckling above seems to have a true claw; it doesn’t look like a bone spur, but is recurved and apparently made of keratin.

Real bird claws, as in the hoatzin,  grow from the digit that’s in the bird wing; in this case it would be the “thumb”. Here’s a “normal” bird.


But birds like hoatzins have true claws, especially in the chicks, which use them to climb back into trees when they fall in the water. I’ve put an Attenborough video of this behavior below the picture, and what it shows is that the claim that a “vestigial” character has to be nonfunctional to be considered vestigial is incorrect. Vestigial traits are simply remnants of traits that evolved earlier but have been coopted for a different function (“exaptations”, Steve Gould might call them). The hoatzin’s claw, very useful for the bird, is certainly a vestigial trait, and is just as much evidence for evolution (of birds from reptiles) as if it were completely nonfunctional.

Some species of waterfowl are known to have these claws (see here and here), but I can’t find something explicitly on mallards.

So we have a mystery here, and I’ve asked a few experts to weigh in. This is either a true atavistic claw in the wing, or one of the claws (nails) in the duck’s rear foot, which could be tucked behind it. You can weigh in, or wait for an answer. Stay tuned.

h/t: Nicole, Greg


My talk in Tallahassee in late March

February 25, 2020 • 12:00 pm

In almost exactly one month, I’m speaking to the Tallahassee Scientific Society in Tallahassee, Florida. My talk is on Thursday, March 26, and I think the time and venue are the same as those for the previous speaker: 7 p.m. at Tallahassee Community College’s Center for Innovation on Kleman Plaza. The topic is “Why Evolution is Still True”, and I’ll give a brief rundown of the evidence for evolution (updated in light of new discoveries), followed by discussion of why Americans remain so resistant to this scientific truth.

I’ll give one more announcement in mid-March or so, and all are welcome to come. I believe they’ll also have my two trade books on sale, which I’ll be glad to autograph. And, if you tell me the genus and species of any felid besides the house cat, I’ll draw a cat in it.

Here’s a photo I sent them to use for advertising the talk; the picture is from Wikipedia so it’s in the public domain. Toes, teeth, and size!

Photo credit: H. Zell (from Wikimedia Commons; CC license CC BY_SA 3.0).


More evidence for evolution: Horse embryos start forming five toes, and four primordia disappear

February 10, 2020 • 9:00 am

When I started this website in 2009, my intention was just to publicize my new book, Why Evolution is True. On the advice of my publishers, I created a site with the idea of occasionally posting new evidence for evolution to complement what was in the book. I expected to post about once a month or so.  Well, what a monster this has become!

But today I’m writing about some new work that fits perfectly with the original aim of this site. It’s a paper in the Proceedings of the Royal Society by Kathryn Kavanagh et al. that gives developmental evidence for the five-toed ancestry of modern one-toed horses. You can read the paper by clicking on the screenshot below or reading the pdf here ; a full reference is at the bottom. If you want a short but less informative piece, the New York Times has a report

Lots of organisms show developmental evidence for their evolution from very different ancestors; I describe some of this in chapter 3 of Why Evolution is True.  Embryonic dolphins, for example, develop hindlimb buds, which in their four-legged ancestors went on to become legs, but in the dolphin the buds regress before birth, leaving newborns with no hind limbs. We humans, like all terrestrial vertebrates, begin development by forming what go on to become gill slits in our fishy ancestors. In reptiles, amphibians, and mammals, though, those gill slits are transformed into other stuctures, like our esophagus.

In my evolution class I talk about the lanugo: the thick coat of hair that human fetuses develop at about six months after conception. It’s shed before birth—but not in chimps, our closest relatives, who are born hairy (remember: we are the “naked ape”). The transitory formation of that coat of hair in our species, which is of no use to the embryo, can be explained only by our descent from a primate with hair.

So if these transitory features disappear, why do we see them at all? We’re not sure, but their appearance may be necessary to provide developmental “cues” for the appearance of features that do remain. Remember, development is a very complex process which requires a nexus of coordinated features appearing at the right times. In the dolphin, for instance, the hind limb buds may provide cues for the development of other skeletal structures, and then disappear because they are no longer needed, for natural selection would remove them because they’re cumbersome, a waste of resources, and unnecessary in a marine mammal.

Horses are another example. We know from the fossil record that modern horses evolved from five-toed ancestors, but what we have left, the lower leg and hoof, are the remnants of only the middle toe. The other four toes disappeared over time (we can see this in the fossil record), though the two toes flanking the middle one remain as vestigial “splint bones” on the horse’s leg. The outer two toes are gone completely. Here’s a drawing of one of the vestigial toes in a modern horse: a splint bone (lateral view; there’s one on the other side, too):

Source: Atlanta Equine Clinic

But you can see all five toes if you look closely during development, as reported in this new paper. If you get horse embryos at the right early stage of development, you can see the primordia for all five toes forming, with the outer ones fusing and shrinking to leave only the middle toe, which becomes the lower leg and hoof. Previously, nobody had been able to see this evidence of ancestry in embryos, but Kavanagh et al. managed to get the right material.

The authors procured (don’t ask me how) horse embryos from artificially inseminated mares, and analyzed four of them by making tissue sections of embryos between days 29 and 35 after copulation. (There is a very narrow window of time to see the primordia for all five digits, as shrinkage and disappearance of the four superfluous ones is fast.)

First, here’s a diagram of what the primordia look like. On the right side you see the evolutionary progression of horse ancestors (also shown below), starting with five, then four, and then three, with the two side toes gradually being reduced to the splint bones. We’re not sure why this happened, but a likely explanation is that at the time these horses were evolving—and they evolved in what is now North America—the climate was drying up and the forests of the West were giving way to grasslands. While toes are good for running fast around trees and vegetation, if you’re escaping predators in a featureless grassland, you want a hoof to run fast and straight.

On the left side, in blue, are the toe primordia in four embryos; those blue bits are regions where cartilage would normally condense and then bone would form. (“FL is “foreleg” and “HL” is “hindleg”). You can see that two of the embryos have five primordia early on, with the central one, which becomes the lower leg, being the largest, as it’s the toe that will become the hoof. The other two embryos have three primordia, with the central one the largest.

Figure 1. (a) Illustration of arrangement and relative sizes of pre-cartilaginous condensations in developing Equus FL and HL digits based on reconstructions of histological sections of 30–35 dpc embryos from this study. (b) Fossil transition series of adult horse FL digits (isometrically scaled) showing the sequence of reduction of anterior and posterior digits and increasing dominance of central digit III. (i) Phenacodus (AMNH 4369), (ii) Hyracotherium (AMNH 4832), (iii) Mesohippus (AMNH 39480 and AMNH 1477), (iv) Hypohippus (AMNH 9407), (v) Hipparion (AMNH 109625), (vi) Dinohippus (AMNH 17224). Illustration from Solounias et al. [6]. (Online version in colour.)
Here’s a photo of the two embryos showing five toe primordia at different points along the legs. You can see five shadowy primordia, with the largest in the middle in (b) of embryo 1 and (g) of embryo 2. (“Proximal” is toward the body and “distal” is toward the future hoof.) In later embryos studied by other people, the big primordium in the middle goes on to develop into the lower leg and hoof, and the others disappear.


Now of course we don’t need this kind of evidence to show evolution, or even to show evolution of toe loss in horses, as we have an excellent fossil record of horses and a good idea of their “family tree”. Here’s a figure from the Encyclopedia Brittanica:

But the developmental evidence is a nice confirmation of what the fossils tell us, and add the information that the evolutionary sequence of toe loss is mirrored in the developmental sequence of modern one-toed horses. This is a version of the “biogenetic law” stating that “ontogeny recapitulates phylogeny” (i.e., development mimics evolutionary history). That law has many exceptions, for sometimes the ancestral stages are competely lost in embryos, but it does hold for horse toes. First five, then three, then one—in both development and in the fossil record.

By the way, sometimes the side toes don’t disappear, but, probably through a screwup in development, form rudimentary adult toes, producing polydactylous horses like this one:

Similarly, sometimes the dolphin’s hind limbs don’t disappear and we get dolphins with little legs sticking out of its rear, like this one that I show in my “evidence for evolution” talk:

UPDATE: I forgot to include the authors’ point that many vertebrates have lost toes from the ancestral five, and that these species are ripe for embryological investigations of the type shown here. They give a table of some of these animals. Would embryos of the camel or the three-toed jerboa, for instances, show five toe primordia that are then lost? We don’t know if this is the case, and the absence of five primordia wouldn’t disprove evolution, for the retention of toe primordia is a lucky (for us) feature of development, but isn’t expected in every case.


Kavanagh, K. D., C. S. Bailey, and K. E. Sears. 2020. Evidence of five digits in embryonic horses and developmental stabilization of tetrapod digit number. Proceedings of the Royal Society B: Biological Sciences 287:20192756.


Nathan Lents on the imperfection of the human body (it’s evolution, of course)

January 10, 2020 • 12:45 pm

UPDATE:  I found out that the well-known evolutionary geneticist John C. Avise published a related book in 2010, but one that concentrates on a different line of evidence for evolution. John’s book (screenshot of cover below with link to Amazon) lays out the many suboptimal features of the human genome. He thus concentrates on molecular evidence, noting the many features in that bailiwick whose imperfection gives evidence for evolution and against intelligent design.  Lents’s and Avise’s books thus make a good pair, since the former seems to deal mostly with anatomy and physiology and the latter with molecular data. I’ll be reading both of them.


Biologist Nathan Lents, whose abbreviated c.v. is given below, has been featured on this site before, both as a critic of creationism (good), but also as a defender of the Adam-and-Eve apologetics pushed by his religious friend Josh Swamidass (bad). But chalk up another two marks on Lents’s “good” side.  First, he’s written a book (click on screenshot below) that lays out all the suboptimal features of the human body—features whose imperfection gives evidence for evolution. I’m getting the book for teaching purposes, and here’s the Amazon summary:

Dating back to Darwin himself, the “argument from poor design” holds that examples of suboptimal structure/function demonstrate that nature does not have a designer. Perhaps surprisingly, human beings have more than our share of quirks and glitches. Besides speaking to our shared ancestry, these evolutionary “seams” reveal interesting things about our past. This offers a unique accounting of our evolutionary legacy and sheds new light on how to live in better harmony with our bodies, in all their flawed glory.

Nathan Lents is Professor of Biology at John Jay College and author of two recent books: Not So Different and Human Errors. With degrees in molecular biology and human physiology, and a postdoctoral fellowship in computational genomics, Lents tackles the evolution of human biology from a broad and interdisciplinary perspective. In addition to his research and teaching, he can be found defending sound evolutionary science in the pages of Science, Skeptic Magazine, the Wall Street Journal, The Guardian, and others.

And here’s a half-hour Center for Inquiry talk, clearly based on his book, in which Lents discusses how the flaws in the human body instantiate evolution. It’s not just that there are flaws—which support the notion that natural selection doesn’t produce absolute perfection, but simply the best result available given the existing genetic variation—but, more important: those flaws are understandable as the result of our evolution from ancestors who were different from us.

Some of Lents’s examples (like our broken gene in the Vitamin C synthesis pathway), are discussed in WEIT, but others, like the bizarre configuration of our nasal sinuses, aren’t. I haven’t seen the book, but it looks like a good compendium of evidence for evolution using something that everyone’s familiar with: the glitches and bugs in the human body.

It’s a good talk, and Lents is an energetic and lucid lecturer. I recommend that you listen to this, for you’ll learn stuff that will stay with you, and also serve to help you argue with creationists.

h/t: Michael

Vestigial limb muscles in human embryos show common ancestry—for the gazillionth time

October 6, 2019 • 9:00 am

There are three kinds of vestiges that constitute evidence for evolution, or rather its sub-claim that modern species share common ancestors. I discuss all three in Why Evolution is True:

1.) Vestigial traits that persist in modern species but either have no adaptive function in a species or a function different from the one served in that species’ ancestors. The vestigial ear muscles of humans are one, the flippers of penguins (functional, but not for flying in the air) is another, and the coccyx in humans (sometimes with attached “tail muscles” that can’t move it) is a third.

2.) Vestigial genes that are functional in our relatives (and presumably in our ancestors) that have been inactivated in some modern species. There is no explanation for these “dead genes” save that they were useful in ancestors but aren’t useful any longer. Examples are “dead” genes that code for egg yolk proteins in humans (but don’t produce them); a dead gene for vitamin C synthesis in humans (we don’t make the vitamin because that gene is inactivated, but rather get it from our diet; and the many dead “olfactory receptor” genes in cetaceans (whales, dolphins, etc.)—genes that were active in their terrestrial ancestors but became inactivated because “smelling” underwater uses different genes and traits.

3.) Features in development that are transitory, and whose appearance makes sense only under the supposition that those features were present in common ancestors and persist in some descendants but not others. The lanugo (a transitory coat of hair in human embryos) is one.

Today’s paper, which just appeared in the journal Development, shows several other “transitory” evolution-attesting features. Diogo et al. show that human embryos develop muscles that disappear as development proceeds, but those muscles don’t disappear in some of our relatives, including closely related ones like other primates as well as distant relatives like reptiles.

Moreover, these muscles, which disappear in most human embryos, sometimes don’t disappear, persisting in adults as rare and nonfunctional variants. Or they appear in malformed individuals, with both phenomena often seen in “vestigial traits”. For example, some people are born without wisdom teeth, considered a vestigial holdover from our ancestors; and the functionality of human vestigial ear muscles that move the ears in our relatives, like cats and dogs, is variable: some people like me are able to move those muscles and wiggle their ears, while others can’t.

Click on the screenshot below to access the paper, and the pdf is here (reference at the bottom of this post).

The authors visualized the muscles in the embryonic arm and leg by doing immunostaining—using antibodies that would affix to proteins in the muscles and also carried ancillary molecules that would make those muscles more easily visualized under the microscope in a three-dimensional way. The authors used 70 antibodies, but the main ones bound to muscle-specific proteins like myosin and myogenin.

They stained the mounted limb sections of 13 embryos (presumably from abortions) ranging from nine to thirteen weeks after gestation (quantified as “gestational weeks”, or GWs), and with the standard measurement “crown-rump length” (CR) ranging from 2.5 cm to 8.0 cm (about 1 to 3 inches). These were thus very small embryos, but the sophistication of the technique, and the efficacy of the stain, combined with our knowledge of embryonic development and tetrapod muscle anatomy, enabled the authors to produce pictures like these: the muscles in the hands of a 10 and 11-GW fetus:


What they found is that human embryos show a number of muscles present in the adults of some other tetrapods (including our closest relatives, the chimps), but that disappear during human development, with a few of these “atavistic muscles” fusing with other muscles in human fetuses although remaining distinct in our tetrapod relatives.

Here’s how the authors describe the main results, listing some of the atavistic muscles in the embryos (I’ve put them in bold):

As summarized in Tables 2-5 and also noted above, various atavistic muscles that were present in the normal phenotype of our ancestors are present as the normal phenotype during early human ontogenetic stages and then disappear or become reduced and completely fused with other muscles, thus not being present/distinguishable in human adults. These include the upper limb muscles epitrochleoanconeus (Fig. 3), dorsoepitrochlearis, contrahentes 3-5 (Fig. 4) and dorsometacarpales 1-4 (Figs 3-5), and the lower limb muscles contrahentes 3-5, dorsometatarsales 1-4 (Fig. 6) and opponens digiti minimi (Fig. 6). These muscles are present in some other tetrapods, as shown in Tables 6 and 7, which summarize the comparisons with other limbed vertebrates. Of all these muscles, only the dorsometacarpales often remain in adults, fused with other muscles: all the others are normally completely absent in human adults. Fascinatingly, all these atavistic muscles are found both as rare variations of the normal adult population and as anomalies in individuals with congenital malformations such as those associated with trisomies 13, 18 and 21, reinforcing the idea that such variations and anomalies can be related to delayed or arrested development.

Here are two of the fetal atavistic muscles. First, the dorsometacarpales in the hand, which are present in modern adult amphibians and reptiles but absent in adult mammals. The transitory presence of these muscles in human embryos is an evolutionary remnant of the time we diverged from our common ancestor with the reptiles: about 300 million years ago. Clearly, the genetic information for making this muscle is still in the human genome, but since the muscle is not needed in adult humans (when it appears, as I note below, it seems to have no function), its development was suppressed:


Here’s a cool one, the jawbreaking “epitrochleoanconeus” muscle, which is present in chimpanzees but not in adult humans. It appears transitorily in our fetuses. Here’s a 2.5 cm (9 GW) embryo’s hand and forearm; the muscle is labeled “epi” in the diagram and I’ve circled it:

This muscle must have become nonfunctional, and reduced in development, over the last six million years or so, when the common ancestor of humans and chimps gave rise to our separate lineages.

An interesting sidelight of this study is that some of these vestigial muscles occur as rare variants in adult humans, either via developmental “accidents” or as part of congenital malformations. Presumably these screwups in development block the genetic changes that normally lead to the suppression and disappearance of muscles in embryos. Variable expression of vestigial traits is common in organisms where the traits haven’t evolved into something else that’s useful. (For more on human vestigial traits, see the Wikipedia article on “human vestigiality”). The authors note that when the muscles do appear in adults, they are “functionally neutral, not providing any type of major functional advantage or disadvantage.”

The presence of these vestigial muscles is pretty irrefutable evidence of evolution and common ancestry, for there’s no reason why either God or an Intelligent Designer (a pseudonym for “God” to ID advocates) would put a transitory muscle in a human fetus that’s of no use whatsoever, but just happens to resemble the fetal muscles that goes on to develop into adult muscles in our relatives.  I wonder how creationists, including IDers, will explain this as the work of a designer. Will they say the muscles are really functional in a fetus? If so, why do they disappear? And doesn’t the fact that they go on to develop into functional muscles in our relatives like chimps and reptiles say something about common ancestry?

Two more points:

1.) The order of appearance of these muscles in development doesn’t completely comport with their order of evolution. This shows that the “recapitulation theory”—that the order of development mimics the order of evolution—isn’t completely obeyed. But we’ve known that for a long time. The time of appearance of a trait in development can be changed by other factors, like its usefulness in “priming” the development of other features. But this doesn’t overturn the very strong conclusion that the presence of transitory muscles in the human fetus that remain in adults of our relatives is evidence for evolution.

2.) Finally, muscles in the arms and legs that appear “homologous” (i.e., have the same evolutionary origin) may have had independent evolutionary origins, and may involve different genes, so they’re not really “homologous” in the way evolutionists use that term. As the authors note,

These differences support the emerging idea that the topological similarities between the hand and foot of tetrapods, such as humans, are mainly secondary (see recent reviews by Diogo et al., 2013, 2018; Diogo and Molnar, 2014; Sears et al., 2015; Miyashita and Diogo, 2016). This idea is further supported by the fact that the order of developmental appearance of the hand muscles is markedly different from that of the corresponding foot muscles (Tables 6, 7). As an illustrative example, whereas the lumbricales are the first muscles to differentiate in the hand, together with the contrahentes (Table 6), in the foot the lumbricales differentiate only after most other foot muscles are already differentiated (Table 7). Thus, these developmental data and evidence from comparative anatomy and from the evolutionary history of human limb muscles (see Tables 6, 7) indicate that several of the muscles that seem to be topologically similar in the human upper and lower limbs actually appeared at different evolutionary times; appear in a markedly different ontogenetic order; derive from different primordia; and/or are formed by the fusion of different developmental units in each limb.

Now the authors didn’t do this study to demonstrate evolution; like most rational people, they accepted it long ago. Rather, their stated aim was to “build an atlas of human development comprising 3D images. . . that can be used by developmental biologists and comparative anatomists, as well as by professors, students, physicians/pathologists and the broader public.” But one of the bonuses, especially for the broader public, is the very clear demonstration of the common-ancestry tenet of modern evolutionary theory.

h/t: Liz


Diogo, R., N. Siomava, and Y. Gitton. 2019. Development of human limb muscles based on whole-mount immunostaining and the links between ontogeny and evolution. Development 146: