Forewarned is forearmed: crickets warn their eggs of nearby spiders

March 1, 2010 • 7:29 am

A new paper in The American Naturalist, by Jonathan Storm and Steven Lima, shows that pregnant female crickets exposed to predatory wolf spiders can somehow “warn” the eggs they carry about the presence of those spiders, so that the offspring of those spider-exposed crickets show antipredator behavior.  This sort of looks like a case of “Lamarckian” inheritance—that is, the inheritance of an acquired trait—but it’s almost certainly not.

The authors used field crickets (Gryllus pennsylvanicus) bred in the laboratory, testing them with the wolf spider Hogna helluo.  To prevent the spiders from actually eating the crickets, they were fed to satiation before the trials, and their fangs were covered with wax.   Pregnant female crickets were placed in a terrarium with the spiders; the terrarium had also been “conditioned” by allowing the spider to inhabit it for two days before the trials, allowing the beast to leave scent and silk deposits in the arena.  The crickets remained exposed for ten days, while a control group was unexposed.  The offspring of both groups were then tested for “predator wariness”: the proportion of time that the crickets spent mobile or immobile in the presence of a spider.  Storm and Lima also measured the offspring’s chances of actually being eaten by a hungry spider.

Surprisingly, they found a difference—a difference in the adaptive direction.  The crickets whose moms had been exposed to spiders spent significantly more time being immobile than did control crickets.  This difference translated into survival: “exposed” crickets were eaten at a significantly lower rate than “unexposed” ones, though in this case the differences were small. (The difference apparently resulted from “exposed” crickets spending more time in refuges in the terrarium, making them less visible to the spider.)

Exposing the laid eggs and nymphs to the spider and spider-cues themselves showed no effect, suggesting that the mother actually does something to the eggs that changes the behavior of the crickets who hatch from them.

Finally, the authors wanted to see if this effect might apply not just in the laboratory, but in the field.  They collected pregnant female crickets from three sites in Indiana that had wolf spiders, and three nearby sites that had no wolf spiders.  Again, offspring of mothers from the “spider sites” showed significantly higher immobility in the presence of spiders (in the lab) than did offspring from “nonspider sites.”   This latter result may, however, simply reflect local natural selection: that is, the mothers in this case aren’t really “warning” their eggs about spiders, but the localities may simply show differential adaptation, so that “spider site” crickets have been selected to be more wary of predators.  There might be no warning needed: all crickets from spider sites could simply have evolved wariness. (The authors do note this possibility.)

What does this mean? Well, it’s one of a few studies in which there are facultative, adaptive “maternal effects” allowing an adaptation to be activated before it’s “needed.”  A similar result has been found in Daphnia cucullata: mother Daphnia exposed to predatory midges produce offspring having a “helmet” morphology that makes them more resistant to being eaten than are offspring of nonexposed Daphnia.

The study raises four questions:

Exactly how do mother crickets warn their eggs? Answer: we don’t yet know.  The authors suggest that exposed crickets might affect their eggs by releasing hormones that induce the antipredator behavior.  As their experiments show, though, it’s clearly something that the mother does, since exposing eggs or young nymphs to the crickets themselves shows no effect.

How could this result from natural selection? That is, how can an adaptation start to arise before it’s “needed”?  Well, this isn’t really a problem for natural selection, at least conceptually.  If there is a reliable environmental cue that persists between parent and offspring generations, any gene that permits a mother to induce her offspring to behave adaptively will be favored. There are lots of traits for which the expression begins before the relevant selection pressure appears.  Birds begin to fly south before the winter comes.  Plants use daylight cues to prepare for winter.  It’s easy to see how natural selection could favor using reliable environmental cues to trigger an adaptive behavior so it’s in place when it’s needed. What is cute about Storm and Lima’s study is that the parents actually provide the cues for their young.

Could this be Lamarckian? That is, perhaps the adaptive behavior is simply an acquired trait passed from parents to offspring,a trait that somehow got embedded into the genes—as if parents who worked out in a gym would produce more muscular babies.  Some evolutionists, like Eva Jablonka and Marion Lamb, have suggested that a form of Lamarckian inheritance could be important in evolution.  I don’t think they’re right, simply because we have little evidence that acquired traits can be inherited, and for many adaptations it’s even hard to envision how their initial appearance could be induced by the environment.

But here’s a way to test this in the spider case: use a form of “family selection,” something common in animal breeding. Simply take a bunch of cricket families, and expose one member of each family to a spider.  For the individuals who are more wary, breed the next generation from their unexposed brothers and sisters.  Then again construct a bunch of family groups from those individuals, test one individual from each new family, breed from the nonexposed relatives of the warier crickets, and so on.

After a few generations, see if this form of family selection has produced increased wariness of spiders. If it has, it shows that you can accumulate hereditary factors promoting resistance to predation without ever having being exposed to spiders. That is, while spiders are the selective factor promoting predator resistance in crickets, you can build up that resistance without ever having seen a cricket.  In other words, the trait is not Lamarckian.

This is the kind of experiment that was done, half a century ago, to show that mutations conferring antibiotic resistance in bacteria were not actually induced by the antibiotic, but were there to begin with in unexposed populations. (You may have heard of “replica plating,” devised by Joshua and Esther Lederberg in the 1950s.)

Is it epigenetic? The idea of “epigenetic inheritance”—inheritance based on things other than change in the base sequences of genes—is also a popular criticism of the neo-Darwinian “paradigm”.  (Jablonka and Lamb have been especially vocal proponents of this view.)  Things like DNA methylation, for instance, can be transmitted from parents to offspring (that’s how “imprinting” of genes occurs), and there’s some evidence that such effects can persist for more than one generation.

Well, some of this epigenetic inheritance almost certainly reflects evolution based on real DNA changes. For example if it’s adaptive to mark parental versus maternal chromosomes differentially, as David Haig at Harvard suggests, then that differential marking itself is probably coded by the DNA.

Regardless, though, we can test whether this “adaptive maternal effect” in crickets is purely epigenetic or DNA-based.  Simply take those two populations of crickets from Indiana that show differential response to spiders, and hold them in the laboratory for about three generations without exposure to spiders.  Since non-DNA-based epigenetic differences are known to disappear after one or two generations, the populations should quickly lose not only their difference in adaptive “imprinting,” but the phenomenon of adaptive maternally-based wariness itself.   If, on the other hand, the population difference (or the trait itself) is based on changes in the DNA, the behavior will decay much more slowly (if at all) when selection maintaining the trait is relaxed.

Storm and Lima’s study has been reported widely in the press and, to their credit,  journalists have avoided touting their results as somehow disproving Darwinism or DNA-based adaptation.  Lamarckism and epigenetic inheritance remain formal possibilities here, but in the absence of any evidence that they’ve been important in the evolution of adaptations, it’s hardly worth looking for them.

Fig. 1.  Wolf spider, Hogna helluo


Storm, J. J. and S. L. Lima. 2010.  Mothers forewarn offspring about predators: a transgenerational maternal effect on behavior. Amer. Natur. DOI: 10.1086/650443

22 thoughts on “Forewarned is forearmed: crickets warn their eggs of nearby spiders

    1. Starts off relatively OK, but the descent of the cliff over which he falls is spectacular.

      And parenthetically NB: it’s a year old.

  1. stressed parent = stressed offspring

    stressed offspring = prone to flight response

    If the offspring were SPECIFICALLY wary of wolf spiders, but not other predators, they’d have something interesting to explain.

    How is this different from the zillions of mouse studies showing maternal environment affects behavior of offspring?

  2. I don’t understand why epigenetics is considered not DNA-based, when it clearly is. If it were not, then the observed methylation would be random and non-adaptive, which is not the case.

    Epigenetics is just genetics, with some gene products traversing the germ line along with the genes. Nothing revolutionary, or indeed deserving of a new term.

    1. Simply because DNA is composed of nucleic acids. Epigenetics is about methyl groups sitting on the DNA. When one sequences a genome (DNA), there is no info about methylation in that genome. Your comment is akin to being surprised that the carpet is not considered part of the hardwood floor on which it rests.

  3. I was going to suggest further testing about three paragraphs before I got to the “family selection,” suggestions.

    I was going to suggest it differently by swapping offspring from spider sites and non-spider sites for the second, third or further generations for testing.

  4. there is only one theory of evolution!

    Or several distributions of them; here is where I part company with the inclusive formalism preferred by biologists. Testing is excluding failed theories, and it is much easier to see what is going on by adapting the formalism to that.

    Every time you change a parameter over a distributive range, because a test conclusively fails some theories in the parameter space, you have falsified a huge set of theories and retained others.

    For example, reformulating the article it would, for me, come out something like:

    – “For some unfathomable reason the authors note the predictions of a rival theory.” [“spider site” crickets have been selected to be more wary of predators].

    – “Lamarckian theories makes extra-ordinary predictions and so need extra-ordinary evidence, but satisfy themselves with predicting a subset of evolutionary predictions.”

    – “Of course each and every prediction [“adaptive maternal effect”] that can be tested strengthens evolution, but a) it is already the best tested theory there is, and b) we have more exciting stuff to test!”

    The article and your comment are good though. It’s just that I have to translate them to my preferred theory of science.

    1. “Lamarckian theories makes extra-ordinary predictions and so need extra-ordinary evidence, but satisfy themselves with predicting a subset of evolutionary predictions.”

      Duh, that came out extra-ordinary stupid.

      I mean that these theories are capable of extra-ordinary predictions as they build on extra-ordinary (non-observed) mechanisms. Yet their proponents do not try any more.

  5. This is a bit of a digression, but Jablonka & co. are misusing the word “epigenetics”, and I would advise Jerry (and others) not to follow their unfortunate lead.

    Epigenesis refers to the theory (now well established) that organismal development proceeds from a germ (a zygote for sexual species) through a series of stages to the adult stage. In animals, development generally goes from a single cell, to a group of cells, to the separation of layers of cells, to the formation and differentiation of organs (apologies to embryologists for this painfully brief summary!). The alternative to this view of development, now generally called preformation, is that a miniature adult is contained within the gametes (often localized to the sperm), and that development consists of an enlargement and unfolding of the tiny creature contained in the gamete. This latter view is, of course, wrong, but it was once a viable theory.

    There is thus no blueprint for the organism (i.e. a miniaturized representation) in development. Rather, as Richard Dawkins felicitously put it, development is a recipe: if you put the right ingredients in the right environments at the right times, you get a cake (or an organism). There is no gene for your chin, just as there is no line in the recipe for the raisin 3 cm in from the top left edge of the cake. Your chin is the result of a large number of genetic factors and environmental conditions, just as the position of the raisin is the result of ingredients and environmental conditions.

    I have often summarized this view of development (which is very important for evolution, as it is the basis of developmental studies that are now elucidating the ‘laws of correlation and growth’ that eluded Darwin) by saying that “Development is epigenetic.” And it is epigenetic, but not in the bizarre and ahistorical sense promoted by Jablonka et al.

    1. Good point. But David Haig, in a recent critique of Jablonka and Lamb’s views, pointed out that the word “epigenesis” has actually been used in two distinct ways: the way you use it and also as “non-DNA-based inheritance.”

      1. I haven’t seen Haig’s critique, but it would be correct to say there have been two usages. My usage dates back at least 200 years. Jablonka’s usage is very recent, and means something completely different than the original meaning, yet has the capacity to confuse. To use a word to mean something it doesn’t mean, and confuse people in the process, is pretty much the definition of an incorrect usage. (Or as Steve Martin put it, “talking wrong”.)

    2. The “epigenetics” usage ship sailed long ago, Jablonak & Lamb aren’t responsible for this. It is often equated with gene imprinting (not to be confused with behavioral imprinting) via DNA methylation–it was/is a handy buzzword that sounded cooler than “methylation,” which sounds like chemistry.

      However, I would say the real usage thieves are those who decided decades ago that the word “gene” should refer exclusively to protein-coding segments of DNA, all other DNA is “junk,” and nothing else that is inherited matters.

  6. What I’d like to know is whether or not there is a significant difference rather than simply a random bias due to a small number of trials (or even a non-random bias in the environment aside from the spider exposure).

    1. The difference is highly significant for “movement” in the lab, less so (p = 0.025) for mortality in the lab, and about the same (p = 0.025 for movement of wild-caught offspring). I’ve put a link to the article in my post, so you can see for yourself.

  7. I can’t tell from the description whether the nymphs of exposed mothers were less mobile only when exposed to the spiders or were they less mobile period. Seems relevant.

    1. To follow up, if the nymphs are less mobile only when the spiders are present, I am impressed. If the nymphs are less mobile whether spiders are present or not (did the experimenters check?), it could be spurious correlation.

  8. I’m really glad you posted this article – or more accurately, I sincerely hoped you’d post an article on this. I’m *just* a computer programmer, but recently started reading a lot about evolutionary theory. When I saw the article originally (or some reference to it) my first thought was “‘Lamarckian’ inheritance? I’m sure it’s not, but I don’t understand ‘Lamarckian’ inheritance enough to know why it’s not.” Now I know.

    PS: Half-way through “Why Evolution is True”.

  9. You do realize you’re talking about “microevolution” and not “macroevolution” here. Just because it proves microevolution(which most creationists will agree is true), it in no way proves macroevolution. That is, not how each generation can have slightly better characteristics that help them to survive, like finch beaks, but how the species came about in the first place.

    1. Read the links in the post below. And, of course, macroevolutoin is “proved” by all the evidence I give in my book, including fossils, vestigial organs, DNA sequences, and embryology.

      As I said in the comment policy some time ago, if you’re coming to this site to make creationist claims, you’d better know your evidence.

  10. I damn near comprehended this whole piece. Fascinating stuff! Living machines are certainly more interesting than the ones requiring Allen wrenches and duct tape.

    Okay… I had to Google “DNA Methylation” (and immediately regretted it), but I think I got the gist of it.

  11. “…a form of Lamarckian inheritance could be important in evolution…”

    “Lamarckian” evolution certainly doesn’t present any conceptual or mechanistic challenges for clonally reproducing, single-celled organisms. Which is most organisms, and most evolution.

    I think that Jablonka & Lamb also focus on things like environmental inheritance–where behavioral phenotypes are passed on via non-genetic means like imitative learning or “built” things–nests, etc. These things are important in some kinds of animal evolution, but I think tend to create confusion when you try to integrate them with gene evolution.

Leave a Reply to Necandum Cancel reply