Moar mimicry: moth mimics spider, and other cool stuff

March 20, 2013 • 6:21 am

The estimable Matthew Cobb called my attention to this post on The Featured Creature, which he found via the Facebook page of “Spider” Dave Penney, a freelance scientist (!) with his own publishing company.

Have a look at this moth. If you saw it in the wild, would you have any idea what the wing pattern means?

photo: John Horstman

photos above and below: John Horstman
Moth images at http://www.flickr.com/search/?w=77995220@N00&q=lygodium

You can see other photos by John Horstman at his Flicker photostream

Well, it almost certainly evolved to make the moth mimic a spider, presumably a predator. As the website says, somewhat breathlessly:

Now this is an example of mimicry at its finest! This newly discovered species (2005) of moth dubbed the Lygodium Spider Moth (Siamusotima aranea) is so named for its preference of feeding on Lygodium species, an invasive Old World climbing fern, and has markings on its wings that make it look just like a spider with orange, spindly legs! This moth mimics a spider so well that I couldn’t even tell what it was at first when I saw the picture from far away!

Why would a moth evolve to mimic a spider? Earlier research on insects (see below) suggests that when prey like this sense an approaching spider, they display their wings (as in the photo below), and that deludes the spider into thinking it’s encountered another spider of its own species. That would be an aggressive conspecific encounter—one from which spiders often flee. A moth with this pattern, then, might escape predation by actually driving away the predator, or at least distracting it for long enough to allow the moth to fly away unharmed.

This evolutionary scenario is speculative, for as far as I know it hasn’t been tested (or even observed) in the moth shown below, but I don’t see any other explanation for the wing pattern.

photo: John Horstman

Note that the patterns on the moth give it eight legs, just like a spider!

National Geographic has another case of a moth mimicking a jumping spider (there’s a video at the link, too).

A paper in Science in 1987 by Mather and Roitberg (no free link, but reference below) shows something similar: patterns on a “true fruit fly” (the tephritid snowberry fly, Rhagoletis zephyria). For a long time these wing patterns were thought to be simply species-recognition marks, until clever zoologists realized that they looked like something else.

This is another rare case of a prey actually mimicking its predator—in this case a jumping spider (the authors used the zebra spider Salticus scenicus for their tests). When the fly senses something approaching, it spreads its wings a bit and wobbles from side to side, much like the gait of a jumping spider.  This makes an approaching jumping spider think that the fly is actually a conspecific spider. And the spider, sensing an aggressive encounter on tap, usually flees. The displaying fly is saved.

Experiments show that spider fled from displaying flies at rates similar to which they fled from other spiders. Further experiments that obliterated the flies’ wing patterns with ink showed that this made them more susceptible to predation (predators didn’t flee as often, but pounced), although there were no controls for the effect of ink-painting on the flies’ well being and vigor.

Here’s what the predator and the tephritid fly look like:

 
Picture 2
 
Finally, a new paper in Proc. Nat. Acad. Sci. USA (free download; reference below) by Sonja Wedmann et al. describes the oldest fossil leaf-mimikcing insect known, a phasmid found in 47-million-year-old deposits in Germany. The fossil is a male, remarkably preserved, and has been given the name Eophyllium messelensis. (Note: if you can’t see a fossil picture directly below, click on the icon.)
 

"Photo

 
The fossil strongly resembles some modern leaf phasmids, like the modern species Phylllium celebicum (below). Note that the female, on the left, is a much stronger leaf mimic than the male (right).  According to the authors, this sexual dimorphism is quite common in phasmids.
 
Picture 3
 

Why this ubiquitous dimorphism? It resembles some types of Batesian mimicry in butterflies, in which edible females evolve to resemble both males and females (who look alike) of a toxic or distasteful model species, but the edible males don’t show mimicry, resembling the ancestor.  In both cases, I suspect, the explanation is the same: there would be an advantage to the males evolving mimicry, too, but an even larger disadvantage to changing their pattern if females have a fixed genetic preference for the ancestral type of male.

I’m not sure whether this explanation is correct, and don’t even know if it’s been tested, but it is at least conceptually testable and makes some sense in light of what we know about sexual dimorphism (females have strong preferences for certain types of males).

The mimicry does show, though, that there were visually-hunting predators around then, for why else would these phasmids evolve mimicry?  Wedmann et al. posit that the predators may have included birds, primates, and bats, all of them known from the same fossil deposits.

_____________

 

Mather, M. H., and B. D. Roitberg. 1987. A sheep in wolf’s clothing: Tephritid flies mimic spider predators. Science 236:308-10.

Wedmann, S., S. Bradler, and J. Rust, The first fossil leaf insect: 47 million years of specialized cryptic morphology and behavior PNAS 2007 104 (2) 565-569; published ahead of print December 29, 2006, doi:10.1073/pnas.0606937104

Death of the world’s most deluded woman: Umm Nidal, mother of “martyrs”

March 20, 2013 • 4:58 am

What kind of mother would send her young son off to die in a suicide bombing of children—knowing he would die—and then rejoice and pass out candy after his death and the deaths of other innocents he took with him? What kind of mother would lobby her other sons to become “martyrs,” too?

A religious mother, of course. And you know what religion we’re talking about.

Umm Nidal (“mother of the struggle”: real name Maryam Farhat), who died three days ago at the age of 64, is a hero to Palestinians. That’s because three of her six sons were “martyrs”. The youngest, Muhammed, was only 17 when he died in a  suicide attack against an Israeli military academy (the mother had encouraged him to engage in this “jihad”), another made explosives targeting civilians and was blown up by one of his drones, and the third was killed by Israeli intelligence.

After Muhammed died (taking with him five Israeli students and wounding 23), she thanked Allah and handed out boxes of chocolates and halvah. She was later elected to the Palestinian parliament.

Nidal’s funeral was attended by thousands of Palestinians and many dignitaries, including the Palestinian prime minister.

On the eve of Obama’s visit, Palestinian President Mahmoud Abbas paid homage to Nidal and gave her a special award, the “Order of Sacrifice.”

Watch this video of Nidal being interviewed in 2005, after Muhammed’s death, and tell me if it’s not the most chilling interview you’ve ever seen: the pure poison of religion in action. The interview was aired by Dream2 TV, and you can read excerpts here:

In the first part of the next video, Nidal is shown praising Muhammed, who stands next to her as she sends him off to die:

And a transcript of her statement from the Hamas website, rejoicing at “the best day of [her] life”:

“How do I feel, as I promise my son Paradise, and as I offer something (my son) for Allah? By Allah, today is the best day of my life. I feel that our Lord is pleased with me, because I am offering something (my son) to His sake. I wish to offer more [sons] for Allah’s forgiveness, and for the flag [of Islam], “There is no god but Allah,” to fly over Palestine. That’s what we want. We want the rule of Islam. I’m not parting from him [as he goes] to his death, but rather I’m parting from him as he goes to a better life, the Afterlife, which our Lord has promised us. By Allah, if I had 100 children like [my son] Muhammad, I would offer them with sincerity and willingly. It’s true that there’s nothing more precious than children, but for the sake of Allah, what is precious becomes cheap.” [Hamas website, Jan. 1, 2006]

No normal mother is glad to send her sons off to war, and any normal mother hopes that they’ll come back alive.  For a mother to send her son off at the age of 17 to kill civilians, to actually hope that he (and his brothers) will die, to rejoice at his death, and to lobby her other sons to follow in those footsteps—well, that takes religion.

It’s a mercy that this monstrous woman died before she had the chance to turn her grandsons into martyrs as well.

Um Nidal, photograph published in Palestine Today
Um Nidal, photograph published in Palestine Today

How to write good science

March 19, 2013 • 1:29 pm

Don’t expect me to give advice here except to say that the best way to write any good science stuff is to read a lot of science writing and pay attention to what you find tedious and what you find absorbing.  Oh, and this: I see the principles of good popular science writing as nearly identical to those of good technical science writing.

I once taught a course on technical scientific writing, but it failed miserably. I concluded that the best way to teach it is to take a student’s papers and keep correcting them over and over until he/she develops a decent prose style.

We all know that Steve Pinker’s next book is on how to write good popular science, and I’m much looking forward to it. In the meantime, there’s a nice piece by evolutionary biologist Lewis Spurgin called “Science and the English language“on the website A Great Tree. Spurgin’s title is taken from George Orwell’s famous essay, “Politics and the English language” (everyone should read this), which, though purporting to be about political writing, is really about writing anything clearly. I always gave Orwell’s essay to my grad students.

Spurgin makes the point that technical scientific reading should be clear and lively, though in reality it’s deadly about 98% of the time. He explains why we should care about this:

Science is about finding the truth and making sense of things. An essential part of this is communicating clearly and honestly. The structure, grammar and choice of words used in science articles makes them vague and inaccurate, which is exactly the opposite of how they are intended, and pretend, to be. And, as Orwell recognised, lazy writing encourages lazy thinking. The imitative and pretentious nature of how we write science papers acts as a barrier to thinking critically about what we’ve done, and how our experiments might be biased.

Science writing is also full of cliché, crap puns and metaphors, and borderline plagiarism. In short, it lacks imagination. It is no wonder, therefore, that nobody enjoys reading science papers. We often enjoy the story contained within scientific studies, but I’d bet that even most scientists don’t enjoy reading journal articles for their writing. Must this be the case? One could argue that imagery has no place in science articles. I think it has, and there are some examples where it has been used well. In my field of evolutionary biology, probably the most famous use of imagery in a science paper was Gould and Lewontin’s article on the “Spandrels of San Marco” [3]. Gould was also excellent at using metaphor to illustrate the vastness of geological time (“Consider the earth’s history as the old measure of the English yard, the distance from the king’s nose to the tip of his outstretched hand. One stroke of a nail file on his middle finger erases human history”). But, sadly, for every Spandrel there are a thousand Achilles’ heels, and so much light has been shed that we’ve all gone blind.

One of the most tragic consequences of the scientific writing style is the effect it has on students. Science students find it extremely difficult to get into the primary literature, and most undergraduates will not be able to properly critique a scientific paper until their final year. Complex methods used in many modern papers, and the jargon required to explain them, form a major part of this barrier, but I have no doubt that the writing is equally to blame. . .

Spurgin then gives a list of major sins of science writing, including pretentious diction, bad metaphors, and so on. I agree. If I see the words “a suite of characters” once again, I’ll hurl, and that also goes for “utilize”, “elucidate,” “facilitate” and “myriad.” And I abhor the way that even decent writers fall into tedious and jargony prose when they start writing science, as if the gravitas of a science paper demands tedium. Well, that’s the conceit of postmodernistic lit-crit, but shouldn’t be true of science. Our job is to be clear, not obscure. Granted, some humdrum stuff is necessary in a research paper, but there’s no reason why results shouldn’t be written clearly, why one shouldn’t write in the first person, or why one can’t use some humor.

Spurgin reprints Orwell’s six dicta for good writing, which I’ve tried to adhere to myself (especially number 3!), and I’ll put them here, too:

(i) Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.

(ii) Never use a long word where a short one will do.

(iii) If it is possible to cut a word out, always cut it out.

(iv) Never use the passive where you can use the active.

(v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. [JAC: note that I violate this a few lines above!]

(vi) Break any of these rules sooner than say anything outright barbarous.

Finally, here are some examples of either tedious or humorous writing that I’ve collected over the years (titles are mine). I omit the authors’ names to protect the guilty.

SCIENTIFIC FASCISM (in a paper on mutations in peanuts)

“Discrepancies in individual plant estimates were reviewed in congress and the reasons for the discrepancies were examined and the sources of differences in judgment were eliminated until the observers spoke with as nearly one mind on the subject of these peanut mutants as perhaps it is possible to achieve in human experience.”

****

POMPOSITY (my footnotes)

“Humans are themselves embedded in a Wrightean viscous population, where behavioral proximity, measured as the relative allocation of interaction among individuals, replaces spatial proximity.  Cooperation is the product of a fragmented social environment and, for this very reason, is itself heterogeneously distributed.  We are socially viscous creatures, creating islands of solstice* within Malthusian necessity.

Wynne-Edwards seems to write from his own island; yet the processes he speaks of bind him to others.  He is imprecise yet richly intuitive, at his best when neither crisply** right nor wrong;  a difficult cantankerous man for those interested in rapid progress.  But recall Alexander’s (1987:18) view of progress:  to identify the core of accuracy and correctness in the works of all writers in a field, excise the flawed portions, and then build from the best that is left. Wynne-Edwards deserves this***, as do we all.”

____________________

* Reference completely obscure.  Author most likely means “solace”.
**Has the author been eating crackers?
***Deserves what?

****

OVERLY CUTE METAPHOR

“In terms of the subject of this review, the epistasis between experimentalists and theoreticians has been positive, although small, but its strength should be increased through selection for tighter linkage, so that our understanding could evolve to the maximum peak in the knowledge surface.”

****

OBSCURANTIST ACADEMESE:

“Thus, even among the horseshoe crabs (proverbial epitomes of morphological conservatism), multitudinous nucleotide differences have accumulated among evolutionary lineages (albeit at an uncertain exact pace).”

Jerry’s Translation:  Diverse species of horseshoe crabs look alike, but their DNA is different.

****

UNNECESSARY RESTATEMENT OF THE OBVIOUS

“The world is heterogeneous.  It varies from place to place and moment to moment.  As a consequence of this variation, the optimal phenotype of an individual changes.  In an ideal world, an individual would alter its phenotype to always match the optimum.  In the real world, however, organisms do not always do this.”

For your delectation: more hatred and lunacy from the Palestinian Authority

March 19, 2013 • 8:30 am

Yes, I know a lot of readers support the Palestinians as victims of Israeli oppression, and some even want the state of Israel eliminated. Others tend to ignore the Palestinian suicide bombings against civilians, the Palestinian placement of weapons in civilian areas, and other violation of the Geneva convention. As I’ve pointed out before, there’s a disgraceful double standard in this situation: Israelis are simply expected to act better than Palestinians, with Palestinian atrocities simply ignored. If Israel sent suicide bombers to Palestinian weddings, you know the world would raise a far stronger alarum than if Palestinians did they same thing—which they do.

I’m no diehard fan of Israel: for example I think they need to immediately get all the settlers off the West Bank before there can be peace, and I favor the establishment of a Palestinian state. But I do see a clear double standard, and a climate in which it’s simply politically correct to ignore Palestinian transgressions and concentrate on the Israelis. Feeding into this is, I think, is some anti-Semitism.  That anti-Semitism is clear among Arab states. Hate-filled messages are daily fare on official Arab media—things far more hateful than the ludicrous and bigoted film “Innocence of Muslims” that rightfully angered not only Muslims, but much of the West.  But films that are similar, but more bigoted, appear daily on official Arab television, but of course we all ignore those. Many of you ignore that.

You do so at your peril, for the hatred of Jews and Israel in Arab media reflect the greater problem of Islam: a visceral disgust at all things western, and a desire to conquer the lands of heretics, apostates, and infidels.  That’s why the two-state solution, which I favor, along with considerable Israeli concessions, will not quell Islamic fanaticism in the Middle East.

Here’s exhibit A (apologies to Chris Mooney!): part of an official bulletin issued yesterday by the Palestinian Authority, and printed in part by the Palestinian Media Watch (if you’re going to call the PMW biased, fine, but if you’re going to accuse them of making this bulletin up, be careful).  And remember that this week President Obama is going to the Middle East in an attempt to broker peace by talking to Israeli and Palestinian officials.

Here’s part of the bulletin (my bold)

Op-ed by Hassan Ouda Abu Zaher:

“‘History is a great lie written by the victors’ – said Napoleon Bonaparte, the source of dubious historical writing and father of Freemasonry in France. If so, is the history planted in us through TV and the standard educational curriculum indeed true? The source of this history is the West – the victor ever since the fall of Andalusia (Muslim Spain)! …Our history is replete with lies, from lies about the corrupt [Caliph] Harun Al-Rashid, which ignore the sources indicating that he dedicated one year to pilgrimage [to Mecca] and one year to Jihad (i.e., he was a good Muslim), to the lie about Al-Qaeda and the Sept. 11 events, which asserted that Muslim terrorists committed it, and that it was not an internal American action by the Freemasons, which was mentioned in the Illuminati game cards ten years before it took place, and in over 15 Zionist and Freemason Hollywood-produced films in the 1990s. The method of repeating [the lies] over and over has authenticated false facts. Had Hitler won, Nazism would be an honor that people would be competing to belong to, and not a disgrace punishable by law. Churchill and Roosevelt were alcoholics, and in their youth were questioned more than once about brawls they started in bars, while Hitler hated alcohol and was not addicted to it. He used to go to sleep early and wake up early, and was very organized. These facts have been turned upside down as well, and Satan has been dressed with angels’ wings…”
[Al-Hayat Al-Jadida, March 18, 2013]
To reprise the contentions:
  • The 9/11 attacks were not committed by Muslims, but were engineered by American Freemasons.
  • Hitler was a good man, and it’s a shame that the Nazis lost.
  • Churchill and Roosevelt were alcoholics, ergo worse than the punctilious, teetotaling (and, I should add, vegetarian) Hitler.

Remember, this is from the official bulletin of the Palestinian Authority. It whips up hatred not only against Jews (the usual stuff) but against America. And it comes two days before Obama’s visit.

If you are going to use this post to beat on Israel, consider what you are doing—and justifying.  This is Islamic lunacy, pure and simple, and we must oppose it—even if it comes from Palestine.

Do you really want to be on the side of a regime that sees Hitler as a hero?

Is religion hardwired?

March 19, 2013 • 4:45 am

The many theories for the origin and persistence of religion fall into two classes: those that think that religion piggybacks on some aspect of human nature, usually evolved (credulity, need to attribute agency to natural phenomena, tendency to accept what parents or elders tell you, and so on), and those claiming that religion is “hardwired,” that is, we have genes that directly produce in us a propensity to apprehend and/or worship God. That claim is often the one adduced by religious people, since it feeds into the notion (e.g., Plantinga’s “sensus divinitatus”) that God instilled in us the need and desire to find Him.

I find the first class of theories more credible, but secular studies of religious belief have been bogged down by the fact that none of these theories are obviously testable by science. I have suggested several times on this site, though, that the “hardwired” theory is in principle testable: all you have to do is bring up children in an environment where they’re completely free of religious knowledge or influence, and see if they spontaneously come to conceive of (and maybe worship) a God. Unfortunately, that’s impossible, because we can’t do experiments with humans. And there’s virtually nowhere that one can raise a child without some exposure to religion.

But a new paper in Trends in Cognitive Science by Konika Banerjee and Paul Bloom, psychologists at Yale University, claim that the evidence is largely in—and doesn’t support the “hardwired” hypothesis. (The paper is just a two-page review, and is free at the link; reference is at bottom. The “Tarzan” reference in the title refers to the question of whether Tarzan, raised by apes, would come to believe in God.)

Their “evidence,” however, is pretty thin—so thin that I don’t think it shows anything.

Here’s their first assertion:

Consider belief in a divine creator. Young children are prone to generate purpose-based explanations of the origins of natural objects and biological kinds. They believe, for example,that clouds are ‘for raining’ and animals are ‘to go in the zoo’ [9]. However, there is no evidence that children spontaneously come to believe in one or more divine creators. It is one thing, after all, to think about natural entities as intentionally designed artifacts of a sort; it is quite another to generate an enduring belief in invisible agents who have created these artifacts. Indeed, other studies find that young children are not committed creationists; they are equally likely to provide explanations of species origins that involve spontaneous generation [10].

But what I see here is not evidence against hardwiring, but an absence of any evidence. And creationism is not equivalent to belief in a supernatural being that affects the world: one can be religious without being creationist.

Later, however, the kids do become creationists, but that’s imputed to cultural or parental indoctrination:

Older children, by contrast, do exclusively endorse creationist explanations. This shift to a robust creation is preference arises in part because older children are more adept at grasping the existential themes invoked by the question of species origins (e.g., existence and final cause) and also because the notion of a divine creator of nature meshes well with their early-emerging teleological biases [10]. However, these older children do not spontaneously propose novel divine creators. Instead, they adopt the particular creationist account that their culture supplies. This might be a singular God or multiple gods; it might be alien visitors or Mother Earth. If children are not exposed to such cultural beliefs,the explicit notionof an intentional creator might never arise.

Likewise, where’s the evidence that their newfound creationism comes from absorbing it from their culture, rather than appearing spontaneously as a product of their genes at a later age? No reference is cited.

The authors cite two more pieces of “evidence.” The first is this:

Some, such as Barrett [4], take children’s readiness to reason about life after death as evidence that they are ‘born believers’ in an afterlife.

This conclusion is probably too strong, however.There is no evidence that belief in the afterlife arises spontaneously in the absence of cultural support.For instance, research in rural Madagascar, where there is widespread belief in ancestral spirits, finds that the conception of an afterlife emerges in the course of development[12]. Even if children are ‘natural-born dualists’ [5], this initial stance need not directly give rise to the afterlife beliefs that are characteristic of many of the world’s religions.

Again, this isn’t evidence for a lack of “hardwiring” as opposed to cultural inculcation. After all, beliefs or behaviors generated by genes needn’t appear at the outset of development.  Interest in sex, for instance, is surely “hardwired,” but doesn’t appear till near puberty.

The authors’ final claim again appears to rest on an absence of evidence, not positive evidence:

Consider, as a test case,belief in multiple deities.This is the historically foundational religious stance, with monotheism a more recent invention [14]. It would be striking support for the generativity position if children raised in monotheistic societies declared their belief in multiple gods. However, to our knowledge, they never do. They come to believe instead in the same singular omnipotent deity that everyone else believes in.

But this shows either that children are inculcated in one omnipotent deity—OR are genetically predisposed to believe in a single deity. (The thesis, after all, is that children believe in supernatural agents, not whether it’s one or many, and even “genetic” belief in a single deity can be altered by culture, just as a genetic predisposition to have sex can be altered by the availability of pornography.)

I have’t read the 14 papers cited by Banerjee and Bloom, but they don’t give citations for their major empirical conclusion. I am sympathetic to their idea that religion piggybacks on other evolved tendencies, simply because an individual reproductive advantage of believing in a God isn’t obvious to me, but the evidence so far is thin.  Too thin, at least, to support the authors’ conclusion, from the abstract:

Drawing on evidence from developmental psychology, we argue here that the answer is no: children lack spontaneous theistic views and the emergence of religion is crucially dependent on culture.

Well, “crucially dependent on culture” can mean several things, and the “genetic” hypothesis might depend on cultural exposure too—just not cultural exposure to religion. One might simply need exposure to society and the environment.

I’m not sure how to discriminate among the many theories for the origin of religion, since it occurred in the distant past and has been culturally transmitted ever since. But at least one is testable in principle: that belief in a god or gods is hardwired, and will arise spontaneously—and in a similar form—in people who are never exposed to religion.

Yes, that’s testable in principle, but not in practice.  We still remain profoundly ignorant of how religion came to dominate our species.

______

Banerjee, K., and P. Bloom. 2013. Would Tarzan believe in God?: Conditions for the emergence of religious belief. Trends Cognitive Sci. 17:7-8.

h/t: Mauricio

CatCam: a film

March 19, 2013 • 4:15 am

Reader Susan McWilliams called my attention to a funny (and touching) cat documentary that’s a finalist in the 2013 PBS (Public Broadcasting Service) film contest. The 16-minute video is called “CatCam,” described as “a film about a cat that constantly disappears from his home.” The owner fits out the cat with a camera, and the film is the result.

More from the PBS film description:

Mr. Lee is an adopted stray cat who routinely disappears from his North Carolina home. Where does he go? To find out, an engineer straps a camera to the cat, inadvertently creating a media sensation.

Image - CatCamFilmmaker.jpgAbout the filmmaker

Seth Keal makes his directorial debut with CatCam, which won the Jury Prize at SXSW and the Best Online Short at the Tribeca Film Festival, amongst other festival awards.  Keal’s previous credit includes producer of Emmy nominated Joan Rivers: A Piece of Work. He is also credited with multiple roles in the production for television content for National Geographic Channel and the History Channel. He currently resides in New York City.

I particularly like the ending, in which the cat Mr. Lee adopts owner Jürgen despite his diehard dislike of cats. They have a way of doing that. (Don’t miss Mr. Lee getting his shrimp-filled trophy!)

You can vote for this film, or others, here.

Cornell HawkCam is back!

March 18, 2013 • 10:06 am

Last year, as I recall, we followed the adventures of the two red-tailed hawks at Cornell, Ezra (male) and Big Red (female).  Well, the hawkcam is up again and alert reader Kevin informs me of this:

I don’t know whether you’ve noticed but Big Red and Ezra are back at the Cornell Hawk Cam [JAC: live video at the link]

They moved nests this year so the cam was down for a bit while they put new ones in, but there are currently two eggs.  Of course, the real action won’t start for a month or so, when we’ll get to see the usual pile of innards and pigeon feet cluttering the nest!

Here’s a screenshot. I’m not sure what the egg gestation time is for Buteo jamaicensis, but I’m sure one of our readers can tell us when the eggs are likely to hatch (one egg was laid March 15, the other yesterday).

Picture 1

“Fine-tuning”: is the multiverse a Hail Mary pass by godless physicists?

March 18, 2013 • 7:21 am

I’ve finished the book Atoms and Eden, a series of interviews by Steve Paulson of luminaries in the science-and-religion debates. The book includes interviews with people on all sides, including Karen Armstrong, John Haught, Sam Harris, Dan Dennett, Jane Goodall, and so on. Judging by Amazon, the book hasn’t garnered much interest, but I’d recommend it.  Paulson asks some hard questions to both pro- and anti-religious people, and their answers are often revealing, showing a side of the person that might surprise you.

For example, here’s a question-and-answer involving Paul Davies, a well known British physicist, now a professor at Arizona State University and director of BEYOND, the Center for Fundamental Concepts in Science. He’s also religious, and has published tons of accommodationist material, eventually garnering the lucrative Templeton Prize in 1995. Here’s the Q&A on p. 256:

Q: Do you think one reason the multiverse theory has become so popular in recent years is to keep the whole idea of God at bay?

A [Davies]: Yes.

Davies goes on to say this in response to a followup question(p. 257):

“There’s no doubt that the popularity of the multiverse is due to the fact that it superficially gives a ready explanation for why the universe is bio-friendly. Twenty years ago, people didn’t want to talk about this fine-tuning because they were embarrassed. It looked like the hand of a creator. Then along came the possibility of a multiverse, and suddenly they’re happy to tal about it because it looks like there’s a ready explanation. . . Even the scientific explanations for the universe are rooted in a particular type of theological thinking. They’re trying to explain the world by appealing to something outside of it.”

Now this struck me as bizarre. I’d written about multiverses before, in my New Republic piece on the incompatibility of science and faith, and I remembered that multiverse theory was not a face-saving device confected by physicists to get rid of the annoying God problem.  And I also knew it was a prediction from some well-regarded theories of astrophysics, not a Hail Mary pass* by god-hating scientists. But, to be sure, I sent Davies’s answer to the Official Website Physicist™, Sean Carroll, asking him, “What Davies said isn’t right, is it?” Sean wrote me back, agreeing, and I reproduce his email with permission:

That’s not right at all. As I explain in my Discover magazine piece, “Welcome to the multiverse“: The multiverse idea isn’t a “theory” at all; it’s a prediction made by other theories, which became popular for other reasons. The modern cosmological version of the multiverse theory took off in the 1980’s, when physicists took seriously the fact that inflationary cosmology (invented to help explain the flatness and smoothness of the early universe, nothing to do with God) often leads to the creation of a multiverse (see here for a summary). It gained in popularity starting around the year 2000, when string theorists realized that their theory naturally predicts a very large number of possible vacuum states (see e.g. http://arxiv.org/abs/hep-th/0302219). All along, cosmologists have been trying to take the predictions of their models seriously, like good scientists, not trying to keep God at bay. (As far as most cosmologists are concerned, God has been at bay for a long time now; the idea that current physics research is being affected in any noticeable way by the idea of God is way off base.)

There is only one connection between the multiverse and God, which is that some physicists have said that anthropic selection effects within the multiverse could lead to the kind of fine-tuning one would otherwise expect from an intelligent designer (see e.g. the subtitle of Leonard Susskind’s book “String Theory and the Illusion of Intelligent Design.”) But the connection was never a popular one, and the idea of God is essentially never mentioned in serious cosmological discussions.

Do read Sean’s two-page piece, which is the best concise explanation of the origins of multiverse theory I’ve seen.

When Sean confirmed my take on Davies’s statement, I became quite flummoxed, for Davies is a reputable physicist, and the interview was published in 2010.  He should know better than to make an erroneous statement like this, which will immediately make believers see multiverses as motivated by atheism, not science. But that’s not right.

The fact is that Davies himself appears motivated by his faith and his accommodationism, for there’s no other explanation for why a smart astrophysicist could say something so misleading.

And, as comic relief, I offer you another LOLzy statement by an interviewee: John Haught, theologian at Georgetown University and erstwhile debate opponent. On pp 89-90, explaining why God is hidden, he emits this gobbledygook:

“The traditions and religion and philosophy have always maintained that the most important dimensions of reality are going to be the least accessible to scientific control. There’s going to be something fuzzy and elusive about them. The only way we can talk about them is through symbolic and metaphoric language—in other words, the language of religion. Traditionally, we never apologized for the fact that we used fuzzy language to refer to the real, because the deepest aspect of reality grasps us more than we grasp it. So we can never get our minds around it.”

That is a masterpiece of equivocation. Sophisticated Theology™ doesn’t get any funnier than that.

And, as a purgative for Haught, I offer you the words of Richard Dawkins on p. 103

Q: Yet most moderate religious people are appalled by the apocalyptic thinking of religious extremists.

A: [Dawkins] Of course they’re appalled. They’re decent, nice  people. But they have no right to be appalled because, in a sense, they brought it on the world by teaching people, especially children, the virtues of unquestioned faith.

________

For those unfamiliar with American football, Wikipedia explains this term:

Hail Mary pass or Hail Mary route in American football refers to any very long forward pass made in desperation with only a small chance of success, especially at or near the end of a half. The expression goes back at least to the 1930s, being used publicly in that decade by two former members of Notre Dame’s Four Horsemen, Elmer Layden and Jim Crowley. Originally meaning any sort of desperation play, a “Hail Mary” gradually came to denote a long, low-probability pass you tell your receivers to go for a Hail Mary to try to score a touchdown in the desperate times of a game.

Here’s a famous Hail Mary pass thrown by Boston College quarterback Doug Flutie in 1984, giving his team a 2-point win over Miami.