Are humans still evolving?: a Radio 4 show.

August 17, 2011 • 5:02 am

The other day, BBC Radio 4 presented a half-hour show hosted by  Adam Rutherford:  “Human evolution versus cultural evolution,” the first of a two-part series called “In our own image: evolving humanity”.  You can hear the show at the link, and I understand it will be up for a week.

The show features evolutionary luminaries discussing the role of genetic versus cultural evolution in the formation of our own species: these talking heads include Steve Jones, Steve Pinker, Kevin Laland, Steve Stearns, and my own Chicago colleague Anna di Rienzo.

If you’ve been following this website regularly, you won’t learn a lot that is new, but you may want to listen if you’re not up to speed on the question of whether and how much humans are still evolving, which is the topic of this show.

The participants have different takes.  Most say “yes,” though Steve Jones, as always, claims that both the evidence for and the scope of genetic evolution in modern Homo sapiens is limited. He argues that since our divergence for the lineage that led to modern chimpanzees, human ancestors have evolved more slowly than those of chimps, and that the evolutionary changes in our ancestry mostly involve losses (e.g., hair).  That, of course, neglects the tremendous cognitive gain that we’ve experienced.  Jones also argues that the opportunity for selection, which involves variation in both longevity and offspring number (“reproductive success”), has also decreased drastically due to improvements in health care and sanitation.

As I’ve discussed before, Steve Stearns and his colleagues have argued otherwise using the long-term data from the Framingham Heart Study.  Contrary to Jones, they claim that there is still sufficient variation among couples in offspring number to lead to substantial selection on health-related traits.

Steve Pinker talks about cultural versus genetic evolution; it may surprise some that he’s not a diehard evolutionary psychologist here, and argues that traits like music-making might not have been the direct object of selection, but rather evolutionary byproducts of other evolved cognitive capabilities. (Some evolutionary psychologists have claimed that traits like art and music were the direct objects of selection, since proficiency in those arts conferred on those higher reproductive success.)

Kevin Laland discusses the recent evidence in the genome for selection on humans; this involves population-genetic analysis of “selective sweeps,” which can detect signatures of natural selection by looking at the diversity of DNA variation around various genes.  Low diversity in a region indicates that a nearby gene has recently undergone a “fixation,” that is, a single “allele,” or gene copy, has risen to high frequency from one or a few original mutations.  See my previous post on this work.

Spencer Wells talks about genetic changes due to migration, concentrating on genes affecting skin pigmentation. He fails to mention, though, that that this story is still not well understood, and that there’s controversy about the classic “melanoma versus vitamin D” explanation.

Finally, Anna di Rienzo discusses her lab’s work on the Duffy antigen showing recent selection for malaria resistance, and talks about work on other “disease genes.”

It’s a good short summary for the layperson. I love the eloquence of these scientists, as well as the timbre of Pinker’s voice and Steve Jones’s hybrid Welsh/English accent (his take on a Cro-Magnon man riding the bus in Camden Town is hilarious).  My only plaint: there could have been a bit more discussion of evidence for evolution via cultural change, that is, the “gene-culture coevolution” as exemplfied by the evolution of lactose tolerance in pastoral human populations (I talk about this in WEIT, and have posted on it here.

The question about evolution I get most often when talking to the general public is this: “Are humans still evolving?”  If you can’t answer that question, you’ll be able to after a half-hour investment in listening to the show.

h/t: Dom

WEIT translated into Arabic (and a Kindle edition)

August 17, 2011 • 4:25 am

For religious reasons, there’s a dearth of books on evolution in the Islamic world, and none, as far as I know, that recount the evidence for evolution.  But the National Center for Translation in Cairo has just commissioned an Arabic translation of WEIT; there was a slight delay due to regime change!

The book won’t be out  for a while (they have 24 months until publication), and I hope it gets disseminated to countries beyond Egypt.  It’s gratifying that speakers of Arabic get the chance to learn about evolution in their own language.

This makes 14 translations now, but this is the one that was most important to me.

I guess I forgot to announce that the book is also available in a Kindle edition, which you can buy here.

An awesome predator: the osprey

August 16, 2011 • 10:17 am

Courtesy of Arkive, we have this magnificent video of an osprey fishing. There is only one species of osprey (Pandion haliaetus), and it’s found on every continent save Antarctica.  It’s also the only species in its genus, and the only species in the family Pandionidae.

They eat fish almost exclusively  There are three vignettes in the following video, and lots to see, including their amazing abilities to spot fish from high in the air, to immerse themselves completely in water while hunting, to catch fish almost larger than they can carry, and to take off from the water’s surface. Notice, in one instance, how the osprey shakes itself free of water while flying with a fish.  Notice, too, that the osprey in the last segment holds a fish with a single talon.

[vodpod id=ExternalVideo.1001869&w=425&h=350&fv=fms_url%3Dvideo.arkive.org%26video_url%3DB3%2FB37A1567-15E8-4AC1-B19B-CD6EB94EB7D2%2FPresentation.Streams%2FPresentationFlash]

h/t: Donald

Uncle Karl tells us how to read the Bible

August 16, 2011 • 6:18 am

Karl Giberson is still wrestling in public with his faith, and I’m reminded of the last stanza of Carrion Comfort, by Gerard Manley Hopkins:

Cheer whóm though? The héro whose héaven-handling flúng me, fóot tród
Me? or mé that fóught him? O whích one? is it eách one? That níght, that year
Of now done darkness I wretch lay wrestling with (my God!) my God.

Karl’s concern—as with all the folks at BioLogos (Giberson used to be vice-president)—is whether to take the Bible as metaphor or literal truth.  In a new piece at HuffPo, “The Bible is a library, not a book,” he comes down largely on the former side. Or does he?  He recognizes that much of the Bible is fiction and metaphor. The Adam and Eve story is an example, though Giberson deliberately avoids discussing how the falsity of that tale impacts Christian theology.

The biblical references to the fixed earth and the first couple require interpretation. You cannot simply read a book like the Bible — you have to read it through complex filters to properly understand it. The most obvious such filter is that of language. The story of Adam and Eve originated as a Hebrew oral tradition, which is a long ways from an English prose translation. And there are more complex filters related to culture, author intent, literary form, historical setting, anticipated audience and so on.

Application of these filters leads many readers to conclude that the biblical story of Adam and Eve was never intended to be read as literal history. The world “Adam” for example, is the generic Hebrew word for “man.” “Eve” means “living one.” The story is about a couple with the improbable names “Man and Living One,” who reside in a magical garden and take walks with God in the evening. It is far from obvious that this should be read as literal history.

And then Karl addresses a question I raised in a previous post: if the Bible is largely metaphor and fiction, how do we know which parts aren’t fiction? For if they all are, then fundamental tenets of Christianity, like the virgin birth, divinity, and resurrection of Christ, are simply stories. And even many liberal Christians assert that these “facts” about Jesus are non-negotiable—that is, if they are fiction then their basis for belief completely crumbles.

But how do we decide which parts of the Bible should be read literally? This question is often posed with an “Aha! I have got you” exclamation, as though the inquisitor is certain it cannot be answered. Jerry Coyne, in his endless quest to discredit all things religious, put it like this in a recent blog:

“Sophisticated” theologians who urge a non-literal reading of the Bible always put themselves in a bind. And it is this: if the Bible is not to be read as a literal account of the truth, then how do we know which parts really are true, and which parts are fiction or metaphor? Nobody has ever found a convincing way to winnow the true from the metaphorical, and so it becomes an exercise in cherry-picking.Less triumphalist versions of this same question were posed to me by a radio listener this morning and a former student yesterday on my Facebook wall.

Giberson thinks the answer to this dilemma is “straightforward, even simple”.  What is it?  The answer is unbelievable:

The Bible is not a book. It is a library — dozens of very different books bound together. The assumption that identifying one part as fiction undermines the factual character of another part is ludicrous. It would be like going into an actual physical library and saying “Well, if all these books about Harry Potter are fictional, then how do I know these other books about Abraham Lincoln are factual? How can Lincoln be real if Potter is not?” And then “Aha! I have got you! So much for your library.”

Acknowledging that the Bible is a library doesn’t do all the hard work for us, of course. But recognizing this at least lets us avoid the so-called slippery slope where a non-literal approach in one place somehow compromises a literal approach in another.

As you see, this is a complete non-response.  The question I asked is this: which parts of the bible are Harry Potter and which parts are Abraham Lincoln?  And how can you tell? Admitting that some parts of the Bible are literal truth and others fiction does not enable us to tell them apart!  What are the criteria we should use?  We have, of course, empirical ways of knowing that Harry Potter isn’t real but Abraham Lincoln was.  We can’t apply those criteria to the divinity or resurrection of Jesus, and many scholars aren’t even certain that Jesus existed.

Giberson’s real answer, of course, would probably be something this: “I just know that the stuff about Jesus is real. Therefore I needn’t use external or historical criteria to distinguish Biblical fact from fiction.  I know the answer by revelation—by what the church tells me and by what I feel in my heart.”

As always, interpreting the Bible rests on a combination of wishful thinking and making stuff up.

________

UPDATE:  Over at EvolutionBlog, Jason Rosenhouse has a longer and better critique of Giberson’s piece.  Here’s a bit:

As it happens, though, I do think a Christian has a way of evading this problem. He could conclude that the Biblical text is inspired only in the sense that its authors had genuine encounters with God that they then put down in writing as best they could. The words themselves are not inspired, they were written by fallible human beings. On this approach we simply abandon the notion of inerrancy, but that is good riddance to bad rubbish. To the Christian who worries that this leaves him without a firm basis for believing what the Bible says about Jesus I would simply ask what it was that convinced him of Jesus’ divinity in the first place. If it was really the complete historical accuracy of Genesis then we have a problem. But if it had something to do with religious experience, or if it was the result of positive changes in their life that occurred after coming to faith, then I fail to see how those reasons are diminished by taking a more moderate approach to the Bible.

UPDATE 2. Over at Choice in Dying, Eric MacDonald, a genuinely sophisticated (ex) theologian, also demolishes Giberson’s “library” idea. A snippet of his post:

The resurrection, in its literal meaning, is absolutely central to Christian faith as traditionally understood, that is, in accordance with the creeds and the consensus of the faithful, so it needs to stay put. That means that, by hook or by crook, Christian apologists will make the resurrection turn out to be confirmed by all the canons of critical historical scholarship, no matter to what lengths they have to go in order, to their own satisfaction, anyway, to do this. But there is simply no way that this can be done with Adam and Eve, as Giberson acknowledges. Therefore, since this story is linked to the meaning of the story of Jesus’ crucifixion, and thus to the significance of the resurrection, this must also be held onto, and made firm, but, in this case, by means of figurative interpretation.

The checker shadow illusion redux

August 16, 2011 • 4:58 am

I’ve posted on this before—it’s the most striking optical illusion I’ve seen—but this video is a far better demonstration than the graphics I used before.  It’s called the “checker shadow illusion” and was invented by Edward Adelson at MIT.  The link in the first sentence explains it: it’s based on how our visual system compensates for shadows, making the checker in shade look lighter than it really is.

I can already hear the theologians crowing (as they often do) about how our senses don’t really perceive an external reality—that our idea that we can trust the world through our senses, as in science, is simply another form of faith. (Forgive me, for I just read this very argument yesterday.)

h/t: Matthew Cobb

Kitteh contest: Kizhe

August 16, 2011 • 4:38 am

Reader Theo Bromine, who writes for the Ottawa website of the Center for Inquiry, sent a story and photos of her Siamese cat Kizhe.  Sadly, Kizhe is no longer with us, but let this post be a memorial to him.

Kizhe was our 7th cat (consecutively), and our 4th simultaneous Siamese.   He was named after a mythical Russian soldier (see here, purportedly invented when the Czar misread “pod poruchiki zhe” (the lieutenants,however) as “pod poruchik Kizhe” (Lieutenant Kizhe.).  Kizhe was assertive, but not aggressive, though he did have a tendency to bite holes in things (including a can of ginger ale, and a waterbed).  Like all Siamese, he loved to climb, and occasionally got himself to heights that were difficult to get down from  – he was the only cat I have ever met who actually went towards a human who was trying to rescue him from a precarious situation (instead of the more common response of turning around and engaging velcro mode).

Alas, early in 2009, we noticed a small lump on Kizhe’s flank.   Its etiology defied all attempts at diagnosis (including consultations on both sides of the Atlantic and Pacific), showing some attributes of cancer and some of infection, but testing negative for all.    Despite surgery and treatment with various drugs, Kizhe’s condition continued to worsen.  We decided to give him a peaceful end to his life, and he was euthanized at home in September 2009.

We readily agreed to the vet’s request to do a post mortem and additional research.  It turned out that Kizhe had a very rare form of lymphoma, and the case has since been written up in a veterinary  journal.   Sad that he is gone, but he has left a legacy of contribution to science.

Kizhe spent most of his time during his last days on his favorite chair in our sunroom, surrounded by the sounds and smells of the backyard.

Does evolutionary convergence prove God?

August 15, 2011 • 9:44 am

Over at BioLogos, Darrel Falk, president of the outfit, presents the “evolutionary convergence” argument for God in a piece called “Was humanity inevitable?” The piece is accompanied by a Wisconsin Public Radio interview centered on Cambridge University paleontologist Simon Conway Morris and his ideas about the relationship between God and evolutionary convergence.  It’s well worth listening to, for Conway Morris’s views are criticized by others as well as receiving some surprising support.

Briefly, evolutionary convergence is the observation that some unrelated groups of animals or plants have, though natural selection, converged on similar “designs” when they find themselves in similar environments.  The classic examples (I talk about some of these in WEIT) are the placental and marsupial mammals (both, for example, have evolved mole-like forms), the vertebrate and cephalopod eyes, the fusiform shape of dolphins, fish, and ichthyosaurs, and the euphorbs of the Old World and the cacti of the New.  Convergence shows that, sometimes, genetic variation will hit on similar adaptive solutions in similar environments. It’s nether hard to understand nor difficult to accept.

Some religious biologists, however, think that God is behind this phenomenon.  I have no idea why, for the prime example they always use is human intelligence, and human intelligence is not convergent with the intelligence of any other unrelated creature.  Our mental complexity is an evolutionary one-off: like feathers or the trunks of elephants, it evolved only once.

Somehow, though, religious scientists like Conway Morris want to argue that the evolution of human-like intelligence was inevitable.  Behind this view, of course, is the faith that God acted to create humans through evolution, and that human intelligence, capable of apprehending and worshiping a God, is the acme of that God-driven process.  But why would the evolution of such an intelligence be any more inevitable than the evolution of, say, the elephant’s trunk?  I think the convergence argument for God is deeply misguided—in fact, it’s incoherent.

I’ve addressed the convergence argument before (go here, and, if you want more, trace the links back to my New Republic piece, where I discuss the issue in depth).

In the radio piece, however, Conway Morris repeats his claim, arguing that the evolution of “a sentient species, with an advanced civilization, in my view, is an inevitability.” Kenneth Miller seems to agree, but biologist Sean Carroll and philosopher Dan Dennett strongly disagree (I’m with them, of course).  Surprisingly, Richard Dawkins agrees that a high, human-like intelligence was an evolutionary inevitability, saying that, “Simon Conway Morris and I are very close. . . .but he thinks it implies some kind of theistic push and I don’t.”  I think Richard’s view here stems from his emphases on “arms races” in evolution, for high intelligence is a great way to win an arms race.

UPDATE: Richard explains in a comment below where and where he does not agree with Conway Morris’s claims about inevitability (he disgrees, of course, about any theistic implications).

Falk’s piece gives both sides of the argument, but he lets Conway Morris’s erroneous claims go unchallenged. Here is one:

Given enough time and resources, [Conway Morris} says, every ecological niche will be filled up by some kind of life form.

Now how do we know that? I think it’s complete twaddle, for there are some good examples of ecological niches that have never been filled. Here’s one: snakes that eat grass. What a great niche for a snake, but although they’ve been around for 125 million years, snakes have never evolved herbivory. Yet there’s plenty of room for such creatures!  (Of course, you can always say that such a niche doesn’t exist, but then your argument becomes tautological.)  Any biologist can think of such unfilled niches.

At the end, Falk punts the question into the arms of theologians: the intellectual equivalent of it tossing it into a black hole:

So is the near-certainty of human life front-loaded from the beginning? Was it predetermined from the Big Bang that human beings would eventually arise? Was it predetermined that God’s natural activity—that activity which upholds the universe and maintains all that is within it—would be sufficient for the eventual development of humans? Alternatively, was supernatural activity required for the creation of the human body? Does the Bible dictate one way or the other? Is it somehow less God’s creation if it took place through God’s natural activity? Is it somehow more God’s creation if supernatural activity was required? These are questions for theologians. Science is taking us up to edge as Conway Morris brilliantly shows. There, we meet the theologians, and there, we begin the journey’s next phase.

These are questions for theologians? They can tell us whether the evolution of humans was inevitable? Or that human evolution required supernatural activity?  Don’t make me laugh.  Theologians are completely incapable of answering those questions, or in fact, any questions.  Theology is not a way of finding out answers; it’s a highly developed form of rationalizing preordained conclusions.

American unbelief on the rise

August 15, 2011 • 7:20 am

Over at AlterNet, Adam Lee (whose website is Daylight Atheism) has a nice article on the rise of atheism—and atheist groups and support networks—in America, “Goodbye religion? How godlessness is increasing with each new generation.” Lee documents some horror stories about what has happened to students who opposed the incursion of prayer and religion in public schools: the principal of one such student’s school wrote to the colleges he applied for and tried to block his admission, another student’s cat was killed.  But despite this vilification, godlessness is on the rise.

The facts:

“This demographic transformation has been in progress ever since World War II, but in recent years it’s begun to seriously pick up steam. In the generation born since 1982, variously referred to as Generation Y, the Millennials, or Generation Next, one in five people identify as nonreligious, atheist, or agnostic. In the youngest cohort, the trend is even more dramatic: as many as 30% of those born since 1990 are nonbelievers. Another study, this one by a Christian polling firm, found that people are leaving Christianity at four times the rate that new members are joining.”

he quotes a sociological survey by Putnam and Campbell that says this:

” . . . Today, 17% of Americans say they have no religion, and these new “nones” are very heavily concentrated among Americans who have come of age since 1990. Between 25% and 30% of twentysomethings today say they have no religious affiliation — roughly four times higher than in any previous generation.”

This is all good, but why is it happening?  Lee says that only one theory makes sense in light of the facts: that churches are making themselves—and religion—unattractive and irrelevant by continuing to push conservative social values in an age of increasing tolerance and liberalism. We’ve all seen the rise in approval of gays and gay marriages in just the last decade, and Lee gives statistics showing not only this, but a substantial opinion by non-Christians that Christianity is “anti-homosexual.” Too, many churches hold conservative stands about women and about contraception and abortion.

The Catholic Church is, of course, the most prominent offender here, but many conservative Protestants also adhere to these dogmas.  At some point, the Catholic Church is going to wake up, for if it doesn’t liberalize it will dwindle to total irrelevance. Yet it seems blind and deaf to what’s happening.

According to Lee, then, the rise of secularism is not so much the doing of secularists or New Atheists but the result of intransigence by churches:

What all this means is that the rise of atheism as a political force is an effect, rather than a cause, of the churches’ hard right turn towards fundamentalism. I admit that this conclusion is a little damaging to my ego. I’d love to say that we atheists did it all ourselves; I’d love to be able to say that our dazzling wit and slashing rhetorical attacks are persuading people to abandon organized religion in droves. But the truth is that the churches’ wounds are largely self-inflicted. By obstinately clinging to prejudices that the rest of society is moving beyond, they’re in the process of making themselves irrelevant. In fact, there are indications that it’s a vicious circle: as churches become less tolerant and more conservative, their younger and more progressive members depart, which makes their average membership still more conservative, which accelerates the progressive exodus still further, and so on. (A similar dynamic is at work in the Republican party, which explains their increasing levels of insanity over the past two or three decades.) . . .

. . . The major churches, clinging to the inferior morality of long-gone ages, are increasingly out of step with a world that’s more enlightened, rational and tolerant than it once was. And the more they dig in their heels, the more we can expect this process to accelerate.

So much for the mantra that “religion is here to stay,” a claim that I always find annoying—and wrong in light of the dramatic decline of religion in much of Europe over the last two centuries.  If religion does stay, it will increasingly be in a less virulent form that doesn’t oppress women or gays, or intrude into the sexual lives of consenting adults. And we can count that as a victory.  (This, of course, assumes that the spread of intolerant forms of Islam doesn’t overcome this trend.)

Now maybe Lee is right (he’s surely at least partly right), but that doesn’t mean that we can’t accelerate the trend by standing up and speaking out.  As I’ve found myself, the more one points out the dangers and irrationality of faith, the more encouragement it gives others to “come out.”  I’ve discovered this from emails I’ve gotten since I became more vocal on the issue, and of course someone like Dawkins has been hugely more successful in getting others to stand up.  And I know from people who come up to me after talks that there is a large number of Americans who are nonbelievers but choose to remain silent lest they be stigmatized.  All we have to do is what gays did to gain acceptance: point out the irrationality of intolerance—and of religion—and, when we face faith-based attempts to coerce others, keep thinking that we’re mad as hell and aren’t going to take it anymore.

h/t: Grania Spingies