A remarkable case of mimicry: katydid nymph mimics ant

April 28, 2017 • 12:30 pm

The nymphs (juvenile stages) of katydids—orthopterans from the family Tettigoniidae—nymphs look pretty much like miniature katydids; here’s a screenshot of what you see when you do a Google image search for “katydid nymph” (click to enlarge):

But one species, at least, has modified its nymph stage to look like a hymenopteran. Here’s a photo by Piotr Naskrecki taken in Mozambique:

Now clearly selection is responsible for this, but what kind? Does it hide from predators by running with real ants (crypsis), or does it resemble a stinging or toxic ant that predators have learned to avoid (Batesian mimicry)? I don’t know, but it’s a lovely mimic.

h/t: Matthew Cobb, who keeps his eye on Twitter

Free Speech: Who gets to decide who speaks? But now we have a Decider!

April 28, 2017 • 10:30 am

This week we’ve seen two articles by English professors arguing that censorship is essential to ensure free speech. One, in the New York Times, was by Ulrich Baer from New York University, and the other, in the New Republic, was by Aaron Hanlon from Colby College (links go to my analyses, which contain links to the original pieces). Both professors claimed that yes, free speech was good, but that “hate speech”—speech that dehumanized people or attacked their identities or nullified their “lived experience”—was not free speech and thus was okay to censor. (By “censor,” I mean disinvite people who have already been invited to speak at universities, or to harass them in such a way that they become unable to give their talks.)

I’m thus pleased to see some pushback in the liberal press against what I consider not only dumb but dangerous arguments for censorship: arguments that, if they became policy, would allow only approved forms of speech on campus. One article, by Kevin Drum (a cat lover) is at Mother Jones, and is called “The most important free speech question is: Who decides?” Here’s an excerpt, in which Drum starts by referring to Aaron Hanlon’s New Republic piece:

The sophistry here is breathtaking. If it’s just some small group that invites someone, then it’s OK if the rest of the university blackballs their choice. After all, universities are supposed to decide what students don’t need to know. It may “look like censorship from certain angles,” but it’s actually the very zenith of free expression.

. . . But now everyone is weighing in, and here on the left we’re caving in way too often to this Hanlon-esque lunacy. Is some of the speech he’s concerned about ugly and dangerous and deliberately provocative? Of course it is. But that’s not a reason to shut it down. That’s the whole reason we defend free speech in the first place. If political speech was all a harmless game of patty-cake, nobody would even care.

Speech is often harmful. And vicious. And hurtful. And racist. And just plain disgusting. But whenever you start thinking these are good reasons to overturn—by violence or otherwise—someone’s invitation to speak, ask yourself this: Who decides? Because once you concede the right to keep people from speaking, you concede the right of somebody to make that decision. And that somebody may eventually decide to shut down communists. Or anti-war protesters. Or gays. Or sociobiologists. Or Jews who defend Israel. Or Muslims.

I don’t want anyone to have that power. No one else on the left should want it either.

Well, this argument is not new; it was made by Hitchens and parroted by me, and it’s a good argument. Who would you trust to make all the decisions about what you can hear on campus? Anyone? I can’t name anybody save someone like Hitchens, who would censor nobody. There is no clear distinction between “acceptable” and “unacceptable” speech, and different people have different views. I, for one, wouldn’t try to censor a Holocaust denialist, because I would want to hear what kind of arguments he/she would make. At the very least, hearing someone with “offensive” views gives you an idea of what  your opponents have to say, and a chance to hone your own arguments. There’s not really a down side, unless you think that you need to be The Decider because the Little People might be swayed by offensive speech.

A related piece is in the Washington Post, written by Samantha Harris, vice president of policy research for the estimable organization Foundation for Individual Rights in Education (FIRE). The piece is called Lawyer: Stop using censorship to ‘protect’ free speech“, and here’s an excerpt, which gives some tangible examples; I particularly like the invocation of Ayaan Hirsi Ali. Harris’s starting point is Ulrich Baer’s call for censorship in the New York Times:

Under this view, some enlightened group of people, claiming a monopoly on the truth, decide which viewpoints are permissible and which must be shut out because they “invalidate the humanity” of others. In Baer’s case, these impermissible views include not only Holocaust denial and white supremacy, but also opposition to illegal immigration and transgender rights, among other things.

. . . But Baer assumes, quite dangerously, that we can know in advance whose stories and experiences are “legitimate” and whose are not.

What about, for example, the lived experiences of genuine dissenters from marginalized groups — people like Ayaan Hirsi Ali, whose arguments about the treatment of women in Islam have been the frequent target of calls for censorship because of their perceived insensitivity to Muslims, though Hirsi Ali was herself raised as a Muslim and subjected to female genital mutilation?

Would Baer and others like him consider her criticism of Islam’s treatment of women to be a legitimate personal narrative, or is it one of those topics that should be off-limits because reckoning with Hirsi Ali’s argument might force other Muslims to defend their humanity? Or is it both? And if it is both, how do we decide — and who decides — which aspect should prevail?

Down the rabbit hole we go.

And what about people like Jonathan Rauch, a gay man who thinks that unfettered free speech is actually critical to minority rights? What if Rauch — and not those who believe that minority rights require the suppression of “hate speech” — is correct?

In a 2013 article for Reason magazine, Rauch described growing up gay in an era of terrible prejudice, and observed how the right to free speech was critical to the success of the gay rights movement.

Now you can say that gay rights were clearly something that should have been articulated, and those opposing them censored, but remember that long ago there were many who had arguments against gay rights, and the morality of equal rights for gays, and of their marriage, wasn’t universally accepted. Societies change, and that change is promoted by free discussion. Criticism of Islam is considered “hate speech” by many Muslims; should we ban it? Then we lose the opportunity of reforming the religion to eliminate its more oppressive tenets.

But one person has set himself up as The Decider—the person who can and has determined which speakers colleges shouldn’t be allowed on campus. It’s those like Coulter and Milo, who are “Nazis” and “far-right asshats.” Further, we have to make such decisions because finance and time dictate that a college can host only a limited number of speakers (not a good argument!). We must invite only those people who promote education, not ignorance, and who open rather than close minds. (I presume, based on The Decider’s past posts, that Ayaan Hirsi Ali, Richard Dawkins and Sam Harris would not fill that bill.)

A bit of The Decider’s argument:

One catch. You want infinite free speech on campus, you have to give us infinite money, infinite time, infinite resources. Fair enough?

Somehow, I don’t think it’s coming. Especially since the same people who want to see Ann Coulter given a privileged spot on the non-infinite roster of available speaking engagements are the people who under other circumstances complain bitterly about diversity. The rage always seems to rise on behalf of far-right asshats and Nazis, like Coulter or Yiannopoulos, have you noticed?

But even if we could accommodate everyone and every single point of view, the result has a name: it’s called cacophony. I don’t see how that is useful or constructive. Universities have a mission of promoting education; should we, in the name of Free Speech, insist that we also promote ignorance? That would be incoherent.

Universities are not neutral on all issues, nor should they be. We try to encourage open-mindedness; you can’t do that by also opening the door to those who encourage the closing of minds. We try to serve a diverse community; that doesn’t work if you take a disinterested position on purveyors of hate and bigotry. We aim to be selective and teach the best ideas that have the support of an educated, informed group…the antithesis of indiscriminate acceptance of bad, unsupported, rejected falsehoods. Coulter has nothing to contribute.

I know what’s next: Marketplace of ideas! Exposing students to novel points of view! The university should take students out of their comfort zone!

This is true. We do that all the time. I introduced my students to epistasis last week — discomfort and confusion were sown everywhere. It was good. But none of these arguments apply to Ann Coulter.

. . . Further, if you think being a place for education and intelligence and learning means you’re supposed to be wide open and completely neutral on everything, letting every voice through unfiltered, you don’t understand the university. I’ll give you two words: critical analysis. The university will examine your ideas, all right, and it will judge them. Nazis don’t get to come back and demand a do-over and a new grade.

Those protests? Those are students exercising their intelligence, and then going into the public square to exercise their free speech. Why? Did you think free speech meant freedom from criticism?

Note to The Decider: none of us have ever made the stupid argument that free speech meant freedom from criticism. But criticism is different from violence, and the former doesn’t justify the latter.

Were The Decider to run a university, we would see nobody on the Right, or especially the Far Right, allowed to speak. After all, time and money are limited, and they’re asshats anyway.

In fact, cacophony is exactly what we need, for, as the Founding Father realized, progress comes not from a harmony of opinions, but from a clash of opinions. I would not want The Decider to decide who promotes the “best ideas” (which of course are his ideas). We wouldn’t hear from the Right, and many from the Left would also be censored—those, like Sam Harris, Christina Hoff Sommers, and Ayaan Hirsi Ali, who are the “wrong kind of Leftists.” What a constricted intellectual world!

Finally, as a geneticist, I have to say that the analogy between epistasis (gene interaction) and Ann Coulter is ludicrous.

h/t: Grania, Richard W.

Obama gets $400,000 to speak at conference organized by Wall Street investment bank

April 28, 2017 • 8:45 am

Sound familiar? Like what Hillary Clinton did when she got over $200,000 for each of two speeches to Goldman Sachs a few years ago?

Yep, Obama—our Barack Obama, former President—is scheduled in September to get nearly twice as much as Clinton for a speech: $400,000 for one hour’s work (I bet others will write the damn speech). Further, it’s a speech at a health care conference organized by Wall Street bank Cantor Fitzgerald. It’s also exactly the same amount Obama earned per YEAR as President of the U.S. The New York Times reports this:

On Wednesday, Mr. Obama’s spokesman defended the former president’s coming speech, saying Mr. Obama decided to give it because health care changes were important to him. The spokesman, Eric Schultz, noted that Cantor Fitzgerald is a Wall Street firm but pointed out in a statement that as a presidential candidate, Mr. Obama raised money from Wall Street and went on to aggressively regulate it.

Mr. Obama will spend most of his post-presidency, Mr. Schultz said, “training and elevating a new generation of political leaders in America.”

If health care changes are important to Obama, there are plenty of venues where he can express his ideas and program without lining his pockets.

Well, at least Obama isn’t in a position to make policy about healthcare any more, but I find it unseemly for him to be so grasping and acquisitive after he left the Presidency. After all, the man is already wealthy from his earlier books (if you don’t believe me, I’ll show you a picture of his mansion about two miles from where I’m sitting). For one thing, he’ll cop several million bucks as an advance on the book he’s writing. And he won’t lack for opportunities in the future. The fact that Obama regulated Wall Street and cares about healthcare is just an excuse: the real reason is that he wants lots of money. He doesn’t need tons of extra money, especially from Wall Street firms. If he has a message, let him convey it to the American public.

I’m sure there are many readers who will say, “This is fine: more power to him. If somebody’s willing to pay Obama that much for an hour’s work, let him take the dough.” But would Jimmy Carter do that? Can you keep an image as a humanitarian while taking big bucks from Wall Street? Many of us criticized Hillary Clinton for giving $200,000+ speeches to Wall Street firms, and if we now say that what Obama is doing is okay, that’s a bit of a double standard. And yes, I know Clinton did it when it was clear she would run for President, but remember that Obama will still act as an advisor to Democrats.

In fact, even other Democrats, including Bernie Sanders and Elizabeth Warren, were critical of this news. From The Independent:

Bernie Sanders has said he thinks it is “unfortunate” Mr Obama opted to receive the fee and argued the decision signifies the profound influence big business has on the political system.

“I think it just speaks to the power of Wall Street and the influence of big money in the political process,” the Democrat Vermont senator told Bloomberg.

“I think it’s unfortunate. President Obama is now a private citizen and he can do anything he wants to but I think it’s unfortunate. You have the former president of Goldman Sachs is now the chief financial advisor for President Trump, and then you have this, so I think it’s unfortunate”.

. . . Senator Elizabeth Warren has also expressed her reservations, saying she was “troubled by” the speaking fee.

“I was troubled by that,” Warren said on SiriusXM’s Alter Family Politics during an appearance to promote her new book.

“One of the things I talk about in the book is the influence of money. I describe it as a snake that slithers through Washington. And that it shows up in so many different ways here in Washington.”

Readers’ wildlife photos

April 28, 2017 • 7:30 am

Don’t forget to send in your good photos. I have a decent backlog, so if your pics haven’t appeared yet, don’t be concerned. They will.

Stephen Barnard has sent photos of birds in flight, but left the identification to you. His comment:

A few of the BIFs (birds in flight) photos I’ve taken recently. Species identification is left to the reader. I’ve posted all these before.

 

Friday: Hili dialogue

April 28, 2017 • 6:30 am

It’s Friday, April 28, 2017, and it’s National Blueberry Pie day. The best specimen of that genre I ever had was a lowbush (wild) blueberry pie from Helen’s Restaurant in Machias, Maine: a mixture of cooked and uncooked berries in an open-top crust, piled high with real whipped cream.  If you’re in Maine and it’s blueberry season, go to Helen’s–no matter how far you have to drive. It’s also National Heroes’ Day in Barbados (is Rihanna included?).

On this day in 1789, the mutiny on the Bounty took place, with Fletcher Christian and 18 sailors set adrift, winding up settling on Tahiti and Pitcairn Island. Here’s a trailer for the 1935 film version starring Charles Laughton and Clark Gable:

On April 28, 1923, Wembley Stadium was opened, and on that date in 1932, a vaccination against the yellow fever virus was announced (I’ve had the shot). On this day in 1945, Benito Mussolini and his mistress Clara Petacci were killed by a firing squad, and then their bodies publicly displayed and degraded. Two years later, Thor Heyerdal and his mates set out from Peru on the Kon-Tiki to determine if South Americans could have settled Polynesia (n reality, the Polynesians came from Southeast Asia). On Apri 28, 1967, Muhammad Ali refused to be inducted into the U.S. Army as a self-declared conscientious objector. He was convicted of a felony but never served time, and the Supreme Court overturned his conviction.

Those born on this day include James Monroe (1758), Lionel Barrymore (1878), Kurt Gödel (1906), Blossom Dearie (1924), Harper Lee (1926), Ann-Margret (1941), Jay Leno (1950) and Penélope Cruz (1974). Those who died on this day include Benito Mussolini (1945) and Francis Bacon (1992). Meanwhile in Dobrzyn, Hili’s quoting Hebrew from the Biblical story of Belshazzar’s Feast. Where does she learn this stuff?

Hili: Mene, Tekel, Peres
A: What do you mean?
Hili: It’s time to pay bills again.
In Polish:
Hili: Mane, tekel, fares.
Ja: O co ci chodzi?
Hili: Znów trzeba płacić rachunki.

Out in Winnipeg, Gus got into mischief. The report from his staff:

Here’s a Gus pic from yesterday. Gus jumped into the laundry basket (clean laundry, naturally) and proceeded to clean his feet. That’s why it’s called ‘the wash’, isn’t it?

In Montreal, Linux Bernie, the new dog of readers Anne-Marie and Claude, WON the tape test. When I inquired whether the picture below was a setup, with the d*g put into the square, I was told that L.B. walked into the square, and then was simply told to “sit.” Does that count as winning?

Lagniappe—a tweet:

https://twitter.com/virtuallydead/status/857834735170723840

Nature paper suggests humans inhabited North America 130,000 years ago

April 27, 2017 • 5:43 pm

by Greg Mayer

As Jerry noted yesterday, in a new paper in Nature, Steven R. Holen and colleagues report finding the remains of a butchered 130,000 year old mastodon in San Diego. (If you haven’t already done so, do go take a look at Jerry’s post, which includes a video press release, and illustrations from the paper.)

The key words in the first sentence are ‘butchered’* and ‘San Diego’. The first word indicates that people had taken the bones of the 130,000 year old mastodon apart– which in itself would be a “neat, but what’s the fuss” result. It’s ‘San Diego’ that’s the cause of the fuss. The peopling of the Americas has been a contentious topic for some time, but virtually all the debate has concerned a relatively slim time interval– 12-30 kya (see here for a previous discussion at WEIT, and this news piece in Science about two recent papers with contrasting conclusions). The San Diego find is thus 100,000 + years earlier!

So what evidence do they have for this early arrival? First, they have the mastodon, whose bones were fractured in ways which they find inconsistent with damage by carnivores or the environment, but which appear consistent only with being struck with implements. They did a lot of breaking of elephant bones in order to try to simulate the damage to the mastodon, and concluded that tools alone could do the trick. The mastodon’s remains were radiometrically dated at 130.7 ± 9.4 kya. In addition to the mastodon, they also found stone tools, which they interpret as hammerstones and anvils.

These results would have many important implications for human evolutionary history; but first we must ask, are the results correct?

I must admit I’m dubious. The anvil and hammerstones are not the sorts of objects which are unquestionably manufactured– they are not like finely fluted spear points, whose human origin cannot be doubted. The breakage patterns in the bones do indicate that the breaks occurred perimortem, but I’m not sure the breaks could not be due to non-human causes. The dating is directly on the mastodon, which is good– they’ve not dated some possibly extraneous item which could have been redeposited from earlier strata. But, nonetheless, dating is subject to various artifacts.

As Carl Sagan used to say, “Extraordinary claims require extraordinary evidence.” What makes the current claim extraordinary is that there’s no other evidence of human presence in the Americas for ca. 100+ K years after this find. And it’s not like the late Quaternary of America is an unstudied or poorly known stretch of time! I don’t regard fracture patterns and crude tools to be sufficiently extraordinary evidence to overcome, in a single go, the weight of that 100,000 year absence. It is much more reasonable to think that the new data can be reconciled with all the past data in a way that does not require us to discount the past data. And, thinking, “they must have made a mistake somewhere with the new data”, is a perfectly plausible way of reconciling the two. This conservatism in the face of anomalies is a key part of the method of science– it properly proportions belief to the evidence.

On the other hand, the new data do not threaten to overturn any fundamental principles, merely a seemingly well-attested fact of evolutionary history, and such facts have been overturned before. So, we must ask, but what if they’re right?

The most interesting implication, to me, is that if there were people here 130 kya, they went completely extinct.  It means that human habitation of an entire hemisphere is an iffy thing. The real first Americans got wiped out by something– disease, predators, climate, competitors, whatever. Who would these now extinct people have been? Well, if they got to America not too long before the radiometric date, they would probably be Neanderthaloid (by which I mean the varied archaic Eurasian subspecies of Homo sapiens with which anatomically modern humans interbred after their spread from Africa). If they came much earlier, they might have been Homo erectus (which would make Harry Turtledove’s A Different Flesh, in which the first European settlers of America encounter not Indians, but “sims“, prophetic!).

There would also be a possibility that these first Neanderthaloid Americans survived, and that the anatomically modern human colonizers of ca. 20 kya, interbred with them in the course of replacing them, just as their forebears did in Asia. However, because American Indians are not, as far as I know, enriched for Neanderthaloid alleles relative to other Eurasians (who are 1-4% Neanderthaloid; a bit higher in Melanesia), this seems unlikely. (There are claims out there that Indians are enriched for Neanderthaloid genes, but I don’t know how that got started; East Asians, from which, at least generally, American Indians descend, are Neanderthaloid enriched relative to Western Europeans, which seems to indicate more than one episode of interbreeding on the course of their migration from Africa.)

* I use “butchered” here in the sense of “processed for eating”, as the bones were presumed broken apart to get at the marrow. The paper uses “butcher” in the narrower sense of “cut with a knife or similar implement”. The paper does not say the mastodon was cut with a knife or other sharp tool.


Holen, S. R., T. A. Deméré, D. C. Fisher, R. Fullagar, J. B. Paces, G. T. Jefferson, J. M. Beeton, R. A. Cerutti, A. N. Rountrey, L. Vescera, and K. A. Holen. 2017. A 130,000-year-old archaeological site in southern California, USA. Nature 544:479-483.