A paper on sexual selection is retracted

November 27, 2013 • 12:36 pm

UPDATE:  Trivers and two co-authors have published a 99-page explication of the case, “The Anatomy of A Fraud: Symmetry and Dance”; you can get a free pdf here.

__________

Curiously, Nature has just retracted a paper that’s about evolution, in particular, sexual selection. The retraction, here, is short and enigmatic, and I’ll post a screenshot:

Picture 3

The paper was actually published in 2005, and I’m not sure why retraction is warranted because “K. G. could not be contacted.” Surely there’s a deeper story here.

Below is the title and abstract of the original paper (reference and link below; free access), which reported a positive correlation among Jamaicans between measured bodily symmetry and dancing ability (as judged by a group of other Jamaicans watching films of dancers). The correlation was particularly high in males. This goes along with the common wisdom that a more symmetrical body indicates higher reproductive fitness, and makes one more attractive to the opposite sex. I don’t know a lot about the evidence, but for a while symmetry was a fad in evolutionary biology, and I guess I grew a bit suspicious about the spate of positive results. But I don’t know whether this paper was retracted because its results were wrong, or for some other reason. It’s a puzzle.

Picture 4

h/t: Matthew Cobb

_______

Brown, W. M., L. Cronk, K. Grochow, A. Jacobson, C. K. Liu, Z. Popovic, and R. Trivers. 2005. Dance reveals symmetry especially in young men. Nature 438:1148-50.

66 thoughts on “A paper on sexual selection is retracted

  1. Before retracting a paper, the journal would try to get every author to agree to the retraction; presumably they couldn’t find Keith Grochow, hence the “K. G. could not be contacted.” From the link Jeff J. posted, it sounds like Grochow’s current invisibility has nothing to do with the reason for the retraction.

  2. From Jeff’s link:

    Trivers began to have doubts in 2007, when a Rutgers graduate student was unable to replicate some of the paper’s findings. On investigation, Trivers found inconsistencies in symmetry measurements between a data set kept by his group and one received from [then-postdoc William] Brown in 2007. “Not only were the values changed, they were not even internally consistent,” says Trivers.

    He’s beefed with Nature that it has taken them this long to retract the paper.

  3. Good for Trivers, a brilliant, if unsung evolutionary biologist whom Steven Pinker describes as “one of the great thinkers in the history of Western thought”. He wrote the forward to The Selfish Gene and Dawkins himself has said the ideas in his book were greatly influenced by Triver’s research.
    http://edge.org/memberbio/robert_trivers

    1. I thought “Trivers” was a familiar name. I think I’ll have to go and read some of the background now.

      1. You won’t be disappointed. He is a very interesting person. When he was a professor of mine in the late 70’s, he befriended Huey Newton and joined the Black Panther Party. They worked on his theories about deceit and self-deception together.

        1. Having a browse this morning, in preference to actually doing some more work.

          But Brown forgot to change the FA values themselves, he only changed the relative FAs, so his data were internally inconsistent: one could not derive one set of values from the other. It is trivial to program a computer to make both changes at the same time. You insert the [desired] values you want and then have the computer change the [components … so that] you get the false values you have inserted.

          Oh, the number of times that I’ve caught contractors doing this to the data they’re presenting to me for acceptance.

          He either forgot to do this, or as I imagine, never thought his data would be subjected to the scrutiny it was and so did not bother to protect himself against this.

          “never thought … scrutiny” would normally be my bet. It’s depressingly common these days for data-acquisition computer technicians to not be trained adequately in the science behind the equipment they operate that they don’t know that one set of results can be derived from other parameters in the same data set … and therefore be used as a cross-check on the consistency of the data set. There are good reasons for this – understanding the data and interpreting it is a chargeable extra service requiring extra, specially trained personnel. But it is nonetheless disheartening.

  4. Dear Jerry,

    How can you not be aware of the vitriol surrounding this issue – it has consumed Bob Trivers for ages and it is he who urged the retraction of a paper, in which he was listed as a co-author, for alleged manipulation of the data by lead author William Brown. Trivers has written a whole book about the “Fraud” and his “shoddy treatment” by Rutgers in which they barred him from campus after his alleged intimidation of Lee Cronks. Here’s a taster:-

    http://roberttrivers.com/Robert_Trivers/A_Case_of_Fraud_at_Rutgers.html

    Best,

    Jeremy Taylor.

    Jeremy Taylor. http://www.sciencefactory.co.uk/authors.php?aid=148

    Jeremy Taylor 4 Gatcombe Road LONDON N19 4PT 07966 531264 020 7272 7597/7281 7166 jeremytaylor@blueyonder.co.uk

  5. It is unfortunate people cannot admit mistakes, malicious or otherwise without public heartache and generalized scrutiny over research methods employed by science. “Well, we tried to commit fraud, but we failed. We will try not to let our hypothesis dictate our data in the future.”

    It is just being human and a scientist at the same time…give them and each other a break. In any event, it is not like they are writing an article where our civilization depends on the truth of what they say.

  6. Holy Cow. I am so sorry on the behalf of Dr. Trivers re his treatment by Rutgers, a university with which I have family connections.
    Is symmetry and sexual selection considered a ‘fad’ in the sense that it may not be really supported by evidence any more? It seemed fine the last time I heard about it.

  7. Well, as a woman, all I can say is that men who cannot dance well, whose moves are clumsy and out of rhythm, are the least sexually attractive men. On the other hand, those men who dance well, whose moves are sensual and flowing, smooth and with the rhythm and music, are the most sexually attractive – whether they are symmetrical in build or not, whether they are good-looking or not. 😉

      1. With sufficient instruction — which, in turn, would depend on one hell of a lot of motivation — I think I might be able to almost dance half as well as Al Gore.

        Maybe.

        b&

    1. You should submit your findings to Nature. I heard they recently opened a position for a paper on just this subject … :S

      1. Perhaps that’s because you’re looking at the wrong moves and limit yourself to under the covers… In my long and vast experience, the way a man moves on the dance floor accurately reflects his intimate performance, be it under the covers, over the covers, on the beach, in the forest, in the fields, on the kitchen table, on the floor by the fireplace, in the sauna, etc., etc. 😀

        1. You must be luckier than me then. 🙂 I do love to dance too and love watching men (and women) who dance well. It’s a beautiful way of expressing oneself. However, I would not discount a man who had no rhythm. 🙂 I’m most attracted to the largest sex organ… the brain!

  8. “Dance reveals symmetry…”

    So, if the dancers weren’t dancing, the observers wouldn’t notice how symmetrical the bodies of the dancers were?

    That doesn’t seem right. I can spot symmetry without the object in question having to be dancing. In fact, symmetry would be easier to assess if the object wasn’t moving.

    If the point is that people are more sexually attracted to symmetrical bodies, then what does dancing have to do with it?

      1. I think the hypothesis was that when people said they were rating dancers (subjectively) they were rating instead their symmetries. Sort of like how when people report loudness they are reporting some function (which can be determined by psychophysics) of the sound intensity.

    1. If I read correctly, men who had already been deemed highly symmetric were then judged to be better dancers than their asymmetric brethren, and dance was supposed to be a stand-in for being more desirable or something.

  9. grochow took down his entire userpage and CV.
    Thankfully some of us are *actually* skilled with computers.
    here it is: https://www.dropbox.com/s/f1tc6vjeay5sy84/grochowcv.pdf

    what did grochow do wrong? he lied about being able to detect motion in three dimensional space. essentially motion in three dimensions is “spacetime” and grochow was bs’ing the entire time using probability distributions and overwhelming terminology. this has been a big problem in the area of machine learning and statistics: too much greed and the rush to popularize artificial intelligence and computers lead to exploitation from individuals such as mr grochow.

    this guy does not know anything about in-vivo motion, or the physics required to actually do what he was claiming (in many areas). to me, it seems like this paper was what he used to make the connections he enjoyed (like lazowska), and he may have worked hard to bullshit his way there.

    he disgraces the very idea of mathematics as art.

  10. Well I have Triver’s book Deceit and Self-Deception waiting to be read – he obviously knows all about it from the inside! Great that the co-authors have done this – gives some hope to combatting fraudulent science. I suppose scientists are just as vulnerable as the rest of us to this sort of thing, sadly.

  11. There’s an episode of Star Trek where someone who wants to set up Captain Kirk hides somewhere on the enterprise. A special type of search is announced, and someone asks “Does this search technique assume that the person wants to be found?”

    What should a senior scientist who is the last author on a paper assume about his student who is the first author? Why is the former on the paper at all? Did he contribute to the research? Presumably, he had some reason to be on the paper. Should part of the reason be that it is his responsibility to check the results and vouch for them? After all, he can check them more easily than can the referees of the paper.

    We all agree that fraud in science is bad. We all agree that retraction is good. We all agree that it is better if it doesn’t happen in the first place. Whose responsibility is it to nip it in the bud?

    Yes, Trivers did ask for the retraction. The question (and none of the questions here are rhetorical) is whether he should have prevented this at an earlier stage.

      1. No, not immunity. However, if one puts one’s name on a paper, one is in some sense responsible for it. The question (and it is not a rhetorical question) is whether such a person should assume that his co-authors are all completely honest. Actually, the problem is deeper. It doesn’t matter why the results are wrong: fraud, incompetence, honest mistake, whatever. When one puts one’s name on a paper, one is responsible for its conclusions.

        Again, Trivers did the right thing, against considerable oppostion, once the fraud became clear. My question, and it is not a rhetorical question, is how much checking of the results can one expect from someone whose name is on a paper as a result of his supervisor status?

        1. Well, if you collaborate with someone, and are not standing over them as they do the work or record their data, there’s precious little you can do to guard against fraud. One has to simply trust in the person, and in the data that you look at before you publish it. So one is “responsible” for fraudulent work as a co-author in the sense that if you find out the work was fabricated, or data were manipulated, you are responsible for retracting the paper and calling out the miscreants. But you’re NOT necessarily responsible for perpetrating the fraud itself.

          These days, when papers frequently involve collaboration among multiple people, it’s hard to guard against one bad apple in the lot. My view (and I haven’t engaged in many collaborations on data papers) is that you must look at the data and analyses of the coauthors and see if there’s anything fishy. And, thank Ceiling Cat, I’ve had honest collaborators. But there’s no way to check if people are making up or tweaking the data one publishes.

          One guard against this is replication: can others duplicate your findings? But this is very uncommon in ecology, evolution, or, in Trivers’s case, behavioral ecology.

        2. So we agree that Trivers did the right thing when he found out. So you’re looking to convict him for not finding out as soon as you would prefer.

          If you have evidence that he could have known earlier than he did but avoided the facts, or that he did in fact know earlier and tried to cover up, then that would be relevant. Otherwise I think you’re demanding clairvoyance.

          1. No, I’m not looking to convict anyone. The question is, should a supervisor who is on a paper written by a student convince himself before publication that the data are correct, or is this asking too much?

          2. It is asking too much for lead researchers (or anyone) to never be wrong or never be misled. Do you hold yourself to that impossible standard?

          3. I agree with you, Phillip. If I’m reading the articles correctly, the data showing the discrepancy between the reported values and their un-tampered with related values that was found post facto were all available prior to publication.

            That said, there are a lot of other factors involved here, such as simply tradition (major professors are usually automatically co-authors, in my experience), interpersonal trust, etc. The alleged miscreant here was apparently Trivers’s own post-doc–perhaps their relationship had seemed fine till now, and there was no reason to doubt one’s colleague’s veracity.

            But whatever extenuating factors there may have been, I think Trivers’s accounts of the fraud would sound better if they began with something along the lines of, “Admittedly, as a co-author and as Brown’s major advisor, I did have the chance to uncover this matter sooner, and now I certainly wish I had done so.”

        3. I have been in science (cell biology) long enough to have heard of several cases of “dodgy results” that somehow could not be replicated after the brilliant postdoc or student left the lab.

          As Jerry says, short of literally standing next to somebody every step of the way, there is nothing – nothing – that you can do to prevent being duped.

          All it takes (and the majority of cases I have heard of involved this “technique”) is e.g. either unequal loading of a gel or changing samples when loading. The latter is pretty much untraceable and the only way to prevent it is to have every single gel run independently by at least two people – quite unfeasible with current pressure to publish.

          So there, that’s your answer. Apart from that, my personal opinion is that you are bordering on concern-trolling.

  12. the article that was used to produce the results in the (now) retracted article is available here:
    http://grail.cs.washington.edu/projects/styleik/styleik.pdf

    btw if you guys look at zoran popovic’s site: http://homes.cs.washington.edu/~zoran/

    i would not at all be surprised if zoran popovic was a larger beneficiary because he was a senior author. if you look on the right hand side of his page he keeps bragging about how some of his group gets published in nature biotech.

    the worst part is that this guy isn’t even good at math/statistics!! he’s just a run of the mill games programmer who *thinks* he knows math.

    i was wondering why nature was so resistant to my article (even when it was being reconsidered) earlier this year, but it turns out that their trust in mr popovic/grochow has lead to a cascade of publications (from these individuals) that must be questioned.

    the sole issue is that these individuals have built their entire CS careers on this publication, and parlayed it into other opportunities. however if their initial work was fraudulent (it is because i’ve essentially demonstrated the opposite of what they were claiming, which is to deduce the “motion contour” in spacetime using fMRI data), i am quite sure it was these individuals that are responsible for abuse of computing science and mathematics.

    to be honest, they are very good at bullshitting for how they used the methods for “style based inverse kinematics”. i would not at all be surprised if this trend was furthered in their latter publications, as referees are less likely to reprimand an individual who was co author on what was perceived to be a groundbreaking study.

    being a proud mathematician/machine learning individual, i was always baffled as to how these stupid “parameterized models” were still popular ten years after they didn’t show empirical applications; i didn’t know of grochow’s paper at the time, and yesterday’s retraction is like a huge bombshell on our community.

    these people not only have set back our community by a decade, they have also inflated and contaminated much of its members. thankfully the “top people” (christopher bishop, neil lawrence, alex smola, sam roweis, lawrence saul, john hopfield, rich sutton, to name a few) are mostly unaffected. however i can’t help but share their sentiment of feeling cheated by the individuals who perpetrated this act.

    the good thing is that our community will be much more careful about what publications are accepted at conferences, and i anticipate we will also be downsizing significantly, where remaining constituents will *actually* have demonstrated a fluid ability in mathematics and computing science.

  13. sorry for the additional post, but dear gosh these select individuals may ahve contaminated the credibility of the entire UW computer science dept!!!

    the more i look at popovic’s publications, the more concern i have… it seems like he has built a huge house of cards.

    he seems to be the “boss” of AI and games, but if you look closer it seems that almost all of his publications focus on “gaming”. it’d be easier to sneak by fraudulent applications in such journals/conferences because again, as i’ve stated a few times (this must be stressed), the level of mathematical knowledge required to have a legitimate empirical application of “Artificial Intelligence” is nontrivial.

    the bigger issue is that the power of our mathematics (functional analysis) admits an abstract representation, which is why it is so powerful. you can rigorously measure phenomena if you truly understand how to interface abstract ideas with mathematical techniques. but again, you can also easily bullshit and get away with it… this is so disappointing.

    given how much Popovic has published, i do not think lazowska was *unaware* of what these people were doing. it is just simply too difficult for me to believe that.

    somewhere out there i bet bill (gates) is quite disappointed.

    1. Does anyone know of a social science paper in a high impact journal that is not retracted yet? What a mess!

Leave a Reply to Jesper Both Pedersen Cancel reply

Your email address will not be published. Required fields are marked *