The ideologues: why we can’t use statistics any more

November 27, 2022 • 10:00 am

I could go on and on about the errors and misconceptions of the paper from Nautilus below, whose aims are threefold. First, to convince us that several of the founders of modern statistics, including Francis Galton, Karl Pearson, and Ronald Fisher, were racists. Second, to argue that the statistical tests they made famous, and are used widely in research (including biomedical research), were developed as tools to promote racism and eugenics. Third, that we should stop using statistical analyses like chi-squared tests, Fisher exact tests, analyses of variance, t-tests, or even fitting data to normal distributions, because these exercises are tainted by racism.  I and others have argued that the first claim is overblown, and I’ll argue here that the second is wrong and the third is insane, not even following from the first two claims if they were true.

Click on the screenshot to read the Nautilus paper. The author, Aubrey Clayton, is identified in the piece as “a mathematician living in Boston and the author of the forthcoming book Bernoulli’s Fallacy.”

The first thing to realize is that yes, people like Pearson, Fisher, and Galton made racist and classist statements that would be deemed unacceptable today. The second is that they conceived of “eugenics” as not a form of racial slaughter, like Hitler, but by encouraging the white “upper classes” (whom they assumed had “better genes”) to have more kids and discourage the breeding of the white “lower classes.” But none of their writing on eugenics (which was not the dominant interest of any of the three named) had any influence on eugenic practice, since Britain never practiced eugenics. Clayton desperately tries to forge a connection between the Brits and Hitler via an American (the racist Madison Grant) who, he says, was influenced by the Brits and who himself influenced Hitler, but the connection is tenuous. Nevertheless, this photo appears in the article. (Isn’t there some law about dragging Hitler into every discussion as a way to make your strongest point?)

My friend Luana suggested that I use this children’s book to illustrate Clayton’s  point:

As the email and paper I cite below show, Clayton is also wrong in arguing that the statical methods devised by Pearson, Galton, and especially Fisher, were created to further their eugenic aspirations. In fact, Clayton admits this for several tests (bolding is mine).

One of the first theoretical problems Pearson attempted to solve concerned the bimodal distributions that Quetelet and Galton had worried about, leading to the original examples of significance testing. Toward the end of the 19th century, as scientists began collecting more data to better understand the process of evolution, such distributions began to crop up more often. Some particularly unusual measurements of crab shells collected by Weldon inspired Pearson to wonder, exactly how could one decide whether observations were normally distributed?

Before Pearson, the best anyone could do was to assemble the results in a histogram and see whether it looked approximately like a bell curve. Pearson’s analysis led him to his now-famous chi-squared test, using a measure called Χ2 to represent a “distance” between the empirical results and the theoretical distribution. High values, meaning a lot of deviation, were unlikely to occur by chance if the theory were correct, with probabilities Pearson computed. This formed the basic three-part template of a significance test as we now understand it. . .

If the chi-squared test was developed to foster eugenics, it was the eugenics of crabs! But Clayton manages to connect the crab study to eugenics:

Applying his tests led Pearson to conclude that several datasets like Weldon’s crab measurements were not truly normal. Racial differences, however, were his main interest from the beginning. Pearson’s statistical work was inseparable from his advocacy for eugenics. One of his first example calculations concerned a set of skull measurements taken from graves of the Reihengräber culture of Southern Germany in the fifth to seventh centuries. Pearson argued that an asymmetry in the distribution of the skulls signified the presence of two races of people. That skull measurements could indicate differences between races, and by extension differences in intelligence or character, was axiomatic to eugenicist thinking. Establishing the differences in a way that appeared scientific was a powerful step toward arguing for racial superiority.

How many dubious inferential leaps does that paragraph make? I count at least four. But I must pass on to other assertions.

Ronald Fisher gets the brunt of Clayton’s ire because, says Clayton, Fisher developed his many famous statistical tests (including analysis of variance, the Fisher exact test, and so on) to answer eugenic questions. This is not true. Fisher espoused the British classist view of eugenics, but he also developed his statistical tests for other reasons, even if he ever applied them to eugenic questions. In fact, the Society of the Study of Evolution (SSE), when deciding to rename its Fisher Prize for graduate-student accomplishment, says that the order of eugenics —> statistical tests is reversed:

Alongside his work integrating principles of Mendelian inheritance with processes of evolutionary change in populations and applying these advances in agriculture, Fisher established key aspects of theory and practice of statistics.

Fisher, along with other geneticists of the time, extended these ideas to human populations and strongly promoted eugenic policies—selectively favoring reproduction of people of accomplishment and societal stature, with the objective of genetically “improving” human societies.

In this temporal ordering, which happens to be correct (see below), the statistics are not tainted by eugenics and thus don’t have to be thrown overboard. As I reported in a post last year, several of us wrote a letter to the SSE trying to correct its misconceptions (see here for the letter, which also corrects misconceptions about Fisher’s racism), but the SSE politely rejected it.

Towards the end of his article, Clayton calls for eliminating the use of these “racist” statistics, though they’ve saved many lives since they’re used in medical trials, and have also been instrumental in helping scientists in many other areas understand the universe. Clayton manages to dig up a few extremists who also call for eliminating the use of statistics and “significance levels” (the latter issue could, in truth, be debated), but there is nothing that can replace the statistics developed by Galton, Pearson, and Fisher. I’ll give two quotes showing that, in the end, Clayton is a social-justice crank who thinks that objectivity is overrated. Bolding is mine:

Nathaniel Joselson is a data scientist in healthcare technology, whose experiences studying statistics in Cape Town, South Africa, during protests over a statue of colonial figure Cecil John Rhodes led him to build the website “Meditations on Inclusive Statistics.” He argues that statistics is overdue for a “decolonization,” to address the eugenicist legacy of Galton, Pearson, and Fisher that he says is still causing damage, most conspicuously in criminal justice and education. “Objectivity is extremely overrated,” he told me. “What the future of science needs is a democratization of the analysis process and generation of analysis,” and that what scientists need to do most is “hear what people that know about this stuff have been saying for a long time. Just because you haven’t measured something doesn’t mean that it’s not there. Often, you can see it with your eyes, and that’s good enough.”

Statistics, my dear Joselson, was developed precisely because what “we see with our eyes” may be deceptive, for what we often see with our eyes is what we want to see with our eyes. It’s called “ascertainment bias.”  How do Joselson and Clayton propose to judge the likelihood that a drug really does cure a disease? Through “lived experience”?

It goes on. Read and weep (or laugh):

To get rid of the stain of eugenics, in addition to repairing the logic of its methods, statistics needs to free itself from the ideal of being perfectly objective. It can start with issues like dismantling its eugenicist monuments and addressing its own diversity problems. Surveys have consistently shown that among U.S. resident students at every level, Black/African-American and Hispanic/Latinx people are severely underrepresented in statistics.

. . . Addressing the legacy of eugenics in statistics will require asking many such difficult questions. Pretending to answer them under a veil of objectivity serves to dehumanize our colleagues, in the same way the dehumanizing rhetoric of eugenics facilitated discriminatory practices like forced sterilization and marriage prohibitions. Both rely on distancing oneself from the people affected and thinking of them as “other,” to rob them of agency and silence their protests.

How an academic community views itself is a useful test case for how it will view the world. Statistics, steeped as it is in esoteric mathematical terminology, may sometimes appear purely theoretical. But the truth is that statistics is closer to the humanities than it would like to admit. The struggles in the humanities over whose voices are heard and the power dynamics inherent in academic discourse have often been destructive, and progress hard-won. Now that fight may have been brought to the doorstep of statistics.

In the 1972 book Social Sciences as Sorcery, Stanislav Andreski argued that, in their search for objectivity, researchers had settled for a cheap version of it, hiding behind statistical methods as “quantitative camouflage.” Instead, we should strive for the moral objectivity we need to simultaneously live in the world and study it. “The ideal of objectivity,” Andreski wrote, “requires much more than an adherence to the technical rules of verification, or recourse to recondite unemotive terminology: namely, a moral commitment to justice—the will to be fair to people and institutions, to avoid the temptations of wishful and venomous thinking, and the courage to resist threats and enticements.”

The last paragraph is really telling, for it says one cannot be “objective” without adhering to the same “moral commitment to justice” as does the author. That is nonsense. Objectivity is the refusal to take an a priori viewpoint based on your political, moral, or ideological commitments, not an explicit adherence to those commitments.

But enough; I could go on forever, and my patience, and yours, is limited. I will quote two other scientists.

The first is A. W. F. Edwards, a well known British geneticist, statistician, and evolutionary biologist. He was also a student of Fisher’s, and has defended him against calumny like Clayton’s. But read the following article for yourself (it isn’t published, for it was written for his College at Cambrige, which was itself contemplating removing memorials to Fisher). I’ll be glad to send the pdf to any reader who wants it:

Here’s the abstract, but do read the paper, available on request:

In June 2020 Gonville and Caius College in Cambridge issued a press announcement that its College Council had decided to ‘take down’ the stained-glass window which had been placed in its Hall in 1989 ready for the centenary of Sir Ronald Fisher the following year. The window depicted the colourful Latin-Square pattern from the jacket of Fisher’s 1935 book The Design of Experiments. The window was one of a matching pair, the other commemorating John Venn with the famous three-set ‘Venn diagram’, each window requiring seven colours which were the same in both (Edwards, 2002; 2014a). One of the arguments advanced for this action was Fisher’s interest in eugenics which ‘stimulated his interest in both statistics and genetics’*.

In this paper I challenge the claim by examining the actual sequence of events beginning with 1909, the year in which Fisher entered Gonville and Caius College. I show that the historians of science who promoted the claim paid inadequate attention to Fisher’s actual studies in statistics as part of his mathematical education which were quite sufficient to launch him on his path-breaking statistical career; they showed a limited understanding of the magnitude of Fisher’s early achievements in theoretical statistics and experimental design, which themselves had no connection with eugenics. Secondly, I show that Fisher’s knowledge of natural selection and Mendelism antedated his involvement in eugenics; and finally I stress that the portmanteau word ‘eugenics’ originally included early human genetics and was the subject from which modern human and medical genetics grew.

Finally, I sent the article to another colleague with statistical and historical expertise, and he/she wrote the following, quoted with permission:

There is an authoritative history of statistics by Stephen Stigler of the UoC. There’s also an excellent biography of Galton by Michael Bulmer. Daniel Kevles’s book is still the best account of the history of eugenics, and he gives a very good account of how it developed into human genetics, largely due to Weinberg, Fisher and Haldane. Genetic counselling is in fact a form of eugenics, and only religious bigots are against it. Eugenics has become a dirty word, associated with Nazism and other forms of racism.

According to Stigler, many early developments, like the normal distribution and least squares estimation, were developed by astronomers and physicists such as Gauss and Laplace in order to deal with measurement error. Galton invented the term ‘regression’ when investigating the relations between parent and offspring, but did not use the commonly used least squares method of estimation, although this had been introduced much earlier by Legendre. Galton consistently advocated research into heredity rather than applied eugenics, undoubtedly because he felt a firm scientific base was needed as a foundation for eugenics.

Like Fisher, Galton and Pearson were interested in ‘improving the stock’, which had nothing to do with racial differences;  even Marxists like Muller and Haldane were advocates of positive eugenics of this kind. I think there are many arguments against positive eugenics, but it is misguided to make out that it is inherently evil in the same way as Nazism and white supremacism.

No doubt Galton and Pearson held racist views, but these were widespread at the time, and had nothing to do with the eugenics movement in the UK; in fact, the Eugenics Society published a denunciation of Nazi eugenics laws in 1933  and explicitly dissociated eugenics from racism (see People are confused about this, because the word ‘race’ was then widely used in a very loose sense to refer to what we would now refer to as a population (Churchill used to refer to the ‘English race’: he was himself half American).

Fisher’s work in statistics was very broadly based and not primarily motivated by genetics; he discovered the distribution of t as a result of correspondence with the statistician W.S. Gossett at Guinness’s brewery in Dublin, and his major contributions to experimental design and ANOVA were made in connection with agricultural research at the Rothamstead experimental station (who have renamed their ‘Fisher Court’ as ‘ANOVA Court’). Maybe everyone should give up drinking Guinness and eating cereal products, since they are allegedly contaminated in this way.


63 thoughts on “The ideologues: why we can’t use statistics any more

  1. As to the question of statistics, from a pedagogical perspective, I’ve heard that most secondary school systems build their math programs to lead everyone towards calculus as the ‘goal’ of the program, but that we should build towards statistics instead, which has many more real-world implications for most people -and I agree.

    Isn’t there some law about dragging Hitler into every discussion as a way to make your strongest point?)

    Godwin’s law […] is an Internet adage asserting that as an online discussion grows longer (regardless of topic or scope), the probability of a comparison to Nazis or Adolf Hitler approaches 1. -Wiki

    encouraging the white “upper classes” (whom they assumed, probably wrongly, had better genes)

    “Probably” wrongly? I would have simply said that “better genes” is inherently and objectively meaningless. “Probably” opens you up to … unfortunate implications.

    1. Wrt Godwin’s law, I hope Jerry might some time read and give his thoughts about this July article (doi:10.5061/dryad.bvq83bkb1) in which the authors link 21st century evolutionary biology to eugenics and Nazis via Fisher. Unlike other efforts that use mere allusion, this one includes the word “Nazi” (in The American Naturalist, in 2022!).

    2. This is exactly right about Calculus, which I have taught to 500+ university students. Statistics (and probability) would be far more useful. I think this is being promoted by public intellectuals more and more.

  2. What a mess. Yes, there are problems with statistical methods, notably significance testing in the era of big data (if you make enough comparisons, some will be significant by chance alone), but inherent racism is not one of them. And regarding the supposedly unholy triumvirate of Galton, Pearson and Fisher, I actually included Galton’s parent-offspring analysis of human height in Genetics: Analysis of Genes and Genomes 9th ed. because a) it demonstrates the power of that approach, and b) it was surprisingly accurate (although no one at the time realized that over 11,000 variants contribute to heritability of the trait). And I would hate to think of the status of agricultural genetics and the quality of food on our tables without Fisher’s quantitative methods. I wish Will Provine were still alive to address all of this.

      1. Clayton addresses the Bonferroni correction in the book, along with other methods of improving statistical analysis.

        Publishing the Nautilus article was not in the best interest of advancing what’s truly important in the book.

  3. There is much misuse and abuse of inferential statistics, but this anti-rationalist pseudomoralist critique won’t help.

    PCC(E) writes: “In this temporal ordering, which happens to be correct (see below), the statistics are not tainted by eugenics and thus don’t have to be thrown overboard.”
    I’d go further than that. Even if inferential statistics had been invented expressly and solely to further human eugenics, it would not taint the methods as such, if they were now put to better uses. It’s like saying knives are evil because you can kill people with them. Also, the moralist writers are very probably practical eugenicists themselves. Presumably, if they were in a position to use a sperm donation, they rather would prefer a hiogh SES male’s sperm to that of high school drop out prison inmate, and when they become pregnant a majority of professed anti-eugenicists do all the tests now in use, with the express aim of aborting a fetus with a major genetic disorder.

    1. >>…the statistics are not tainted by eugenics and thus don’t have to be thrown overboard.

      Does Clayton ever claim that statistics should be thrown overboard? Can someone please point out where?

      1. Carl, it is obvious that “the statistics” refers to the methods developed by Fisher, Pearson and Galton specifically, which the author claims are “What we now understand as statistics…”. He may have written a really good and balanced book, but this article is neither, so we can only conclude that either he didn’t write the article (unlikely), or the book is not that balanced or he knows who butters his bread and is not allergic to some virtue signaling. As to quotes, how’s this :” Statistical thinking and eugenicist thinking are, in fact, deeply intertwined, and many of the theoretical problems with methods like significance testing—first developed to identify racial differences—are remnants of their original purpose, to support eugenics.”

        1. The last quote above is a concise masterwork of … whatever this genre is … it reminds me of a sort of sci-fi from the 60s – where a fanciful elaboration springs from a scientific finding – I’m thinking of the “Damn you all to Hell!” part of Planet of the Apes… maybe I should read it…

    2. Classical statistics, which was invented to improve beer, was built on the foundation of classical probability, which was invented to improve gambling. Depending on how you look at it, the field either has a rather playful history or it is sinful to its core.

  4. Even if statistics had indeed been developed entirely as a means of furthering eugenics and racist ideas, that fact would have zero relevance for whether statistics is valid. And if it’s valid we have to use it, if we want to get things right.

  5. Want to note a Sagan quote/idea from Demon Haunted World – that of _knowledgeably_ questioning authority.

    I don’t have the accurate quote handy in haste, but that’s the idea – which the author fails to accomplish.

    1. Maybe this one.
      “I have a foreboding of an America in my children’s or grandchildren’s time – […] when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness.”

      1. The problem now is that the “elites” are slipping into darkness, or pretend to be, to be better able to discredit dissidents who then can be painted as heretics and profaners of the holy DEI, of which the elites are the priesthood, without a rational discussion of the issues.

      2. OK, found it :

        Page 25
        Year : 1996

        It’s particularly aggravating because I _just_the_other_day_ made an effort to nail this down – I actually emailed PCC(E) in the process!

        I’ll have to get it into ASCII format – plain text.

        1. I keep a file of pithy quotes and links to things that i can search. Its not that well organized, but neither am I.

    2. Not the Sagan quote you’re looking for, Thyroid, but an interesting one anyway:

      Society corrupts the best of us. It is a little unfair, I think, to criticize a person for not sharing the enlightenment of a later epoch, but it is also profoundly saddening that such prejudices were so extremely pervasive. The question raises nagging uncertainties about which of the conventional truths of our own age will be considered unforgivable bigotry by the next.

      Chapter 1, “Broca’s Brain” (p. 11)

    3. Christian apologist C.S. Lewis made a related point in his book The Screwtape Letters, with advice from a senior devil to a junior devil about insulating their human “patients” against wisdom that might reach them from previous generations:

      “… since we cannot deceive the whole human race all the time, it is most important thus to cut every generation off from all others; for where learning makes a free commerce between the ages there is always the danger that the characteristic errors of one may be corrected by the characteristic truths of another.”

  6. From the Babynames site:

    “The name Aubrey is primarily a gender-neutral name of English origin that means Noble Ruler.”

    …so any article written by Aubrey Clayton suffers from the taint of colonialism and elitism and my safely be disregarded. Isn’t that how criticism by contagion is meant to work?

  7. It’s hard to read this nonsense. Sadly, it will take a great deal of effort to refute these anti-intellectual claims, effort that could have gone into to solving real problems in the world.

    As I say often (really), statistics protect us from ourselves—from our proclivity to accept our prejudices as truth. Presumably the critics know this and so, by demonizing statistics, are purposely attempting to clear the way for their prejudices to prevail.

    1. We can replace the Statistics coursework for a maths or science degree with one called “Feelings and Hunches 101”, “Gut Instincts, 210”, “Personal Truths” and “BIPOC Intuition”at the graduate level. No tests, of course, Tw@tter posts for your research project, showing your workings out by displaying your “adult” coloring pages, and a building a shoebox diorama for your PhD thesis.

      1. Thesis? I am sure people will soon begin identifying as Doctors of Philosophy. People have been identifying as Reverends for years.

      2. A few months back I downloaded Robin DeAngelo’s PhD thesis, and tried reading a few pages. I honestly got so disgusted by the logical fallacies and enormous gaps in inferential reasoning she displayed that I just . couldn’t . go . any . further. It was infuriating. It could have been written by a bright but misguided high school student, yet she emerged with her shiny new PhD, completely vindicated in all her Freudish depth psychologizing about the “white mind”.

        She violates a basis premise of the “critical-” theories: standpoint epistemology, wherein only a person themself has privileged access to their own thoughts and experiences, but that doesn’t seem to bother her. She’s white; she knows all about white minds. And with her book sales and lectures, she’s undoubtedly set for life. Nice work if you can get it.

    1. Onion, Nautilus – the objects both exhibit a mathematical elegance, perhaps self-referential,.. to delightful effect…

  8. As to dragging Hitler in to make a point, you may be thinking of Godwin’s Law. It goes something like the longer an argument continues the more likely someone will mention Hitler or the Nazis to make their point.
    I tend bar and use it to break up silly arguments by saying “The first person to reference Hitler automatically loses”. It’s really quite handy.

  9. “Objectivity is extremely overrated… Just because you haven’t measured something doesn’t mean that it’s not there. Often, you can see it with your eyes, and that’s good enough.”

    “I prayed to God to help me find my wallet, and then I found my wallet. That proves God answers prayers. That’s good enough!”

    “I can see with my own eyes that the earth is flat, and that’s good enough.”

    “I made this plot of RNA-seq data, and then I squinted and looked at it sideways, and I saw with my eyes that changes in neurotransmitter expression cause lung cancer. Good enough! Nature paper, here I come!”

    Jesus Hermaphrodite Christ, what a load of anti-scientific garbage this quote is.

  10. If the corner results of a field so inarguably useful as statistics initially came from eugenics, wouldn’t that in fact argue that eugenics is a useful field, if only for its spinoff contributions?

    1. I’d go further and say that eugenics was never really discredited. Its application had been tainted by racism and a poor understanding of genetics, but I don’t think that it’s ever been shown why criticisms of eugenics wouldn’t also apply to natural selection. From what I’ve seen, arguments against eugenics always seem to devolve into rhetoric about “who is to decide which genes are better.”

      1. It’s a touchy subject to be sure but who on the left is against birth control, abortion, or preventing the impregnation of people with severe intellectual disabilities? All of these are key parts of eugenics, as well as being quite personal, often heartbreaking, decisions.

      2. The only substantive difference between eugenics and natural selection (or indeed, the artificial selection practised by livestock breeders for centuries to millennia) lays in the presence of intention in those carrying out the eugenics (and indeed, artificial selection). The outcomes, from the marvels of a butterfly’s wing to the horrors of a battery chicken are distinct from the intentions behind the process. Natural selection has no intention behind it, just the blind mindless operation of probability.
        Some people find that unsettling.

  11. This paper is a new good history of the dispute between Weldon and Bateson, including how Fisher (a generation later) resolved their dispute by knitting them together. Key quote:

    “Celebrated across the world as a major turning point in the young science of genetics, [Fisher’s 1918 paper] formally reconciled mendelian and biometric approaches to inheritance by introducing the concepts of variance and polygenicity.” Weldon was a mathematician and adopted the biometric understanding of inheritance based on statistics; Bateson was an experimentalist and adopted the mendelian understanding of inheritance based on observed effects of rare mutations. Like chocolate and peanut butter, they belonged together but stayed apart through a series of professional and personal conflicts until Fisher reconciled their views. He didn’t invent a new candy though.

    Davis argues that

    “Mendelian genetics [but not biometrics] caught on early and spread rapidly in the US, in part due to promotion by eugenicists who used genetic determinism to mobilize a racist and classist agenda that permeated US genetics and governmental policy until well after World War II.”

    I think that’s an empty claim (American racists would have used biometrics or any other understanding of inheritance to make bad claims about the superiority of white people). But her historical treatment of the ideas and their dependence on the personalities of Weldon and Pearson is interesting.

  12. One of the most dangerous ideas to come out from the far progressive left.
    What does the sweating professor really want as a replacement of statistics? What tools do we have, other than facts and statistics, to speak truth to power? If we are left to use only appeals to emotion and anecdotes, we will find that we are badly out-shouted and out-gunned and out-spent by those who oppose social and political progress.

    1. Mark, check out the Lea Davis paper I posted. Her argument is that what we really need is for researchers to put down their own egos and instead adopt “open-hearted curiosity”. A sort of kumbaya for STEM.

    2. >> What does the sweating professor really want as a replacement of statistics?

      Clayton does not argue for rejecting or replacing all statistics – only rejecting particular statistical methods that are prone to errors in favor of one’s that aren’t. Bernoulli’s Fallacy details how Bayesian analysis should be used in place of older methods. Many assuredly non-woke and better known scientists and intellectuals are banging the same drum – Steven Pinker for example.

        1. No, thanks for pointing that out. I should have written “are less so” where I had “aren’t.” It also depends on the situation, where often the older methods yield identical results.

          In many cases, information in addition to sampling frequency makes Bayesian analysis the far better choice.

  13. Has anyone else here actually read the book Bernoulli’s Fallacy? I have. Aubrey Clayton, the author of both the book and the Nautilus piece under discussion, chose to tell the story of statistics along side the story of eugenics. The statistics part is very interesting and important. I found the already familiar eugenics parts distracted from the main focus of the book, which among other things helps explains the “replication crisis,” that bad statistical methods are often to blame.

    Contrary to the thrust of Prof. Coyne’s analysis, Clayton’s reason for rejecting the statistical methods developed by Fisher and Pearson is not that they were racists, but that better methods are available. It’s a long argument exposing entrenched methodical errors (significance testing for example) and for employing Bayesian analysis – Newtonian vs. Einsteinian statistics if you will.

    1. If he’s making the argument that some statistical methods would be better superseded by others, then he should make the argument on scientific grounds without all the wokeness.

      Who originated the methods really is irrelevant to which are best.

      1. Clayton’s wokeness is greatly and unfairly exaggerated:

        I want to make clear what exactly I hope to accomplish by telling the stories of statistics and the eugenics movement at the same time. It may seem that in doing so I am unfairly judging people who lived over a century ago by modern standards of inclusiveness and egalitarianism. But that’s not my goal. It’s likely any intellectual of that time and place held views that would sound abhorrent to present-day ears, and my intention is not to dismiss the work of these statisticians simply because they were also eugenicists. It would be impossible to study the great works of history without engaging with authors who were not “pure” by our standards. Meanwhile, if we ignored that intellectual context and focused only on their abstract ideas instead, we would sacrifice valuable understanding. As Pearson himself once wrote, “It is impossible to understand a man’s work unless you understand something of his character and unless you understand something of his environment. And his environment means the state of affairs social and political of his own age.”

        Clayton, Aubrey. Bernoulli’s Fallacy (pp. 14-15). Columbia University Press. Kindle Edition.

        1. I’m sorry but I’m going to disagree vehemently about what you said above. I did not GREATLY AND UNFAIRLY exaggerate Clayton’s wokeness, for that wokeness is precisely what dominates the article that I analyzed. Where is the unfairness, then? It sounds like the book is very different from the article, with the latter making a far more woke argument. I doubt that you can disagree with that. But you can retract your conclusion that I was “unfair” about the article.

          1. Sorry to offend. My intention was to convey my view, having read both and your piece, that book is well worth reading. I did not mean to convey you were doing something underhanded or dishonorable, only that with fuller knowledge people might form a different conclusion about Clayton’s work.

            1. Sorry but you keep touting the book and yet what is under iscussion is the article, which is dire, woke and full of ludicrous assertions. You haven’t addressed Jason’s comment, and thus fail to deal with the huge dichotomy between this piece and what you ASSERT is in the book. Since the article was written to get people to read the book, something very weird is going on, something that you completely ignore.

              Well, you’ve touted the book several times already and told the readers several times it’s different from the piece. Although you’ve ignored Jason’s point, you’ve made your point–repeatedly. There’s no need to say it again. Readers can decide, based on what you said and what is in this article, whether they want to proceed to read the book.

    2. Look, I didn’t read the book but I did read the article, so if he disses statistics in the book, it’s not for the reason he emphasizes in the article. So don’t go “contrary to the thrust of Prof. Coyne’s analysis. . “

      1. I assumed you hadn’t read the book and am not faulting you for that. I’m sharing my opinion that what is written in the book is contrary to the thrust of your article. The book isn’t egregiously woke and has a good exposition of methodological controversies. Nowhere does Clayton say the work of Pearson or Fisher should be discounted on the basis of their eugenics views, only on accuracy grounds. I hope people will read it despite getting a bad impression from Nautilus.

  14. Thinking of Hitler and the Nazi’s it’s worth pointing out much of their science has shaped the world today as well, for better and worse.

    Just because something may have been used for evil in the past and is tainted by it doesn’t mean it is a legit science.

  15. In regard to the discussion under comment #11, a website at MIT has already answered
    the question: there will be a new higher academic degree, Doctor of Equity. Look it up at

    As for the inherent racism, colonialism, heterosexism, and other offenses of the normal distribution, perhaps we need to reconsider the Central Limit Theorem, one of the bases
    of probability theory. It proves that if samples are taken randomly from any distribution, of any shape, the distribution of the samples will approximate the normal distribution.
    It must then follow that randomness itself, the concept of distributions, and the operations called mathematics, are all inherently racist, colonialist, heterosexist, etc. etc.. They must all be given up in the interest of social justice and goodthink. If Babbling Beaver is on the
    right track, we will soon have Doctors of Equity making precisely this demand. Of course, some Doctors of Educational Theory are already nearly there

  16. “Objectivity is extremely overrated”- It is amazing how these woke postmodernist ideas have gone from the obscure work of a few legal scholars (Derrick Bell, Richard Delgado, Kimberle Crenshaw, Catharine MacKinnon, etc.) to control over much of academia and the minds of millions of well-meaning Americans.

  17. Yet another flabbergasting example of postmodern antiscientism as embodied in wokeism: The methods and theories of “Western”, “white” science with its valuing of objectivity and rationality are regarded as inherently racist (sexist, “cisgenderist”, “ableist”, and what have you) instruments of oppression, and must therefore be eliminated.

  18. When someone talks about their values, the first thing I want to find out is how they view objective knowledge (truth). In my senior years, the single thing I’ve come to value most highly and most intentionally is the value of truth and of science as the only means for finding truth. If someone tries to discredit science because it reaches the “wrong” conclusions, I feel I have to conclude that our value systems differ pretty much all the way down.

  19. Jesus. Like any other entry in the social justice literature, the best argument against this nonsense is liberal quotation. I swear, it’s like these people think of the list of logical fallacies as a style guide..

Leave a Reply