More touting of indigenous knowledge as coequal with modern science

April 6, 2026 • 11:00 am

Once again we have an article about how science could be improved if only it incorporated indigenous “ways of knowing”—the “braiding of knowledge” referred to in the Guardian article below (click to read).  I often see another metaphor used to express the same thing: “two-eyed seeing”, with one eye seeing the way indigenous people do, and the other way modern science does. (I won’t use the term “Western science,” often used to denigrate it.) The implication is that modern science is half blind without indigenous knowledge.

And once again we see five things. The first is that indigenous knowledge is local knowledge, usually about how to grow food or harvest other things that enhance the lives of locals.

Second, indigenous “ways of knowing” are not science in the modern sense—the sense that involves hypothesis testing, doubt, controlled experiments, blind testing statistics, data analysis, and mathematics.  Indigenous “science” does not avail itself of these essential items in the toolkit of science.  Rather, it usually involves using trial and error (mainly about food), and if something works, it becomes “knowledge”. Such knowledge—like how to build the “clam gardens” copiously mentioned in the article below—may be true and may indeed be “knowledge” conceived of as “justified true belief”, but justification usually doesn’t involve replication.

Third, the “braiding” is asymmetrical: modern science can contribute much more to indigeous practices than the other way around. How to build clam gardens or harvest sweetgrass is, after all, not something that’s widely applicable, while principles of genetics, quantum mechanics, chemistry, and so on, are universal, and science can do a lot to help indigenous people with issues like medicine, probably the most important area of asymmetry.  We do not often adopt indigenous medical practices, but the other way around is pervasive, because modern medicine, based largely on science, works..

Fourth, examples of indigenous knowledge that are given in the article are few. These article are usually a lot more about people touting “other ways of knowing”, and calling attention to the past oppression of indigenous people, than they are about the expansion of human knowledge.

Finally, the article completely neglects examples of the damage done to the environment by indigeous people, and these examples are not rare. They cannot be mentioned because what indigenous people do must be uniformly regarded as good. But they are not, as the date below the fold show.

Click below to read; the author is Leila Nargi.

Examples of indigenous knowledge. I would be remiss if I neglected the “ways of knowing” that the article says should be braided with modern science. There are not many, but this list is pretty exhaustive from the article. Excerpts from it are indented, and my comments are flush left.

Clam gardens:

Beginning at least 4,000 years ago, Native communities built clam gardens into the intertidal zone from Washington state through coastal British Columbia, and into south-east Alaska. They are a unique form of mariculture that provide harvestable habitat for an array of tasty ocean creatures like butter clams – collected “in great numbers, then smoked and dried and stored and traded”, Hatch said. But they also yielded red rock crab, basket cockles, sea cucumbers, limpets, sea snails and seaweeds in a veritable smorgasbord for humans and marine mammals, such as otters.

These gardens change where sediment moves and may protect against increasing shoreline erosion; studies also show that clam productivity and populations are higher inside gardens than outside them.

Yes, this is an advance in growing clams, and may have other salubrious environmental effects, though they aren’t documented. At any rate, stemming erosion would be limited because clam gardens are restricted in size.

Sweetgrass harvesting:

Still, the necessity of “proving” the validity of longstanding Indigenous practices can frustrate. Suzanne Greenlaw, a citizen of the Houlton Band of Maliseet Indians, is an ecologist at the Schoodic Institute, a non-profit of the National Park Service (NPS) that supports Wabanaki-led research. She participated in a 2016 study to understand how sweet grass, which grows in salt marshes, rebounds after harvesting. The study was part of a Wabanaki bid to re-establish the right to gather sweet grass from NPS land. Though the Wabanaki have made baskets from sweet grass for centuries, they have been cut off from ancestral marshes in Maine’s Acadia national park for at least 100 years.

Non-Indigenous researchers planned to conduct an environmental assessment to gauge how well plants regrew after picking, choosing sweet grass plots that had no connection to those once used by the community. This led to a comparison study in which Wabanaki practitioners demonstrated their superior understanding of how and where to harvest for the greatest ecological benefit. (They may reclaim harvest rights later this year.)

Notice that modern science will be used to verify whether the way sweetgrass is harvested affects future harvests.  But that is not indigenous knowledge; rather, it’s an in-progress attempt to verify that knowledge, with the goal of helping indigenous people who have lost their right to harvest regain their rights.

Other stuff:

More Indigenous people – Robin Wall Kimmerer, author of Braiding Sweetgrass, is a notable example – are entering academia and changing it from the inside, while some tribal nations have hired their own scientists. Non-Native institutions are seeking to undo their erasure of Indigenous cultures; the Brooklyn Botanic Garden has started to include labeling that highlights Lenape names and uses for food plants like persimmons. International environmental organizations also increasingly recognize the importance of including Indigenous voices in discussions around the climate crisis. Since 2022, there’s even been federal funding to study ways to combine Indigenous and western sciences, so each part remains distinct while being strengthened by the other.

Note that labeling plants with indigenous names is an exercise in linguistics and anthropology, not a “way of knowing”. And while indigenous people should not be excluded from discussions about practices that may affect their lives, that too is not “knowledge’ but inclusion.

More:

In fact, there are many proven correlations between Indigenous-managed food systems and ecological health. Researchers at Simon Fraser University have found that when Indigenous groups in British Columbia tended forest gardens, they not only produced an impressive biodiversity of food plants – from crabapple and hazelnut and wild plum to wild rice and cranberries – they also improved forest health.

Whyte, the University of Michigan professor, works with the Sault Ste Marie Tribe of Chippewa Indians in Michigan – one of many Native nations that used prescribed burns to boost populations of sharp-tailed grousesnowshoe hare and deer, all of which declined after the federal government’s 1911 burning ban. Collaborating with US Forest Service researchers, they conducted more than 20 ecology surveys and other projects that proved their case for fire, in the interest of establishing a co-management plan that would allow them to reintroduce this tool.

The first part is absolutely expected: if you deliberately plant diverse plants to get fruits and nuts, and compare the biodiversity with that of native forests, yes, you’ll get a more diverse “ecosystem”. If you see that as a “healihier” ecosystem because it has more ethnobotanical assets, yes, that is also true.  But surely the author doesn’t mean to imply that all North American forest should be turned into “forest gardens” for growing food.

As for controlled burning, yes, that can be useful in replacing natural burns that are no longer permitted, but in the past burns set by indigenous people could become uncontrolled.  This was particularly dire in New Zealand, where 40% of native forest (30-35% of the total land area) was burned by Māori people within 200 years of their arrival on the two main islands in the 13th century. (There were of course no non-Polynesian “colonists” then.) See below the fold for more data.

All in all, it’s not an impressive record, and hardly one that enriches modern science. Indeed, modern science is making a large contribution to indigenous people than the other way around. Despite that,

Indigenous knowledge is sacralized and, the article implies, should be considered coequal with modern science.  Some quotes:

Rather than dismissing Indigenous knowledge, more western scientists are discovering its viability for themselves and adjusting their research goals to embrace it.

That represents a “massive shift”, according to Kyle Whyte, a professor of environmental justice at the University of Michigan and a member of the Citizen Potawatomi Nation. Historically, western scientists have considered themselves rigorous and empirical, while they have classified traditional Native thought as mythic, religious or plain made-up, he said.

It’s not false to say that a great deal of “traditional Native thought”, construed as “ways of knowing”, is indeed mythic, religious, or plain made-up.  But some of it is not, and insofar as this knowledge can be verified by modern science, that part is indeed “knowledge”.

Western science favors distinct disciplines – ecology, biology, geology and Supernant’s specialty, archaeology. But Indigenous knowledge considers “the earth and the water and the air and the plants and the animals as deeply interdependent and interconnected; to understand one is to understand all. And that has a lot to teach western science,” Supernant said of the importance of braiding these systems.

Notice the inaccurate term “Western science”.  And insofar as a system is dependent on other things, modern science has to deal with it. But, as my advisor Dick Lewontin said in an essay called “A reasonable skepticism“:

But this holistic world view is untenable. It is simply another form of mysticism and does not make it possible to manipulate the world for our own benefit. An obscurantist holism has been tried and it has failed. The world is not one huge organism that regulates itself to some good end as the believers in the Gaia hypothesis believe. While in some theoretical sense “the trembling of a flower is felt on the farthest star,” in practice my gardening has no effect on the orbit of Neptune because the force of gravitation is extremely weak and falls off very rapidly with distance. So there is clearly truth in the belief that the world can be broken up into independent parts. But that is not a universal direction for the study of all nature. A lot of nature, as we shall see, cannot be broken up into independent parts to be studied in isolation, and it is pure ideology to suppose that it can.

It is common to say that indigenous knowledge is superior to modern science because the former is more “holistic”.  Lewontin shows the fallacy of that claim.

Here’s another common claim you encounter in this kind of literature:

As opportunities for western and Indigenous collaborations multiply, it’s critical that Indigenous people maintain control over any knowledge gleaned and how it’s used, especially in light of western scientists’ historic penchant for extracting information that suited their own purposes and dismissing the rest. “Western science can help, as long as Native people are still decision makers. . . ” [quote from Suzanne Greenlaw, a Native American ecologist]

If this means anything beyond the way that published data is treated in modern science, then it is an unwarranted privilege. When science is published it becomes the property of humanity, and by and large those who produced the knowledge have no control about how it’s used—nor should they. If other people want to use what you’ve published for their own purposes, well, that’s the way science works. Indigenous people should have no more control over any knowledge they make public than should anybody else.

Below we see the implication that indigenous knowledge should be considered coequal with modern science (the quote is from Kisha Supernant “Métis and Papaschase and the director of the University of Alberta’s Institute of Prairie and Indigenous Archaeology”):

What constitutes progress when it comes to braiding western and Indigenous science depends on whom you ask. “If the burden of proof remains on Indigenous communities to demonstrate, using western scientific methods, that their knowledge … is valid, I think we’re not at the place we need to be,” Supernant said. “It is difficult to braid two things together when they’re not given equal weight in the braid.”

Well, I’d say that given the toolkit that’s constitutes modern science and is used to establish “knowledge,” then yes, indigenous people should have to demonstrate that their knowledge really is knowledge in the modern sense before it’s used.  When the Māori want to play whale songs to infected kauri trees because whales and kauri trees were once seen as brothers, then they should have to demonstrate the phylogenetic affinity of trees and cetaceans as well as the efficacy of whale songs. (This is a real case based on mytic lore.)

Finally, the bit below strikes me as rather patronizing, treating Indigenous people like children. (“Whyte” is a “professor of environmental justice at the University of Michigan and a member of the Citizen Potawatomi Nation.”)

Whyte is encouraged that the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), which seeks to provide scientific evidence to inform government decision-making, included a chapter on Indigenous knowledge in its latest global assessment. But he sees plenty of opportunity for improvements to braiding. For starters, “Indigenous people need to be involved at the earliest stages of research,” he said. And that means western scientists “need to get into the habit of approaching potential [Indigenous] partners and saying ‘I’m interested in water. Are you interested in water?’ before any research questions have been created. Let’s just get excited together about the topic, and plan from the beginning.”

If they plan experiments on indigenous land, or experiments that affect indigenous people, then yes, there should be consultation.  But “getting excited together” before any research questions have been formulated is not the way that science works, nor should it.  Science is not an endeavor that involves research equity, and creating such equity must be an extracurricular activity. The job of science is to understand the Universe, not to create social justice or spread an ideology.

h/t Ron, Ginger K

Click “continue reading” to see what we know about the damage indigenous North Americans did to the environment. It gives the answer to a question I asked Grok.

Continue reading “More touting of indigenous knowledge as coequal with modern science”

Indigenous “ways of knowing” invade Canadian science classes

March 27, 2026 • 11:00 am

I’ve spent a lot of time pushed many electrons going after the fallacy in New Zealand that indigenous “ways of knowing”—in this case from the Māori—are just as valid as so-called “Western ways of knowing,” which is what Kiwi progressives call “science”. You can see my pieces here, but there are many.

This sacralization of the oppressed, whereby the beliefs of minorities are given extra credibility, has now spread to Canada, a pretty woke place.  Lawrence Krauss, who now lives in British Columbia, was astonished and depressed to find indigenous (Native American) superstitions treated as science in the secondary-school curriculum.

You can read his lament by clicking the screenshot below, or find the article archived here.

Quotes from Krauss’s piece are indented, and my comments are flush left. This battle apparently needs to be fought in every country where science, which is not “Western” but worldwide, has been diluted via the efforts of “progressives” who think they’re doing a good thing. They’re not: they are impeding the education of kids by conflating superstitions and established science.

Check out the links in the first paragraph:

I now live in British Columbia (B.C.). A colleague recently forwarded me the current B.C. high school science curriculum for grades nine and twelve. It includes an embarrassing amalgam of religious gobbledygook and anti-science rhetoric. It is an insult to school children in B.C. and does a disservice to the students of the province at a time when understanding the nature and process of science is becoming increasingly important to their competitive prospects in a world dominated by technology.

You may wonder how religious fundamentalism could so effectively creep into the curriculum in a progressive place like British Columbia. The answer is simple. The religious nonsense being inserted into the curriculum has nothing to do with Christian fundamentalism; rather, it is Indigenous religious nonsense. And in the current climate, Indigenous “knowledge” is held to a different standard from scientific knowledge—or, rather, to no standard at all.

. . . In the B.C. science curriculum for grade nine, this agenda is explicit. Students are expected to: “Apply First Peoples’ perspectives and knowledge, other ways of knowing, and local knowledge as sources of information.” “Ways of knowing” are defined as “the various beliefs about the nature of knowledge that people have; they can include, but are not limited to, Aboriginal, gender-related, subject/discipline specific, cultural, embodied and intuitive beliefs about knowledge.”

Here’s one example of how indigenous knowledge dilutes superstition. Like me and many others, Krauss has no problem in teaching this stuff as “social science or history”, but bridles at equating it with science:

For example, lesson three of the “BC Grade 9 Student Notes and Problems Workbook,” contains a section entitled “The Universe: Aboriginal Perspectives.” Over the course of two pages, the creation myths of various aboriginal peoples are described in detail, as “beautifully descriptive legends depicting the relationship between Earth and various celestial bodies.” Such subjects as the creation of the universe by a raven; the presence of water everywhere on Earth except on Vancouver Island; the eternal efforts of the Moon to get some of that water to drink; how and why a divine son and daughter team set out to make the Sun traverse the sky, while ensuring that it seems to stop in the middle of the day; how one of the jealous siblings turned into the Moon; how lunar eclipses occur when the spirit of Ling Cod tries to swallow the Moon; how one constellation of stars is the remnants of a giant bird that flew up from Earth; and how the celestial raven eventually released the Moon, stars, and Sun from boxes, in that order. These are quaint myths, and one can imagine how a reasonable science book might describe how we overcame these prehistoric notions to arrive at our modern understanding via the process of science. Instead, the conclusion at the end of this chapter reads, “These stories parallel the Big Bang Theory.”

The only answer to that is, “No they don’t.”  Krauss continues:

As if the insults to the process of science reflected in these curricular statements weren’t bad enough, when the workbook actually discusses science, it gets it all wrong. For example, the book states that, “Indications are that all galaxies are moving away from a central core area. Thus, the universe is said to be expanding.” In fact, the central premise of the Big Bang picture of our expanding universe is that there is simply no centre to the universe. The Universe is uniformly expanding but not from a single central point, but from everywhere. Elsewhere, the process that describes the power generation in stars is listed three times as nuclear fission. This is the opposite of the actual process, nuclear fusion, which explains how light nuclei combine to form heavier nuclei.

This is not surprising, for the people who tout indigenous knowledge as coequal with modern science often are not conversant with modern science. This is also true in New Zealand: advocates for native people simply look for parallels that can be used to say, “Look—indigenous people had a parallel but equally correct way of understanding the universe.” And the answer to that, too, is “No they didn’t.

The damage done to children’s education, and to science itself, are obvious, but summed up by Krauss at the end:

The understanding of the modern world is based on science and that understanding was built up, often at great cost, by overcoming myth and superstition. It is a giant leap backwards to cater to such superstitions in a misguided attempt to somehow pay back Indigenous peoples for historical wrongs. Students today had nothing to do with the sins of the past, and we owe it to them to teach them the best possible science we can. That means separating religious myths from science, and in the process actually trying to get the science straight. The B.C. science curriculum is a disgrace on both counts.

Amen.  I suspect the only reason this tactic hasn’t spread to Europe is that they have—with the exception of the Sámi of Scandinavia—almost no indigenous people to sacralize. But India has plenty, and already science is being diluted there by Hindu “ways of knowing”, including the government’s establishment of institutes tasked with revealing the scientific wonders of cows and their urine, dung, and milk. When I visited India on a lecture tour, I spent a long time listening to credible scientists beef about (sorry for the pun) the stupidity of the government’s dilution of science. Their complaint? “Where’s the beef?”, for despite a big government expenditure, there was little to show. That’s what happens when “scientists” are more or less ordered to come up with results wanted by others.

Now The Atlantic touts religion—or rather, beliefs that don’t need evidence

March 27, 2026 • 9:30 am

I’ve been posting from time to time about how the mainstream media is suddenly touting religion and its benefits—a phenomenon I don’t fully understand. Now The Atlantic has joined the queue with an article by Elizabeth Bruenig, who’s written for the magazine for 6 years, and before that for the NYT, the WaPo, and the New Republic. She also has a master’s degree in Christian theology from Cambridge University.  All this means that she’s fully qualified to tout religion to liberals.

And in the article below she does just that, but in an unusual way.  She dismisses the need for any evidence for gods or specific religions, and takes the position that belief itself, however arrived at, is sufficient to warrant the truths of that belief. It’s bizarre, and another example of a supposedly reputable publication jumping the rails.

You can read the article archived below, or find it archived here.  (Thanks to the many readers who sent me this piece.)


Bruenig begins by dissing the New Atheists (unfairly, of course), and then segues into her Frozen Waterfall Moment: the epiphany that solidified her waning faith.

I grew up in a faithful Methodist household in deep-red Texas during the George W. Bush years, when the political sway of Evangelicals was at its zenith. At the same time, evangelists of a robust atheism—figures such as the biologist Richard Dawkins, the critic Christopher Hitchens, and the neuroscientist Sam Harris—toured the country offending salt-of-the-earth Americans with their contempt for religious belief. It was hard for me to ignore that a number of their assertions were clearly correct: Young-Earth creationism, for instance, instantly struck me as absurd when I first learned about it from a history teacher in my public junior-high school, who confidently told me that the world is only a few thousand years old.

That wasn’t what my family or church taught, but Christians who subscribed to those beliefs were suddenly ascendant, and their thinking colored the country’s religious landscape. Meanwhile, the New Atheists were making hay of the fact that such faithful misapprehensions about nature were easily disproved by scientific discovery. Though I continued to attend church as usual, I privately wondered whether the entire enterprise might be rooted in nothing more than a misunderstanding.

This steady diminishing of faith probably would have continued indefinitely, were it not for one brisk autumn afternoon in 2011 when, standing alone at a bus stop, I happened to witness the presence of God.

The unevenly paved lane where I waited was a quiet one-way street tucked away in a clutch of trees. I gazed down the road, preoccupied with other things—midterm exams, campus-club minutiae—and expecting the bus to trundle around the bend. A sudden icy wind tore around the corner instead, sweeping into gray branches and climbing ivy to send a spray of golden birch leaves spiraling into the sky, taking my breath along with them. And I knew that my soul was bared to something indescribably majestic and bracing—something that overwhelmed me with the unmistakable sensation of eye contact. What I saw, I felt, also saw me. Before I could rationally account for what had happened, a verse of poetry from John Ashbery came to mind:

look of glass stops you

And you walk on shaken: was I the perceived?

That seemed to explain things perfectly, jarringly so. I was dazed in class as afternoon darkened to evening.

Note that at the same time she sneers at New Atheists for their “contempt” for religion, she notes that they also dispelled misguided beliefs in creationism, so chalk that up to New Atheism. In her case, the ephipany was more mundane than the three frozen waterfalls that brought Francis Collins to Christ: hers involved a wind blowing leaves into the sky.  And for some reason that made her think about a poem that is not at all about God, but (as far as I can see), the creative process of a writer and how that process is perceived by the poet and how it interprets reality. It’s an okay poem, but it doesn’t rhyme, so it’s really a bunch of fragmentary thoughts, as in Ulysses, but put into verse form. At any rate, when Breunig, the wind that blew the leaves around somehow blew faith into her soul.

Surprisingly, given Bruenig’s own contempt for the need for evidence to buttress one’s faith, she spends a long time describing a new big book that appears to make the same old arguments about the facts of science that point to God (fine-tuning, the Big Bang, etc.):

The latest evidence suggests that God most likely exists, argues a big recent book by Michel-Yves Bolloré, a computer engineer, and Olivier Bonnassies, a Catholic author. Tracts that aim to prove the reality of God are hardly novel. What makes this endeavor unique, say the French writers behind God, The Science, the Evidence: The Dawn of a Revolution, is the scientific nature of their work. Medieval monks toiling away at poetic meditations on the divine have their place, the authors allow, but their own arguments are meant to surpass mere abstract justifications for belief. Instead they assert that cutting-edge empirical proof observable in the natural world makes a firm case for God. With this, they strive for the ultimate alchemy, transforming faith into fact.

Bolloré and Bonnassies’s book is part of a burgeoning genre of apologetics that relies on relatively new scientific developments and theories, like quantum mechanics and cosmology, to make an ancient case. Their book, which has already sold more than 400,000 copies around the world, arrives at a time of both bloody religious conflict and rapidly collapsing religious belief, especially among the young and the highly educated. It joins other recent projects—including two new documentaries, The Story of Everything: The Science That Reveals a Mind Behind the Universe and Universe Designed—that propose the same tantalizing theory: that there is incontrovertible proof that a divine power created the cosmos, and that this evidence is mounting.

. . . [the authors] identify a series of scientific breakthroughs that helped undermine religious faith over the centuries, including Galileo’s heliocentrism, Newton’s clockwork universe,

The publisher says pretty much the same thing: scientific discoveries in quantum mechanics, cosmology, the “fine-tuning of the Universe,” and the incredible complexity of living organisms” (i.e., Intelligent Design) have dispelled materialism and naturalism:

Yet, with unexpected and astonishing force, the pendulum of science has swung back in the opposite direction. Driven by a rapid succession of groundbreaking discoveries—thermodynamics, the theory of relativity, quantum mechanics, the Big Bang, theories about the expansion and fine-tuning of the Universe, and the incredible complexity of living organisms—old certainties have been completely overturned. Materialism increasingly has the appearance of an irrational belief.

I’ll admit I haven’t read this 500-page behemoth, whose summaries recycle the same old arguments for God from science, and I’m not sure I want to read it (you can see a critical review of its content archived from Medium), whose author (“Matthew”) confirms the impression I got from above, but adds that the book also throws in some theology. From Medium:

Yet what is strange is how much [the book] feels like a nostalgic throwback, it is reminiscent of the publishing fads of the 00s when New Atheism was in its peak and church book stands were full of books with titles like “The Dawkins Delusion” or “How Science Proves God” or whatever it might have been. The book even approvingly quotes Dawkins’ claim that God is basically a scientific hypothesis that we can prove or disprove, and the authors claim we should be able to look at science and find evidence of God, or at least we shouldn’t find evidence that contradicts the idea that there is a divine creator. Yet it is also far weirder than intelligent design rebuttals of atheists, the book goes beyond science, including lengthy chapters on the bible, the person of Jesus, the continued existence of the Jewish people, the persecution of scientists in the Soviet Union and (sorry Substack) for some reason, the Fatima miracle.

I will be honest up front, I found the book to be absolutely mad, hamfisted and confused. It is error strewn, misrepresents various ideas completely, and in spite of being written by two Catholics claiming to be retrieving a more ancient worldview, it largely constitutes a clumsy argument for a God of enlightenment deism, making some absolutely eye wateringly odd claims along the way. As the reviews all seem to say it is extremely “readable” but mostly because it is presented as a skim over of topics in soundbites and quotes so that it reads like a print out of a load of powerpoint slides.

. . . More to the point, I find it hard to believe we are in an “intellectual paradigm shift” when the authors have offered what is essentially undigested quotes from wikipedia and a bunch of arguments that were in vogue nearly two decades ago. This book is the definition of singing to the choir, except by the choir it must mean a very particular set of Christians inclined to share the author’s theology but not inclined to know anything about the arguments.

You can read the rest of the review for yourself.  The fact is, though, that the quality and arguments of the book are irrelevant, for Atlantic author Breunig says that people don’t need no stinking evidence to accept gods and their natures. The argument from science, she says, is misguided (bolding henceforth is mine):

To imagine that one might find traces of the divine strewn throughout the universe, or that earthly methods of inquiry might uncover some of those signs, isn’t ridiculous. But this latest round of arguments in favor of intelligent design seems aimed mostly at establishing that God could or should exist within the rational frameworks we already employ. This is both weak grounds for belief and a fundamental misunderstanding of faith. The route to durable faith in God often runs not through logical proofs or the sciences, but through awe, wonder, and an attunement to the beauty and poetry of the world, natural and otherwise.

In other words, it’s the “beauty and poetry of the world” that convinced Bruenig of the divine. Apparently she has overlooked the ugliness of the world: the cancers in children, the incessant wars and killings, the death of thousands of innocent people in natural disasters, and even humans’ destruction of the very beauty that inspires her. Is this evidence for Satan?

It’s quite bizarre to read about Breunig’s transformation into a believer, one who rejects science but still touts “objective evidence” for divinity.

She turned her Golden Leaf Epiphany over in her mind, and it is that epiphany—a purely emotional experience—that led her to see reality (OBJECTIVE reality) through a god-shaped lens. And she disses New Atheism again for its supposed claim that believing in gods makes one unsophisticated or dumb.  No, she’s wrong: the argument is that accepting theism means you’re credulous. Breunig:

 I began to ask myself what it would cost me intellectually if I were to choose to metabolize the experience as it had occurred to me. That decision came with several implications. If God is real, then perhaps other things—goodness, righteousness, beauty—that are usually dismissed as matters of subjective experience might also be objectively real. That prospect was much more agreeable to me than another consequential implication of electing to believe: that, as the New Atheists had so vigorously argued, theism meant putting aside any pretensions I had of sophistication or intellect.

As I explored this problem, I spent hours in my college library reading Saint Augustine, a foundational philosopher and theologian. Here I encountered another strange sensation: Every word I read felt like remembering something I had once known but somehow forgotten.

Oh dear God, St. Augustine, a man who was a Biblical literalist (something that Bruenig rejects). Like many early theologians, Augustine argued that the Bible could be read both literally and metaphorically, but insisted on the absolute truth of what’s in print. Augustine accepted instantaneous creation from Genesis, Adam and Eve, Noah’s ark, and the whole Biblical mishigas. Bruenig ignores those parts, for she’s looking to buttress her incipient belief. (And remember that she concluded, apparently objectively, that God exists because of the feeling that swept over her when she saw the wind blow the leaves around.)  And so, after reading Augustine, she decided to accept an “objective” reality that didn’t need empirical support, and re-embraced religion:

And maybe the Christian Neoplatonists, Augustine among them, had some points as well. I contemplated this for a while before I realized that there wasn’t any sense in debating it with myself anymore. I knew what I felt, so I gave up and chose to believe.

Note that she has no evidence for Christianity, but chose to believe, even though she uses the word “objectively,” implying that other people would agree with her “choice”. (They don’t: Christians are in a minority of the world’s people.) At the end of her piece, Bruenig simply asserts that you don’t need anything but emotion to buttress your Christianity. In so doing she simply shrugs off all the arguments that have been raised against belief and says “faith is enough”, effectively immunizing her beliefs against refutation. (Bolding is mine.)

In my years of working out exactly what I believe, I have been relieved to learn that faith does not in fact demand the surrender of logic and vigorous intellectual inquiry—a case Bolloré and Bonnassies convincingly bolster with numerous testimonials from award-winning scientists. Still, to trust in the existence of God is to accept both the appearance and the possibility of being naive or delusional. No accumulation of promising developments in our analytical understanding of the world can delay confrontation with that essential fact. Having faith is a vulnerable thing.

Bolloré and Bonnassies’s arguments are more likely to shore up the faith of wavering believers than to win new converts. This itself is no small thing. The authors may even be right about the growing evidence for the existence of God secreted away in the latest science. But their approach has a history of upsets. The only way to inoculate belief against that cycle of disruption is to treat faith as a decision that transcends scientific proof.

It’s clear here that she wants to inoculate her belief against disruption (i.e., against disproof), and by arguing, “It’s true because I believe it,” she’s succeeded.  Well, good for her, but she’s not going to convince people who think that giving your life to Christianity and its beliefs of a divine Jesus who was also God, the miracles he performed, and the crucifixion and resurrection—you are donning the mantle of a superstitious belief system without a rational reason to do so. Remember, emotions and feelings are not part of rationality.

This whole essay could be summed up on one sentence:  “I believe because I want to believe, and I don’t need reasons (or rationality) to do so.”

Shame on The Atlantic for pushing this pabulum!

h/t: Jim

My article in Quillette: “Can art convey truth?”

December 26, 2025 • 10:05 am

Last June I went to the Heterodox Academy’s annual meeting, this time in Brooklyn, New York. I had been asked to be on a panel, “The Duties & Responsibilities of Scholars”, which included, besides me, Jennifer Frey, Louis Menand, and John McWhorter.  The introductions were by Alice Dreger and Coleeen Eren.

I knew of two of the panelists—Menand (a Harvard professor of English, distinguished author, and writer for the New Yorker), and McWhorter, (a Columbia University linguist, writer, and columnist for the NYT who’s been featured regularly on this site).  That was enough to intimidate me, so I spent several months reading about the topic beforehand, concentrating on academic freedom and freedom of expression.  Some of my thinking on these topics was worked out in posts on this site that you might have read. Along the way, I realized that the “clash of ideas” that is touted as essential (indeed, perhaps sufficient) to guarantee the appearance of truth, does not produce any kind of “truth”. (This clash, discussed by John Stuart Mill and Oliver Wendell Holmes, is often said to be the reason why we need freedom of speech.) But the clash doesn’t home in on truth unless you put into the mix some empirical evidence, essential for finding the “propositional truths” defined in my article below.

That led to my realization that the purpose of universities stated by many people is incomplete. As I say in my new Quillette piece (click on the screenshot below, or find it archived here):

Likewise, the common claim that the most important purpose of colleges and universities is to expand, preserve, and promulgate new knowledge—to find consensus truths—is also wrong. Finding truth is not the purpose of the literary arts like literature and poetry, the visual and graphic arts like film, painting, animation, photography, and the performing arts like theatre, dance, and music. These fields cannot find truth because that is not why they exist nor why they are taught. (Other areas like economics and sociology, often considered part of “the humanities,” can find truth insofar as they engage in empirical study of reality.)

It’s not just art that can’t find truth without evidence, but also philosophy. (I won’t deal with math here, as I’m still thinking that one over). I don’t deal with philosophy in the article, but I haven’t yet found an example of philosophy coming up with a testable propositional truth without dragging in empirical evidence.  But this doesn’t mean I think that philosophy (or the humanities in general) shouldn’t be taught in college. As I say in my piece:

First I should address the anti-art bigot charge. Just because I see art as a source of something other than the kind of truth uncovered by science does not for a moment mean I’m dismissive of art. My undergraduate education included courses in Greek tragedy, Old English (I can still read Beowulf in the original), modern literature, ethical philosophy, and fine arts, creating in me a desire to keep learning, to keep being inspired, to keep discovering art. I have derived and continue to derive extraordinary pleasure and betterment from art and other branches of the humanities. Science gave me a career, but the arts have given me at least as much in life as science has. But what I’ve gained from art has not been truth.

The rest of the piece, which I won’t expend on as you can read it at the link below (you might have to give Quillette your email address, but you can accces it for free) explains, at least implicitly, why I still think that the humanities (which includes all forms of art) should be taught in schools, for the purpose of such instruction, while not finding truth, is to give us a hunger to expand our experience.  One more sentence:

Finding truth is not the purpose of the literary arts like literature and poetry, the visual and graphic arts like film, painting, animation, photography, and the performing arts like theatre, dance, and music. These fields cannot find truth because that is not why they exist nor why they are taught. (Other areas like economics and sociology, often considered part of “the humanities,” can find truth insofar as they engage in empirical study of reality.)

. . . The real value of art, then, is not that it conveys knowledge that can’t be acquired in other ways, but that it produces emotional and cognitive effects on the receiver, usually conferring an experience of beauty. Art can enrich how we think about ourselves and other people, and, crucially, allow us to view the world through eyes other than our own. Through reflection, this expansion of experience can enhance our knowledge of ourselves. But that is subjective rather than propositional knowledge.

I showed this piece to a friend this morning, who asked me this: “Your argument is basically ‘the humanities have other uses so we need to keep them in universities.’ So it begs the question — why should they be housed in universities? You seem to suggest the answer is because it makes people feel and think in other ways. Is that kind of personal development something university resources should be dedicated to? A lot of administrators and politicians these days answer no.”

But my answer is “yes“. As I wrote her:

Yes, you ask a good question and I should have answered it. It’s sort of implicit in the piece when I relate how much I’ve benefited from learning about the arts personally, and that is from the arts (literature, etc.) having awakened my desire to learn more. The arts are one of the great areas of human endeavor, and for that alone should be taught in universities.  As I said, it sparks the desire to think about oneself, or learn other perspectives, and while that’s not truth in the scientific sense, it should be taught for that alone.  Ditto philosophy. Ethical philosophy was an important course I took in college, and without that I wouldn’t know about the history of people’s ideas on morality, even though morality turns out to be subjective.
In the end, I think that colleges should stay the way they are, save for the elimination of teaching religious dicta, as in some divinity schools, and that the purpose of a college education is more than just the expansion, production, preservation, and promulgation of (propositional) knowledge.  Why AREN’T universities the place for absorbing the artistic endeavors of humanities? Where else would you learn about it?

And I added that philosophy, which I still don’t think can find truth on its own, is one of the most valuable tools we have for sorting out dreck in arguments, and helping us home in on the truths by thinking logically. Ethical philosophy, in particular, was important to me as it made me think about exactly why I thought things were moral or immoral, and why—a quest I’m still on. So of course philosophy should stay in the college curriculum. The only thing that should be eliminated is the teaching of religious dogma (as opposed to the history and content of religion), dogma that is often promulgated in divinity schools.

The video discussion above is long: 75 minutes, but if you want to listen to the bit on truth in humanities, and see McWhorter and Menand try to tar and feather me, start about 22 minutes in and listen for about six minutes.  It’s in that section that I think McWhorter made an admission that undercut both his and Menand’s argument—an admission I note in the last paragraph of my Quillette piece:

Curiously, I think that perhaps my art-isn’t-truth stance is not as extreme and unreasonable as my eye-rolling, shoulder-shrugging friends in the humanities imply. As I mentioned, at the Heterodox Academy panel Menand and McWhorter were the eye-rollers and shoulder-shruggers, but I see that they too have run up against the objective/subjective issue in their own thinking.   For example, in an exchange about whether Leonard Bernstein’s symphonies are greater than his musicals, McWhorter wound up admitting, “There is no truth: it’s a matter of informed opinion and opinion on what you have decided you value in art.” Agreed!

McWhorter makes his claim starting at 27:55.  You don’t have to watch the video, but do read the piece, which at about 2000 words is short.

Kathleen Stock on female genital mutilation, cultural relativism, and a recent (odious) paper in The Journal of Medical Ethics

December 20, 2025 • 11:00 am

Over at UnHerd, philosopher Kathleen Stock, formerly of the University of Sussex, critiques a paper in The Journal of Medical Ethics that I discussed recently, a paper you can read by clicking below. (You may remember that Stock, an OBE, was forced to resign from Sussex after she was demonized for her views on gender identity. These involved claims that there are but two biological sexes, and her cancellation was largely the result of a campaign by students.)

As I said in my earlier post, this paper seems to whitewash female genital mutilation (FGM), and does so in several ways. The authors think that the term “mutilation” is pejorative, and is more accurate and less inflammatory than saying “female genital modification”, which covers a variety of methods of FGM, some much more dangerous than others, as well as cosmetic genital surgery on biological women or surgery on trans-identifying males to give them a simulacrum of female genitalia. (There is also circumcision, which some lump in with the more dire forms of FGM.)

The Ahmadu et al. paper also notes that anti-FGM campaigns in Africa, where the mutilation is practiced most often, have their own harms. As Stock comments in the article below,

And so our co-authors — the majority of whom work in Europe, Australasia, and North America — tell us that anti-FGM initiatives in Africa cause material harms. Supposedly, they siphon off money and attention that could be better spent in other health campaigns, and they undermine trust in doctors.
They also cause young women to consider genital cutting as “traumatising” in retrospect, we are told, where they would not otherwise have done so. Even though some who have been subject to it can experience “unwanted upsetting memories, heightened vigilance, sleep disturbance, recurrent memories or flashbacks during medical consultations”, there is allegedly no actual trauma there, until some foreign aid agency tells them so.

And if you don’t believe Stock, here’s a small part of the section of the Ahmadu et al. paper trying to push the word “trauma” out of descriptionos of FGM:

Most affected women themselves rarely use the word ‘trauma’ to describe their experiences of the practices. If they describe the experiences in negative terms, they may use words such as ‘difficult’ or ‘painful’, but some of them may simultaneously describe the experience as celebratory, empowering, important and significant. This may even accompany experiences of pain, but this pain, when made sense of in its cultural context, does not equate to trauma.

Researchers and clinicians often use the mostly biomedically based DSM-5 (the current version of the Diagnostic and Statistical Manual of Mental Disorders) to assess trauma, with a focus on post-traumatic stress disorder (PTSD). While narratives of women who have experienced a cultural or religious-based procedure may contain descriptions of symptoms that fall into the PTSD nosological category (such as ‘unwanted upsetting memories’, ‘negative affect’, ‘nightmares’ or heightened sensations, vigilance or sleep disturbance), the cross-cultural validity of PTSD as a construct and its use in migrant populations has been widely contested, because it applies Western cultural understandings to people who do not necessarily equate the experience of pain as directly causing trauma.

That is first-class progressive whitewashing! As Stock describes :

[Anti-FGM campaigns] also cause young women to consider genital cutting as “traumatising” in retrospect, we are told, where they would not otherwise have done so. Even though some who have been subject to it can experience “unwanted upsetting memories, heightened vigilance, sleep disturbance, recurrent memories or flashbacks during medical consultations”, there is allegedly no actual trauma there, until some foreign aid agency tells them so.

Finally, Ahmadu et al. note that anti-FGM campaigns, and the term “mutilation”, have led to unfair stigmatization of some groups in the West that practiced FGM in their ancestral countries (and still practice it in the West, though to a much lesser extent). You could argue, for example, that it leads to bigotry in the West against those of Somalian ancestry, as FGM is rather common there. And I agree that it’s unfair to stigmatize an entire group because some of them practice FGM. Only the perpetrators should be punished and the promoters rebuked. But the practice should be loudly decried, and aimed at communities who employ it.

In her article, Stock rebukes the article as a prime example of “cultural relativism,” the view that while people within a given culture can judge some acts more moral than others, considering different cultures one cannot judge some as having behaviors more moral than do others.  One might, if one were stupid, criticize this as forms of ethical appropriation. So, say the relativists, we shouldn’t be too quick to judge those in Somalia who practice infibulation of young women.

You can read Stock’s article by clicking below, but if you’re paywalled you can find the article archived here.

Stock is not a moral relativist, at least when it comes to genital “modification,” a term she opposes.  I’ll put up a few quotes, but you should read the whole piece, either online or in the archived version:

Progressives are notoriously fond of renaming negatively-coded social practices to make them sound more palatable: “assisted dying” for euthanasia, or “sex work” for prostitution, for instance. The usual strategy is to take the most benign example of the practice possible, then make that the central paradigm. And so we get images of affluent middle-class people floating off to consensual oblivion at the hands of a doctor, rather than hungry, homeless depressives. We are told to think of students harmlessly supplementing their degrees with a bit of escort work, not drug-addicted mothers standing on street corners. Perpetually gloomy about human behaviour in other areas, when it comes to sex and death the mood becomes positively Pollyanna-ish.

Similarly, the authors of the new FGM article are apparently looking for the silver lining. Some genital modifications enhance group identity, they say, and a sense of community belonging. And as with euthanasia and prostitution, they want us to ignore the inconvenient downsides. But at the same time, there is a philosophical component here mostly absent from parallel campaigns. It’s cultural relativism — which says that strictly speaking, there are no downsides, or indeed upsides, at all.

That is: from the inside of a particular culture, certain practices count as exemplary and others as evil. Yet zoom out to an omniscient, deculturated perspective upon human behaviour generally, and there is no objective moral value — or so the story goes. All value is constructed at the local level. Worse: when you zoom back into your own homegrown ethical concerns after taking such a trip, they seem strangely hollow. Like an astronaut returning to Earth after having seen the whole of it from space, everything looks a bit parochial.

Stock lumps the authors into three groups, which she calls “the Conservatives” (no genital surgeries of any type), the “Centrists” (okay with circumcision for males but no surgery on females), and “Permissives” (people who think that “it is up to the parents to decide what is best for their children, and that the state should refrain from interfering with any culturally significant practices unless they can be shown to involve serious harm.” [that quote is from the Ahmadu et al. paper]. These conflicting views lead to the tension that Stock and others can perceive in this paper. What are the sweating authors trying to say?

Cultural relativism, while in style among progressives, is a non-starter. You can see that by simply imagining John Rawls’s “veil of ignorance” and ask imaginary people who have not been acculturated to look at various cultures from behind that veil and then say which culture they’d rather live in. If you are a young girl, would you rather be in Somalia or Denmark? If you’re gay, would you rather be in Iran or Israel? And so on.  Here’s Stock’s ending where she asserts that not all forms of “genital modification” should be lumped together or considered equally bad:

Meanwhile in the Anglosphere, anti-FGM laws allegedly cause “oversurveillance of ethnic and racialised families and girls” and undermine “social trust, community life and human rights”. All these things, it is implied, are flat wrong. This sounds like old-fashioned morality talk to me. But then again, if old-fashioned morality talk is permissible, may not we also talk explicitly about the wrongs of holding small girls down to tables and slicing off bits of them, or sewing them up so tight that they are in searing agony? These things sound like they might undermine “social trust, community life, and human rights” too.

Rather than be a relativist about morality, it makes more sense to be a pluralist. There are different virtues for humans to aspire to, and they can’t be ranked. Sometimes there are clashes between them, resulting in inevitable trade-offs (honesty vs kindness; loyalty to family vs to one’s community; and so on). There are very few cost-free moral choices in this life. Equally, some virtues will vary according to cultural backdrop. The local environment may partly influence which virtues are paramount. For instance, family obedience and respect for elders will be stronger in places where close kinship ties help people to survive.

But still, there is always a limit on what behaviours might conceivably count as good; and that limit is whether they actively inhibit a person’s flourishing, in the Aristotelian sense. The most drastic and bloody forms of FGM obviously do so. They lead a little girl to feel distrust and fear of female carers; predispose her to infections and limit her sexual function for life; cause her pain, nightmares, and panicky flashbacks for decades.

With minimally invasive genital surgeries involving peripheral body parts, matters are not so clear. But whatever the case about those, you can’t just assume in advance that all genital modifications are equal, so that discriminating between them by different legal and social approaches is somehow “unfair”. If cultural relativism were really true, there would be no such thing as unfairness either. It would just be empty meaninglessness, all the way down. Academics with heroic designs on the English language should be careful not to fall into ethical abysses, even as they tell themselves the landscape around them is objectively flat.

Here Stock comes close to equating “more moral” with “creating more well being,” a position that Sam Harris takes in The Moral Landscape, and a position I’ve criticized. But here the niceties of ethics are irrelevant. There is simply no way that forcing FGM upon girls can be considered better than banning it.

Can mathematics and philosophy produce (propositional) truth?

December 19, 2025 • 9:45 am

I have written a piece that will be published shortly on another site; it’s largely about whether academic disciplines, including the arts, can produce “propositional truths”, that is, declarative statements about the world that are deemed “true” because they give an accurate description of something in the world or universe.  Examples are “Jerry has five fingers on each hand”, “Sheila plays the violin in an orchestra,” or “humans and other apes shared a common ancestor.” The reason I was concerned with propositional truths is that it’s often said that the search, production, preservation, and promulgation of such truths is the primary purpose of universities.  Is it? Read my piece, which will be out next week, to see. I’ll post a link when it’s up.

I won’t give my thesis here about truth and the various academic disciplines, as that’s in the other article, but in my piece I omitted two areas: mathematics and philosophy. That’s because there’s a big controversy about whether these disciplines do produce propositional truths or, alternatively (and in my view), give only the logical consequences of assumptions that are assumed to be true.

For example, a “truth” of mathematics is that 16 divided by 2 equals eight.  More complex is the Pythagorean theorem: in a right triangle, the square of the length of the hypotenuse is the sum of the squares of the other two sides. This is “true”, but only in Euclidean geometry. It is not true if you’re looking at triangles on a curved surface.  The “truth” is seen only within a system of certain assumptions: geometry that follows Euclid’s axioms, including being planar.  All mathematical “truths” are of this type.

What about philosophy? Truths in that field are things that follow logically. Here is a famous one:

All men are mortal
Socrates is a man;
Therefore Socrates is mortal.

Well, yes, that’s true, but it’s true not just because of logic, but because empirical observations for the first two statements show they are propositional truths! If they weren’t true, the third “truth” (which was tested and verified via hemlock) would be meaningless.

Here’s another of a similar nature that came from a friend:

“All As are B; x is an A; therefore x is B—doesn’t depend on the content of A and B: it’s a *logical truth*.”

Again, the statement is indeed a logical truth, but not a propositional truth because it cannot be tested to see if it’s true or false. Nor, without specifying exactly what A and B is, can the empirical truth of this statement be judged. I claim that all philosophical “truths”—logical truths without empirical input—are of this type.

When I told my friend this, I got the reply, “This is analytic philosophy. The people who do it work in philosophy departments and call themselves philosophers: and most philosophy BA and PhD programs require a lot of it. I’m sure any of our competent philosophers would be happy to supply hundreds of propositional truths that are philosophical.”  The friend clearly disagreed with my claim that philosophy can’t by itself produce propositional truths. Insofar as philosophy is an important area of academia, then, I am not sure that it’s discipline engaged in producing or preserving truth.

Two caveats are in order. First, this is not meant to demean philosophy or argue that it doesn’t belong in a liberal education. It certainly does! Philosophy, like mathematics, are tools for finding truths, and indispensable tools. Philosophical training helps you think more clearly  Unlike many scientists, I see philosophy as a crucial component of science, one that is used every day. Hypotheses that follow logically from observations, as in making predictions from observations (e.g., Chargaff’s observation, before the structure of DNA was elucidated, that in organisms that amount of A equals the amount of T, and the amount of G equals the amount of C), are somewhat philosophical, and certainly logical. Dan Dennett is a good example of how one can learn (and teach others) to think more clearly about science with a background in philosophy.

Second, I do not feel strongly about what I said above. I am willing to be convinced that mathematics (but not necessarily philosophy) gives us propositional truths. There is, for example, a school of philosophers who accept “mathematical realism,” defined this way in Routledge’s Encyclopedia of Philosophy:

Mathematical realism is the view that the truths of mathematics are objective, which is to say that they are true independently of any human activities, beliefs or capacities. As the realist sees it, mathematics is the study of a body of necessary and unchanging facts, which it is the mathematician’s task to discover, not to create. These form the subject matter of mathematical discourse: a mathematical statement is true just in case it accurately describes the mathematical facts.

An important form of mathematical realism is mathematical Platonism, the view that mathematics is about a collection of independently existing mathematical objects. Platonism is to be distinguished from the more general thesis of realism, since the objectivity of mathematical truth does not, at least not obviously, require the existence of distinctively mathematical objects.

A corollary of this is my own claim (which is mine) that although the objects and “truths” of mathematics and philosophy are inapplicable to all species outside of our own, as only Homo sapiens can grasp, discover, and use them. The earth spins for all creatures and plants upon it, but the integers and prime numbers are “real” only for us. (Do not lecture me that crows can count!).

I have read some of this controversy about mathematics, but it rapidly becomes abstruse and tedious, and so I’m proffering the view of a biologist, not a professional philosopher.  I am more open to the idea of mathematics producing truths than philosophy, simply because, as one reader once commented, “You can’t find out what’s true by sitting in an armchair and thinking.”

So it’s clear I’m soliciting readers’ views here to help clarify my own thinking. Comment away!

Three Royal Societies abandon their mission to promote global and universalist science

December 1, 2025 • 10:15 am

A Kiwi who wishes to remain anonymous (of course) sent me this link to an announcement of a meeting of three Royal (Scientific) Societies: those of New Zealand, Australia, and Canada. The screenshot below also links to two other short documents, a communiqué and a statement by the Presidents of all three Societies.

The object is severalfold: to eliminate “structural racism” and inequities in science, to tout “indigenous knowledge systems” as not only different and distinct from normal science, but as having contributed valuable knowledge to science in unique indigenous ways, and to assert that indigenous people have a right to “maintain, protect, and develop indigenous knowledge systems, intellectual property, and data.”

Click below (or above) to access the three statements.

The things I agree with are these:

a.) Members of ethnic minorities have surely been discriminated against in the past, and have had difficulty entering into modern (sometimes called “Western”) science

b.) There should be outreach, expanding opportunities for anyone who wants to do science to have a chance to participate

c.) “Indigenous knowledge”, insofar as it tells us something true about the universe, is indeed a part of modern science and should be considered thus

d.)  Any research done using the resources of indigenous people should be done with their permission, collaboration, and full participation

The things I question are these:

a.) Whether structural racism—meaning formalized practices or policies—are still in place preventing minorities in all three countries from doing science. Other words are “bias” or “bigotry”. In the U.S., universities are bending over backwards to recruit minorities, and I can’t think of an example of formalized bias, though of course some non-minority scientists will be bigoted (I’ve also not seen many of them).

b.)  The extent to which indigenous knowledge has contributed to modern science.  It’s telling that, as in nearly all such documents, these three tout this knowledge as invaluable, but don’t provide a single example of the kind of advances that indigenous knowledge have promoted.

And the things I take issue with are these:

a.) Indigenous knowledge is a form of “knowledge” separate and distinct from that produced by modern science. As I’ve argued repeatedly, many forms of indigenous knowledge involve things that are nonscientific in the modern sense.  For example, Mātauranga Māori (“MM”)from New Zeland is described by Wikipedia this way:

Mātauranga (literally Māori knowledge) is a modern term for the traditional knowledge of the Māori people of New Zealand Māori traditional knowledge is multi-disciplinary and holistic, and there is considerable overlap between concepts. It includes environmental stewardship and economic development, with the purpose of preserving Māori culture and improving the quality of life of the Māori people over time.

MM includes not only practical knowledge, like how to catch eels or harvest mussels, but also superstition, word of mouth, tradition, religion, and codes of behavior. Some of it is knowledge in the “justified true belief” sense, but a lot of it is not.  Those who know more about Australian and Canadian indigenous “ways of knowing” can weigh in here.  And none of this comports with modern science in terms of using pervasive doubt, hypothesis testing, experiments, statistics, and the whole armamentarium that is the toolkit of modern science, which stopped being “Western” a long time ago. Modern science is practiced pretty much the same way the world over.

b). While indigenous people can surely design experiments and publish their data, they do not have control over it in the sense of not allowing other people to use it, or refusing to give the primary data behind anything that’s published. While the present document doesn’t say this explicitly, it implies it, and other indigenous people in New Zealand have more explicitly that data are proprietary.

Here are a few quotes from the three documents linked above (direct quotes are indented; my own comments are flush left):

A description of the meeting:

Over 3 days of keynote speeches, wānanga, cultural activities, and panel discussions, top Māori and Pasifika thought-leaders engaged with First Nations experts from Canada and Australia, including Fellows from five of Australia’s learned academies.

Key themes included the need to dismantle academic barriers and inequities for Indigenous students and researchers, share decision-making about research practices and priorities, and shape research agendas to focus on Indigenous knowledges and address challenges that are important to Indigenous Peoples.

Indigenous scholars and knowledge-holders talked about their experiences in academia, and presented research ranging from the study of Indigenous histories, cultures, knowledges, and languages to environmental management and traditional legal systems.

Indigenous scholars and knowledge-holders have championed and led education and research by, with, and for Indigenous communities, and have revitalised interest and awareness in traditional knowledges through language, cultural activities, and creative arts. Their work has explored and built on Indigenous knowledge systems to generate new insights and innovations – such as research methodologies and ethical frameworks based on traditional worldviews and values.

The advances touted for indigenous knowledge (note the absence of examples and yet the assertion that indigenous knowledge systems are separate and distinct “ways of knowing”). Bolding is mine:

 The Taikura Summit has continued and built on those exchanges, and we have now learned of the achievements and experiences of hundreds of Indigenous scholars and knowledge-holders. 

We have heard more about their journeys and achievements, and some of the myriad ways in which they are advancing understanding, particularly in the study of Indigenous histories, cultures, knowledges, and languages. These scholars and knowledge-holders have shown intellectual leadership by practising and advocating for research and education by, with, and for Indigenous communities. They have revitalised interest and awareness in Indigenous knowledge systems by connecting people through cultural activities, creative arts, and languages. 

Indigenous scholars and knowledge-holders have pioneered research practices, methodologies, and ethical frameworks, grounded in traditional worldviews and values, that uplift different ways of looking at challenges and have reshaped research practices across disciplines. Their work has shown that Indigenous knowledge systems are not simply historical artefacts, but living bodies of understanding that continue to evolve and to generate new insights. 

From the Communiqué (bolding mine):

 The Summit recognises that Indigenous Peoples are the rightful leaders, authorities, and stewards of research concerning their communities, territories, and knowledges. Indigenous research is grounded in distinct systems of knowledge, practice, and ethics that have sustained societies and ecosystems for millennia. These knowledge systems, sciences and artistic forms constitute rigorous and essential ways of knowing and understanding the world. They are not supplementary to other science methodologies. They have their own integrity and value. 

Note the clear statement that indigenous knowledge systems are “rigorous and essential ways of knowing and understanding the world” and “are not supplementary to other science methodologies.” This says that indigenous ways of knowing cannot simply fuse with science into a general understanding of the universe.  But indigenous ways of knowing, insofar as they incorporate anecdotal or observational evidence, are already fuse-able with modern science. It’s all part of understanding our universe.

Finally, also from the Commuiqué:

We acknowledge the enduring impacts of research practices that have marginalised, misrepresented, or appropriated Indigenous knowledge. Correcting these legacies requires fundamental transformation within institutes of higher learning and learned academies. This includes:

• addressing structural racism and inequities, including for Indigenous people with diverse sexual orientations or gender identities,

affirming the sovereign right of Indigenous Peoples to determine their own research priorities, methodologies, and outcomes, and

• enabling Indigenous Peoples to maintain, protect, and develop Indigenous knowledge systems, intellectual property, and data.

This part involves questionable assertions, such as that about structural racism, as well as an implication—and I may be wrong here—that the products of indigenous science belong to the indigenous people.  But one thing is for sure, nobody can control the outcome of their “research methodologies”, for you don’t do research if you already have determined its outcome.

So Canada and Australia have bought into the “other ways of knowing” mentality that’s long pervaded New Zealand.

I’ll give a few quotes from my anonymous Kiwi correspondent:

I think these statements have thrown science under the bus in all three countries. If our RSTA [Royal Society of New Zealand] still retained any credibility it’s lost it now. How can you make a blanket statement about indigenous knowledge being as rigorous as other “ways of understanding” when it spans everything from empirically verifiable knowledge to superstition? This legitimises any form of quackery or snake oil provided it’s sold under a banner of cultural authority – there are no standards of universal evidence.
I’m hoping that this will lead to change in RSTA, but Canada and Australia now have the same problem! All three scientific associations have abandoned their statutory claim to leadership and  responsibility for global and universalist science.
. . . It is appalling. Probably the worst thing for me is that it says to indigenous people that they have to choose between their culture and science. That we’ve got here is because relativist ideology has been used as a Trojan Horse to smuggle non-science into science. I see no difference between this and the separation between religion and science. Religion is also culture, and biblical creationism can equally be portrayed as a “way of understanding”. What’s lost is the epistemological distinctiveness of science.
The point is not that indigenous knowledge is all myth and superstition. It’s not. But if the products of different “ways of understanding” are only legitimately viewed through their own “cultural” lens then everything devolves into a political battle – a Foucauldian universe. I think at its heart this is activist politics, and so-called science leaders have fallen for it.
Well, read above and judge for yourself. What science and scientists should ensure is that indigenous knowledge, if it’s to be considered a real “way of knowing,” has to comport with the knowledge produced by modern science. We cannot water down science by mixing it with legend, myth, unsupported assertions, or religion. When it comes to science, we cannot indulge in “the authority of the sacred victim.