This piece is from the blog of the Heterodox Academy (HA), a group founded by Jon Haidt, Chris Martin, and Nicholas Rosenkranz to promote viewpoint diversity an counteract academic and ideological conformity, especially of the authoritarian sort. They regularly publish articles, and have several discussion groups, including one about STEM matters.
Last year I wrote about the HA’s “Campus Expression Survey“, in which they surveyed U.S. college stuents for their willingness to discuss controversial topics. Students were generally unwilling to talk about controversial subjects, but not the majority of them. This is what they found, summarized in the article below as well as in a recently-published paper (click on screenshot below to read the former):
Between September and November 2021, Heterodox Academy (HxA) surveyed 1,495 full-time college students ages 18–24 across the United States as to how comfortable or reluctant they were to speak their views in the classroom on five core controversial topics — politics, race, religion, sexual orientation, and gender — as well as one specific controversial topic (the COVID-19 pandemic). Students also reported their comfort or reluctance to speak their views about noncontroversial topics for comparison. The HxA researchers found that 60% of US participants expressed reluctance to discuss at least one controversial topic. Students who reported having low interaction quality with classmates (i.e., not much opportunity to get to know other students) also reported higher reluctance to discuss all five of the core controversial topics.
That’s a reasonable sample, but in general about 25-40% of students were unwilling to share their reluctance to discuss each topic in the classroom, with 60% unwilling to discuss at least one topic. Some of this surely reflects chilled speech and fear of not sharing “tribal views”, but some of it must be general shyness. It’s not clear what the percentage would be if nobody was afraid of demonization, for even in that case some students would be reticent to speak about stuff!
Now the HA took its survey to New Zealand, and the comparison is given in the following short piece. Overall, NZ students aren’t that much different from American ones:
The rationale for studying New Zealand students:
These trends from the US campuses may seem worrying. It is possible, though, that these views reflect only the United States, with its two-party system and high rate of polarization. How similar is the situation in British Commonwealth countries like New Zealand?
Unlike the United States, New Zealand has a progressive parliamentary democracy, although the country is of course not free of political disagreement. The political system of New Zealand grapples with issues that drive political divisions in the United States as well, including racial prejudice, gun laws, vaccination, taxation, and climate change. However, on the whole, New Zealand society does not display the deep partisan mistrust that characterizes American society.
. . . Bradley Wendel has written about significant differences in the notion of fairness and trust in the government that separate the American and the New Zealand political systems.
. . . New Zealand is also a good comparison as the country has similar issues around political disagreements as the United States and shares the same social issues, including prejudice, inequality, vaccination, taxation, and climate change, that drive political divisions in the United States. At the same time, it is free of the partisan mistrust that characterizes much of American society. It is quite possible that the pattern of responses by New Zealand students would differ from their US counterparts. To find out if this is true, we replicated the US survey with 792 undergraduate students across three of New Zealand’s largest universities.
The answer is simple: yes, Kiwi students are just as wary as American students are of sharing their views in the classroom. The new survey involved 792 undergrads in 3 New Zealand universities. In this case, race wasn’t surveyed as a “hot topic.” Though New Zealand doesn’t have the black-white divisions that we do in America, they do have their own racial issues, with the Māori people citing pervasive racism and oppression. The failure to ask about this is not explained.
The overall figures are about the same. (Curiously, they didn’t ask what percentage of students would be reluctant to discuss at least one of these topics.)
The authors of the Soc. Sci. paper conclude this way:
The results are clear: chilled campus speech is not unique to the United States. The results do not, however, support a universal phenomenon. Like any country, New Zealand is quite distinct from the United States on some dimensions, but very similar on others. It is not possible from an analysis of New Zealand alone to tell which dimensions are relevant to campus expression or the extent to which results are the consequence of American cultural exportation. Our results ultimately represent just one, albeit significant, dataset, and we encourage other researchers to administer their own versions of the survey to their own students—and academic staff—to create a more accurate picture of the international situation on university campuses.
It’s clear that the differences among topics don’t reflect simple shyness or reticence, as the values would be more equal if that were true. But it’s also not clear how much of the reluctance to speak is due to fear of opprobrium (“chilling”) as opposed to simple shyness or unwilliness to speak in general. At least the figures don’t go above 50%—but remember that this is self-report. I would expect the true figures to be a bit higher than this.
Well, this doesn’t count as wildlife, but it does refer to the excretory habits of one species of primate. As contributor Athayde Tonhasca Júnior notes, ” I strongly suspect that this subject has not been approached before in your website. . . ” Indeed! Apparently these are loo-related photos from his travels.Athayde’s notes are indented, and you can click on the photos to enlarge them.
A visit to the toilet (room), bathroom, restroom, washroom, or lavatory, is an opportunity for reflection and introspection, or to seek refuge, peace and quiet. Indeed, British men allegedly spend seven hours per year in the toilet hiding from their wives and children (according to “research” commissioned by a bathroom furniture company). But the loo – or bog, can, head, john, or latrine – can also be a place of amusement and learning.
A flamingo on duty to check your hand-washing technique in Bologna, Italy.
Unfortunately this educative and lyrical message was removed from a dentistry practice in Perth, UK:
A health warning in Scots, which is a language, a dialect or bad English, depending on who you ask (and their political views). The UK government and the European Union recognise Scots as a minority language, but many linguists place it somewhere on a dialect continuum. To the chagrin of nationalists, Scottish heavyweights Adam Smith and David Hume considered the use of Scots as an indication of poor education.
An emergency cord is great, but what if you want to order a pizza or dry your hair while bombing the bowl? (Hotel in Padua, Italy):
My travelling companion was displeased with the facilities in a Padua cafe. Squat toilets are terrible for the elderly or disabled, but they have a great advantage: you don’t need to touch anything. You learn to appreciate them when you hear the call of nature in the back of beyond. They are also better for your health, supposedly:
A latrine in the Housesteads Roman Fort, Britain, on the northernmost edge of the Roman Empire. Year 200 AC:
Marcus: Salve, Quintus.
Quintus: Ave, Marcus. Are you well? You look a bit green around the gills.
Marcus: Tell me about it. I think that batch of garum from Rome was off.
Quintus: I hear you.
Cornelius: I hear you too, Marcus. Loud and clear! Ha-ha! Say, chaps, wouldn’t you have a spare sponge on you?
A tersorium (a sea sponge on a stick) supposedly used by the Romans to wipe themselves after using the latrine. The sponge may have been washed in a gutter with running water, or in a bucket of water, salt and vinegar. But not everyone agrees with this popular tale (kids love it). According to Gilbert Wiplinger (Austrian Archaeological Institute), the tersorium may have been nothing more than a toilet brush. Read his gripping account in the Proceedings of the International Frontinus-Symposium on the Technical and Cultural History of Ancient Baths, Aachen, Germany, 2009.
Sign in a loo in an antechamber of Perth’s Sheriff Court House. One must be at rock bottom to shoot up before facing a sheriff (a Scottish judge with powers to fine or lock you up for up to five years). For the last seven years, Scotland has maintained the unenviable first place in Europe for drug-related deaths; drugs in Scotland have a death rate almost four times the rate in the UK as a whole. These figures – together with failing education, economy and health indicators – are secondary for people in power. The one-track-mind Scottish National Party cares for little else besides breaking up the union:
Epiphany inside a loo in Perth, UK:
The facilities in the family home (today a museum) of Brazilian painter Cândido Portinari (1903-1962) in the town of Brodowski, São Paulo State, illustrate a time when homes were not cluttered with stuff and had plenty of space to spare:
Collector, philanthropist and extremely rich Ema Klabin (1907–1994) needed the loo to store some of her many priceless pieces of art. Her house in São Paulo is a museum (Fundação Cultural Ema Gordon Klabin) well worth visiting. Entrance is free:
A replica of a once common warning to men in public urinals, hotels and railroad stations in the UK. Not doing-up all the buttons of your trousers (no zippers then) was a grave indiscretion:
That’s not nice. At all:
Able young non-pregnant adults can use the loo in the petrol station across the road:
In a cafe in the Brazilian coastal city of Ubatuba, you are not allowed to flush yourself. Presumably to prevent polluting the sea:
“Use the toilet as you have committed a crime: don’t leave clues behind” (loo in a São Paulo bookshop):
I found this article fascinating, and the explanations intriguing. Two YouGov polls surveyed 1,000 Americans (2,000 total) in January of each year, asking people to estimate the proportions of Americans in 43 different groups. A large number of these estimates were wildly inaccurate, particularly when minority groups were surveyed (estimates were way too high) as well as “majority” groups (e.g., “Christians”), where estimates were too low. Read on; there’s a sociological explanation for such mis-estimation, though I don’t know how well supported it is.
Click to read (it’s free):
First the data, with calculations explained in the figure:
The pattern is one of overestimating the sizes of minority groups and underestimating sizes of majority groups. Groups hovering around the middle tend to be estimated more accurately:
When people’s average perceptions of group sizes are compared to actual population estimates, an intriguing pattern emerges: Americans tend to vastly overestimate the size of minority groups. This holds for sexual minorities, including the proportion of gays and lesbians (estimate: 30%, true: 3%), bisexuals (estimate: 29%, true: 4%), and people who are transgender (estimate: 21%, true: 0.6%).
It also applies to religious minorities, such as Muslim Americans (estimate: 27%, true: 1%) and Jewish Americans (estimate: 30%, true: 2%). And we find the same sorts of overestimates for racial and ethnic minorities, such as Native Americans (estimate: 27%, true: 1%), Asian Americans (estimate: 29%, true: 6%), and Black Americans (estimate: 41%, true: 12%).
A parallel pattern emerges when we look at estimates of majority groups: People tend to underestimate rather than overestimate their size relative to their actual share of the adult population. For instance, we find that people underestimate the proportion of American adults who are Christian (estimate: 58%, true: 70%) and the proportion who have at least a high school degree (estimate: 65%, true: 89%).
The most accurate estimates involved groups whose real proportion fell right around 50%, including the percentage of American adults who are married (estimate: 55%, true: 51%) and have at least one child (estimate: 58%, true: 57%).
This tendency to overestimate small groups and underestimate large ones has been seen in other studies. The data that fascinate me are of course the wild overestimate of the population of Jews and Muslims (often cited as trying to “take over the country”, as well as of atheists and gays. I thought everybody had a rough idea of the proportion of blacks in the U.S., but this, too is grossly overestimated. And how people can think that 30% of Americans can live in New York City eludes me (30%, like all the figures, are medians among guesses). If that were true, the city would have a population of 100 million!
On the underestimage size, disparities are smaller, but one of them surprises me: the median estimate of proportion of people who have read a book in the last year is just 50%, while the actual figure is 77%. I’m not sure abut the reason for this disparity, but I’m still horrified that only about 3/4 of Americans have read a book in a whole year (frankly, I would have guessed that it would be less).
The authors note that the overestimates of minority groups aren’t likely to be due to fear of such groups, since actual members of those groups tend to show the same degree of overestimation as do non-members. That, too, baffles me. How could a Jew think that 30% of Americans are Jewish? I always knew it was about 2%, and that’s the correct proportion.
Now, what’s the explanation? Here’s what YouGov says:
Why is demographic math so difficult? One recent meta-study suggests that when people are asked to make an estimation they are uncertain about, such as the size of a population, they tend to rescale their perceptions in a rational manner. When a person’s lived experience suggests an extreme value — such as a small proportion of people who are Jewish or a large proportion of people who are Christian — they often assume, reasonably, that their experiences are biased. In response, they adjust their prior estimate of a group’s size accordingly by shifting it closer to what they perceive to be the mean group size (that is, 50%). This can facilitate misestimation in surveys, such as ours, which don’t require people to make tradeoffs by constraining the sum of group proportions within a certain category to 100%.
This reasoning process — referred to as uncertainty-based rescaling — leads people to systematically overestimate the size of small values and underestimate the size of large values. It also explains why estimates of populations closer to 0% (e.g., LGBT people, Muslims, and Native Americans) and populations closer to 100% (e.g., adults with a high school degree or who own a car) are less accurate than estimates of populations that are closer to 50%, such as the percentage of American adults who are married or have a child.
I suppose this could be called “psychological regression to the mean.” It doesn’t fully convince me, though, because I’d think people would go on their “lived experience” rather than assume their experience has given them a biased sample of the size of a group. But I haven’t read the meta-study in the link.
All of us bandy about the term “critical race theory”, or use its initials, CRT. But how many of us really know what it is? And IS there really a widely-accepted canon of thought called CRT? If you were to ask me, I’d say CRT is the view that all life is a fight for power and hegemony of socially-constructed “races” that have no biological reality, that all politics is to be viewed through the lens of race, that the “oppressors” are, by and large, all biased against minorities and fight endlessly to keep them powerless, with many of the oppressors not even knowing their bias, and that different kinds of minority status can be combined into an “intersectionality” so that someone can be oppressed on several axes at once (for example, a Hispanic lesbian).
But not everybody agrees with that, and in fact there are widely different versions of CRT depending on the exponent (Ibram Kendi is perhaps the most extreme in his pronouncements), and also on the country. In the article below at Counterweight, Helen Pluckrose, co-author with James Lindsay of the good book Cynical Theories, tries to parse a meaning of CRT from all the diverse construals.
It turns out that because there are so many versions of CRT, perhaps (in my view) it’s best to stop using the term at all.
Click on the screenshot to read:
There’s Materialist CRT, Postmodernist CRT, the British Educational Association’s CRT, Critical Social Justice Anti-Racism, and even a version for higher education confected by Payne Hiraldo (a professor of the University of Vermont). I won’t give them all here, and of course there’s considerable overlap. Here’s what Helen says are the tenets from the book Critical Race Theory: An Introduction, with her interpolations. Her words are indented, and the tenets are doubly indented and put in bold:
Critical Race Theory: An Introduction describes it as a departure from liberal Civil Rights approaches:
Unlike traditional civil rights discourse, which stresses incrementalism and step-by-step progress, critical race theory questions the very foundations of the liberal order, including equality theory, legal reasoning, Enlightenment rationalism, and neutral principles of constitutional law.
and sets out four key tenets:
First, racism is ordinary, not aberrational—“normal science,” the usual way society does business, the common, everyday experience of most people of color in this country.
This is a claim that racism is everywhere. All the time. It’s just the water we swim in. It’s also claimed that most people of colour agree with this. In reality, people of colour differ on this although a greater percentage of black people believe it to be true than white people.
Second, most would agree that our system of white-over-color ascendancy serves important purposes, both psychic and material, for the dominant group.
This means that this system, which has just been asserted to exist everywhere, is valued by white people both psychologically and in practical terms. Many white people would disagree that they regard racism positively.
A third theme of critical race theory, the “social construction” thesis, holds that race and races are products of social thought and relations. Not objective, inherent, or fixed, they correspond to no biological or genetic reality; rather, races are categories that society invents, manipulates, or retires when convenient.
This argues that races are social constructs rather than biological realities which is true – “populations” are the biological categories and don’t map neatly onto how we understand race – and that society has categorised and recategorised races according to custom, which is also true. [JAC: I’d take issue with the claim that there is no biological “reality” at all to populations, races, or whatever you call ethnic groups. The classical definition of “race” is incorrect, but the view that races have no biological differences and are thus completely socially constructed, is also wrong.]
A final element concerns the notion of a unique voice of color. Coexisting in somewhat uneasy tension with antiessentialism, the voice-of-color thesis holds that because of their different histories and experiences with oppression, black, American Indian, Asian, and Latino writers and thinkers may be able to communicate to their white counterparts matters that the whites are unlikely to know. Minority status, in other words, brings with it a presumed competence to speak about race and racism.
There is much evidence that there is no unique voice of colour, and although there is good reason to think that people who have experienced racism may well have more perspective on it, they tend to have different perspectives. CRTs are more likely to regard those who agree with them as authoritative than those who disagree – i.e “Yes” to Derrick Bell and Kimberlé Crenshsaw but “No” to Thomas Sowell or Shelby Steele.
After you work your way through Helen’s long piece, you realize that you simply cannot use “Critical Race Theory” unless you specify exactly what version you’re talking about. In fact, I’d say it’s best to ditch the phrase altogether and just discuss the claims. I believe that’s Helen’s conclusion as well:
If it helps to call the current anti-racist theories “contemporary critical theories of race” rather than “Critical Race Theory”, do so, but for goodness’ sake, let’s stop the endless quibbling about terminology and talk about the ideas that have deeply infiltrated universities, employment, education, mainstream media, social media and general culture.
This is vitally important for two reasons. Firstly, we need to be able address racism in society ethically and effectively. Secondly and relatedly, individuals need to be allowed to have their own views about how racism works and their own ethical frameworks for opposing it. They need to be able to discuss and compare them. This will help with achieving the first goal.
When it comes to discussing contemporary critical theories of race, we need to be able to talk about what the current theories actually say and advocate for and whether they are ethical and effective. Many people from a wide range of political, cultural, racial, religious and philosophical backgrounds would say “No” they are not, and they should be able to make their case for alternative approaches.
It is also vitally important that we are able to talk about how much influence these theories already have and how much they should have on society in general and on government, employment, mainstream media, social media and education in particular, and whether this influence is largely positive or negative. From my time listening to clients of Counterweight, I would respond, “Way too much” and “Largely negative” to these questions.
She ends with what are perhaps the most important questions, and can’t resist injecting her own opinion. Others may differ, but she says she has an open mind:
Most importantly, we need to be able to measure and discuss what effects these theories have on reducing racism, increasing social cohesion and furthering the goals of social justice. Are they achieving that or are they increasing racial tensions, decreasing social cohesion and being the driving force for many injustices in society while creating a culture of fear, pigeonholing people of racial minority into political stereotypes, and silencing the voices of those who dissent? I strongly believe, based on the reports coming into Counterweight, that it is the latter. However, I am willing to be persuaded to think differently, so let’s talk.
In the end, the theory is important only if we can get data supporting or contradicting it.
I usually avoid TED talks because they smack too much of motivational speech: like the advice of Matt Foley, who lives in a van down by the river and eats government cheese. But this one popped up when I was watching YouTube, and, listening to the introduction, I was drawn into it.
The speaker, Robert Waldinger, is director of the Harvard Study of Adult Development, a project that’s been going on for 75 years. The researchers studied 724 men over that period, asking them how they were doing and what they were doing every two years until the men died. They also did personal interviews, got medical records, and even drew the subjects’ blood.
There were two groups in the original study that’s ongoing since the 1930s: Harvard sophomores and the “control” group of boys who came from troubled and disadvantaged families in poor parts of Boston.
60 of the original 724 men are still alive, and now their children are being studied as well: 2000 more. Women have been added at last. This represents an unparalleled study of what factors make for a happy and healthy life.
The answer, which may seem anodyne to you, nevertheless contradicts the Millennial answer Waldinger describes, which is the view that having fame and money make for a good life. (“A good life” is one in which the person lives it is both healthy and happy and lives a long time.) I’ll let you listen to the video for yourself.
I think this 13-minute talk is worth hearing, both for your own well being and, perhaps, to help other people. But maybe you’ll see it as obvious and trite.
By the way, Waldinger is a psychiatrist and (disappointingly to me) a psychoanalyst and is also a Zen priest.
Is the phrase “short primer” redundant? If so, forgive me. At any rate, there’s a pretty evenhanded treatment of CRT, covering its main tenets and its implications, in Forbes. You can see it by clicking on the screenshot below:
The author’s bona fides: Redstone is “the founder of Diverse Perspectives Consulting and a professor of sociology at the University of Illinois at Urbana-Champaign. [She is] the co-author of Unassailable Ideas: How Unwritten Rules and Social Media Shape Discourse in American Higher Education and a faculty fellow at Heterodox Academy.”
Her main point is that Critical Race Theory “forms a closed system”, a “perspective that leaves no space for anyone, no matter how well-intentioned, to see the world differently.” In other words, it brooks neither dissent nor discussion.
Her concerns are these:
CRT’s critics are often portrayed as wanting to “whitewash” history and deny the reality of slavery. If the problem were that simple, the criticisms would indeed be worthy of the dismissal they often receive. Yet, there are serious concerns about CRT that are rarely aired and that have nothing to do with these points. As a result, confusion and misinformation abound and tension continues to mount.
She lays out what she sees as the four main tenets of the theory as it’s presented in schools or to the public. Note that these differ from conceptions of CRT offered by scholars in academia. Quotes from the article are indented; any comments of mine are flush left.
1. Colorblind racism—Deemphasizing the role of race and racism, including to focus on concepts of merit, is itself a manifestation of racism.
2. Interest convergence—Members of the dominant group will only support equality when it’s in their best interest to do so.
3. Race and racism are always tied together. Race is a construct meant to preserve white dominance over people of color, while making it seem like life is about meritocracy.
4. Inattention to systemic racism—An unwillingness to recognize the full force of systemic racism as determining disparities between groups is a denial of the reality of racism today (and evidence of ignorance at best and racism at worst).
I’d add to that the following three points, which are mine. (Actually, points 5 and 6 come from Ibram Kendi and point 7 from Robin DiAngelo and many others):
5. (Really a supplement to point 4): Inequalities in representation or groups, for example disproportionately low numbers of people of color in STEM fields, is prima facie evidence of current and ongoing racism in those fields and not a historical residuum of racism in the past.
6. The only way to rectify this kind of systemic racism resulting from ongoing discrimination is to discriminate in favor of minorities (i.e., affirmative action, dismantling meritocracies, etc.). As Kendi said, ““The only remedy to racist discrimination is antiracist discrimination. The only remedy to past discrimination is present discrimination.”
7. Every white person, whether they know it or not, is a racist, embodying, even unconsciously, the tenets of white supremacy instantiated in point 3 above.
According to Redstone, the downside of promulgating CRT is that all criticism of the theory is immediately dismissed as racism, so that there is no room for “principled concerns some may have about seeing every aspect of society through the lens of race and power.” Further, it may be hard to restructure society, she avers, when all social problems are fobbed off on either racism and ignorance.
Finally, in this short piece she gives her recommendations for people on all sides of the political spectrum, as well as for schools and the mainstream media. I quote:
To conservatives: Stop trying to enact legislative bans on CRT. Such bans are censorious, probably unconstitutional, and, simply put, will do nothing to solve the underlying problem.
To progressives: Stop talking about CRT and, more importantly, its related ideas as though objections to it and concerns about it are all driven by a denial of systemic racism or an unwillingness to acknowledge the reality of slavery. As I’ve pointed out here, this is to grossly miss the point. The importance of this point stands even if the loudest critics are not raising the concerns I’ve outlined here.
To the mainstream media: See advice for progressives, above.
To schools and workplaces: Critical Race Theory is a social science theory—a tool to understand the world around us. As a theory, its related ideas about race, identity, power, and fairness constitute one possible way to see the world. As with any social science theory, but particularly one this controversial, its ideas should be placed in context. Placing the ideas in context requires presenting contrasting viewpoints—for instance, perspectives that do not automatically assert that racialized explanations and solutions should be the primary lens for viewing the world. Importantly, these contrasting viewpoints are to be presented on moral footing that’s equal to CRT’s.
I can’t say I disagree with any of these prescriptions. The presentation of CRT as a given that brooks no dissent is particularly troubling to me as a scientist, because, after all, it is a “theory” and can’t be taken as absolute truth. My points #5 and #7, for example, are dubious and, I think, palpably false assertions. Yet if you raise objections, you’re not only typed as a racist yourself, but demonized. We have to beware of a theory that is presented as prima facie truth, for, like CRT, it constitutes a system that, because it cannot be shown to be wrong, cannot be assumed to be right.
This is not to say, of course, that racism doesn’t exist, or hasn’t shaped our country profoundly. It does and it has. But it’s not the only problem we face (there’s the matter of class inequality, for instance), and even fixing racial inequality is far more difficult than some adherents to CRT suggest. (Effacing history, for example, by removing statues or renaming buildings, while such efforts may be warranted, will accomplish almost nothing.) And CRT won’t touch the issue of anti-Semitism.
Here Helen has an eight-minute interview with Steve Pinker. (Note that there’s a photo of Cape Cod in the background, where Steve and Rebecca repair to their second home.) It’s mostly about wokeness and how to combat it.
The literature of identity politics and social justice, with or without capitals, is full of assertions that this or that system, conception, or object is a “social construct.” This is nearly always claimed without defining “social construct,” though most of us have a vague idea that the term means something that lacks an objective reality independent of human social agreement. And it’s usually used dismissively—not to deny something like gender identity or racism—but to deny that they exist independently of human thought. That is, the claim that “race is a social construct” is taken to mean that “there is no objective reality to the concept of race, which was simply created by humans” (the usual reason is to give groups power over other groups), but nevertheless race is seen and treated as real in the same way that the idea of a monarchy (see below) is treated as real.
I decided to look up various definitions of “social construct” to see if my notion was true. It turns out that, by and large, it is, and most definitions are pretty similar. Below, for example, are four definitions with links (given definitions and glosses are indented).
Oxford English Dictionary: A concept or perception of something based on the collective views developed and maintained within a society or social group; a social phenomenon or convention originating within and cultivated by society or a particular social group, as opposed to existing inherently or naturally.
Merriam-Webster:an idea that has been created and accepted by the people in a society. Class distinctions are a social construct.
Macmillan Dictionary: a concept or belief that is based on the collective views of a society rather than existing naturally
yourdictionary.com: Social constructs develop within a society or group. They don’t represent objective reality but instead are meaningful only because people within the society or group accept that they have meaning. Simply put, social constructs do not have inherent meaning. The only meaning they have is the meaning given to them by people.
For example, the idea that pink is for girls and blue is for boys is an example of a social construct related to gender and the color of items. The collective perception that a particular color can be associated with a certain gender is not an objective representation of truth or fact. Instead, it a social convention that came to have meaning within the context of society.
So I was correct: “social constructs” are ideas or objects or notions that do not exist independently of human decision making and social agreement. They are not “real” in the sense that without social agreement about what they mean, they would have no objective reality. Or so it is claimed.
Now the term “reality”, of course, is slippery. Certainly money is real, in terms of paper currency, but the agreement that it can be used to purchase goods and has ascribed value is a social construct. Even Martian sociologists could observe this, but the value of a dollar bill would have to be ascertained by observing how it’s used. And the British monarchy is real, though it wouldn’t exist without social agreement. I won’t go on in this vein, as it leads into psychological hinterlands where I would be criticized by some no matter what I said. I simply present the definitions I’ve seen above.
Now, here is a list of examples of “social constructs” along with my rough take on whether I think they really do adhere to the definitions above. You can find more examples here.
gender. Gender and gender roles are multifarious, and more are devised each day. The behaviors associated with these (e.g., “genderfluid”) do describe real behaviors, but “genderfluid” as a given category seems to me a social construct.
gender roles. Same as above, though the behaviors may stem from biology. I would have trouble, for example, with the idea that being bisexual is “just” a social construct, for it does describe people who are attracted to members of both sexes. And there may be a biological basis for this.
sex. As I’ve argued at length, sex is a biological and objective reality, in nearly all cases of animals a binary category with a strict basis resting on gamete size. So while gender may be a social construct, sex, as in “biological sex”, is not.
sex roles. This is a mixture of both an objective reality and a social construct. That is, the view that men are generally attracted to women and vice versa, a feature that has an evolutionary basis, is not something agreed on by society, despite numerous exceptions like homosexuality. And some behavioral differences between the sexes, like aggression and risk-taking, are, I think, not social constructs but partly encoded in our DNA by natural selection. Other “roles”, like guys should like blue and girls pink, are clearly social constructs.
religion. Despite the claim that people have an inborn desire to apprehend and worship divine deities or concepts, I see religion as a social construct. It may have a biological origin, as some claim (ascribing mysterious events to specific causes), but religion in the form we know it is clearly something devised by humans. I also don’t think that if we wiped out all religious sentiment from the planet, it would return with nearly the ubiquity it has today. We simply know too much about what makes things happen, and we still have no evidence for gods.
social class system. It’s an objective fact that some people are smarter than others and some make more money than others. But the idea that this makes some people superior to others is clearly a social construct, and a maladaptive one. Indian castes are similar, but have been genetically separated for so long via historical origins as well as prohibitions on intermarriage that now there are correlations between one’s caste and one’s genes.
monarchy. A social construct and, I think, another maladaptive one.
marriage. A social construct; many societies don’t have marriage in the way we know it. The rules, rituals, and laws about marriage have all been made up by society.
countries. Clearly social constructs based on human history and either warfare or general agreement among different groups of people.
money (see above).
biological species. Not a social construct in general, but a reality existing independent of humans, most obvious in sexually reproducing animals but also in many plants (animals, after all, chose to mate with members of their own species, and that choice has nothing to do with human consensus). For a full-scale justification of species as real groups, independent of human conception, see Chapter 1 of my book with Allen Orr, Speciation.
disability. Another slippery one. Clearly if someone has lost their sight or their limbs, they are not as “able” to do some stuff than people who are relatively intact, though they may develop compensatory skills (like more acute hearing in the deaf) that make them “super able” in other ways. Ergo the term “differently abled.” In general the idea that people with such losses should have interventions to compensate for them and enable them to participate more fully in society, and should have such interventions, is both an objective reality (e.g., for the blind) but also a social convention (our moral view that the disabled deserve to be accommodated).
I should add here that I see morality is perhaps the most prominent social construct, for while it’s a fact that societies have moral systems, the specific actions viewed as “good” or “bad” have no objective justification or even a label independent of human agreement.
race. This is the most hot-button of the topics, so I’ve saved it for last. Clearly race is a “social construct” if by the term you mean that “races are absolutely distinguishable groups of individuals with substantial and diagnostic genetic differences.” The old Carleton Coon-ian races of “Caucasoid, Mongoloid, Capoids, Congoids, and Australoids” have gone down the drain.
On the other hand, multi-site genetic analysis shows, in general, that humans do fall into groups largely distinguishable from their DNA, though those groups are overlapping and show gene mixing, so that many individuals cannot be said to fall into a given group. But the grouping of humans can, with fair accuracy, give an idea of someone’s geographic origins and ethnicity, because it reflects an ancient geographic separation of populations that led to their genetic differentiation. As we know, the amount of diversity within any given group exceeds the diversity between groups, but that doesn’t mean that you can’t use multiple segments of DNA, combined, to diagnose someone’s ancestry and ethnicity.
Multilocus groupings of humans, for example, show that they can be divided into various fairly distinct genetic clusters, ranging from 4-7, and which correspond roughly to areas where humans were genetically isolated (Africa, Oceania, East Asia, the Americas, etc.) In the U.S., multi-site cluster analysis identifies four clusters, corresponding to whites, Hispanics, African-Americans, and East Asians (Chinese and Japanese). Further, when you look at someone’s genetic profile and put it into one of those four clusters, and then ask them, without that knowledge, what their self-identified “race” is, the match between genetics and self-identified “race” is remarkable. As the paper of Tang et al. notes:
“Of 3,636 subjects of varying race/ethnicity, only 5 (0.14%) showed genetic cluster membership different from their self-identified race/ethnicity.”
I won’t cite other studies showing that you can identify the location of one’s genetic ancestors with remarkable accuracy. The point is that this correspondence between genes and ancestry, and between phenotype (correlated with ancestry) and genes means that “race”, while a loaded term—I use “ethnic groups” as a substitute—has some basis in biological reality and therefore is not a social construct. If the concept of “race” (or “ethnicity”, as I prefer to say) were purely an agreement of people within society having nothing to do with objective reality, you wouldn’t see the correspondence between how one identifies themselves and the code in their DNA. I hasten to add that these biological identifiers of races say nothing about hierarchies, but they are biologically and evolutionarily meaningful.
All this discussion goes to show several things. First, the concept of a “social construct” is bandied about widely, but often used either inaccurately or is not defined at all. Some things seen as social constructs, like sex and race—or species, for that matter, as some misguided biologists have asserted that species in nature are purely human-defined segments of a biological continuum—actually have an objective reality independent of human consensus. Others, like a monarchy or Mormonism, are purely the results of a human consensus. Thus you need to explain what you mean when you claim that something is a “social construct”, and explain why that concept has no objective reality but is purely the result of social agreement.
Two of the last holdout areas for religion—countries and regions that have historically been resistant to nonbelief—are now becoming surprisingly secular. Those are Ireland in the West and seven countries in the Middle East—at least according to recent surveys. The stunning thing about both areas is how fast the change is coming.
Let’s take the Middle East first. There are two studies mentioned in the article below in Die Deutsche Welle (click on screenshot):
The article itself gives data for only Iran, but you can find data for six other countries by clicking on the article’s link to a study at The Arab Barometer (AB), described as “a research network at Princeton University and the University of Michigan.” (The sample size for that study isn’t easily discernible from the various articles about it).
First, a graph showing a striking increase in secularism across the six nations:
The change from the blue bar to the burgundy one is at most 7 years, yet every index in each country has dropped over that period save for a few indices that appear to be unchanged. The true indices of religiosity itself—profession of nonbelief and attendance at mosques—has fallen dramatically. And remember, this is over less than a decade. Trust in religious leaders and Islamist parties has also dropped.
Here’s the summary among all these countries. (Note that many Muslim countries, including those in Africa and the Far East, as well as nations like Saudi Arabia and Yemen, aren’t represented.)
In 2013 around 51% of respondents said they trusted their religious leaders to a “great” or “medium” extent. When a comparable question was asked last year the number was down to 40%. The share of Arabs who think religious leaders should have influence over government decision-making is also steadily declining. “State religious actors are often perceived as co-opted by the regime, making citizens unlikely to trust them,” says Michael Robbins of Arab Barometer.
The share of Arabs describing themselves as “not religious” is up to 13%, from 8% in 2013. That includes nearly half of young Tunisians, a third of young Libyans, a quarter of young Algerians and a fifth of young Egyptians. But the numbers are fuzzy. Nearly half of Iraqis described themselves as “religious”, up from 39% in 2013. Yet the share who say they attend Friday prayers has fallen by nearly half, to 33%. Perhaps faith is increasingly personal, says Mr Robbins.
And some data from Iran, not represented in the survey above. Remember, Iran is a theocracy. The survey is for those over 19, and the sample size is large: over 40,000 “literate interviewees”.
An astonishing 47% have, within their lifetime, gone from being religious to nonreligious, while only 6% went in the opposite direction. As we see for almost every faith, women retain their religion more than men. The “non-religious people” aren’t all atheists or agnostics, but instead appear to be “nones”—those with no formal affiliation to a faith. (This includes atheists and “spiritual people” as well as goddies who don’t belong to a formal church.)
I say that many are “nones” because another study in Iran, cited in the AB article, showed that 78% of those surveyed in the Middle East believe in God: a lot more than the 47% below who professor to being “non-religious” (of course these are different surveys and might not be comparable). Still, in this other survey, 9% claim that they’re atheists—comparable to the 10% of Americans who self-describe as atheists.
The sociologist Ronald Inglehart, Lowenstein Professor of Political Science emeritus at the University of Michigan and author of the book Religious Sudden Decline [sic], has analyzed surveys of more than 100 countries, carried out from 1981-2020. Inglehart has observed that rapid secularization is not unique to a single country in the Middle East. “The rise of the so-called ‘nones,’ who do not identify with a particular faith, has been noted in Muslim majority countries as different as Iraq, Tunisia, and Morocco,” Tamimi Arab added.
Inglehart’s book, Religion’s Sudden Decline, came out January 2, so it’s brand new, and you can order it on Amazon here.
It’s a pity that Grania isn’t here to comment on this article from Unherd’s new news site The Post, as she always had a good take on Catholicism in Ireland (she was, in fact, a German citizen born in South Africa). These data come from a study taken by the World Inequality Database, which I can’t access. I’ll just give the scant data for Ireland presented by David Quinn (click on screenshot):
The proportion of Irish people who say they never go to church:
That is a huge jump!
The proportion of Irish people who regularly attend church (once a month or more often):
This shows that the drop in Irish religiosity reflects a rise in who rarely or never go to church, not a falling-off of the regulars. Quinn reports that “just under half of Irish people were coming to church less than once a month four or five year [sic] ago and this is now just 22%. Many of those sporadic attenders have stopped coming altogether.”
Over much of the 12 years this website has been going (we started in January 2009), I’ve written posts showing the decline of religiosity in the West, predicting that it is a long-term trend that will end with religion becoming a vestigial social organ. Yes, it will always be with us, but in the future it won’t be very much with us. But I thought the Middle East would be a last bastion of belief, as Islam is so deeply intertwined with politics and daily life. But that appears to be waning as well, for the Middle East is becoming Westernized in many ways, and with that comes Western values and secularism (see Pinker’s Enlightenment Now for discussion of increased secularism and humanism.) This is to be applauded, except by those anti-Whigs who say that religion is good for humanity.
Quinn echoes much of this at the end of his piece, explaining why Ireland remained more religious than England and the countries of Northern Europe:
Secularisation has swept across the whole of the western world, and Ireland is part of the West. It was impossible for Ireland not to eventually be affected by social and intellectual trends elsewhere. What almost certainly delayed secularisation in Ireland is that, in the years after we gained independence, one way of showing we had shaken off British rule was by making Catholicism an integral part of our national identity. As we no longer believe it is necessary to do this, we are now shaking off the Church.
The third factor is that, as a small country it can be particularly hard to stand out from the crowd. Once, we all went to Mass. Now, below a certain age, almost no-one goes. We were a nation of nuns and priests. Now, we are becoming a people with no direct religious affiliation: a country of ‘nones’.
Here we have two editorials purporting to say different things, but in the end reaching nearly identical conclusions.
The first, published at Persuasion (click on screenshot), is by a young writer, Sahil Handa, described by Harvard’s Kennedy school: “a rising Junior from London studying Social Studies and Philosophy with a secondary in English. At Harvard, Sahil writes an editorial column for the Crimson and is a tutor at the Harvard Writing Center. He is the co-founder of a Podcast Platform startup, called Project Valentine, and is on the board of the Centrist Society and the Gap Year Society.”
The title of Handa’s piece (below) is certainly provocative—I see it as a personal challenge!—and his conclusion seems to be this: most students at elite colleges (including Harvard) are not really “woke” in the sense of constantly enforcing “political correctness” and trying to expunge those who disagree with them. He admits that yes, this happens sometimes at Harvard, but he attributes wokeness to a vocal minority. The rest of the students simply don’t care, and don’t participate. In the end, he sees modern students as being similar to college students of all eras, especially the Sixties, when conformity meant going to “hippie protests.” His conclusion: modern “woke” students, and those who don’t participate in the wokeness but also don’t speak up, are evincing the same “old borgeois values” (presumably conformity). And we shouldn’t worry about them.
It’s undeniable, and Handa doesn’t deny it, that Wokeism is pervasive at Harvard. He just doesn’t see it as universal:
If you’re reading this, chances are you’ve heard of the woke mob that has taken over college campuses, and is making its way through other cultural institutions. I also suspect you aren’t particularly sympathetic to that mob. While I’m not writing as a representative of the woke, I do wish to convince you that they are not as you fear. What you’re seeing is less a dedicated mob than a self-interested blob.
I recently finished three years as a Harvard student—a “student of color,” to be precise—and I passed much of that time with the type you might have heard about in the culture wars. These were students who protested against platforming Charles Murray, the sociologist often accused of racist pseudoscience; these were students who stormed the admissions office to demand the reversal of a tenure decision; these were students who got Ronald Sullivan—civil rights lawyer who chose to represent Harvey Weinstein in court—fired as Harvard dean.
. . . . Nor are most students even involved in campus protest.
There are almost 7,000 undergraduates at Harvard, yet the tenure protest was attended by fewer than 50 students, and a few hundred signed the letters urging the administration to fire Sullivan. Fretful liberals do not pause to think of all the students who didn’t join: those who talked critically of the activists in the privacy of their dorm rooms; those who wrestled with reservations but decided not to voice them; or those who simply decided that none of it was worth their time.
But Sullivan was fired as a dean. The Harvard administration itself engages in a lot of woke decisions, like punishing students from belonging to off-campus single-sex “finals= clubs” (probably an illegal punishment), and giving them “social justice placemats” in the dining halls to prepare them to go home for the holidays. The woke students may not be predominant, but they are vocal and loud and activist. If that’s all the administration sees and hears, then that’s what they’ll cater to.
But why aren’t the non-woke students protesting the woke ones? Well, Handa says they just don’t care: they’re too busy with their studies. But it’s more than that. As he says above, the students who have “reservations” “decide not to voice them.” Why the reticence, though?
It’s because voicing them turns them into apostates, for their college and post-college success depends on going along with the loud students—that is, acquiescing to woke culture. The Silent Majority has, by their self censorship, become part of woke culture, which creates self-censorship. (My emphases in Handa’s excerpt below):
The true problem is this: Four years in college, battling for grades, for résumé enhancements and for the personal recommendations needed to enter the upper-middle-class—all of this produces incentives that favor self-censorship.
College campuses are different than in the Sixties, and students attend for different reasons. Young people today have less sex, less voting power and, for the first time, reduced expectations for the future. Back in the Sixties, campus activists were for free speech, and conservatives were skeptical; today, hardly anybody seems to consistently defend free speech. In 1960, 97% of students at Harvard were white, and almost all of them had places waiting in the upper class, regardless of whether they had even attended university. Today, fewer than 50% of Harvard students are white, tuition rates are 500% higher, and four years at an Ivy League college is one of the only ways to guarantee a place at the top of the meritocratic dog pile.
It would be strange if priorities at university had not changed. It would be even stranger if students had not changed as a result.
Elite education is increasingly a consumer product, which means that consumer demands—i.e. student demands—hold sway over administration actions. Yet most of those student demands are less a product of deeply understood theory than they are a product of imitation. Most students want to be well-liked, right-thinking, and spend their four years running on the treadmill that is a liberal education. Indeed, this drive for career success and social acquiescence are exactly the traits that the admissions process selects for. Even if only, say, 5% of students are deplatforming speakers and competing to be woker-than-thou, few among the remaining 95% would want to risk gaining a reputation as a bigot that could ruin their precious few years at college—and dog them on social media during job hunts and long after.
It seems to me that he does see a difference between the students of then and now. Yes, both are interested in conforming, but they conform to different values, and act in different ways. After all, they want to be “right thinking”, which means not ignoring the woke, but adopting the ideas of the woke. And that conformity extends into life beyond college, for Harvard students become pundits and New York Times writers. This means that intellectual culture will eventually conform to the woke mold, as it’s already been doing for some time.
In the end, Handa’s argument that we should pretty much ignore Woke culture as an aberration doesn’t hold water, for he himself makes the case that many Harvard students exercise their conformity by not fighting Woke culture, and even becoming “right-thinking”. After tacitly admitting that Wokeism is the wave of the future, which can’t be denied, he then reiterates that college Wokeism doesn’t matter. Nothing to see here folks except a war among elites, a passing fad:
The battle over wokeism is a civil war among elites, granting an easy way to signal virtue without having to do much. Meantime, the long-term issues confronting society—wage stagnation, social isolation, existential risk, demographic change, the decline of faith—are often overlooked in favor of this theater.
Wokeism does represent a few students’ true ideals. To a far greater number, it is an awkward, formulaic test. Sometimes, what might look to you like wild rebellion on campus might emanate from nothing more militant than old bourgeois values.
Perhaps Stalinism didn’t represent the ideas of every Russian, either, but by authoritarian means and suppression of dissent, all of Russia became Stalinist. The woke aren’t yet like Stalinists (though they are in statu nascendi), but even if they aren’t a majority of the young, the values of the Woke can, and will, become the dominant strain in American liberal culture. For it is the “elites” who control that culture. Even poor Joe Biden is being forced over to the woke Left because he’s being pushed by the woke people he appointed.
Michael Lind has what I think is a more thoughtful piece at Tablet, which lately has had some really good writing. (They’ve been doing good reporting for a while; remember when they exposed the anti-Semitism infecting the leaders of the Women’s March?). Lind is identified by Wikipedia as “an American writer and academic. He has explained and defended the tradition of American democratic nationalism in a number of books, beginning with The Next American Nation (1995). He is currently a professor at the Lyndon B. Johnson School of Public Affairs at the University of Texas at Austin.”
Lind’s thesis, and I’ll be brief, is that the nature of American elitism has changed, and has become more woke. It used to be parochial, with each section of the country having its own criteria for belonging to the elite (i.e. attending the best regional rather than national colleges). Now, he says, we have a “single, increasingly homogeneous national oligarchy, with the same accent manners, values, and educational backgrounds from Boston to Austin and San Francisco to New York and Atlanta. He sees this as a significant social change: a “truly epochal development.”
Click on the screenshot to read his longer piece:
In some ways, avers Lind, society is more egalitarian than ever, and what he means by that is that there is less obvious bigotry or impediments to success for minorities. And he’s right:
Compared with previous American elites, the emerging American oligarchy is open and meritocratic and free of most glaring forms of racial and ethnic bias. As recently as the 1970s, an acquaintance of mine who worked for a major Northeastern bank had to disguise the fact of his Irish ancestry from the bank’s WASP partners. No longer. Elite banks and businesses are desperate to prove their commitment to diversity. At the moment Wall Street and Silicon Valley are disproportionately white and Asian American, but this reflects the relatively low socioeconomic status of many Black and Hispanic Americans, a status shared by the Scots Irish white poor in greater Appalachia (who are left out of “diversity and inclusion” efforts because of their “white privilege”). Immigrants from Africa and South America (as opposed to Mexico and Central America) tend to be from professional class backgrounds and to be better educated and more affluent than white Americans on average—which explains why Harvard uses rich African immigrants to meet its informal Black quota, although the purpose of affirmative action was supposed to be to help the American descendants of slaves (ADOS). According to Pew, the richest groups in the United States by religion are Episcopalian, Jewish, and Hindu (wealthy “seculars” may be disproportionately East Asian American, though the data on this point is not clear).
Membership in the multiracial, post-ethnic national overclass depends chiefly on graduation with a diploma—preferably a graduate or professional degree—from an Ivy League school or a selective state university, which makes the Ivy League the new social register. But a diploma from the Ivy League or a top-ranked state university by itself is not sufficient for admission to the new national overclass. Like all ruling classes, the new American overclass uses cues like dialect, religion, and values to distinguish insiders from outsiders.
And that’s where Wokeness comes in. One has to have the right religion (not evangelical), dialect (not southern) and values (Woke ones!):
More and more Americans are figuring out that “wokeness” functions in the new, centralized American elite as a device to exclude working-class Americans of all races, along with backward remnants of the old regional elites. In effect, the new national oligarchy changes the codes and the passwords every six months or so, and notifies its members through the universities and the prestige media and Twitter. America’s working-class majority of all races pays far less attention than the elite to the media, and is highly unlikely to have a kid at Harvard or Yale to clue them in. And non-college-educated Americans spend very little time on Facebook and Twitter, the latter of which they are unlikely to be able to identify—which, among other things, proves the idiocy of the “Russiagate” theory that Vladimir Putin brainwashed white working-class Americans into voting for Trump by memes in social media which they are the least likely American voters to see.
Constantly replacing old terms with new terms known only to the oligarchs is a brilliant strategy of social exclusion. The rationale is supposed to be that this shows greater respect for particular groups. But there was no grassroots working-class movement among Black Americans demanding the use of “enslaved persons” instead of “slaves” and the overwhelming majority of Americans of Latin American descent—a wildly homogenizing category created by the U.S. Census Bureau—reject the weird term “Latinx.” Woke speech is simply a ruling-class dialect, which must be updated frequently to keep the lower orders from breaking the code and successfully imitating their betters.
I think Lind is onto something here, though I’m not sure I agree 100%. This morning I had an “animated discussion” with a white friend who insisted that there was nothing wrong with using the word “Negro”. After all, he said, there’s the “United Negro College Fund.” And I said, “Yeah, and there’s also the National Association for the Advancement of Colored People, but you better not say ‘colored people’ instead of ‘people of color’!” In fact, the term “Negro” would be widely seen as racist now, though in the Sixties it wasn’t, and was used frequently by Dr. King, who almost never used the n-word in public. “Negro” was simply the going term for African-Americans then, but now it’s “people of color”, or, better yet, “BIPOCs. And that will change too”. “Gay” has now become a veritable alphabet of initials that always ends in a “+”. “Latinx” isn’t used by Hispanics, but by white people and the media. It’s an elitist thing, as Lind maintains.
But whether this terminology—and its need to constantly evolve, 1984-like—is a way of leveraging and solidifying cultural power, well, I’m not sure I agree. Weigh in below.