Why Evolution is True is a blog written by Jerry Coyne, centered on evolution and biology but also dealing with diverse topics like politics, culture, and cats.
This video shows a brave man. Why? Because he’s an Egyptian living in Egypt, presumably a Muslim, who is saying The Truth That Must Not Be Said: that ISIS uses explicit justification from the Qur’an and the hadith for its barbarity. The motivation for slaughter, beheadings, and immolation, in other words, comes from Islam.
It’s Ibrahim Eissa, an Egyptian journalist speaking the truth a week ago on al Tahrir (Egyptian) T.V.
Click on the screenshot below, or go here to listen to the video.
I hope he has armed guards. . .
The transcript:
Ibrahim Issa: Whenever ISIS carries out an act of barbarity, such as decapitations, throat slitting, or the burning of a person alive, as they did today, various sheiks tell you – if they even bother to say anything – that this has nothing to do with Islam, that Islam is not to blame, and whatever. But when the people of ISIS perpetrate slaughter, murder, rape, immolation, and all those barbaric crimes, they say that they are relying on the sharia. They say that this is based on a certain hadith, on a certain Quranic chapter, on a certain saying of Ibn Taymiyyah, or on some historical event. To tell the truth, everything that ISIS says is correct.
. . . This should not come as a surprise to anyone, as a surprise to anyone, and nobody should be shocked by what I am saying. All the evidence and references that ISIS provides to justify its crimes, its barbarity, and its horrifying, criminal, and despicable violence… All the evidence and references that ISIS provides, claiming that they can be found in the books of history, jurisprudence, and law, are, indeed, to be found there, and anyone who says otherwise is lying.
. . . When they kill a person claiming that he is an infidel, when they rape women, when they kill prisoners, and when they slaughter and decapitate people, they say that the Prophet Muhammad said so. Indeed, the Prophet said so! What was the context? The interpretation? That’s a whole different story. None of those [Al-Azhar clerics] who purport to be moderate, and who were told by President Al-Sisi to change the religious discourse, have the courage – not a single grain of courage – to admit that these things are indeed to be found [in Islamic sources] and are [morally] wrong. If it is claimed that a certain companion of the Prophet did this or that, you should respond by saying that he was morally wrong. I would like to see a single Al-Azhar cleric in Egypt have the courage to admit that Abu Bakr burned a man alive. That’s right. He burned Fuja’ah [Al-Sulami]. This is a well-known historical story.
. . . Was Abu Bakr morally wrong to burn that man alive? Nobody dares to say so. So we are left in this vicious circle, and you can expect more barbarity, because all this barbarity is sacred. It is sacred. This barbarity is wrapped in religion. It is immersed in religion. It is all based on religion. Your mission [as a cleric] is to say that while it is part of our religion, the interpretation is wrong. Do not tell people that Islam has nothing to do with this.
I would love to hear what Karen Armstrong or President Obama would say in response.
Australia, though a pretty nonreligious country compared to the U.S., still has its pockets of faith. One was recently emptied on the Australian Broadcasting System’s (ABC’s) site “The Drum,” in piece written by Simon Smart, a man described on the ABC’s site as
. . . a Director of the Centre for Public Christianity and the co-author with Jane Caro, Antony Loewenstein and Rachel Woodlock of For God’s Sake – An Atheist, a Jew, a Christian and a Muslim Debate Religion. A former history and English teacher, he studied theology at Regent College, Vancouver. He is the author of a number of books including Bright Lights Dark Nights – the Enduring Faith of 13 Remarkable Australians.
All it takes is a degree in theology to screw up one’s thinking, or so it seems in Smart’s piece “Fry vs. God: the comedian’s concerns aren’t new.” In it, Smart attack’s Stephen Fry’s short response on theodicy in his interview with Gay Byrne on Irish television (I’ve put it below in case you’ve forgotten it or haven’t seen it.) Asked what he’d say to God if he encountered Him, Fry responded that he’d ask God why there’s so much unmerited evil in the world, like kids getting cancer.
Fry’s short response has caused a lot of consternation among Christians. I’m a bit surprised at that, because the problem of evil, which spawned the discipline of theodicy, has been around as long as Christianity itself. Fry didn’t say anything that the Old Atheists didn’t say decades ago. But the old arguments need to be raised again with each generation of believers, and Fry is, after all, an immensely smart, popular, and likable man. His comments demand an answer.
Well, Smart has responded to Fry. As with all attempts to give a religious justification of evil, though, he fails miserably. Here’s how Smart explains why an all-loving and omnipotent god tortures children (my emphasis):
Ultimately when a Christian stops to consider the struggle of human existence they will want to point to the death and resurrection of Jesus as the centre of a very long, and still unfolding, story of how God launches a plan to redeem the world from its misery – a portrait of a God who has not remained aloof from the suffering but rather has become part of it.
The resurrection of Jesus points to God condemning all the things that have destroyed life, and promising a day when the weight of history and all the centuries of human cruelty, sadness and loss will be overcome. Is it enough? Not everyone will think so.
The biblical picture offers a promise of the possibility of a new beginning when murdered children will be raised up and restored, where families torn apart by violence will find peace and harmony again. It presents a vision of a time where crushing loneliness will be a thing of the past, where bodies broken and ravaged by disease or old age will be restored to strength and vitality, where people who have experienced grinding poverty will find abundance, where children ripped from their mother’s arms in a tsunami will be ushered in to new life. In the end Christianity is a story of the denial of the powers of darkness and violence and cruelty and hatred and heartbreak. And in their place the victory of goodness and mercy; kindness and love.
Every aspect of this vision is predicated on Jesus rising from death. If that didn’t happen, then it is right and proper to join Stephen Fry and to throw the whole thing out the window.
This is completely insane. It simply says that because of Jesus all things wrong will be made right in the next world. But that doesn’t answer the question at all, for the evil has still already occurred in our world! Even if there’s a new beginning, the end was often pretty bad. And God could have prevented those bad endings.
Is it better to have a kid who dies horribly of cancer subsequently find peace and harmony in heaven, or for that kid to not have gotten cancer in the first place, causing horrible suffering for herself and her parents? Which way of ordering the cosmos would be better? I would have thought that a truly good God would create a world in which there was no suffering of innocents and people went to heaven as well. Or why not just have a heaven that everyone lives in, and cut out the middleman? In fact, why make people in the first place? Smart gives no answers.
This week I’ve seen more than one Christian use similar arguments to justify evil, and they all seem blinkered to the problems. If God is omnipotent, he can do anything he wants, and if he makes innocents (or animals) suffer horribly, there must be a good explanation. The only credible ones left to believers are a). That God is sometimes evil, as he was in the Old Testament, or b). We don’t understand God’s ways. Any other explanation is pilpul. The most parsimonious explanation is, of course, that there is no God and “evil” is simply expected in an evolved world, a world where mutations cause cancer, people get infectious diseases, the tectonic plates move, and animals and microbes evolved to kill other animals.
At least Smart has the honesty to admit that if there were no Resurrection, then Christanity crumbles at its core. But his argument that the Resurrection justifies natural evils doesn’t hold water.
In the end, Smart plays his trump card: even if we don’t know whether there is a God, or why, if there is a good God, he creates evil, we have to keep the myths alive because of the Little People:
We may even, as Fry claims to have done, conjure up a degree of optimism in the face of the implications of a godless universe. But if we arrive at that point it would be fitting to acknowledge that, while doing nothing to rid ourselves of suffering we will have removed a source of profound hope that for centuries has sustained millions of people in the face of life’s joys and sorrows.
NASA astronaut Barry “Butch” Wilmore captured this humbling view of East Coast lights as the sun was just beginning to creep above the horizon earlier this week. The faint, blue glow of the sun’s rays passing through our atmosphere can be seen on the right, mirrored by a beautiful, green aurora on the left.
Washington, D.C., Baltimore, New York City, and Boston can be seen whizzing by. A thin layer of mid-level clouds blurs city lights over parts of the Northeast.
When Christopher Hitchens was close to death, a veil was drawn around him by his friends and loved ones, and rightly so, I guess, for it’s intrusive to inquire how a man is faring on his deathbed. But I’ve always wondered, given that Hitchens was eloquent, brave, and an atheist, what his last words were. I haven’t read any accounts of his death—including his own book Mortality, which contains a eulogy by his wife Carol Blue—so the anecdote I’m about to tell may already be well known.
While looking for some information on Hitchens, I came across an account of his memorial service, held in New York on April 20, 2012, though he died in Houston on December 15 of the previous year. The account was written, curiously enough, by Andrew Sullivan for his website The Dish, and was called “The Hitch has landed“. It’s a poignant remembrance by Sullivan, who was an usher at the service. And there’s an excerpt that gives us Hitchens’s last words:
And then his last words. As he lay dying, he asked for a pen and paper and tried to write on it. After a while, he finished, held it up, looked at it and saw that it was an illegible assemblage of scribbled, meaningless hieroglyphics. “What’s the use?” he said to Steve Wasserman. Then he dozed a little, and then roused himself and uttered a couple of words that were close to inaudible. Steve asked him to repeat them. There were two:
“Capitalism.”
“Downfall.”
In his end was his beginning.
You’ve surely seen the video below, but here is Hitchens in his last public appearance, two months before he died, receiving the Richard Dawkins Award in at the Texas Freethought Convention in Houston. He left his hospital bed to speak, and these may have been his last words in public, The video is set to music, but that doesn’t detract from what he said, for he fought the enemies of reason right up to the end.
I continue to read Hitchens for inspiration, and brush off those detractors who devalue his entire life simply because he was in favor of the Iraq war. The man spent his life battling totalitarianism and irrationality, and of course he was sometimes wrong. Who among us hasn’t been? But mark a few errors against the very full column of his brilliance, his fight for what he saw as true, his copious writings on so many topics, and the eloquence that inspired us all. It’s a bromide to say of someone who’s died that “they can’t be replaced,” but in the case of Hitchens it’s undeniably true.
Human bedbugs, Cimex lectularius, are “true bugs,” that is, insects in the order Hemiptera. They are an infernal pest, sucking the blood out of people and leaving a nasty, itchy rash. (I was bitten only once, but it was in a fleabag hotel in Peru, and there were many bites all over me, with the rash persisting for about three weeks.) Fortunately, bedbugs aren’t known to carry any diseases.
Still, they’re annoying, as you’ll know if you’ve followed the news over the past couple of years. Having been nearly eradicated by 1940 following applications of DDT, bedbugs started making a comeback when we declared a DDT moratorium, and the bugs are now common in American cities and a devil to eliminate.
Here’s a human bedbug sucking blood from the arm of a volunteer (photo from Wikipedia)
But where did bedbugs come from? Well, it’s long been known that their closest relative seems to be the bat bug, a similar insect that lives on bats, sucking their blood in the caves. The batbug also happens to be classified as the same species as the human bedbug, Cimex lectularius. The morphological differences between the two forms are trivial, but you can still tell them apart with a microscope. Below is a diagram and some text from Bad Bed Bugshighlighting the diagnostic differences:
The trick to identifying a bat bug is by looking at the length of hairs on the upper covering of the thorax. The picture above is the joining of one half bat bug (left side) and one half bed bug (right side). You’ll notice that the length of the bat bugs hairs is longer than the width of its eye. The bed bug however, has hairs that are smaller than the width of its eye.
There are other differences, too: as Carl Zimmer notes in a new piece in the New York Times, the human variety has longer and thinner legs than the bat variety, perhaps because the bat variety needs a firm grip on their cave-hanging hosts.
There also appear to be physiological differences. As a new paper in Molecular Ecology by Warren Booth and colleagues (reference and free link below) notes, each type does better in terms of longevity and reproduction when it feeds on its own host. A batbug forced to ingest human blood does okay, but not as well as on a bat, and vice versa. Finally, the daily rhythm (“diurnal cycle”) differs between the two forms: batbugs feed during the day, when bats are asleep in their caves, while human bedbugs feed at night, when humans are asleep in their beds.
One problem with these data, which are used by both Zimmer and Booth et al. to imply genetic differentiation, is that we don’t know whether these differences are evolved genetic differences between the forms, or are only developmental/physiological responses to feeding on different species. It’s possible that if you transferred a batbug to humans, it will develop longer legs, change its feeding cycle, and get physiologically acclimated to human blood in a generation or so, and that this is not due to evolutionary (genetic) change, but could be a purely developmental (“plastic”) response.
That’s not a far-fetched interpretation. Human head and body lice, which are not different species, also transform their physiology and morphology as a result of acclimation and not genetics, and even Anolis lizards change the shape of their legs if they’re forced to climb on thin branches rather than clamber on tree trunks or the ground. The only way to determine if the morphological differences between bedbugs and batbugs are due to genetic/evolutionary change is to rear them over several generations on a common diet, and see if the differences persist. If they do, they’re genetic.
The reason Zimmer and Booth et al. dwell on this is because bats have been suggested to be the vector that gave us human bedbugs. Bats, so the theory goes, were originally afflicted with batbugs, and early humans lived in caves alongside the infested bats. Batbugs then found a juicy new source of food nearby, a few individuals colonized humans, and the rest is history: the human bedbug.
Booth et al. wanted to see how much genetic differentiation there really is between human bedbugs and batbugs, and so their paper reports an extensive genetic analysis of several hundred of individuals from both forms of the bug. The researchers looked at mitochondrial DNA, nuclear DNA (in the form of microsatellites), and at genes that had evolved in human bedbugs to resist DDT.
What they found was that batbugs and human bedbugs do indeed show significant genetic differentiation—in all three types of genes investigated. Bedbugs and batbugs clearly form two distinct genetic lineages. This is shown by statistical analysis of bugs taken from the two hosts; the figure below shows the genetic differentiation for nuclear DNA among samples of both forms taken in Europe. Brown dots are individual human bedbugs, blue are batbugs, and you can see how well separated they are (see the caption below the figure).
(Caption from paper, Fig. 4) Results of Factorial Correspondence Analysis showing genetic differentiation based on microsatellite allele frequencies for individual C. lectularius collections sampled across Europe. Samples clustered by host: brown (dark) squares represent human-associated samples, blue (light) squares represent bat-associated bed bugs.
There is, however, still some evidence of gene flow between the two forms, perhaps occurring when a batbug finds itself on a human and mates with bedbugs, or vice versa. Although most human bedbugs show the DDT-resistant form of “pesticide genes”, a few don’t, and those “susceptible” genes may have come from the batbugs, which never experienced DDT. Still, what we have here are two closely-related but genetically distinct lineages, and that is the big lesson from the paper of Booth et al. But they want to say more, and that is what Carl Zimmer highlighted in his NYT piece (see question #2 below).
Two questions remain:
1. Were batbugs the ancestors of the human bedbug? It seems likely, although neither Zimmer nor Booth et al. explicitly give the information that is be crucial for ansering this question: Are the batbug and bedbug more genetically similar to each other than either is to any other species in the genus? If the batbug is the ancestor of the bedbug, then the two forms have to be “sister taxa,” that is, each other’s closest relatives. Now this may indeed be the case, and may be cited in one of Booth et al.’s references, but I didn’t look them all up. I’ll take it for granted that both Zimmer and Booth et al. know that these are in fact sister taxa.
But one problem remains: do they only look like sister taxa because there has been gene flow between batbugs and human bedbugs, making them look as if they evolved recently when in fact they didn’t? This is a problem with trying to suss out the evolutionary history for any pair of species that live in the same place and occasionally hybridize. Fortunately, it can be taken care of. For example, if bedbugs and batbugs had distinct forms of genes (as they do), but those forms are still more similar to each other than to the gene forms of other species or populations in the genus, then that would imply that they are indeed sister taxa. Neither the authors nor Zimmer discuss this, but it may be such a well-known result that neither thought it necessary to mention it explicitly.
Also, the human bedbug is genetically depauperate compared to the batbug, and that’s what one would expect if only a few individual batbugs originally colonized humans, going through what we call a “population bottleneck.” The genetically depauperate nature of the human bedbug compared to the batbug also implies that if there was a colonization from bats to humans, it happened only once or a very, very few times. If colonization was frequent, human bedbugs would be much more genetically variable among populations than we see. If the bat transfer theory is correct, the colonization of humans by batbugs must have occurred in the distant past when humans lived in caves along with bats, and that would probably be about 50,000 years ago in Eurasia. (No molecular dating of the divergence was reported.)
But what the authors and Zimmer find most exciting about the study is encapsulated in the second question:
2. Are these forms on the road to becoming different species? Are we seeing, in the form of batbugs and human bedbugs, two groups that descended from a common ancestor (on bats), and are now in the process of becoming different biological species? Indeed, Zimmer calls his piece, “In bedbugs, scientists see a model of evolution.” What he means by that is “a model of how new species form.”
We evolutionists, by and large, conceive of species as being different groups that cannot exchange genes because of biologically-produced “isolating barriers” that prevent the formation of fertile hybrids. Bedbugs and batbugs do appear to have such barriers: they don’t do well on each other’s hosts, they are active at different times of day, they seem to maintain differences in appearance, and, of course, the DNA data show a lack of genetic exchange. Now, as I said, we don’t know whether the differences in activity period, ability to thrive on the host, or morphology are based on differences in genes (we can’t assume blithely that they are), but the DNA data clearly show that these lineages don’t exchange genes very often. Could it be that we have a case of speciation in action due to host shift by the batbugs?
The answer is that we don’t know for sure. What we see are two diverged lineages, but we can’t know whether they will continue their evolutionary divergence and go on to form two “full species”, totally incapable of exchanging genes, that deserve different Latin names. It’s possible that they will maintain their status as somewhat distinct lineages, but that gene flow will be enough to keep them from achieving full reproductive isolation.
There are in fact many known cases of groups that are similar to these bedbugs in having achieved partial but not full reproductive isolation, so to imply that these bugs are unique, or that we have here a rare model of speciation in statu nascendi, is incorrect. In the book Speciation that I co-wrote with Allen Orr, we discuss many cases of “host races” in insects that show significant genetic divergence of forms living on different plants (aphids and the”true” fruit flies [tephritids] are two examples), but in which there is still gene flow between the forms. They are not considered “full species” since reproductive isolation is incomplete. In all of these cases we simply have no idea about whether they’ll go on to evolve into fully isolated species. We’d have to wait around for a couple of hundred thousand years to find that out.
The fact is that most populations of a species showing some reproductive incompatibility probably do not go on to form full species. Either they fuse back into one species, or one form goes extinct, or they maintain their status as incompletely isolated forms. To ask the question, “Are these going to become full species?” is to ask a question that can’t be answered.
Nevertheless, there is of course a lot of interesting information about batbugs and bedbugs in the paper of Booth et al., regardless of their unknown evolutionary fate. At least we know (probably) where an annoying human parasite came from, and something about the evolutionary differences between them. That might not help us eradicate bedbugs, but isn’t it fascinating to contemplate that our affliction with that creature is a remnant of our evolutionary history as cave-dwellers?
Birds remain the most popular subject for readers’ wildlife photos, and today we have contributions from three readers. First, Stuart sends a familiar but much loved-bird, and a lovely portrait it is, too:
I’m more often a landscape photographer, but a few years ago I took this photo of a kea [Nestor notabilis] when stopped in a parking area at Otira Gorge in the South Island of New Zealand. I was actually taking photos of the gorge and the road that is built into the cliff when the bird landed just over a meter from me.
The kea is on the protective barrier to prevent vehicles going over the cliff. Just after I took the photo it hopped onto the wing mirror of the car, with an open window, less than arm’s reach from my wife.
These are really inquisitive birds, and this one apparently had absolutely no fear of humans.
I’m always amazed at the beaks of these things. But what an awesome bird!
Reader Pete Moulton sent ducks!
Here are a few duck photos from my home patch in east Phoenix.
First up is a first-year drake Northern Shoveler (Anas clypeata), showing the mandibular lamellae (along the cutting edge of his bill) which are an adaptation to his filter-feeding lifestyle. If the lamellae made you think immediately of baleen whales, you’re on exactly the right track.
[JAC: I’ve enlarged the bill so you can see the lamellae used for filter-feeding]:
This guy’s an adult drake Ring-necked Duck, Aythya collaris, showing his nictitating membrane. [JAC: See John Chardine’s photo-essay on nictitating membranes here.] These membranes occur in a variety of animals, and serve protective and maintenance functions, depending on the animal. Most bird photographers would probably throw this image away, but you may have noticed that I often shoot for didactic value; my significant other can use images like this one in the classroom sessions of her birding classes. European and Asian readers will have noticed this duck’s similarity to their own Tufted Duck, A. fuligula.
And finally a drake Gadwall (Anas strepera), a bird that should be familiar to all Northern Hemisphere duck-watchers. Gadwalls here are typically quiet and retiring, which leads me to wonder about that binomial.
When I told Pete that I was a big fan of ducks, and that many of them had cool names (“steamer duck,” “scoter,” “ruddy duck,” and so on), he sent me two moar pictures of ducks and additional information:
When someone wants to take up birding, I usually try to get them started with ducks. They’re beautiful, there’s usually a respectable variety at any given location, and most of them just sit out in the open while an inexperienced observer flips through the field guide to make identifications. Sad to say, most new birders want to start with raptorial birds, which are much much harder to find, and a lot more difficult to identify once you do find one.
Speaking of their cool names, here’s one that’s both a whistling-duck and a fulvous duck: a Fulvous Whistling-Duck (Dendrocygna bicolor). These are very rare in my area nowadays, so this guy at a Scottsdale Park a few years ago attracted a lot of attention. Being certain that a vagrant waterfowl is really wild is always a little dicey because they’re popular in avian collections. Keepers are supposed to mark their birds with bands (Europeans call them ‘rings’), web-tags, or by removing a hallux (rear toe), but they don’t always do that. This individual showed none of these signs, and our Arizona rare bird committee ultimately deemed it a truly wild bird. Being the only Fulvous in the area, it eventually took up with a Black-bellied Whistling-Duck, and for several months thereafter the ‘Odd Couple,’ as they became known, turned up all over Scottsdale.
Finally, reader Tim Anderson from Australia sent a cormorant—and a poem (not his; below):
This is a Little Pied Cormorant (Phalacrocorax melanoleucos) in the Tumut River, Brungle, NSW. [JAC: It’s also called the “little shag.”]
Tim’s note on the cormorant poem:
The “common cormorant or shag” poem belongs to the canon of nonsense poetry in the tradition of Edward Lear and Lewis Carroll. I grew up with my father announcing these poems to me in a loud voice as we went to pick up my mother from late night lectures at the University of Queensland. Strange things we remember from our childhood.
The common cormorant or shag
Lays eggs inside a paper bag
The reason you will see no doubt
It is to keep the lightning out
But what these unobservant birds
Have never noticed is that herds
Of wandering bears may come with buns
And steal the bags to hold the crumbs.
It is Dreaded Tuesday again, and our friends on the East Coast, especially Boston, are reeling under six feet of snow. There has been more snow in Boston so far this winter than in any winter in recorded history. And more is on the way today! Meanwhile in Dobrzyn, Hili is feeling insecure. I wonder if she’s been thinking of Fitness.
Hili: Please, tell me, and be honest, are there really people in the world who believe that black cats have better supernatural powers than tabby ones with white paws and a white chest?
A: Yes.
In Polish:
Hili: Powiedz mi, tylko mnie nie oszukuj, czy naprawdę są na świecie ludzie, którzy wierzą, że czarne koty mają większe nadnaturalne moce niż koty szare z białymi łatami?
Back in the Pleistocene, I used to criticize my undergraduates for using the word “awesome” to refer to anything that was even slightly out of the ordinary. (I used to sit in the lab with several of them and the grad students, and we’d all push flies and chat. I would hear “awesome” all too often.) But that was a losing battle, and now “awesome” is firmly entrenched in Generation Z lingo as meaning “pretty good.”
But maybe it’s not too late to fight for the word “amazing.” It’s now used as a synonym for “nice,” as in, “You have to meet my amazing friend Julie.”
Yes, I know the English language evolves, but let’s look for the moment at what “amazing” really means, at least according to the Oxford English Dictionary. Here are the four definitions; the last is the one used by the Young Folk:
And here are the definitions of “amazement”; clearly the fourth one is the one that people usually mean when they say “amazing”:
The condition of being mentally paralyzed, mental stupefaction, frenzy. Obs.
Loss of presence of mind; bewilderment, perplexity, distraction (due to doubt as to what to do). Obs.
Overwhelming fear or apprehension, consternation, alarm. Obs.
Overwhelming wonder, whether due to mere surprise or to admiration.
Now I have no objection to using #4 as the positive adjective, but does the presence of Julie really evoke “overwhelming wonder”? Having met many of these “amazing” people, I think not.
And before you chide me for not accepting the changing usage of these adjectives, let’s ponder what is happening. Every word that once denoted an extreme condition of goodness, like “awesome” or “amazing,” is being systematically devalued. Eventually we will run out of words to denote the truly awesome things, like the Big Bang—or bats. And then when people want to talk about those, what words will they have?
First they came for “awesome,” and I did not speak out because there were still other good adjectives left.
Then they came for “amazing,” and I did not speak out because you get criticized for opposing the way kids warp the language.
Then they came for “stupefying,” and there were no words left because I had let them all be downgraded.
As usual, put your rants below and DO NOT TELL ME THAT I SHOULD ACCEPT ALL CORRUPTION OF LANGUAGE.