A profoundly misguided cartoon

October 9, 2015 • 12:30 pm

Usually I like Zach Weinersmith’s SMBC (Saturday Morning Breakfast Cereal) series, but this one is off the mark:

20130815

Well, of course feeling or receiving love and getting kicked in the cojones are subjective sensations, and we don’t yet know exactly how the nerve impulses become perceptions. But we do know two things. First, love and pain have emotional connotations, and when we think about them, they don’t seem like chemical reactions. And love, at least—albeit chemical—is one of the things that makes life worth living. Pain, however, does not, though it serves an immensely adaptive function: alerting us to damage to our bodies.

But I doubt that Weinersmith is just touting the emotional value of love here; rather, he seems to be dissing the very notion that love and pain are chemical reactions. But they are, and we have good evidence for that. You can affect the affections of people and animals by injecting them with chemicals.

Pain, too, is a chemical reaction, or at least has something to do with nerve transmission and is therefore based on molecules. We know this for several reasons. One is the existence of Congenital insensitivity to pain with anhidrosis (CIPA), a genetic disease that simply removes the ability to sense pain (as well as heat or cold) from its unfortunate victims. And they’re “victims” because, heedless of damage to their bodies, they get burned, infected, break bones, and continually hurt themselves without knowing that they’ve done it, and without seeking medical care. If you’re a CIPA sufferer and put your hand on a hot stove, you won’t remove it. The consequences are clear. And so are the implications: if a mutation in the DNA can remove the sensation of pain, and undoing that mutation can presumably bring back the sensation of pain, then pain must be a physiochemical phenomenon.

We know the same thing from local anesthetics, like the novocaine you get at the dentist’s. You’re conscious but don’t feel pain in the area where the chemical is injected. It clearly does something to the nerves or their transmissions that eliminates the subjective sensation of pain. The sensation thus has a neurological/chemical basis.

Unless Weinersmith sees “believing in love” as “finding value in love”, then the cartoon is profoundly antiscientific. But even if he isn’t, the existence of anesthetics and diseases like CIPA tell us that, at bottom, subjective sensation has a materialistic and physical basis. We all know that, but many religionists reject it.

h/t: jsp

91 thoughts on “A profoundly misguided cartoon

  1. I figured this comic was just saying “Yea, they are chemical reactions, but that doesn’t mean they don’t matter or aren’t real.”

      1. I agree completely. The cartoonist isn’t dismissing materialism or determinism. He’s pointing out the absurdity of dismissing pain and love as “just a bunch of chemical reactions.” those chemical reactions mean a great deal to us, even if they are completely material.

        1. The thing is the entire cartoon is based on a straw man. When someone says they don’t believe in love, they simply mean they don’t believe in the magical component of it not that the feeling of love (or pain) isn’t real.

        2. The cartoonist is insinuating that there are people who don’t believe in love because it’s just chemicals. Straw man. These people do not exist. Nobody says “I don’t believe in love because it’s just chemicals. Everyone believes in love. Even those who think it is material only.

          1. Technically, yes, the first character represents a strawman. But I don’t think it’d be overly-charitable to interpret the first chatacter’s stance as “if love is just the result of chemical interactions then it wouldn’t really be love”. People who say this abound. Of course, they don’t go on to conclude love doesn’t exist; instead, they think they’ve demonstrated that love must be some kind of dualistic magic.

    1. The cartoon is just a dumb strawman. Why does the other fella not believe in pain if he understands that the chemical and neuro impulses of pain are real?

      Nobody is saying that the feeling of love isn’t real, just because it can be explained through materialistic means!

    2. Same here. To quote a recent post by PZ Myers: “You are not just chemicals in your brain; you are chemicals in your brain, isn’t that awesome?” In fact without additional information suggesting something else, it seems obvious to me that your interpretation is the right one.

    3. This was the obvious reading to me when I first saw it, and I find it difficult to mentally reconcile with Jerry’s reading even now.

      The brown-haired guy kicking the other in the crotch is pointing out the absurdity of devaluing love as “just a chemical reaction” by demonstrating that the other person cannot dismiss the ‘mere’ chemical reaction of pain.

      The problem is not with the chemical reaction, but with the ridiculous attempts to dismiss the concept by adding a diminutive (“just”, “mere”, “only”) to the description.

      1. The problem is that when we, for instance, say that the movement of planets is governed “just” by the laws of gravity, not by flying angels, the word “just” isn’t meant to depreciate the physical phenomenon, but rather point to the fact there’s a natural explanation to it. (Though, if someone previously thought it were really the angels holding the planets, the word “just” may be a depreciation of sorts.)

        1. And no, if Zach Weinersmith drew a cartoon in which the guy who says “it’s just gravity” gets pushed off a rooftop, he wouldn’t be making any real point at all.

    4. Bingo. Weinersmith is fighting the Cherry Pion Fallacy (no such thing as cherry pie unless the ulimate constituents are cherry pion particles).

      A fallacy that common probably already has an established name, but until I learn it I’m calling it the Cherry Pion Fallacy, inspired by a comment on Sean Carroll’s blog.

  2. I think the yellow haired dude is supposed to represent one of those(us)determinist atheists. This is one of the most annoying straw men out there. Like when Oprah was genuinely confused as to how an atheist could experience the sensation of awe.

  3. I’ve actually found that thinking about pain in terms of chemical reactions or signals from nerves to my brain makes it easier to tolerate it. This comes in handy when I have to get needles jabbed in my arms for blood work or dental work done. My brain’s telling me something’s wrong, but I already know what’s happening and it’s nothing to worry about. So the pain essentially becomes a redundant signal and much easier to handle.

  4. You’ve misinterpreted the cartoon. I think he agrees with you.

    He’s not dissing the notion that love and pain are chemical reactions, he’s dissing people who conclude from that that this somehow makes them less significant.

    He’s dissing the guy in the blue shirt; the guy in the green shirt is pointing out that just because something is a chemical reaction doesn’t change it’s value or “believability.”

    1. Well, until he tells us otherwise, I don’t think we can be sure. And, at any rate, it’s useful to remember the evidence that qualia are naturalistic phenomena, and thus don’t require some supernatural explanation.

    2. My reaction was the same as Dong’s. I don’t know about Dong’s background, but I wondered if the different way I see it is because I’m not a scientist?

      I thought taupossaft’s comment a bit rude and unnecessary though. That sounds like it could be a Kiwi moniker too, and maybe not far from where I live. We want Jerry to visit sometime dude!

  5. “he seems to be dissing the very notion that love and pain are chemical reactions”

    That’s not how I read the comic at all. I’m pretty sure he’s on the same page as you.

      1. Oh yes, next time you read about, say, frog reproduction in a biology textbook, remember – it’s not “just chemical reactions”! 🙂

  6. I agree with the interpretation of comment #1. Ie. the goal is to illustrate that learning that something is a chemical reaction should not diminish it.

    > he seems to be dissing the very notion that love and pain are chemical reactions

    If I squint a bit I can *almost* read it the way you (PCC) read it — it took me several tries — but I’m pretty sure that way is not the way W. intended it, especially given how *utterly* out of character that interpretation would be given previous SMBCs.

    The point could have been made more clearly, though.

  7. Two non-seminal comments (no pun intended):

    People born without the ability to feel pain are subject to danger from any health hazard
    including infections, blisters, stings, etc., as well as burns and cuts.

    Subjecting children to pain from needles, surgeries, etc. are physically painful to the child, emotionally painful to the parent. I have been in the position of helping 4 or 5 adults hold down my small child who was energetically trying to avoid a shot of cortisone in a keloid scar from Achilles tendon surgery. There were supposed to be a series of 12 shots, but I called them off at two when I found that they could be done at his request as an adult with equal effectiveness.

  8. Somehow when people talk about the sexual reproduction of non-human animals they don’t have any problem discussing the sexual development of a growing organism, the hormonal basis of sexual behavior etc. But when it comes to the human animal, oh, it’s all magic!

  9. “But I doubt that Weinersmith is just touting the emotional value of love here; rather, he seems to be dissing the very notion that love and pain are chemical reactions.”

    -Don’t see that at all. I find it extremely difficult to see how Jerry came up with this interpretation.

    1. Perhaps it’s because I’m a scientist and reacted strongly to the dismissal of a reductionist explanation. If the artist intended a different message than the way I saw it, I’ll be glad to admit I’m wrong. Still, do you deserve to be kicked in the balls for making a scientifically true statement?

      1. I think Weiner intended the problem statement+word to be “I don’t believe in love.” and “just”, as if love being an outcome of chemical reactions renders the emotions resulting from it meaningless or inherently untrustworthy. Pain is a pretty blatant chemical reaction, but it’s also makes for a very powerful emotion, as well as a quite trustworthy one!

        The cartoon isn’t trying to advocate people getting kicked in the balls for saying stupid stuff (I think).

  10. “Pain, too, is a chemical reaction, or at least has something to do with nerve transmission and is therefore based on molecules. We know this for several reasons. One is the existence of Congenital insensitivity to pain with anhidrosis (CIPA), a genetic disease that simply removes the ability to sense pain (as well as heat or cold) from its unfortunate victims. And they’re “victims” because, heedless of damage to their bodies, they get burned, infected, break bones, and continually hurt themselves without knowing that they’ve done it, and without seeking medical care. If you’re a CIPA sufferer and put your hand on a hot stove, you won’t remove it. The consequences are clear. And so are the implications: if a mutation in the DNA can remove the sensation of pain, and undoing that mutation can presumably bring back the sensation of pain, then pain must be a physiochemical phenomenon.”

    And then there are also people who are asexual, that is, people who don’t feel any sexual attraction at all.

    https://en.wikipedia.org/wiki/Asexuality

  11. I think it’s overly reductive to say that pain and love are strictly chemical phenomena. They are states and signals in a complex system, the particular medium of chemical encoding is not necessarily essential. It is not impossible that there could someday be silicon systems that experience love and pain, encoded as electrical or photonic signals and states rather than chemical reaction networks. So what love really “is” is something at a higher level of system abstraction.

    1. Actually we don’t know whether the AI could ever be made to feel emotions the same way we do, so it’s a non-argument at this point.

      1. That’s fairly gratuitous skepticism. A silicon neural network comprised of functionally equivalent artificial neurons, connected in the same topology as an animal brain, should be, well, functionally equivalent to the animal brain.

        1. It’s too early to speak about “functionally equivalent anything” in AI as our knowledge of how the brain works is still in its infancy.

          1. Well we know that the brain is comprised of a finite set of basic components, which obey specific physical laws and hence have specifiable behaviors (it doesn’t matter whether we currently know everything about those behaviors), and a finite sets of interconnections that carry signals between said components. The state evolution of the system depends on those behaviors and connections. The chemistry of animal brains is sufficient to produce those behaviors but there’s no reason to believe it’s necessary. It’s not really an AI issue, it’s a more basic question of salience. We can make a binary adder out of animal neurons, but obviously that doesn’t mean binary addition is defined by biochemistry. The function is distinct from the medium.

          2. Look, we don’t know at this point how our consciousness arises from the neural network of our brain. Time will tell whether we will be able to reverse engineer the brain and build an artificial one. It’s easy to say that if we build an artificial brain that’s equivalent to ours, the brain could think and *feel* as human brain (ergo: the feeling of love for example is not confined to a biological system), but it’s circular argument that rests on the assumption that we will in fact be able to build such a brain.

          3. No, it rests on the assumption of physicalism, i.e. that mental states (including consciousness) supervene on physical states of matter, and there is no spooky mind stuff. Whether a thinking, feeling artificial brain is feasible from an engineering standpoint is irrelevant to that argument.

            Now if you want to take issue with physicalism, knock yourself out, but then you’re into ghost-in-the-machine territory.

          4. It’s not a circular argument. I don’t have to able to build an artificial brain to acknowledge that one could exist in principle. Function is distinct from substance, that’s the entirety of my point. I wasn’t originally talking about AI, but I don’t see any reason to assume that we won’t be able to synthesize a brain-like circuit in the not too distant future. The functional behavior of neurons doesn’t seem all that mysterious, so it’s mostly a question of mapping and replicating the network topology. We can already make chips with a number of silicon neurons comparable to a mouse brain, we just need to support the density of synapses and implement a comparable mapping of the connections. It’s not as far off as you seem to think.

          5. It is not irrelevant to the argument, because if you can’t replicate the human brain in silicon, you can’t say that silicon-based AI could ever harbor emotions like love and pain.

          6. Scientifik:

            Long before we confirmed the existence of any extrasolar planets, we could predict with confidence that their orbits would obey the same laws of celestial mechanics we’re familiar with in our own solar system.

            We have yet to discover extraterrestrial life, but we can be reasonably confident that any such life will have evolved by Darwinian selection processes, and that well-established corollaries of natural selection (kin selection, sexual selection, etc.) will apply in appropriate circumstances.

            So I don’t see why you insist that it’s circular logic to extrapolate what we know about the workings of the human brain to hypothetical artificial brains. Extrapolation of that sort is a staple of scientific reasoning. Any given prediction may turn out to be wrong, in which case we need to revise our models, but that doesn’t make prediction a crime against science.

          7. “No, it rests on the assumption of physicalism, i.e. that mental states (including consciousness) supervene on physical states of matter, and there is no spooky mind stuff.”

            I don’t have any issue with the assumption of physicalism (as all the available evidence supports it), only the assumption that we can “in principle” build the equivalent of a biological human brain. We currently have zero idea about how to give a computer neural network consciousness. And there’s no principle telling us that we ever will.

          8. “No, it rests on the assumption of physicalism, i.e. that mental states (including consciousness) supervene on physical states of matter, and there is no spooky mind stuff.”

            I don’t have any issue with the assumption of physicalism (as all available evidence supports it), only the assumption that we can “in principle” build the equivalent of a human brain. We currently have zero idea about how to give a computer neural network consciousness, and there’s no principle telling us that we ever will.

          9. There is no physical principle that says it can’t be done, so unless you think the natural brain is somehow “magical”, eventually (I cannot guess when) an artificial brain will be constructed.

          10. “There is no physical principle that says it can’t be done”

            We don’t even understand what properties of the biological neural network give the brain its consciousness, so how can you honestly say that we can in principle build the equivalent of our brain in computer? What if we find that some aspects of wetware can’t be replicated in computer?

      2. Actually we do know: humans are prove that it’s possible to build such a machine. If you think there is a barrier to that, that’s pure speculation.

        Should such barrier exist, only science is able to find it.

  12. It seems to me that these discussions of “love” are always intended to exploit the emotional reaction that many people have when contemplating the idea of love, and to thereby derail any rational discussion of the relationship between love and whatever point they’re trying to make. I’m reminded of the trope I often hear from religious folks that since we believe in “love” even though we can’t explain it, we should believe in “god” even though he (she?), too, is in explicable. However, in a general sense at least, we DO understand love. It is simply the name we give to a set of feelings, no different than the name we give to the feelings we call envy or confusion or anger.

  13. I see this quite differently. I have no idea what the cartoonist believes or was trying to get across, but this is consistent with a reading that makes a quite important point that atheists should care about, and have discussed for a long time. Especially, but not only, in the context of mental states. Let’s say we give a thorough account of, say, love, and find that it is, in fact, a complex chemical, brain state. (There is good reason to believe it will be at least a disjunction of states, to satisfy the requirements of multiple realizability). Love is, we find, X (or X or Y or Z or…, where X, Y, Z, etc., are chemical, physical states. Non-naturalists have long countered that we have just “explained away” the mental state, that is, shown there is no love. But explaining isn’t explaining away. We’ve shown (or would have if we had the full explanation, which we don’t) what love is, but it doesn’t go away. To use the philosophical terms, we have reduced it, but not eliminated it. Same with any mental state, such as pain.

    Which is what the cartoon, in my reading, trades on. Yes, pain IS a brain state (which, if I am correct, your reading has the cartoonist denying), but the sheer, well, PAIN of it shows that it is real. It’s explained, not explained away. The cartoonist then draws the non-obvious analogy with love. “Explained” is not “explained away.” We haven’t “unweaved the Rainbow,” in explaining it. X IS Y, but not, X is JUST Y.

    If I recall correctly, Dawkins makes this point even the most naturalist of naturalisms, the selfish gene theory. At one level, the basic one, there would be just genes building bodies, and that’s it. But what they build is bodies capable of behaviors of love, caring, etc.

    This cartoon, and your explanation and mine, give me material for my philosophy lecture next week. My point isn’t “hey, look at me, correcting this mere biologist,” but rather, “Hey, isn’t this of interest to teh philosophy of mind, and I might be wrong, or he might be wrong–let’s go to work looking at the arguments.” (Of course if you were to say more, I would include that!)

    1. “isn’t this of interest to teh philosophy of mind”

      Just curious, how is the philosophy of mind different from the science of mind?

  14. To which I would add, since we are studying arguments from analogy in my class, that you point out a difference (between pain and love) which I would argue (again, not infallibly) is irrelevant to the point that the cartoon is making. I truly can’t see how the “emotional value” of love but (presumably) not pain has anything to do with it.

    1. I would propose a better analogy: our brain on love vs our brain on drugs. Discuss the two brain states 🙂

    2. How are you so sure you know what point the cartoon is making? I thought I did, but I may be wrong. Perhaps you might as well.

      At any rate, a pinch would have sufficed; the kick in the balls means something!

      1. This is after all a cartoon, and as such is meant to make us laugh. I don’t think a pinch would have sufficed for that.

          1. I’m speaking of the cartoonist’s intention, not of his success at achieving his goal.

          2. I doubt that the intention of this mean-spirited strawman cartoon was to make anyone laugh.

  15. if a mutation in the DNA can remove the sensation of pain, and undoing that mutation can presumably bring back the sensation of pain, then pain must be a physiochemical phenomenon.

    I think this is an oversimplification. Consider a computer-controlled knitting machine in the process of knitting an argyle sweater. Changing a bit in the computer’s memory might well have the effect of destroying the pattern, and undoing the change might restore it. But we would clearly not be justified in concluding that argyle sweaters are therefore electronic phenomena.

    Pain and love are mental states. Such states (like sweaters) have a physical basis, but the fact that they’re coded or signaled chemically does not mean that they’re “just chemical reactions”. They’re cybernetic processes that happen to exploit chemistry in part of their implementation.

    1. FYI, “just chemical reactions” = just a natural process (it’s just chemical processes, hormones, genetic programming, neural activity). That’s all what the shorthand means to convey. Love is not supernatural magic.

      1. If that’s what Blue-Shirt Guy meant, then why did he preface it by saying “I don’t believe in love”? Seems to me that what he’s saying is that love is nothing but chemistry, that our subjective feelings of love aren’t real.

        By kicking him in the balls, Green-Shirt Guy is saying “Is this real enough for you?”

        So no, it’s not a commentary on naturalism v. supernaturalism. It’s a commentary on eliminative reductionism.

        1. >If that’s what Blue-Shirt Guy meant, then why did he preface it by saying “I don’t believe in love”?

          Well, because he doesn’t believe love is anything supernatural.

          1. So in your view “I don’t believe in X” really means “X exists and is not supernatural.”

            Try substituting evolution or global warming for X.

          2. >So in your view “I don’t believe in X” really means “X exists and is not supernatural.”

            Yes, that’s all he could have meant in this case.

            This of course has nothing to do with not believing in evolution or global warming or any other thing, and pertains only to the proper unpacking of this strawman cartoon.

          3. “Yes, that’s all he could have meant in this case.”

            There’s plenty of evidence on this thread of other possible meanings. I gave one myself just a few comments above this.

            But if you refuse to acknowledge even the possibility of any other viewpoint than your own, there’s no point continuing the discussion.

    2. “Pain and love are mental states. Such states (like sweaters) have a physical basis, but the fact that they’re coded or signaled chemically does not mean that they’re “just chemical reactions”. They’re cybernetic processes that happen to exploit chemistry in part of their implementation.”

      Mental states are the product of physical brain, just like the sweater is the product of a computer-controlled knitting machine. Mental states aren’t therefore some esoteric entity merely exploiting the brain.

    3. PS your interpretation of mental states “exploiting” the chemistry of the brain comes dangerously close to the idea of soul exploiting the physical body.

    1. “Much of the debate over their importance hinges on the definition of the term, and various philosophers emphasize or deny the existence of certain features of qualia. As such, the nature and existence of qualia remain controversial.”

      Good ol’ Philosophy!

  16. Well, however one sees this cartoon I do like the SMBC cartoons. I have a bunch of favorites on this computer. The earlier ones from the author were often mean spirited without being funny, imo, but over time they greatly improved.

Leave a Reply to musical beef Cancel reply

Your email address will not be published. Required fields are marked *