Famous philosopher and Templeton-Prize winner: science = faith

February 10, 2010 • 8:00 am

The Guardian continues its string of ludicrous essays defending religion against the encroachment of science.  The latest is a “Comment is Free” piece by Mark Vernon (you’ll remember him as the guy who wrote perhaps the all-time classic work of aphophatic tripe: “God is the Question” [see a response here]), reporting (and praising) a talk by the Canadian philosopher Charles Taylor.  Taylor has raked in the cash for his efforts at reconciling science and faith: he won not only the 2007 Templeton Prize (one million pounds), but also the 2008 Kyoto Prize ($470,765).  I tell you, there’s serious money in accommodationism!

At any rate,  here’s Taylor’s point, much lauded by Vernon:  although science is usually based on evidence and rational inquiry, there are times when it’s not.  Those are the times when a scientist suddenly has an intuition, or a hunch, that turns out to revolutionize the way we see the world. So, for example, we have Einstein and relativity, Planck and quanta, and, I suppose, Darwin and evolution (although in that case his “hunch” evolved, so to speak, from looking at a lot of evidence).  According to Taylor and Vernon, these Kuhnian “paradigm shifts” are nothing other than a “leap of faith” or, to use religious words (which these people love to do), a “revelation”:

But take, for the sake of the argument, one of the best known attempts to understand what really happens in scientific reasoning, that put forward by Thomas Kuhn. . .

. . . What analysis of this kind suggests is that the reasonableness of science is partially true, during periods of what Kuhn called normal science, when puzzles are proposed and solved. However, during paradigm shifts, that evaporates. Science enters a period of flux and uncertainty until a new paradigm is settled. Intellectual wars break out too. Scientists stop talking to one another. They label opponents “heretics”. Then rational discourse breaks out once more – until the next shift.

The challenge is to understand what happens during the shifts. What processes are at play then? There’s a huge debate about this. But it is at least plausible that the rational periods of normal scientific enquiry are only possible because enough scientists have decided to go with the disruptive hunch or intuition. Certainly, they test it. And their tests “prove” it – until the next shift, that is.

So, the suggestion is that you could be forgiven for concluding that science is only possible because scientists are prepared to make a collective leap of faith, a commitment to the prevailing paradigm. Further, science just wouldn’t be possible if scientists always and everywhere adhered to the scientific method alone, the procedures that have come to define what counts as rational. Something other than repeated observations and correct inference is required for progress.

It’s because of him we have the phrase “paradigm shift” – those breaks between the science of Aristotle and Copernicus, or between that of Newton and Einstein. What happens, he thought, is that there is no procedural appeal to reason in these moments, no patient weighing of the evidence. Instead, there is a rupture, a revelation. Science finds itself teleported to a new world, in which even the questions it asked before now look foolish.

Indeed, sometimes scientists do rely on intuition.  And I suppose you could, in some cases, use the word “faith”.  Einstein, for example, thought that his theory of relativity was true simply because it had to be true: he knew in some way that his beautiful equations represented the state of the universe.  But what do we mean by “faith” here?  In the case of Einstein—or Darwin—their “faith” meant this: trust or confidence that their hunch was correct.

Taylor and Vernon, however, want us to take “faith” in its other, religious sense:  belief in God, the supernatural, and things that can’t be verified emprically.  Trusting that the reader won’t notice this sleight-of-hand, they then proclaim that, like believers, scientists take leaps of faith. Here’s what Vernon says:

To put it another way, the neat distinction between science and religion unravels, for religion involves commitments made on faith too. You might protest: revelation purports to come from God and is untestable, two characteristics that the scientist would certainly reject. Except that regardless of its source, a revelation can only make an impact if it makes sense to people, which is to say that they test it against their lives, that it can account for the evidence of their experience, like a theory. Revelation can only bear the weight of significance when people have engaged with it rationally too.

Moreover, a particularly successful religious revelation, or should we call it a “faith hunch”, may come to have global appeal: it becomes a kind of universal language. The Christian in Sante Fe can worship with the Christian in Shanghai. Perhaps in this respect religion is closer to science too. We might take Taylor’s lead and discuss, rationally if we can.

Can the lucubrations of philosophers and journalists manqué get any sillier than this?  A scientist’s confidence that he or she is on the right track is not the same religion’s absolute belief in the verity of propositions that can’t be supported empirically.  And, of course, none of these scientific “leaps of faith” are accepted by scientists as true until they’re vetted by scientific experiment or observation.  Einstein’s general theory of relativity, for example, wasn’t widely accepted as a true theory until Eddington demonstrated the bending of light around stars during an eclipse in 1919. In what way does this equate to a believer’s assertion that Jesus died for his sins because that believer simply knows that it’s so?

Now Vernon seems to know that something is amiss here. After all, he notes that “revelation purports to come from God and is untestable, two characteristics that the scientist would certainly reject.”  But he then implies that revelations have their own sort of “truth,” for they “make sense to people”, who “test [these relations] against their lives, that it can account for the evidence of their experience.”  But is that the same as testing the theory of relativity? Certainly not, for those revelations that are “tested” against people’s experience, and “make sense” to them, conflict among people of different faiths!

To a Muslim, Mohamed was the prophet of God, while Jesus was certainly not the son of God.  To a Christian, things are reversed. To a Hindu, neither is true, and what “makes sense” is a complex polytheism.  The lack of agreement among the claims of faith, but the requirement for agreement in science, is the crucial difference between scientific truth and religious “truth.” It would be well if Vernon and Taylor could grasp this simple distinction.  True, “the Christian in Santa Fe can worship with the Christian in Shanghai”, but the Christian in Santa Fe cannot worship with the Muslim in Santa Fe!

And of course Taylor and Vernon might consider that what “makes sense” to religious people is remarkably coincident with what those people were taught as children.  Most Muslims don’t accept Islam because it makes more sense to them than, say Christianity.  They accept it because, when they were children, they were taught that Islam was true.

All this is obvious.  What may not be obvious is the conflating of the two meanings of the word “faith” by those who assert that both science and religion rely on faith.  This is a philosophical shell game.  Maybe Vernon is taken in by it, but he’s small potatoes compared to Taylor, a man of reputation and, now, wealth.  People lap up this kind of stuff, so eager are they to hear that they really can retain their religious beliefs in the face of creeping atheism and materialism.  And they don’t want to look too hard at the arguments.  Even very smart people can be gullible when it comes to claims like this, and that gullibility translates into wealth and fame for people like Taylor.

If they want to give a Templeton Prize to a philosopher, how about Anthony Grayling or Dan Dennett?


UPDATE:  Over at Mark Vernon’s website, it says this: “Mark Vernon is a writer, broadcaster and journalist. He began his professional life as a priest in the Church of England: it may not seem an obvious step from there to journalism but writing a sermon is remarkably similarly to writing a feature; and speaking to parishoners is remarkably like talking to a microphone.”

75 thoughts on “Famous philosopher and Templeton-Prize winner: science = faith

  1. Good post, Jerry.

    Surprise, surprise, accommodationists desperately scraping the bottom of the tu quoque fallacy barrel!

    Yeah, they mistake trust and confidence based on the fact that science works and can spot its errors, with inane, bizarre, misplaced, bonkers, grasping, futile religious faith.

  2. Nobel prize-winning physicist Steven Weinberg wrote an interesting article on Thomas Kuhn in the New York Review of Books titled “The Revolution That Didn’t Happen.” I recommended the article to those who still use the term “paradigm shift.”

  3. So…science gets new ideas base on empirical observations. Are they claiming the same is true for religion?
    For example, science tells us that human beings are an ape species living on a rocky plant, the share of which, out of the observable universe, is practically zero. Will any religious scholar think based on this, that we are not so important to the creator of the universe to send his son to die for our sins, or a winged horse for one of us to go meet him in person?

  4. In my Intro to Philosophy of Science course, I was taught the distinction between the Context of Discovery and the Context of Justification — doesn’t everyone who writes about this stuff know this? We know that benzene has a circular structure not because Kekule dreamed of snakes eating their tails, but because he used that insight to do empirical tests. Plenty of people have had insights that are wrong — for science what is important is not how a hypothesis is reached (Context of Discovery), but how it is demonstrated to be true (Context of Justification).

    Or, put more simply, they laughed at Einstein, but they also laughed at Bozo the Clown.

    1. I wouldn’t take the snake dream story at face value.

      A hexagonal structure for benzene had already been proposed, in print, no later than 1854—six years before Kekule’s purported snake dream, and decades before his first known mention of the dream in which it supposedly all came to him in a flash.

      It’s not clear Kekule didn’t get idea from there, or from some other source. Maybe it was just floating around.

      Or he may have reinvented the basic idea himself, when awake, and misremembered it as coming to him all at once in the dream, when in fact the dream was just a reminder, or a salience-raiser.

      Or maybe not even that; he may even have been trying to avoid giving credit to prior work, or just “telling a story” that was a bit simpler and more “interesting” than the poorly remembered humdrum truth.


      For a variety of reasons the snake dream story is highly suspect. It’s the kind of thing people are very bad at recollecting accurately, even if they’re trying to tell the unvarnished truth. People tend to forget most of the very tentative and incremental steps in working something out, and misremember it as having just “come to them” via irreducibly creative insight.

      Most of the steps are not memorable, because when you make them, you don’t know whether they lead to a solution, or the solution—you only know what leads to the solution later, when you’re pretty sure it is the solution, and by that time you’ve generally forgotten most of the actual steps.

      Once Karl Jung took snake dream story and and ran with it, the anecdote entered the realm of myth. Of course, Jung didn’t know what we know about just how fragmentary memory really is, and how much unconscious reconstruction, revision and contamination is involved in remembering things, and he had his own axes to grind about the importance of dreams.

  5. I think Taylor also mischaracterizes these nature of these “scientific leaps of faith”. It’s not like physicists thought they had it all figured out in 1900, and then Einstein, out of nowhere, had some kind of divine intuition that told him everything they thought they knew was wrong.

    Physics at the time had some unsolved problems, like squaring Maxwell’s equations with the newly discovered quantum properties of light. Einstein thought about these problems for a long time and proposed some brilliant solutions.

    I’m sure lots of physicists proposed other solutions which turned out not to hold up under experiment. Even Einstein was famously wrong in some of his “leaps of faith”, such as his proposed cosmological constant, and his skepticism towards quantum mechanics (“I am…convinced that [God] does not throw dice.”).

    Also, it’s not as though Einstein proved Newton completely wrong. It just turns out that Newton’s laws were only an approximation of the truth, while Einstein’s were somewhat more accurate approximations.

    Paradigm shifts (or whatever you want to call them) in science can be ugly sometimes, but these debates are settled rationally by examining empirical evidence. Thus science can claim progress towards greater understanding of truth in a way not possible for religion.

    1. Even Newton was aware that he didn’t quite get things right. Nasty Mercury – and blame people like Copernicus for making such good measurements to show up a flaw in Newton’s scheme. I certainly don’t envy the volume of calculations that had to be done by hand in his era though.

  6. Scientists don’t get very far convincing other scientists of their ideas with “faith”.

    Can you imagine: Imaginary Einstein just had “faith” in atomic theory, or in special reality, but he never got the equations to fit the available data and never produced a theory that made a verifiable prediction, but his theories just became accepted on faith alone?

    Great indeed was that leap of faith!

    Scratch “faith” and replace it with “intuition” that then has to be tested against reality before anyone takes it seriously.

  7. It’s interesting that you credit the conception of quanta to Planck in the same sentence that you mention Einstein. Einstein actually won his Nobel Prize for his work on quanta. Even though Planck laid most of the ground work (such that we have Planck’s Constant), he remained in the old guard until his death.

    1. Planck first proposed the energy quantum, but not through any fit of a priori inspiration. He did it as a fudge to make theory fit the facts about the radiation spectrum.

  8. Isn’t the obvious flaw in Taylor and Vernon’s equivalence game the large number of hunches from scientists that end up as crumpled pieces of paper in the waste paper bin? Were all those ‘revelations’ some marvellous deliverance from some ineffable source? Hardly – just brain storms that went nowhere. Probability dictates that clever blokes thinking about something a long time may eventually stumble upon the right answer. (Not to downplay them; well done to you guys who’ve done it :-)). Isn’t that more likely than attribution to some kind of ‘magical’ intuition?

    Now, how do we decide which *religious* hunches should be consigned to the waste paper bin? Oh yeah. Test ’em.

    1. Yep.

      I am not a scientist, but I am a software engineer, and sometimes when I am trying to find a problem with a system, I absolutely engage in leaps of intuition. Why, just last week I made an intuitive leap that was particularly non-linear and unexpected, and led us to get much closer to the underlying cause of a major problem.

      But on the way there, I had probably a dozen other intuitions that turned out to be completely wrong. Funny thing is, I was able to test them…

  9. “the Christian in Santa Fe can worship with the Christian in Shanghai”

    I suppose that is why there are 30,000 sects of Christianity.

    Of course the ‘revelations’ that is given about scientists is nonsense:

    Everyone knows that Darwin based his ideas on the work of previous people plus years of observation, study and correspondence.

    Einstein’s education started at ten-year old with texts in science, mathematics and philosophy, including Kant’s Critique of Pure Reason and Euclid’s Elements. He studied Maxwell’s electromagnetic theory in his mid-teens, then went to the Polytechnic in Zurich.

    Max Planck studies mathematics and physics in high school and worked with several mathematicians and physicists in his early career.

    It is an insult to these scientists to claim that their ideas just ‘came’ to them and denies their education, hard work and dedication.

  10. True, “the Christian in Santa Fe can worship with the Christian in Shanghai”, but the Christian in Santa Fe cannot worship with the Muslim in Santa Fe!

    Furthermore… Not only can the physicist in Santa Fe collaborate with the physicist in Shanghai, but the physicist in Santa Fe can collaborate with the astronomer in Santa Fe. Hell, the physicist in Santa Fe can collaborate with the astronomer in Shanghai, for that matter.

  11. I’m getting a little tired of hearing about Kuhnian paradigms.

    If we compare Newton’s laws of motion and gravitation to relativity, here’s what we get: they both describe relationships between entities so that seemingly unrelated empirical discoveries can be incorporated into one philosophical system. Newton’s laws are very good approximations to special cases of relativity; which is to say, both theories are consistent for the vast majority of all empirical observations that had been made before relativity was published.

    No leap of faith is required; since the two theories are consistent on the bulk of all observations that can be made by human beings. The only question is which better incorporates the remainder of observations into a single philosophical system with the first bunch. And due to Newton’s laws relying on the notion of an absolute frame of reference, relativity seemed even a priori to be the better theory.

    Notice: the real “leap of faith” was the belief that there is an absolute frame of reference, which A) feels “right” from a human perspective and B) was required to make sense of Newton’s otherwise relatively intuitive laws. So if you want to call the switch to relativity a “paradigm shift,” you’ll notice that instead of constituting a leap of faith, it actually eliminated a leap of faith.

    This should underscore the differences between “faith” in scientific principles and religious “faith.” Productive work in science should eliminate leaps of faith (or really, assumptions) by deriving them from lower-level principles. For religion, faith is a virtue. In science, it is a vice.

  12. “there’s serious cash in accommodationism!”

    Man, I’ll say. I wish one of my previous employers was as free with their checkbook as the Templetons are.

    1. “Heresy” comes from the Greek for “choice” — “heretics” are those who dare to “choose” their beliefs, often based on reason or evidence.

      But no, I don’t think scientists use the word except as a joke.

  13. Correction:
    If they want to give a Templeton Prize to an HONEST philosopher, how about Anthony Grayling or Dan Dennett?

    Prostitution remains prostitution, it doesn’t matter matter whether carnal knowledge is involved.

  14. Oh, how many scientist have had an ‘intuition’, and been proven wrong?
    And that’s not heresy or a fault, it is science at its best, getting light into the darkness, but not always finding what was sought.
    As opposed to religion which is generally about turning all light off.

  15. Oh, how many scientist have had an ‘intuition’, and been proven wrong?

    I imagine even more scientists have have a flash of intuition and then worked out that flash of intuition is flawed.

      1. You mean Jean-Baptiste Botul’s study of Immanuel Wont?
        In BHL’s defense, Botul’s conferences at the Neo-Kantian Association of Paraguay were held in Guaraní; the gist of his argument may have waxed flabby (q.v.) in translation…

    1. I think most philosophers would consider Charles Taylor’s recent work to be retrograde, not cutting-edge. I have to say though that as I’ve followed the debates surrounding atheism, I’ve developed a bit of embarrassment at having studied philosophy as an undergrad. Yes, those trained in the sciences may sometimes oversimplify, but people in philosophy do seem to have a pattern of saying ridiculous things, even if they are a small minority.

      1. I think there’s a playful or cheeky aspect to philosophy that can’t be denied. Russell described it as a method of using simple, unarguable premises to arrive at dotty conclusions. And what a strange project! Why would anyone besides a smug contrarian admit to this sort of thing?

        Well, sometimes it is wise to engage in a bit of exaggeration. Every time we say something is “true” or “false”, we’re actually engaging in a bit of rounding up concerning the balance of probabilities. The transition from Newton to Einstein was actually the transition from one theory which was more-or-less true to one that was more true, but this is both linguistically cumbersome and slightly embarressing to say. Evidently, as far as science communication goes, you’re going to end up humiliated no matter what you do — at least, until you suck in your gut and start asserting yourself clearly.

        So Dawkins, for instance, tells people right on the cover of TGD that if they believe in God, they are delusional; but then, if they’re willing to give his view a shot by reading and comprehending his actual views, they find that he describes himself as a “6/7 atheist”. It’s a minor leap of language that is intended to spark some outrage, a moment of rhetorical flourish to wake people up to the stakes and consequences of the ideas. Obviously he’s not alone. PZ Myers does it. Even Galileo did it, if you want to get really serious.

      2. Ben Nelson writes: “The transition from Newton to Einstein was actually the transition from one theory which was more-or-less true to one that was more true…”
        Nice, but not quite true. Newtonian physics was the best possible approximation within the reference framework and experimental data available at the time. Hence, “true” until new data, new concepts, new instruments forced re-evaluation. Within its framework, as Roger Penrose phrased it: SUPERB.
        As for Galileo’s rhetorical exaggerations: such flourishes were common devices in 17th century rhetoric, the ‘Dialogo sopra i due massimi sistemi’ being exceptional for its content, but not singular in its form. One of those who saw through its maieutics, was not amused and criticised Galileo for not providing a rigorous cut-and-dry analytical demonstration instead of a dialectical exchange of punch-lines was, paradoxically, the ‘heretical Aristotelian’ Antonio Rocco.

        “Every time we say something is “true” or “false”, we’re actually engaging in a bit of rounding up concerning the balance of probabilities.”
        Nicely put, but I doubt that Newton, let alone Galileo, would have identified themselves with an asymptotical approach to truth. Interestingly though, calculus growing more familiar seems to have accredited this notion by the mid-18th century. In 1764, the Göttingen physicist and eminent wit G.C. Lichtenberg wrote [Sudelbuch A 1]:

        “The great artifice of taking minute deviations from truth for the truth itself, whereupon the entire differential Calculus is built, is also the foundation of our witty thoughts, whereas the whole would collapse, should we partake of the deviations in philosophical strictness.”

        (My translation, as no complete English version of this aphorism seems extant. The original reads:
        “Der grosse Kunstgriff kleine Abweichungen von der Wahrheit für die Wahrheit selbst zu halten, worauf die ganze Differential-Rechnung gebaut ist, ist auch zugleich der Grund unsrer witzigen Gedanken, wo oft das Ganze hinfallen würde, wenn wir die Abweichungen in einer philosophischen Strenge nehmen würden.”)

      3. Occam, thanks for the note. Hopefully I’ve understood it correctly. On the question of Einsteinian and Newtonian paradigms, a lot will ride on whether we accept that probabilistic/comparative characterization of truth or a different one. But if we start from the point where we say, I think plausibly, that the probabilistic account is always the best for our purposes in retrospect, then it’s interesting to be able to talk about and infer what we have to say about the real live cases. Likewise, I don’t mean to suggest that Newton had a probabilistic sense of truth, any more than he was a fictionalist. Rather I think (at least for the purposes of the argument) that he was a realist, and naturally drawn to the realist form of language for the practical reasons I invoked. It could have just been the style of the time, but still. The point applies to Galileo as well. Duhem had his allegiances challenged after reading through the historical record and sympathizing, much to his chagrin, with some of the more sophisticated critics on behalf of the Church, even though as it turns out Galileo was right in every sense that matters.

        That Lichtenberg quote is interesting and potent (and thank you for it), especially since it seems to pose a challenge to my “philosophical playfulness” thesis, which was the main point. But of course I only presented one side of what I meant. There’s a sense in which philosophy is play, and a sense in which it is deadly serious, though in what combination we put these varying temperaments depends on the kind of philosopher.

  16. From a public hygiene perspective, this oligophrenic anticlimax raises the question whether recent NHS budget cuts are impairing the ability of the health authorities to respond to this latest outbreak of Taurocoprophagia perniciosa.

  17. We should stop using the word “faith” in connection with science and scientific theories. The correct word is “confidence”.

    Confidence always decreases as counter-evidence mounts, whereas faith typically increases.

      1. Trust is no good – the godbots love ‘trust’ because ‘trust=faith’. After all, a good scientist doesn’t really trust all that much – it’s a matter of reviewing the evidence and the arguments.

  18. There’s an interesting analogy with the field of metaheuristics (which includes genetic algorithms as well as very simple techniques like hill-climbing). [Roughly speaking, in a metaheuristic algorithm, you take a potential solution to a set problem and then “tweak” it and see if it improves. If it does, you replace your current potential solution with the new one and repeat.]

    There’s a significant trade-off in how much you tweak a solution from turn to turn. If you tweak it only a little, you can only travel up whatever local optima you’re sitting on at the moment, whereas if you tweak too much, you can move to other optima, but are less effective at climbing your current optima. How user-tunable different algorithms are are a major part of the difference between different algorithms, and some algorithms have adaptive tweak sizes, or tweak sizes that vary over time.

    I very much doubt that one would describe a hill-climbing algorithm with a large tweak size as having regular “leaps of faith”.

  19. Great post, Jerry, although Darwin wasn’t the only one whose hunch sprang from cold and hard appraisal of the evidence. Planck’s and Einstein’s did too.

    In the case of Planck, the assumption of the quantization of energy was the only way he could theoretically derive the empirically established curve for black body radiation. He had worked on this problem for years, painstakingly trying to get the data from known thermodynamics and statistical mechanics. In the end, quantization was basically forced on him. (And even still, he was profoundly cautious about interpreting this result physically.)

    In the case of Einstein’s general relativity, “intuition” doesn’t really come into the mix until we try to derive the precise form of the field equations. Einstein probably wouldn’t have been “sure” his field equations were the right ones if it weren’t for that fact that he used his theory to derive (with impressive accuracy) the known results concerning the perihelion precession of Mercury.

  20. And work on the psychological bases of ‘creativity’ in science (such as that by Dean K. Simonton and Kevin Dunbar) shows that the ‘creative spark’ type moment is a lot simpler in practice than some quasi-mystical inspiration. sheer hard work, repetition of approaches in slightly different combinations, trying and trying variations until eventually one solution is arrived at. Even computer based simulations of ideas being exchanged in combination can illustrate how new ideas can emerge. No need for cosmic magicians, sprinkles of fairy dust or yogic enlightenment. Sheer hard work, experimentation, dedication, persistence and exchange of ideas pay off eventually..that’s science…divine inspiration is for the lazy conmen of this world.

  21. Here are two point from an old man’s life experience concerning the misuse of the word faith:

    First, often when using a doctor for a surgical procedure, someone might argue that you “must” have faith prior to submitting to the surgery. To which I say nuts!

    When I needed cataract surgery on both eyes, I chose my surgeon and hospital on the basis of two criteria. I learned the doctor (52 years old) had performed over 7,000 such surgeries in the same hospital and had over a 98 percentile success rate.
    Therefore,the word faith had nothing to do with my decision; it was made on the basis of evidence, bolstered with “rational expectation,” not faith, in the secular sense nor in the religious sense.

    Second, many times over my life when I faced a problem with construction designs, limited by what materials I had available, I would have trouble clearing my mind for sleep. Then somehow, overnight, suffering in that twilight mind between sleep and wakefulness, I would arise in the morning with a clear, lucid choice.

    After many years, I came to the conclusion that in that funny half-way sleep, my mind was able to overcome its fully conscious biases and predetermination, allowing the well- spring of new possibilities to remove those same clouds. That is exactly what I propose happened to Einstein and almost every new discovery across the board — and whether in a half conscious daydream or a night dream, new sights are found, replacing the extremely limiting biases, etc. that are a standard limiting knowledge base.

    Even then, faith, in the religious sense, does not have a damn thing to do with the discovery which, in its own turns on the light of newer, “rational expectations.”


  22. I sometimes wonder if religious people ever have a single idea in their lives? You know, something that just occurs to them? I have ideas all the time. Good job I’m not religious, otherwise I’d have to assume that a god was talking to me non-stop.

  23. “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong” (Richard Feynman)

    The Jesus stories might be beautiful, but they fail any experimental test. But that doesn’t seem to deter some “smart” people from clinging to the many religious dogmas.

  24. Nice response to a silly claim, still well worth responding to.

    One further point: Did anyone tell Taylor that the largest paradigm shift ever was the outing of spirituality, theology and revelation from philosophy and science?

    And let’s keep it that way …

  25. What utter nonsense; I would expect even a freshman to see that Vernon is an imbecile who does not understand the least thing about science. While thinking about a problem for a very long time (years or even decades) scientists sometimes come up with a good idea. They then spend a lot more time testing their ideas to see that it matches available data, to see if it explains some currently unexplained but observed phenomena, and to see if it can make predictions verifiable by experiment. There is no faith – only a search for evidence which may suggest that the idea is OK and the search for evidence which will invalidate all or part of the idea. Vernon is full of shit.

    1. Somewhere on the planet there is an essay by Isaac Asimov which covers a few of these alleged ‘paradigm shifts’ of Vernon. One example was the flat earth idea. If you step outside and look around, the earth looks pretty flat. In fact for most purposes, flat is a very good approximation. Now with careful observation (and the ancient Greeks did this) you can find evidence that the earth isn’t quite flat. No matter how far you travel in any direction, the horizon still appears to be the same distance – you never approach what appears to be the edge of the disc which you stand on. That (and numerous other things) is consistent with a spherical earth. So to some of the ancient Greeks, the earth was a sphere – a well supported claim which the catholic church had denied for over 1400 years. The world did not suddenly change and all that was believed to be true thrown out. No, people just had to say “oh, the earth isn’t really flat after all – it’s a giant ball”. About 2500 years later other refinements were made – the earth is slightly oblate, then later the earth is slightly bulkier in one hemisphere. For over 40 years we’ve had various geoid models which are even more complex than the oblate pear-shaped globe. Hardly anything changes, we just know more and know things in greater detail. If you step outside the world still looks flat. If you plan to build a house you make flat surfaces. The only loser in all this is religion; people found religion incredulous and silly even before the alleged birth of the zombie god-man and as we learn more about nature religion only grows duller and sillier.

    2. Didn’t Asimov also note that:
      The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ (I found it!) but ‘That’s funny …’

  26. The thing that really annoys me about articles such as Vernon’s (and a similar one in the Guardian this week by Thomas Jackson http://www.guardian.co.uk/commentisfree/belief/2010/feb/09/religion-theology-science) is that they are written by people who have never performed a scientific experiment in their whole lives.

    Instead, they have formed their views on the basis of all sorts of suppositions for which there is no evidence, linked together by reasoning which is full of logical fallacies.

    They then consider the results of some scientific advance, where people have put in real effort at observing and understanding how reality is. But they don’t take the trouble to master the mathematics of it which is really necessary to understand what is going on. Instead, they read a summary designed for laymen, which uses analogies, language and concepts which laymen might have a chance of understanding. Because the world of physics at a very large scale and especially at a very small scale is seriously weird, such summaries tend to express a degree of mystery in order to convey this weirdness.

    And the religious jump on this language of mystery and then claim “but this is just what we have been saying all along!”

    It’s bunkum. It is lazy thinking. And it is at best a complete misunderstanding of the actual science and at worst a deliberate and dishonest misrepresentation of it.

  27. Good article. I’ll keep it brief. Although it may be getting on for 30 years old, the brilliant book The Dragons Of Eden: Speculations On The Evolution Of Human Intelligence by the peerless Carl Sagan has some interesting ideas about the left and right hemispheres of the brain. Specifically the analytical left and the intuitive right, and how science is a mix of this kind of thinking.

  28. Thank you for this piss-take on Taylor, who’s been an irritating prat longer than I can remember. As perhaps a token theist on WEIT, I have to say that Taylor’s noodling strikes me as particularly shallow, as I’ve read “good” philosophy of religion and he is most assuredly not it.

    1. I just looked up that website, and discovered that Vernon used to be a priest in the Church of England! See “update” at end of post.

  29. “If they want to give a Templeton Prize to a philosopher, how about … Dan Dennett?”

    Templeton Criteria of Merit: “The judges seek, above all, a substantial record of achievement that highlights or exemplifies one of the various ways in which human beings express their yearning for spiritual progress.”

    Dennett: “Safety demands that religion be put in cages …”

    Worthy of “spiritual progress?”

  30. This is so much bullshit, this intuition and leap of faith nonsense.

    I’m not a scientist, but I am an artist. What I know about intuition from it is that it’s not a “leap” at all; it’s an outgrowth of having internalized the basics so well that you’re free to build from the foundation of knowledge/techniques acquired after long, arduous hours of learning and/or training.

    Think of it this way: Mozart didn’t become Mozart because he was a musical genius, but because he was a musical genius who had spent hours upon hours practicing scales and etudes, to the point of knowing to perfection the relationships of sounds. Of all composers, his huge body of work demonstrates a ferocious level of traditional music training–probably several hours every day, for years, of learning to play scales.

    I could go on with examples like Joyce for literature or Picasso for art, but the process is always the same–when you have the basics down pat, then you can build on them to develop something new. It’s surely the same for science.

  31. Taylor’s understanding of science is pretty bad – he’s on record (at least as of ~25 years ago) as thinking that science has to accomodate more human factors. Fair enough, though vague, but he goes on to a good way to do that is to allow for things like psychoanalysis which most of us would likely regard as pseudoscience.

  32. Einstein’s theory of relativity was not as much of a creative hunch, or as original, as popular myth would have it.

    (Most instances of “creative leaps” turn out that way on closer inspection. Hunches and creativity turn out to be heuristic inferences and incremental processes, not brilliant magical leaps of insight.)

    The idea that all motion was relative has been around a long time, probably forever. Leibniz, in particular, disagreed with Newtonian physics because he thought that absolute motion didn’t really make sense. Leibniz had a hypothesis of relativity, but never came up with a worked-out theory. Still, he always thought Newton was wrong, for reasons that still make sense, and he was right.

    When Einstein came up with his actual theory of relativity, he didn’t do it out of sheer creative insight. He mostly just took the Michelson-Morley experiment seriously, interpreting it as a confirmation that relativity probably true, that an absolute frame of reference probably could not elegantly fit the facts, and that the speed of light was constant and likely somehow fundamental.

    Leibniz did not have such good evidence that relativity was true, or a specific theoretical constraint to work with, like keeping the speed of light constant. He mainly thought that (1) moving frames of reference clearly worked—e.g., if you drop a ball on a moving ship it falls “straight down” relative to the ship’s frame of reference, (2) an absolute frame of reference was an unnecessary assumption, and (3) he couldn’t figure out what an “absolute” frame of reference could really mean anyhow. (E.g., what would it mean if the whole universe was stationary, vs. the whole universe being an inertial frame, with nothing else to gauge its motion relative to?)

    Einstein didn’t invent relativity, he just made it work. And he mainly made it work by taking empirical constraints seriously and running “experiments”—thought experiments—to rule out a lot of options, applying Occam’s razor and so on. There was no magical insight or leap of faith, just excellent reasoning, guesswork, and a careful process of elimination.

    And contra Kuhn, there was no basic incommensurability between paradigms. Before Einstein, physicists were able to understand Leibniz’s points about relativity—they understood the idea of relative motion, and the idea that there might be no additional fixed frame of reference. They didn’t have compelling evidence that Newton was wrong, which would both motivate and guide the search for a theory of relativity that was right.

    Einstein was certainly a genius, but he was a data-driven genius.

Leave a Reply