SkepticblogSkepticblog logo banner

top navigation:

Russell’s Hedgehogs and Hirst’s Shark

by Daniel Loxton, Feb 07 2012

Today I’d like to share a piece of good practical advice from philosopher Bertrand Russell—and to share some reflections that touch upon it.

To avoid the various foolish opinions to which mankind are prone, no superhuman genius is required. A few simple rules will keep you, not from all error, but from silly error.

If the matter is one that can be settled by observation, make the observation yourself. Aristotle could have avoided the mistake of thinking that women have fewer teeth than men, by the simple device of asking Mrs. Aristotle to keep her mouth open while he counted. He did not do so because he thought he knew. Thinking that you know when in fact you don’t is a fatal mistake, to which we are all prone. I believe myself that hedgehogs eat black beetles, because I have been told that they do; but if I were writing a book on the habits of hedgehogs, I should not commit myself until I had seen one enjoying this unappetizing diet.1

This straightforward advice—try not to take people’s word for stuff, especially when we’re promoting a position in public—is a core skeptical concept. It underpins all of skeptical scholarship, for responsible skeptical outreach demands the due diligence exercise of checking everything twice. Someone says they saw something? Maybe they did, and maybe they didn’t. We ought to try to find out. Someone says they know something? Well, maybe they do—and maybe they don’t. If skeptical sources (for example) confidently assert that a case is solved or a paranormal topic debunked, we ought to ask ourselves, “I wonder if this topic is really understood, and how well?” —and then try to find out before repeating assertions from the sources we admire. Sometimes it turns out that the best available scholarship is preliminary, or incomplete, or even downright speculative.

As I’ve learned again and again from my own research experience, it’s not even safe to assume that apparently reputable secondary sources provide accurate quotations, let alone correct analysis. This reminds me of a saying in my family: “Everyone is just some guy.” Celebrity authors, paranormalists, scientists, skeptics—all just people feeling their way as best they can with the incomplete information they have in front of them. I’m just some guy; why take my word on anything much? Why take anybody’s?

And yet we have to. There is no practical alternative: we have to take other people’s word all the time, on all sorts of stuff. As I’ve argued, we lay skeptics have extremely little justifiable ability to dissent from the prevailing current of opinion among domain experts on any topic of mainstream science or scholarship. Without deep expertise earned through years of training, we are often unable even to understand why experts think the things they do, let alone determine whether they’re right.

Nor is the course always clear even among experts within their own fields—perhaps especially in areas relevant to skeptical research. Consider this troubling meditation from parapsychologist and skeptic Susan Blackmore, reflecting upon her Ten Years of Negative Research in Parapsychology:

How far could I generalize these negative results? …I had to ask whether my negative results applied only to those experiments carried out by me, at those particular times, or whether they applied to the whole of parapsychology. There is no obvious answer to that question. … How could I weigh my own results against the results of other people, bearing in mind that mine tended to be negative ones while everyone else’s seemed to be positive ones? …  At one extreme I could not just believe my own results and ignore everyone else’s. That would make science impossible. Science cannot operate unless people generally believe each other’s results. Science is, and has to be, a collective enterprise.

At the other extreme I could not believe everyone else’s results and ignore my own. That would be even more pointless. There would have been no point in all those years of experiments if I didn’t take my own results seriously.2

What’s the answer to these conflicting challenges to our wish for reliable knowledge? There is no answer. Or rather, there are techniques to somewhat reduce our fallibility—techniques we call “science”—but no magic window on reality. We’re stuck with uncertainty on all topics, at all times.

Now, don’t get me wrong. Bertrand Russell was quite right to observe, “When one admits that nothing is certain one must, I think, also admit that some things are much more nearly certain than others.”3 But let’s try to set that too-easy response  aside for a moment. Let’s set ourselves on the less secure footing of genuinely confronting uncertainty—of letting the problem of uncertainty resonate for a while, before turning to the standard canned answer.

Skeptics make much of our rhetoric of the virtue of doubt, but often we mean merely that we think we are right and the other guy is wrong. We may well be correct, but the belief that we are correct is small achievement—no more and no less than what everybody thinks already. We are, all of us, built for belief. “Man is a credulous animal,” Russell explained, “and must believe something; in the absence of good grounds for belief, he will be satisfied with bad ones.”4 Given the innate human talent for unearned certainty, I submit that it is valuable for skeptics to open ourselves to the idea that the world is very much more complicated than we currently understand.

This feeling of intellectual vertigo is easier to describe than it is to achieve—and it is effectively impossible to sustain. I’m reminded here of the title of Damien Hirst’s famous sculpture featuring a preserved shark, “The Physical Impossibility of Death in the Mind of Someone Living.” Like the knowledge of our own mortality, we’re just not built to comprehend the depths of our own ignorance, nor to feel the possible truth of things we believe to be false. I mean, we can say it—”Maybe they’re right”—but moments of truly honest inward doubt are rare and vertiginous things. There’s abstract understanding that we could, in principle, for the sake of argument be wrong, and then there’s truly knowing it—and the latter decays like experimental antimatter, reverting almost instantly back to the comforting, constructed reality that served our ancestors in their search for food and shelter. Yet fleeting as that feeling is, it’s worth reaching for, experiencing, and internalizing to the greatest degree we can manage.

Maybe they’re right. I could be wrong right now

This, after all, is the heart of both modern scientific skepticism and the older philosophical traditions that gave us the word “skeptic.” Certainty will always be suspect. The problems with knowing will always remain.

References:

  1. Russell, Bertrand. Essays in Skepticism. “Intellectual Rubbish.” (New York: Philosophical Library, 1962.) p 70
  2. Blackmore, Susan. “The Elusive Open Mind: Ten Years of Negative Research in Parapsychology.” Skeptical Inquirer, Vol 11, Spring 1987. p. 249 – 250
  3. Russell, Bertrand. Essays in Skepticism. “Atheism and Agnosticism.” (New York: Philosophical Library, 1962.) pp. 85–86
  4. Ibid. p. 65

Like Daniel Loxton’s work? Read more in the pages of Skeptic magazine. Subscribe today in print or digitally!

21 Responses to “Russell’s Hedgehogs and Hirst’s Shark”

  1. A Lopez says:

    Sorry but I find your post a little bit confusing, to say the least. Are you trying to imply that, for instance, regarding the origins of the universe, some yanomami guy “opinions” deserve the same attention and respect that some NASA astrophysicist’s? Or that, in the case of a brain tumor, the expertise of an African witch doctor and a neurosurgeon are equivalent? Surely not.
    Are you trying to suggest that in the real world we can find degrees of (un)certainty?
    What I mean is: there are things that we know for certain, and the scientific method is the best method (yet) we’ve found out to discover them.

    • badrescher says:

      I saw absolutely nothing in this post that could be interpreted as “all opinions are equal”.

      A fundamental property of science is that it cannot ‘prove’ anything (there is no absolute certainty); nothing can. Accepting this is extremely important in the process of learning because we can learn nothing if we are certain that we already know it. Science provides us with tentative knowledge and some measure, for each bit of that knowledge, of how confident we should be that it is accurate. THAT is what this post is about – not ‘equalizing’ theories, but being humble rather than arrogant and keeping one’s mind open.

      The skeptical mantra should be “doubt”, not “doubt what other people think”.

  2. Max says:

    Navel gazing is nice, but a low Brier score is nicer.
    http://en.wikipedia.org/wiki/Brier_score

  3. David H. says:

    Well stated, indeed, Mr. Loxton. I raise a glass to you!

  4. John K. says:

    Great post.

    It is very difficult to come to terms with the humbling observation that there is so much we cannot understand because the universe contains parts both too microscopic and macroscopic for us to ever reasonably observe. For me, being a skeptic is mostly recognizing how very much there is that I just don’t know. Once I acknowledge my vast ignorance, then I can move forward and base as many beliefs as possible on actual repeatable evidence and leave everything else in the “unknown” file. This is not always very satisfying, but it is the only way to distinguish between that which is real and that which is only imagined.

  5. tmac57 says:

    “Certainty will always be suspect. The problems with knowing will always remain.”
    Hmmmm…Daniel,are you certain about that?

  6. Phea says:

    When one considers all the people who have existed, exist now, and will exist in the future, we can begin to understand just how small a percentage of human experience we each own. It is humbling reality. Being a humble skeptic is a tough balancing act, but a worthwhile goal.

  7. Jason Loxton says:

    This seems to just come back to the Mark Twain quote: “Supposing is good, but finding out is better.” I would add that finding out is harder. After 12 years of university education, I both constantly baffle at my own ignorance and also still have to fight to force myself to recheck “facts” before I prep a lecture, etc. It is oh so easy to fall into comfortable, complacent confidence, even when you are aware of the complexity of knowledge, the changing landscape of the known in your discipline, and the fallibility of recall.

  8. BillG says:

    “We are, all of us, built for belief.” To state the obvious, without some degree of belief we would all perish. In my view the “virtue of doubt” can be extremely liberating, though not always the easiest path.

  9. gdave says:

    “I could be wrong” and “I don’t know” may be the two most important phrases in any skeptic’s toolkit.

  10. d brown says:

    “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”
    Mark Twain If you look it up, many of the big things were not from planed research. It’s things they found by accident looking for something else.

  11. Kitty says:

    great article and very true. I remember the family doctor treating a family member for an ulcer. The treatment didn’t work, and the poor relative was blamed for “eating the wrong things” and not taking the medication properly. I remember his anger that he HAD followed all the advice and treatment. Later of course we find out that ulcer treatments really didn’t work well, and now the treatment of ulcers is different. However, doctors just trusted “This doesn’t work very well with the people I treat, but this is what others say does work.”

    Though Hirst’s SHARK???? Oh my, as an artist, you lost me there. Indeed, skeptic artists (there are not just skeptics in science) this his shark is a hilarious rip off. He has more sharks, and the “famous” one is vigorously trying to decompose. More skepticism in art PLEASE. Even with a great name that shark isn’t work $12 million.

    http://www.amazon.com/Million-Stuffed-Shark-Economics-Contemporary/dp/0230620590/ref=sr_1_1?s=books&ie=UTF8&qid=1328807214&sr=1-1

  12. Ted Fontenot says:

    “We are, all of us, built for belief.”

    Yes, we are, but aren’t we also built to question? If we aren’t, then how did we develop this strain of skepticism at all? It’s not as well advertised, but there is a history of disbelief as well as one of belief. Fundamentally, we don’t want to simply believe; we, ultimately, want to know that what we believe is true. Belief and disbelief are like plaintiff and defendant. They are just way stations and pacifying place-holders on that journey to “near certainty”. But there has to be, at some moment, a judgment along a spectrum of probability, we should allow, as much as we can at any given moment, that spectrum to inform and serve us in this, our quest, for truth, as well as to make do in the meantime. But, yes, we are all too often too definite without sufficient warrant.

  13. John Elliott says:

    A recent T-shirt I saw sums it very nicely; “I MAY BE WRONG, BUT I DOUBT IT.”

  14. Markx says:

    I WAS actually wrong once.

    How so? What happened?

    Well, I thought I’d made a mistake, but in the end it turned out I was correct in the first instance.

  15. sittingbytheriver says:

    After all these years, I still love that shark. The decomposition makes it more conceptually meaningful . Hirst is a genius.

  16. Dave Dolson says:

    I think the question of “what is the lay-person to believe?” is a valuable discussion to have. We are all lay-persons in most aspects of science.

    Much (all?) public policy is decided by the voting public, sometimes in referenda or indirectly through elected officials.

    For example, should fluoride be added to drinking water? Should garbage be burned or buried? Should cosmetic use of pesticides be permitted? Does fracking harm drinking water? Should children be immunized? Should industry be subsidized to create jobs?

    Even in complete ignorance of the science, there are some useful principles:
    (a) note fallacious reasoning (x is correlated with y; therefore x caused y.)
    (b) follow the money: who benefits? who is paid by whom?
    (c) Beware of anecdotal evidence (do not accept, “I’ve been smoking since I was 12 and I don’t have cancer, so smoking is safe”)
    (d) Look to parallels in history (before immunization, many children died of measles, polio, …)
    (e) Learn to identify character assassinations.
    (f) statements invoking supernatural beings are to be ignored

    These are off the top of my head. Is there a field of philosophy that makes these rigorous?

    So I believe that when you can’t make the observations yourself because you don’t have time for a 20-year study on 1000 individuals, maybe you can still make a good judgement of the truth, or invoke the precautionary principle.