Subscribe to Elucidations:
       

Episode post here. Thanks to David North for his hard work on this back episode transcription!


Matt Teichman:
Hello and welcome to Elucidations, a philosophy podcast ordinarily recorded at the University of Chicago but which today is being brought to you from San Francisco. Joining me is Kent Bach, emeritus professor at San Francisco State University to talk about jumping to conclusions and knowing when to think twice. Kent Bach, welcome!

Kent Bach:
Hi Matt. That’s one thing we’ll talk about. What else, I don’t know.

[LAUGHTER]

Matt Teichman:
Excellent. And possibly other things too.

Ignorance as Evidence

Matt Teichman:
So, this idea that ignoring something can count as evidence is a bit counterintuitive because, usually, if you think about evidence, it’s something you’re looking at and considering, weighing in your mind whether it really is evidence for something happening. So maybe we can motivate this with an example. What would be an example where not thinking about something justifies some other claim or where your not thinking about something serves as evidence?

Kent Bach:
What I have in mind with the notion of ignorance as evidence is that when you come to believe something, you base your belief on something or other, and you could conceivably consider all sorts of possible sources of evidence against whatever it is that you believe. But, most of the time you don’t, so there are two questions: what is the significance of the fact that you don’t consider certain possibilities and what is required for you to consider them when they need to be considered? Take a simple example. You’re given a question: “Which of the following four former presidents is on Mount Rushmore?” It’s a multiple choice question and you’re given the choice between Richard Nixon, Abraham Lincoln, George W. Bush, and Dwight D. Eisenhower. Chances are you’ll immediately think of Lincoln as being the correct answer.

Matt Teichman:
That’s my general idea, yes.

Kent Bach:
Oh was it? Good for you. You must know a lot about American history and South Dakota. The alternatives really don’t get taken seriously. As soon as you hear the name ‘Lincoln’, that’s it. And if you think about multiple choice exams in general, when you go through them, what typically happens? If it’s something you have some knowledge about, and you’re lucky, exactly one of the four choices will strike you as the correct one and you’ll immediately mark that as your answer and go on. Nothing holds you back. But sometimes, you are held back because maybe one of the other choices seems equally as appealing as the one that first appealed to you. Or perhaps you think of an answer that’s not even included among the four (and maybe you have that as an option under the heading ‘none of the above’). What’s going on there? You’re partly relying on your ability to invoke knowledge that you have when you need to. In most cases of relatively easy multiple choice questions, you don’t even bother: you take one answer to be the correct one, and that’s it. When there’s some kind of doubt, then you start bringing in alternative possibilities. But there could be a lot of possibilities other than the ones that you’re considering and that list could conceivably go on and on. Somehow, you have the ability to think of relevant possibilities, or plausible possibilities when they’re there, and you don’t bother yourself thinking about implausible possibilities (or irrelevant ones) when they’re not there.

So, one question arises: how does this work? That’s not a philosophical question, it’s really a psychological one that has something to do with how you access information that you possess when you need to access it. And the simple fact of the matter is that we don’t know much about how that works. That’s a very hard thing to study. And maybe some cognitive psychologists have a handle on it, but I’m not aware of their work and I suspect the research has only just begun.

Anyway, the philosophical point is that the non-occurrence of the thought of some possibility functions for us as evidence that that possibility is too remote to be taken seriously. That’s not to say we can’t make mistakes in this area. But typically, we treat the non-occurrence of the thought of a possibility as evidence that the possibility is too remote to be bothered with. And we do all that without having to think that it’s not worth considering. Because if we had to think that, then we’d be considering it. And since there indefinitely many possibilities that we could at any given moment think of, if all that was required was that they be possibilities, then we’d just go nuts thinking of them all. So in short, we manage to rule out most possibilities without having to explicitly consider them.

Matt Teichman:
I like this a lot. I think you just explained why I’m so terrible at multiple choice tests.

Kent Bach:
Oh?

Matt Teichman:
Because I do well on tests where the question is “what is the answer to this” or “what is the solution to this equation?” Stuff like that. Why? Because I look at the question, I try to figure out the answer, and usually I get it right. On multiple choice tests, I look at the different answers and there’s always a trick and I’m always distracted by the trick. And then I get lulled into thinking about that and considering it too much, and then I get the wrong answer. So in a way, doesn’t that corroborate what you were just saying? A multiple choice test can frustrate this natural process that we have ordinarily—even when we’re not doing that kind of thing—of figuring out what the importantly relevant choices are in our heads. When we’re considering the answer to a question, maybe part of what’s happening in a multiple choice test when you get thrown for a loop is that irrelevant alternatives are being made salient to you and you’re considering them when you shouldn’t be.

Kent Bach:
Well if the question is a trick question, then part of what the test writer is doing is trying to throw you off. To keep you from just relying on instinct, so to speak. That is, accepting the first plausible-sounding answer as the correct one. It may be that the question isn’t intended to be taken the way it first strikes you, so that it’s really not the question you thought it was, and therefore doesn’t have the answer you might’ve thought (since it answers a different question). What I think you’re talking about, and why you’re not that confident in your ability to deal with multiple choice questions, is that you second guess yourself too soon, which is to say you don’t rely on your ability to think of possibilities only when you need to. You tend to think of them even when you don’t need to think of them, which of course is a hallmark of a good philosopher. But whoever said that philosophers are practical?

Matt Teichman:
An important upshot of what we’ve been talking about is that in order to make decisions about what to do, you have to tune a lot of things out. You have to tune a lot of possibilities out. If I want to form a belief—if I hear the sound of my friend walking around in the next room and I draw the conclusion that my friend is in the next room, there are a lot of far-fetched scenarios that I have to rule out in order to go ahead and continue believing that. If I start wondering how I know that it’s not the FBI or something—these paranoid, fantastic scenarios, if I don’t tune them out, then there’s going to be too much noise in my head to draw the correct conclusion, which is that my friend is indeed in the next room.

Kent Bach:
That’s a good example. I mean, does your friend have distinctive footsteps? Do these footsteps sound different from the way that friend normally walks? That might give you reason to think: hey, maybe a burglar is snooping around. Or maybe your friend has changed his walking style. That’s another possibility. But most of the time we don’t take alternative possibilities very seriously.

Now, I’m not saying that that’s always good. I mean, we can exclude the things that we shouldn’t just as easily as we can exclude things that we should. We may be better or worse in different areas when it comes to knowing what to take into account and what not to take into account. After all, there are plenty of exceptional situations. But part of what we rely on is an ability to detect that your situation is exceptional and consider that maybe special measures are needed to deal with that situation.

Matt Teichman:
Ok, so now here’s what I want to know. If we’re not dealing with these cases of tuning irrelevant scenarios out but rather cases of evidence where we think hard about whether this really is good evidence. Say there’s a footprint in the sand and I want to decide whether it’s evidence that a coyote was really here. It seems that in philosophy and forensic science we have a method that we can follow for thinking about “is this really evidence for such and such” and “is it really good evidence or does it just seem like it’s evidence,” and so on and so forth. We do that by attending to the evidence, thinking about it carefully, and trying to fit it in with other evidence. But it seems like we can’t do that with stuff we ignore because once we start thinking about it, we’re not ignoring it anymore. So how can I, when faced with two options of correctly ignoring and incorrectly ignoring something, examine them both carefully and decide which one to ignore? Because the moment I’m examining them, I’m not ignoring them anymore. Does this pose a problem?

Kent Bach:
Well, take your example of a footprint: if you’re like me, you don’t know one footprint from another. So you’re not going to rely on any snap judgement you’re inclined to make because you have no reason to think it’s correct. If, on the other hand, you’ve spent a lot of time out in the woods, and you’ve learned to track animals and to recognize footprints of animals of different kinds, and you see what in fact is a footprint of a coyote, then your thought that it’s a coyote is likely to be fairly reliable. Maybe there’s a few other animals that have very similar footprints. But maybe you’re experienced enough to know either that those other animals aren’t in this area, or that further possibilities have to be considered. Maybe you look closely and you can detect subtle differences between different generally similar footprints, as between coyotes and two or three other kinds of animals that have somewhat similar footprints. So if you have the ability to detect those differences, maybe your initial thought that this is a coyote doesn’t have to be second-guessed.

This is because you’re justifiably confident enough in your own ability to tell the difference. If not, then you should be careful enough to think of the other possibilities, if you know what the subtle differences are and you look for them. If not, then maybe you get out a book of animal prints and compare what you see to the examples in the book, or do something to verify that what you think is actually what the footprint is: a coyote. So a lot of it depends on your knowledge about your own knowledge. That is, how knowledgeable you are for the thing you’re making a judgement about. What I’m talking about are areas where somebody is basically very good at making snap judgements in a particular area—but once you go beyond your areas of expertise, then you’re in trouble.

Matt Teichman:
So let’s say I’ve discovered that I’m bad at snap judgements and I want to get better. Is the answer to try and develop better habits? It seems like what you want to change if you’re bad at making snap judgments isn’t so much ‘ah, this particular judgement has got to be better’, but rather something like preventative medicine: in advance, cultivate better habits of tuning stuff in and tuning stuff out.

Kent Bach:
Well, let me ask you a question first. When you say bad at snap judgements, do you mean you tend to make a lot of incorrect snap judgements or do you mean you fail to make snap judgements when you have perfectly legitimate reasons to make them?

Matt Teichman:
Let’s say I make incorrect snap judgements. If I have three minutes to think about something, I make the right decision. But if I only have fifteen seconds, I’m constantly tripping over myself.

Kent Bach:
Ok. Well, first of all, let’s get one thing clear. Most of the time you’re making all sorts of snap judgements without even thinking that you’re making them. For example, you’re sitting in a chair at the moment and even though we’re in earthquake country, you’re making the snap judgement that the floor isn’t going to collapse in the next minute or two. That’s a reasonable thing to assume, but you don’t even consider the alternative. At least you didn’t until I mentioned it.

So, you’re making lots of snap judgements that things are not out of the ordinary about your immediate environment. When you’re crossing a busy street, you don’t make the kind of snap judgement you make when you’re walking across the same street at five o’clock on a Sunday morning and there’s no traffic there. But you make adjustments as to when it is appropriate to take things for granted to be as they normally are. When things aren’t, then you have to be more careful, obviously.

I’m not pretending to offer practical advice about what people should or shouldn’t do. I’m perfectly willing to concede some people make snap judgements too readily when they should be more careful, and other people are too obsessive compulsive to make snap judgements when they should be making them. They worry about a lot of trivial things when there’s really no need to worry about them. But the general idea I have is that not considering certain possibilities is part of what it takes for most of our ordinary beliefs to be justified, provided we’re good at knowing when not to take such possibilities into account. So it’s really more of a point in the theory of knowledge than in the realm of practical psychological advice. But I can certainly say some people could benefit by being more thorough and other people could benefit from being more impulsive.

Thinking and Believing

Matt Teichman:
You’ve drawn an interesting distinction in your work between what you call thinking and believing. So: thinking that my friend is in the next room versus believing that he’s in the next room. And you use that distinction in very interesting ways to try to address philosophical puzzles we have about various strange things that arise when our sense of what to ignore, or what to attend to, is poorly calibrated. So, what’s this distinction between thinking and believing?

Kent Bach:
Well I think—not only do I think, I believe—that everybody has innumerable beliefs, most of which are not playing any active role in one’s thinking at a given moment. There are literally millions of things you believe, and it would be very awkward if you could somehow entertain a very large number of those beliefs at any one time. Your mind would be very cluttered if that were the case. Nevertheless, you have all these beliefs and you can bring them to bear—or somehow they come to bear—when they’re relevant to whatever you’re currently thinking about or otherwise engaged in.

I take thinking to be an active psychological occurrence so that, for example, I can think that Bismarck is the capital of North Dakota, right now. I think it’s something I believe; I’m not even sure. Maybe it’s not Bismarck. But anyway, the thought that Bismarck is the capital of North Dakota occurred to me. That’s an example of thinking.

I brought in this distinction originally in connection with the problem of self-deception. This has bothered philosophers since ancient times, the problem of self-deception. The problem is how can you deceive yourself if deceiving yourself is anything like deceiving someone else. Because if the deceiver and the deceived are the same person, then it looks like whatever the deception isn’t going to work. So how can there be such a phenomenon as self-deception? I brought the distinction between thinking and believing to suggest a possible solution to that problem. That is that self-deception, unlike deceiving other people, doesn’t require getting them to believe something you don’t believe. Deceiving somebody else requires getting them to believe something you don’t believe. And that’s easy enough to do; you just lie. Of course, they might recognize you as lying, in which case they won’t buy it. But in your own case, it can’t be quite like that. Because if you intend to deceive yourself, you’ll also be aware of your intention to deceive yourself, in which case it won’t work.

So I took what you might call a reductive approach to self-deception when I wrote on this topic, because it seems to me that it really isn’t necessary to get yourself to believe something contrary to what you believe. Let’s say that your son is a mass murderer, which is not the easiest thing for parents to believe about their sons, even sons who are mass murderers. But you might do other things that have the practical effect of not believing that your son is a mass murderer. You’ll clutter your mind with all sorts of thoughts about how wonderful your son is, what a nice boy he always was. Or you’ll think about something else. So, what you’ll do is avoid the effect that believing he is a mass murderer would normally have by interfering with that process. In the paper, I go into various ways in which that can occur. What’s relevant here is: your thinking process is not what you actually believe. So you could really believe that he’s a mass murderer but not consciously accept it, and that would be one possible outcome.

Matt Teichman:
So, the difference between thinking my son is a mass murderer and believing that my son is a mass murderer is as follows: if I think that my son is a mass murderer, that’s an event that takes place in my mind. I actively think that thought; I assent to it. Something like that. Whereas believing is more like, it’s not an event in my mind, it’s not a happening, a thing that I suddenly do. It’s rather just a state in me whereby if I were to consider whether my son was a mass murderer, I would go yeah, he is a mass murderer’. It’s the tendency that, if I were to think about it, I would think it was true.

Kent Bach:
Yes, normally when you believe something, when that question arises, you think what you believe. That’s the handy thing about beliefa: you already know what to think when the question arises. You don’t have to reconsider the question if you have a belief about it. But if it’s something that’s really hard to accept, you will interfere with the normal effects of the belief and clutter your mind with alternative thoughts, or with thoughts on other subjects, or just ‘I don’t want to deal with that’, and go on to something else. But you will not directly face up to whatever it is you believe. If your self-deception is successful, and it may be to a greater or lesser degree. Gradually, over time, it might become less and less successful, until finally you accept whatever it is.

Matt Teichman:
Right. So, ordinarily, when I believe something that’s a matter of indifference to me, like I really don’t care one way or the other—I don’t have any vested interest in whether or not it’s true—then if I believe it, I think it. That’s the normal case. What often is a spur to self-deception is when there’s something that I believe, but I really wish it wasn’t true; the fact that it’s true really upsets me. So as this sort of coping mechanism, I just start thinking about random other things, or I try to think the opposite of this thing that I wish wasn’t true. Which—

Kent Bach:
You may think of specific instances that support the opposite thing. So even though you have this belief you might think of some minor—but, because you think of them, significant—reasons not to accept what in fact you already believe, or would believe if you let the evidence have its normal effect.

Matt Teichman:
And this deals with the paradox of self-deception because all the while, I haven’t necessarily given up my belief that my son is a mass murderer. All I’m doing is looking for strategies to avoid thinking about it.

Kent Bach:
Yeah. Basically, what you’re trying to do is avoid a certain thought. Which means: either don’t keep that thought from occurring at all, or if it does occur, make it go away.

Matt Teichman:
Nice.

Snap Judgements in Communication

Matt Teichman:
So you’ve argued that having a better understanding of how we make snap judgements is important for understanding how we communicate. How does our ability to converse with each other rely on our ability to make snap judgements?

Kent Bach:
Well with communication, there are two sides to it: there’s the speaker’s side and there’s the listener’s side. In normal conversation they take turns. As I was saying earlier, we generally don’t make fully explicit what we mean. We’d have to spend a lot of time going through concrete examples to really see how this works. There’s an old cliche that it all depends on context, but what is it that depends on context? And what indeed is context, in the sense relevant to communicating?

First of all, we take for granted that we are trying to communicate. We take for granted that we’re speaking the same language. But we also have to rely on some kind of understanding of each other’s purposes in engaging in the conversation. That brings in all sorts of interpersonal knowledge: knowledge I have about you, and you have about me, and that we have about people in general or people in particular situations, whether it is McDonalds, at a bank, at a baseball game, or whatever it may be. So there are all sorts of things that will constrain the range of possible things that speakers could mean, given that they say what they say.

As speakers, we’re doing something that is in a way more complicated than what the listener is doing, because we’re trying to communicate something, and we have to come up with something to say that will make evident to the listener what we’re trying to communicate. And we do that while we’re trying to think of how to do it. I’m about to start a sentence and, at the moment, I don’t know how that sentence is going to end. Well, it just ended. But I didn’t know it was going to end because I was intending to make it go longer and didn’t do that. Try to be self-conscious of what you’re doing when you’re trying to speak, and you’ll get utterly confused because you’ll find yourself in a very awkward interaction between what you’re thinking and what’s coming out of your mouth. That’s precisely what normally does not occur when we speak. Somehow, we’re very fluent at it for speaking in our native language. We don’t have to pay extra close attention to what we’re going to say next. We just do it. We do it in such a way that we can make our intentions evident to the person we’re talking to. And that person is in turn figuring all of that out.

Now, if you ask me ‘how do we do that’, I don’t know. That’s a very complex psychological question, and most of the research on language tends to focus on very specific kinds of cases. It’s impossible really to study full-fledged language use as it occurs in everyday life. So this is another area where our understanding of what really goes on is very limited even though it’s been well-studied.

Matt Teichman:
So, we’ve discussed a couple examples of this general theme of everyday reasoning that involves snap judgements and how that also crucially involves the ability to figure out what’s salient and what isn’t and ignore the things that aren’t salient. Why do you think it’s important for philosophers to study this topic?

Kent Bach:
Well, a lot of it goes beyond philosophy, but that doesn’t mean it’s not important for philosophers to consider it. I mean, if there’s been any theme in philosophy over the past few hundred years, it’s that there’s much less in the mind that meets the mind’s eye than philosophers have traditionally supposed (certainly if we go back to Descartes, for example). So, let me bring in a distinction that has become very popular in the psychological literature, including in a recent bestseller by Daniel Kahneman: Thinking Fast and Slow.

In that book, which I recommend to everybody, he invokes a distinction between System 1 and System 2. But he’s very careful to point out that these systems aren’t really distinct entities in the head, but are just abstractions from what is in fact a very complicated thing, namely, our psychology. System 1 is the fast, intuitive, almost impulsive system and System 2 is the slow, deliberate system.

A lot of times in the book, even though he denies that these are two distinct systems that operate somehow on their own, he talks as if they are. So on the one hand they’re not really separate autonomous systems, but on the other hand, he treats them to a large extent as if they were. From my point of view, we should think of them as in constant interaction so that, to the extent that we are deliberate, we are also relying on our intuitive abilities to bring information to bear when we need it.

Kahneman’s book tends to emphasize the weaknesses of System 1: our vulnerability to various kinds of statistical errors and cognitive illusions and so on and so forth. The book is very good at documenting a wide range of these. In my opinion, it would have been interesting if he had at least a chapter or two on all the good things that System 1 does that straightens us out when our intellectualizing takes us too far afield, or when we fail to take things into account that really are relevant. Often times things come to the mind when we least expect it that really are relevant. So, these processes go on which we as philosophers or psychologists don’t really understand as well, but which as ordinary people we rely on, for better or for worse. Kahneman’s book tends to emphasize the worse. I’d like to emphasize the better and include both the better and the worse in what we acknowledge as the workings of our subconscious processes of thinking.

Matt Teichman:
Kent Bach, I both think and believe that this was a very stimulating discussion, so thank you.

Kent Bach:
I won’t dispute that.


Elucidations isn't set up for blog comments currently, but if you have any thoughts or questions, please feel free to reach out on Twitter!