Sunday, December 04, 2011

Hear, hear

Brian Leiter likes to pour scorn on the NY Times's philosophy blog The Stone (he really doesn't like Simon Critchley), but this post by Alva Noë is absolutely spot on. I can hardly decide which choice bits to put here in order to incite you to go over there and, as they say, read the whole thing. Let's try this one, which sounds very much like what I always say when this issue comes up (so naturally I like it):
The idea that a person is a functioning assembly of brain cells and associated molecules is not something neuroscience has discovered. It is, rather, something it takes for granted. You are your brain. Francis Crick once called this “the astonishing hypothesis,” because, as he claimed, it is so remote from the way most people alive today think about themselves. But what is really astonishing about this supposedly astonishing hypothesis is how astonishing it is not! The idea that there is a thing inside us that thinks and feels — and that we are that thing — is an old one. Descartes thought that the thinking thing inside had to be immaterial; he couldn’t conceive how flesh could perform the job. Scientists today suppose that it is the brain that is the thing inside us that thinks and feels. But the basic idea is the same. And this is not an idle point. However surprising it may seem, the fact is we don’t actually have a better understanding how the brain might produce consciousness than Descartes did of how the immaterial soul would accomplish this feat; after all, at the present time we lack even the rudimentary outlines of a neural theory of consciousness.

What we do know is that a healthy brain is necessary for normal mental life, and indeed, for any life at all. But of course much else is necessary for mental life. We need roughly normal bodies and a roughly normal environment. We also need the presence and availability of other people if we are to have anything like the sorts of lives that we know and value. So we really ought to say that it is the normally embodied, environmentally- and socially-situated human animal that thinks, feels, decides and is conscious. But once we say this, it would be simpler, and more accurate, to allow that it is people, not their brains, who think and feel and decide. It is people, not their brains, that make and enjoy art. You are not your brain, you are a living human being.

We need finally to break with the dogma that you are something inside of you — whether we think of this as the brain or an immaterial soul — and we need finally take seriously the possibility that the conscious mind is achieved by persons and other animals thanks to their dynamic exchange with the world around them (a dynamic exchange that no doubt depends on the brain, among other things). Importantly, to break with the Cartesian dogmas of contemporary neuroscience would not be to cave in and give up on a commitment to understanding ourselves as natural. It would be rather to rethink what a biologically adequate conception of our nature would be.
Well, that would be "what I always say" if I were better at saying it than I actually am. You go, Professor Noë! Woo hoo! There's more, too, so ... you know what to do.

Wednesday, November 09, 2011

Wait, what about pumpernickel?

I have just discovered that on February 16, 1962, one C. M. Mullen of G. & C. Merriam Company wrote a letter to my maternal grandfather, informing him that:
We are glad to reply to your letter of February 12 in regard to the word bagel. This word has been entered in the Addenda Section of Webster's New International Dictionary, Second Edition, for the last few years. It is now entered in its regular alphabetical place in our recently published Webster's Third New International Dictionary, the definition reading as follows:
a hard roll shaped like a doughnut that is made of raised dough and cooked by simmering in water and than baked to give it a glazed browned exterior over a firm white interior
Okay then. Carry on.

Monday, September 19, 2011

Click click

I clicked over today to an interview with Brian Leiter in a post on The Browser today, and in the "Related Articles" sidebar there I noticed a link to my own recent 3QD post on Nietzsche's perspectivism (here), where it is described as "thought-provoking." How about that. You may also reach that same post by going over to the latest Philosophers' Carnival at Minds and Brains. Whichever way you get there is fine with me. (That is, there's no one single, correct way ... never mind.)

In other news, my post on Kant squeaked into the final round of 3QD's 2011 philosophy competition as a wild card (like the Red Sox will if they don't choke). So Patricia Churchland will become acquainted with my views on Kant. I repeat: how about that.

Friday, September 09, 2011

3QD philosophy prize

The voting round of the 3 Quarks Daily philosophy prize for 2011 is now open. I've got a post there, but don't just go there and vote for it! Take a look at the others too. (Then vote for it.) Deadline: Sunday night.

Wednesday, August 24, 2011

Bored with the Beguine?

I don't usually post things like this, but I want to get back into the habit of posting, and you must admit I have been pretty restrained with the "I watered my plants today" sort of blogging. However, I feel obliged to report that for the past few days the Roxy Music song "Do the Strand" has taken up residence in my skull and will not leave for greener pastures no matter what. Even now I hear therein the voice of Bryan Ferry, the thinking man's Freddie Mercury (did someone already say that, or did I just make it up?). I have no idea why this is, as I have not heard that song in, well, years probably. Such is life.

Thursday, June 30, 2011

Epistemic luck (intro)

When I was about 10 years old, I had a philosophical discussion – or disagreement anyway, as we did not get very much farther than simply stating our positions (and then again, more firmly) – concerning the definition of knowledge. My interlocutor claimed that he could know that 2 + 2 = 5, even though that statement is false. I replied, that no, you can't know something unless it's true. You may believe that 2 + 2 = 5, but that's not enough: you can't know it (even though, in stating your belief, that may indeed be what you'll say), because knowledge must be true.

Contemporary philosophical consensus on the subject comes down on the side of my youthful self: knowledge must be belief which is true (so there, D___ O_____!). However, it is also generally accepted that a third condition is also required. And by "generally accepted" I mean "virtually universally simply assumed, on page 3 of just about every Epistemology 101 textbook, where no dissent is ever even imagined, let alone conceded actually to exist, let alone taken seriously." And yet not only are there more than one of us dissenters (for so I am), but we do so in various ways for various reasons. Analytic epistemology can be as dreary a subject as there is in philosophy, but as it turns out this is really, really important to get clear on, for reasons I hope to make clear as we continue.

The proximate stimulus for this current effort was a post (now two actually) on the New York Times's online philosophy blog The Stone. Distinguished Notre Dame philosopher Gary Gutting there deigns to enlighten the great unwashed about the importance of justifying one's beliefs, as opposed, apparently, to believing any old tosh that enters one's head:
Apart from its entertainment value, Harold Camping’s ill-advised prediction of the rapture last month attracted me as a philosopher for its epistemological interest.  Epistemology is the study of knowledge, its nature, scope and limits.  Camping claimed to know, with certainty and precision, that on May 21, 2011, a series of huge earthquakes would devastate the Earth and be followed by the taking up (rapture) of the saved into heaven.  No sensible person could have thought that he knew this. Knowledge requires justification; that is, some rationally persuasive account of why we know what we claim to know.  Camping’s confused efforts at Biblical interpretation provided no justification for his prediction.  Even if, by some astonishing fluke, he had turned out to be right, he still would not have known the rapture was coming. [my bold]
True belief, in other words, is not enough for knowledge: we need justified true belief (JTB).

I can already feel my eyelids drooping, so let me say this just the once and I'll be done with it. Since my dissertation (which went into it at some length, to little avail) I have resisted writing about this stuff, because to do so requires what seems to be the sort of academic nitpicking and intuition-mongering that makes so much contemporary philosophy look pointless and arcane. But if we get this right the payoff could be immense (for some of us at least, and part of my point is that you may not yet know who you are), so I hope I can count on the reader's patience as we wade together through the bog.

A relative newcomer to the current epistemological scene, Duncan Pritchard already has a zillion papers (available on his website) and an important book on the subject. As he explains on p. 4 of Epistemic Luck, the idea that true belief does not suffice for knowledge is very widely held indeed, to the point of invisibility:
[A]s befitting its status as a universal intuition—what these days we philosophers tendentiously call a 'platitude'—one finds this thesis both everywhere and nowhere at the same time. That is, whilst this line of thinking is clearly being presupposed in much of contemporary epistemological thought, the thesis itself is rarely drawn up to the surface of discussion, and even then it is left to stand as it is: a pure platitudinous intuition that is in need of no further explication.
As an example Pritchard points us to the SEP article on knowledge by Matthias Steup, who faithfully reflects the consensus of the discipline with only a gesture in the direction of argument:
Why is condition (iii) [i.e., the justification condition on knowledge] necessary? Why not say that knowledge is true belief? The standard answer is that to identify knowledge with true belief would be implausible because a belief that is true just because of luck does not qualify as knowledge. Beliefs that are lacking justification are false more often than not. However, on occasion, such beliefs happen to be true. Suppose William takes a medication that has the following side effect: it causes him to be overcome with irrational fears. One of his fears is that he has cancer. This fear is so powerful that he starts believing it. Suppose further that, by sheer coincidence, he does have cancer. So his belief is true. Clearly, though, his belief does not amount to knowledge. But why not? Most epistemologists would agree that William does not know because his belief's truth is due to luck (bad luck, in this case). Let us refer to a belief's turning out to be true because of mere luck as epistemic luck. It is uncontroversial that knowledge is incompatible with epistemic luck. [again, my bold]
Steup continues, naturally enough, given the purpose of mentioning it at all, to discuss the effectiveness of the justification condition for its purpose, that of ruling out lucky guesses ("mere" true beliefs) as cases of knowledge, which leads to the next section of the Gettier problem.

Now Steup does seem to give an argument here, with his cancer example. But when we look at it we see that there is no argument at all. Even though for some unknown reason only "most epistemologists" would agree here (who dissents and why need not concern us here, as we are busy men), our platitude is "clearly" true, and, again, "uncontroversial" for no actually stated reason. I should add that given the actual state of the discussion at this point Steup is entirely justified [!] in telling the story this way for this audience. No reason stretching out an already long SEP article with tracking down every last objection and quashing it.

Okay, let me just close for today with this last quote. Even after a couple of objections to his JTB account in the comments, which we will discuss later on, Gutting came back in his most recent piece with this:
Plato long ago pointed out [aside: I love that rhetorical tic! If he pointed it out, it must be true – so I need not argue for it, right?] that it is not enough just to believe what is true. Suppose [oh good, another example] I believe that there are an odd number of galaxies in the universe and in fact there are. Still, unless I have adequate support for this belief, I cannot be said to know it. It's just an unsupported opinion. Knowing the truth requires not just true belief but also justification for the belief.
Again, no argument at all, just an example. It's just obvious!

All right, so I've established that the consensus view is the consensus view. Next time I'll start distinguishing the various ways one might object to this "universal consensus."

Wednesday, February 09, 2011

The past is never dead; it's not even past

I recently had occasion to dine at an establishment with which most people are at least familiar, as it is a fairly large chain. I speak of Applebee's. The food was fine; I am not a connoisseur of steak, but the Asiago Peppercorn [sirloin] steak was very tasty, and small enough to be a) eaten whole at one sitting, and b) probably accurately placed on the 550 Calories menu (served with veggies).

However, it is not the food that made the experience so memorable. It was earlier in the evening (around 6 PM) than most suburbanites dine, so the place was relatively empty, but even so (or perhaps because of this) the music was fairly loud. I went to high school in the 1970s, and as far as this restaurant was concerned, it seems that we have never left. Here, to the best of my memory, is the soundtrack to the meal for your imaginative perusal.

We arrived during Jimmy Page's extended guitar break in Led Zeppelin's "Heartbreaker", which continued, as is the custom, into "Living Loving Maid (She's Just a Woman)", the next track on Led Zeppelin II. I am leaving a track or two out, including a blues song which might have been Muddy Waters and an album rock track (clearly from that same era) which I did not recognize, but the time matches up about right so this must be about it. Continuing after Zeppelin:

J. Geils Band - Love Stinks
Fleetwood Mac - Rhiannon
Heart - Barracuda
Queen - Somebody to Love
Bob Seger - Hollywood Nights
ZZ Top - Gimme All Your Lovin' [hey, this one's from 1983!]
Tom Petty - The Waiting [1981]
The Cars - Bye Bye Love
The Beatles - Something
The Eagles - Hotel California [we left during Joe Walsh's guitar solo, which I even paused at the door to listen to]

While I never listen to this stuff voluntarily (although I do own Eliminator, from which I would have played "Got Me Under Pressure" or "I Need You Tonight"), I enjoyed this set perfectly well (except for Bob Seger, whom I can do without). In a weird time-capsule sort of way. And "Bye Bye Love" has been running through my head constantly since then. Still, I'm not going back there, Asiago Peppercorn steak or no.

Tuesday, February 08, 2011

Let's not read too much into this

Abbas just sent me this picture:


The title of his email was "Duckrabbit!", but when I looked at it, all I saw was a duck. I mean, it's a duck – it's got wings, it's swimming in water, the whole duck bit. But then I looked more closely at the head, trying to see it as the rabbit-looking-the-other-way which is the other aspect of the famously ambiguous drawing. I succeeded; but it turned out to be a particularly unpleasant instance of aspect-dawning, as that perception was accompanied by a sickening wave of, let's say, Unheimlichkeit, as perhaps expressed in the utterance "Yikes, that is one seriously deformed rabbit" (what with the wings and all).

Then I looked at it a third time, and I noticed that indeed, it was picture of a duck onto which a rabbit head had been Photoshopped. (No doubt those who are more familiar with these creatures would have noticed that right away, but I rarely see either in the, um, flesh.) Now, having noticed the rabbit head as a rabbit head when looking to see the image as a rabbit, I now see the duck as well (and can hardly believe I failed to do so earlier) only as a seriously effed up specimen of its kind – I mean, look at that soft furry beak!

So, a clever trick; but (to be way too literal about it) I think we lose something in the translation to photography. The duck part is clearly a duck and only a duck; and the rabbit part (once you see it!) is clearly a rabbit and not a duck. So in a way it's really not an ambiguous figure at all, just an impossible one – where the point of the duck-rabbit figure is that it really is a picture-duck, just as much as any other, and a picture-rabbit as well ... but not at the same time. And off to the philosophical races we go.

Tuesday, February 01, 2011

Baby steps

Okay, I said I was going to start 'er up again, but not surprisingly that is easier written than done. So let's start out small. I direct your attention to the Amazon widget, which I have restocked with more timely items than were in it previously. I'm up to page 587 or so of Anna Karenina, which means that after reading it for six weeks I am about a third of the way through. Right now (in Part Three) Anna's sister-in-law Darya Alexandrovna Oblonsky (aka Dolly) is talking with Konstantin Dmitrievich Levin, a rejected suitor for the hand of Dolly's sister Kitty, at Dolly's summer house. I expect we'll get back to Anna within the next hundred pages or so. (I should mention that the direct link to Amazon provided by the widget is to an edition of the book different from that indicated by the icon, so beware.)

I've already finished the other fictional work, The Half-Made World, which was pretty good but falls short of unconditional recommendation. It was recommended by a couple of people on a thread at Crooked Timber, where I have found they know their stuff, esp. when said stuff is science fiction-y; this book is an example of the subgenre known as "steampunk," if you know what that is. Without going into detail, I really like the set-up here, and the characters are great. I wasn't sure about the ending though, as it remains unclear whether the author is setting up a sequel or just allowing us to imagine for ourselves what happened next. Good stuff.

I've just started the Garry Hagberg book, which looks really good, if a bit intimidating in its thoroughness. The Cartesian subject/object dualism, which still pervades philosophy even after all this time, has two main aspects (duh). I say (duh), but in practice it seems very difficult to see them as related at all, let alone different aspects of the same thing. The Cartesian conception of objectivity is most directly manifested in doctrines of "metaphysical realism" in Putnam's sense (when arguing against it, that is, not when, Kerry-like, he was for it before he was against it). In such contexts we fight against it by trying to show how the Cartesian picture, in its seductive urging that we simply identify it with prephilosophical common sense ("of course there's a real world out there!"), forces incoherence upon us. I guess that's what we do in the other context too (where the bait is instead "of course my mental states are 'inner'!"), but the details end up making the two cases very different in practice. Anyway, in these former contexts we tend to spend all our time arguing about the possible senses in which the world is "real" or "independent" or "objective," and the Cartesian subject figures simply as the supposedly detached observer of the objective world however construed.

However, we can't arrive at a stable position no matter what we say about objectivity, unless we also deal with the Cartesian subject itself (qua Cartesian). Here our target is the Cartesian "inner," as manifested for example in Nagelian qualephilia or more overtly dualistic doctrines in the philosophy of mind (Chalmers, G. Strawson, etc.). But of course in this case the main charge against "dualism" has been led by materialists and other naturalists concerned to make the world safe for empirical science (here, brain science and other sorts of empirical psychology, including but not limited to evolutionary psychology). So what we end up with is a lot of straightforward reductionism/eliminativism and its more discreet heirs, all concerned to emphasize (properly enough as far as it goes) the publicity and non-spookiness of "subjective" phenomena like mental states. However, in its often overt scientism this line tends to leave in place, or at the very least not replace, the very conception of objectivity which is the subjective correlate of their target. This leaves them open to counterattack, although rarely in those very terms.

In any case these two anti-Cartesian projects have not really been brought into line with each other in a satisfactory way. Rorty and to a lesser extent Dennett (in his case you have to dig it out as he is not exactly forthcoming on the matter) have been onto this, but each takes some pretty important missteps by my lights. Davidson and Wittgenstein are more promising guides, even though – or possibly because – neither makes a big deal out of marrying the one criticism to the other, instead simply pursuing a more unified project from the beginning, and only subsequently allowing us to take this or that aspect of it for this or that purpose. (Or maybe we're just slow.)

Hagberg sees himself as promoting a Wittgensteinian view of subjectivity, as manifested in discussions of autobiographical writing, the philosophy of same, autobiography as philosophy, etc. – where the very overlapping of these topics is meant to bring out the Wittgensteinian nature of each, which seems promising. I read an earlier book, Art as Language, in which he argues, along Wittgensteinian lines, that art is not a language, making one wonder if perhaps another title would have been better. This new one, Describing Ourselves: Wittgenstein and Autobiographical Consciousness, looks to be written in the same style. I shouldn't complain (especially as I can do no better myself), but I have to say that this style – very, very carefully setting out the position and arguing very, very, carefully for its truth, or at least provisional plausibility, very, very carefully anticipating and defusing every. possible. objection. – makes me crazy. In some cases this is just what we want, but in Wittgensteinian contexts it really seems like one risks getting the words Just Right at the expense of messing the tune up big time. Alice Crary and Oskari Kuusela do it too, which is why I started their books with real anticipation but bogged down, like, right away. Like I said, though, I've just started the book so maybe I'm wrong.

One person who can't be said to do this is Cavell (but of course there are corresponding dangers to this approach too, as anyone who has tried to wade through Stephen Mulhall's books can tell you). His latest, which looks wild but which I have not yet begun, seems to be autobiographical, but only deals with the last few years, and only a particular line of thought/events within that time. I'll say more when and if I get to it (or not).

Lastly we have another book on Wittgenstein, on a topic very close to this blog's heart: aspect perception (and indeed there the d/r is right on the cover, along with the kid from Jarman's Wittgenstein, which I just saw the other day). It's a collection, and one of the editors, William Day, was a very advanced grad student in my program when I was there. This meant of course that I rarely saw him, and in fact I didn't even meet him until several years in, at a conference. Nicest guy you'll ever meet, and very impressive – as articulate on these matters as your humble blogger is stammeringly incoherent. So I very much look forward to reading this.

Monday, January 03, 2011

A note from the management

My goodness, I haven't been around since July?! Okay, right. Let me do a little dusting, and then let's see if we can't start the engine up.