3 days ago
[Vagueness] seemed to present the strongest challenge to the classical, realist picture that has always rung true to me, on which the world is largely independent of us, and the principle of bivalence holds ― every proposition is either true or false (and not both), even if we do not and perhaps cannot know which ― and other standard principles of logic hold too. The problem was that, on an unqualified realist picture, there must be a point at which subtracting just one grain from a heap takes it from being true to being false that there is a heap in front of you, which seems to be incompatible with the vagueness of the concept of a heap, which has no precise definition. For a long time I could see no satisfactory way round that objection. Then, as I was finishing my first book, Identity and Discrimination, I started thinking about the way in which ordinary knowledge requires a margin for error. It dawned on me that the need for a margin for error would explain why, even though ordinary concepts have sharp boundaries, we can’t know where those boundaries are located. That explanation solved the main objection to the logical view that I had always wanted to hold. So the hard part was working out the epistemology; the logic was the easy bit. The larger purpose underlying my book Vagueness was to argue for realism like this: if realism is wrong about anything, it is wrong about vagueness (that premise was generally agreed); but realism is not wrong about vagueness; therefore it is not wrong about anything. [my bold]Well, that's one view of the matter, anyway. Or we could just marvel at how no nettle can be too sharp for the desperate realist to grasp. I had heard this before, actually – that he was trying to defend realism against what seemed to him to be its toughest challenge – but sometimes it's better to learn to crawl before you try to walk.