I see that, even in a time of economic crisis when there is great reluctance to commit significant funding to fantasy, the term "Web 3.0" is beginning to weasel its way into the working vocabulary of the technical press, often as a way of trying to breath life back into that beast-that-will-not-die, the "Semantic Web." I have long argued that the very concept of a Semantic Web is a product of the unfortunate collision of reckless thinking about semantics with a World Wide Web that turned out to be far messier than its credited inventor, Sir Tim Berners-Lee, either expected or now seems willing to acknowledge. What I had not realized was the extent to which that reckless thinking about semantics has a long history that extends back at least ten years before Sir Tim was a gleam in his parents' eyes.
I have the Center for Dewey Studies to thank for this historical perspective, particularly through their publication of the volume Knowing and the Known, a collection of papers by John Dewey and Arthur F. Bentley (most of them jointly authored). The first chapter in this collection, "Vagueness in Logic," was written by Bentley and first published in The Journal of Philosophy in 1945; and Bentley uses this study to expose how some of the most reputable minds of the first half of the twentieth century had been incredibly sloppy in their use of words such as "proposition," "truth," "meaning, "language," and (yes, indeed) "fact." About the only one of those minds that does not get a thorough drubbing from Bentley is that of the Polish logician Alfred Tarski, whose work Bentley describes as "like a breath of fresh air after the murky atmosphere" of the other logicians he has examined. Bentley even allows us to sample that air with the following bit of Tarski's text (translated from the Polish):
It is perhaps worth-while saying that semantics as it is conceived in this paper (and in former papers of the author) is a sober and modest discipline which has no pretensions of being a universal patent medicine for all the ills and diseases of mankind whether imaginary or real. You will not find in semantics any remedy for decayed teeth of illusions of grandeur or class conflicts. Nor is semantics a device for establishing that every one except the speaker and his friends is speaking nonsense.
Why should I strain to aim my best polemic efforts at the likes of Berners-Lee when a deceased Polish logician has already said all that needs to be said about his visions? More important is the question of why this fixation on the Semantic Web as "a universal patent medicine" should be so persistent.
My proposed answer to this latter question is to blame everything on computer science education (having been part of the early "strike force" to design and implement both undergraduate and graduate curricula for this would-be degree program). Back in the day (as we now say), it seemed as if the best way to understand the ultimate capability of computer software was to study the theory and practice of compiler construction, through which one could get the computer to do what one had expressed in some "programming language." The theory side of the discipline could be broken down into two sub-disciplines:
- Syntax: The study of how one recognized the "well-formed forms" of expression in a programming language and could then describe their well-formedness by parsing them into structural representations.
- Semantics: The conversion of each of those structural representations into a sequence of operations grounded in the "machine language" of the computer that would actually be running the program.
In the world of compiler construction, the concepts of syntax and semantics were "very fresh and clean" (with apologies to Robert Wilson). Programming languages were never ambiguous; and they could only be literal, never figurative. Indeed, the theory was so "fresh and clean" that it did not take long to discover that, given the right formal description of a programming language, one could go so far as to "compile a compiler" for it based on a equally formal description of the capabilities of the computer that would be running the program.
The result of this triumph of technology was that the technologists assumed they now knew "all about" syntax and semantics. In the words of Michael Feingold's translation of Bertolt Brecht, they screamed "I've mastered it without half trying," when, in fact, the compiler-compiler itself offered so little insight into the roles that syntax and semantics play in how intelligent beings actually communicate that it now stands as prime example of what José Ortega y Gasset, in The Revolt of the Masses, called "the work of men astoundingly mediocre, and even less than mediocre." Where technologists sought the crystal clarity of engineering in their understanding of human communication, they served only to muddy the waters; and these days it seems that the increase in both available data and computational power does little than muddy those waters even more.
If we want to filter out that mud, we would do well to go back to Bentley's critique, rather than trying to seek out new technological tricks to prop up such a blatantly deficient understanding of what we know and how we communicate what we know. We should begin with the paragraph with which Bentley concluded his article:
This problem [of our understanding of knowledge and communication], we believe, should be faced naturalistically. Passage should be made from the older half-light to such fuller light as modern science offers. In this fuller light the man who talks and thinks and knows belongs to the world in which he has been evolved in all his talkings, thinkings and knowings; while at the same time this world in which he had been evolved is the world of his knowing. Not even in his latest and most complex activities is it well to survey this natural man as magically "emergent" into something new and strange. Logic, we believe, must learn to accept him simply and naturally, if it is to begin the progress to future demands.
With any luck (and with the even fuller light that today's modern science now sheds) we may then transcend the work of the mediocre and prevent it from engendering further disappointingly ineffective mediocrity!
No comments:
Post a Comment