Today's San Francisco Chronicle had a review by Mary Eisenhart of The Big Switch: Rewiring the World, From Edison to Google, by Nicholas Carr. The focus of the review is as follows:
Carr devotes the second half of his book to the study of unintended (at least by the innovators and cheerleaders) consequences of the 20th century's technological breakthroughs and likely parallels in the 21st's.
As one might expect, Carr views most (if not all) of those unintended consequences as running the gamut from unfortunate to downright dangerous; and, as the title suggests, he makes as much of a case for Edison's "electronic age" as he does for the "Internet age," which is the primary attraction for most of those "innovators and cheerleaders." Since I have now run up a count of 175 for my used of the "consequences" label, I was naturally interested in learning more about this book, particularly in light of the number of times I have used this label in conjunction with the "Google" label. More recently I have also been invoking the phrase "technocentric ignorance" when I write about such consequences. Thus, I was eager to read Eisenhart's review in its entirety. What fascinated me most, however, was her account of a game of text analysis that Carr played to make a point of the nature of consequences and how we think about them.
The game involves comparing two paragraphs. Unlike Eisenhart I would like to present both of them initially without attribution. Here is the first:
The more we teach this megacomputer, the more it will assume responsibility for our knowing. It will become our memory. Then it will become our identity. In 2015 many people, when divorced from the Machine, won't feel like themselves - as if they'd had a lobotomy.
Here is the second:
[As] machines become more and more intelligent, people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.
Note that, at a denotational level, both of these texts come very close to saying the same thing. Nevertheless, each text is embedded in a radically different context; and it is basically that context that determines the nature of the tone connoted by the text. In one case the tone is utopian, and in the other it is dystopian. Before reading further, see if you can find any clues in the "raw text" that will allow you to distinguish them.
Let me now continue the game by declaring the source of one of these paragraphs: It has been extracted from Ted Kaczynski's Unabomber Manifesto. This should be enough to let on that this particular paragraph is the one with the dystopian connotation. With this additional information, again try to figure out which is the dystopian paragraph before reading further.
All right, game over. Kaczynski's is the second of the two paragraphs. The first is by former Wired editor Kevin Kelly, cheerleader par excellence. Eisenhart wraps up her review by offering a final quotation from Carr's book and her reflection on that quotation:
"What was for Kaczynski a paranoia-making nightmare is for Kelly a vision of utopia," he writes - and it's a fact that should give us all pause as we rush headlong into the connected future.
I could not have said it better myself!
No comments:
Post a Comment