Thursday, November 6, 2008

The Machine Gets Confused

When I wrote a meditative reflection on E. M. Forster's short story "The Machine Stops" last May, the point I was trying to make was that we have become victims of those who spend too much time promoting what a technology does and not enough (if any) time addressing the consequences of what happens when the technology fails to do what it is supposed to do. Our victimization takes the form of a dependency (which I continue to suggest may best be described as addictive) on what the technology does, to the exclusion of thinking that, whatever that task may be, there are other ways to do it. Thus, the sorts of consequences I have in mind often amount to a form of withdrawal (as is the case when, for example, our electronic mail service is disrupted); but I have also suggested that the consequences may involve abandoning any "sense of reality," if not just plain old common sense.

For an illustration of this latter phenomenon, consider, as a case in point, the following story about some unfortunate newlyweds from today's SPIEGEL ONLINE:

After a wedding in the town of Hamm, just east of Dortmund, the couple set off for a hotel in a rustic village called Willingen. They switched on their car's navigation system and proceeded to follow instructions.

Unfortunately, the navigation system seemed to have no better idea of where to go than the couple had. The newlyweds found themselves driving along a bumpy, unpaved forest road toward a tall mountain. Even when a barricade blocked further progress, the navigation system led them forward. But when they tried to drive around the roadblock, their car got stuck.

Not wanting to spend the night in a pitch-black forest on the side of an 840-meter (2,755-foot) mountain, the couple called the police. By now it was about 8 p.m. The police needed another two hours to find them, since the couple was unable to say exactly where they were.

Finally, though, the cops were able to get the car back onto a main road and lead the couple to their honeymoon hotel -- where they checked in just before midnight.

The common sense part is best illustrated by the driver's decision that the car's navigation system "knew more" than a physical barricade. Put another way, the "word" of the technology was accepted without the slightest shadow of a doubt, to the point that the driver was more willing to doubt the "message" of the barricade.

This also tells us something about how the technology was designed, which is that it apparently could not tell the driver, "I don't know." In turns of my attempts to analyze what I have called "service pathology" (which is as applicable to service-providing technology as it is to human service providers), this was a case of the "pathology of ignorance" in which the consequences were pretty dire. However, just as serious was the driver's dependence on a "navigation service" for getting to a destination. As I had suggested around the same time that I was brooding over Forster, we have become a culture whose ontology no longer seems to have room for the good old-fashioned paper map and are therefore just as vulnerable as all those in Forster's story who were helpless when their all-serving machine stopped.

This is usually the point at which at least one technology evangelist invokes the logic of the National Rifle Association: Machines don't make mistakes; people make mistakes! This may be true, but it is not helpful. Was this "honeymoon nightmare" story an account of bad design or negligent implementation? Whether the designer failed to allow for the possibility that the system would accept an address it could not process or the coding team did not classify the hotel address properly makes little difference from the point of the user. It is a little bit like the automatic checkout system at my local Safeway, which works great as long as a human supervisor is present. That supervisor has been trained to a level of skill where, when things are working, (s)he can manage four checkout stations. Unfortunately, last weekend I saw the consequence of the confluence of a system failure with a supervisor unprepared for that failure; and that poor Safeway had to worry about longer lines for fewer human-run checkout aisles! Thus, if we choose to call either the German or the Safeway case study a problem of people making mistakes, they are making those mistakes under the aggravation of a technology that was supposed to "make things better!"

The SPIEGEL ONLINE story also demonstrates that this is not strictly an American problem. It is a problem that can arise in any culture victimized by technology addiction, which means that, thanks to globalization, it is a world-wide problem. Unfortunately, as was the case with Forster's fictional culture, we seem more inclined to live with the problem that to try to do something about it.

No comments: