Tuesday, February 20, 2007

Big Brother is Mining You

One of Amy Goodman's guests on this morning's Democracy Now! was Canadian human right attorney Maureen Webb, discussing her book Illusions of Security, Global Surveillance and Democracy in the Post 9/11 World. As a lawyer she had a strong command of both legal principles and specific cases, but she went further than this and discussed her own efforts to comprehend the nature of current data mining technology. This gave her the informed position of addressing the consequences of what the technology can do, what developers would like it to do, what it probably cannot do, and why all four of these issues are cloaked in secrecy, because just about all of the technology has been developed for private-sector organizations that need to keep the algorithmic details proprietary. The whole thing made for a highly chilling story.

Ironically, the chill factor was enhanced by today's report by Victoria Shannon for The New York Times (which I happened to read at SPIEGEL ONLINE). For all the threats to human rights and civil liberties that are rooted in this new culture of data mining, it would appear that there are some very strong sentiments in European countries, such as Germany and the Netherlands, to follow the American lead. Like it or not, Orwell's vision is now very much with us; and it appears to be with us on a global scale.

Fortunately, NPR brought on a guest who could talk about an alternative strategy for dealing with terrorist threats. This was Stephen Flynn, discussing his book, The Edge of Disaster. The most important message from Flynn's interview is that any effort to prevent terrorist acts is, at best, an exercise in frustration:

Officials should think less about security than about resiliency, Flynn says.

"We're focused almost myopically on preventing every act of terror, which is... frankly an impossibility," he says. "But what we can do, what we can control, is how we respond when terrorist incidents happen or when accidents happen."

Flynn's logic is that, if you can demonstrate that your system is less vulnerable to attack (meaning that, even if an attack occurs, it can come back "up to speed" quickly and with few ill consequences), then that resiliency contributes to lowering the likelihood of attack. I tend to buy into this, but I suspect it will not find much support from those who derive their power from keeping the general public hiding under their beds!

No comments: