Wednesday, November 18, 2009

The Complexity of the Abuse Problem

This morning the BBC NEWS Web site ran an interesting report about the problem of abusive practices, particularly towards children, in the brave new world of social software on the Internet. Given that the undiscriminating embrace of this technology may be equaled only by the mindless evangelizing that continues to promote it, these stories are valuable. Nevertheless, this particular account reflects a bias that may not be particularly productive. Here is how the story opens:

Major social networking websites have been criticised for not introducing a help button for children to report concerns about grooming and bullying.

Jim Gamble, from the Child Exploitation and Online Protection Centre (Ceop), hit out at the sites as one site, Bebo, adopted the button.

He said there was "no legitimate reason" why other sites like MySpace and Facebook had not done the same.

As Bill Clinton would have put it, I feel Gamble's pain. I have long argued that the Internet is a hazardous place whose dangers have been consistently overlooked or downplayed by social software evangelists, and most of my attention has been towards hazards to adults. By all rights the risks to children should be even greater, and we need voices like Gamble's to raise consciousness about those risks.

Nevertheless, I fear that Gamble may not entirely grasp the nature of the technology. The Bebo "button," which is illustrated on the BBC NEWS Web page, is an instance of what tends to be called reporting technology. Most of us have encountered it in some setting or another. Indeed, anyone reading this should be aware of the technology, because it is at the top of the page. If I write something that offends, then a reader can click on the "Report Abuse" hyperlink to notify the Blogger support team that I have done so; and, whatever my past rants and inquiries into the dark side may have been, I have tried very hard both to edit my text before submitting it and to stay on the right side of the boundary of normative social practices.

The rub, however, resides in that second infinitive phrase. Where is that "boundary of normative social practices?" When reporting technology is engaged, it is basically a request for a judgment on where a particular item sits with respect to those practices. How does such judgment take place? More importantly, in the midst of the heavy volume of content flowing through the Internet pipes, how can each such judgment be made both effectively and efficiently?

The bottom line is that there are no good answers to these questions, so Gamble's indignation has missed the point. Where MySpace and Facebook are concerned, the question is not whether or not they choose to adopt a simple button-based reporting technology. The real question has to do with what happens when the button is clicked? What sort of account (and, once again, the concept of λόγος from Plato's investigation into the nature of knowledge in "Theaetetus" rears its head in the world of Facebook) is elicited when abuse is reported? Then, what happens to that account (even if it is nothing more than "the abuse button was clicked") after it has been submitted? To offer a reductio ad absurdum example of just how vulnerable this process is, I recently learned of a situation in this brave new world of outsourcing in which reports of abuse were being read by individuals who did not understand the language of either the content or the account very well.

So, while it may distress Gamble, there is, indeed, a "legitimate reason" why MySpace and Facebook have not jumped on the reporting technology bandwagon. Worse yet, it is unclear that Bebo is quite the paradigm of vigilance that Gamble would like it to be. Consider the following excerpt from a BBC Panorama report that I cited this past March:

Internet ratings company Nielsen claims that Bebo, with its one million Irish users, was the most popular site in Ireland after Google in 2007.

Sectarianism on the site hit the headlines after threatening posts surfaced following the 2006 murder of Catholic school boy Michael McIlveen in Ballymena.

Three years on, and some pages on Bebo brazenly continue to promote violence.

Has this situation changed since March; and, if so, was the change the result of the adoption of the reporting technology that Gamble so admires?

As I see it, the real problem with social software is that problems of offense and abuse are human problems for which humans have to be in all parts of the loop from the very beginning. The most dangerous consequence of Internet volume has been the gradual erosion of person-to-person "Contact Us" mechanisms in favor of alternative technologies, such as blogs in which users can discuss problems among themselves (which may or may not be monitored my the technology support team) or FAQ pages (where one may be able to vote on how informative they were without any confidence that one's vote counts for very much). The erosion is, of course, understandable. There are just not enough individuals available for "contact" to keep up with the load of users trying to make contact.

Thus, conditions are such that technology providers search frantically for bandages because no one seems to have the resources to analyze the nature of the problem and think about solving it. One would think that there are plenty of educated people out there who would be more than happy to work the problem. Is it then a question of not wanting to put budget into those resources? If so, then we are back in the days of the Ford Pinto, when managers decided that the cost of settling with the victims of a defective product was lower than the cost of repairing the product. I wish I could say that this is another instance of a farcical repetition of history; but, where the safety of children is concerned, this just is not the case.

1 comment:

smartnews said...

Thank you for your article on child abuse. We have information on child abuse and ritual abuse at
http://ritualabuse.us

also see http://childabusewiki.org