These days just about every major news organization that manages a Web site also manages a team of bloggers. In some cases those blogs provide a sort of appendix to the published editorial page, providing staff writers with an opportunity to write longer and more frequent contributions. In other cases these bloggers are recruited from outside of the staff, often with little (if any) compensation, motivated primarily by the benefit of appearing on a site that attracts a large number of readers. In the CNET organization this is called the CNET Blog Network; and, since news related blog posts show up on my CNET News.com RSS feed, I would say that they make good on the promise of drawing large numbers of readers to their bloggers.
This morning an RSS headline (viewed through Google Reader) attracted me to an article on Chris Soghoian's Surveillance State blog. However, before I get into the relationship between Soghoian and CNET, I want to observe that a statement about the (lack of relationship) between him and CNET appears twice on his blog page:
Christopher Soghoian delves into the areas of security, privacy, technology policy and cyber-law. He is a student fellow at Harvard University's Berkman Center for Internet and Society , and is a PhD candidate at Indiana University's School of Informatics. His academic work and contact information can be found by visiting www.dubfire.net/chris/. He is a member of the CNET Blog Network, and is not an employee of CNET. Disclosure.
I would take this as a model statement of both the Blog Network itself that the sort of bloggers who participate in it, everything nicely above-board and apparently important enough to CNET to be stated twice on the same page.
The headline that attracted me was:
Recovery.gov shuns transparency, blocks Google
Given the extent to which I have been interested in (and often critical of) the relationship between the new Obama Administration and the Internet, this headline was hard to resist, since it offered at least a hint of hypocrisy in what had appeared to be a major differentiating feature of the changes that Barack Obama had promised to bring to the White House. The primary support for the claim in the headline appears in the following excerpt from the blog post:
Although the site is advertised as proof of the president's commitment to transparency, its technical design seems to betray that spirit. Most importantly, the site currently blocks all requests by search engines, which would ordinarily download and index each page to make the information more accessible to the Web-searching public.
The site's robots.txt file has just a few lines of text:
# Deny all search bots, web spiders
User-agent: *
Disallow: /Although the White House Web team did not immediately respond to a request for comment, the single-line comment at the top of the file indicates that the blocking of search engines is no accident but rather a statement of policy.
Now two points about this report caught my attention on strictly journalistic grounds. The first was the reminder that blog posts and similar products of what Timothy Egan, in his blog for The New York Times (for which he used to work and has now retired), called "Web info-slingers" do not constitute journalism (a case that Egan made compellingly in his "Save the Press" blog post). As such, we, as readers, have no idea whether or not bloggers trying to get a response from the subject of a post are aware of, let alone practice, the standard procedures of journalism. Thus, the final sentence of that excerpt cannot be read as a source of legitimate content. It may be well-intentioned; but it is also gratuitous.
However, that sentence does bring us over the bridge to the relationship between the CNET Blog Network and the (presumably) professional journalists on the CNET staff. This bridge was crossed by a thread of comments from readers with experience in Web design. The thread was initiated by the following anonymous comment:
When I have been working on a new website, I have configured the robots.txt file to disallow indexing. Then when it is ready to go production, I change the file to allow indexing. It is always my fear that I will forget to change this. I wonder if this task got forgotten? Has anyone contacted the recovery.gov web master to confirm that this was their intention?
This was followed by a comment of agreement from "jon_abad," who took the question of appropriate practice to the next level:
I would hope that a news organization like Cnet would bother to ask the White House's web folks for a comment in order to determine if the spider blocking in robots.txt was on purpose as opposed to just posting conjecture.
Alas, I fear this comment is "a consummation/Devoutly to be wished." I have read enough contributions to the CNET Blog Network to recognize that CNET takes a minimal approach to editing their bloggers (probably somewhat along the lines of the approach that Wikipedia takes to its contributors). However, in this case the question arises as to whether or not anyone on the CNET news staff takes the time to read what those bloggers write. The content of this post deserves the treatment of serious journalism to determine whether this is really a "story with legs" or an alarmist reaction to standard Web design practice.
1 comment:
On a related note, I recently surfed over to the Citizen's Briefing Book, only to find it is no longer accessible. Access has been blocked before archive.org got around to it. Whether the website was purely a public relations move, or the government just doesn't Americans to know that Americans want to know about UFO's and decriminalized marijuana, I can't say, but I'm definitely disappointed that the White House has prevented history from interpreting this document.
Post a Comment