My recent "Confluence of 'Glitches'" piece suggested that the sorts of systems failures that have recently plagued institutions as large as United Airlines and the New York Stock Exchange may be traced back to a major nosedive in what, when I was both teaching and research computer science, used to be called "software quality." Indeed, it has been so long since I have encountered that phrase that I have to wonder whether or not it still has any currency in current software engineering practices. The fact is that our day-to-day life is more and more dependent on software embedded in the devices and systems that we now use heavily. This is no longer a question of OS X software not being as good as it used to be. Within the last 48 hours I had an encounter with a security guard dealing with screen freeze on his cell phone. I have a Blu-ray player that keeps telling me to wait while it upgrades its software; and even after that "improvement" it still freezes on some Blu-ray discs. Comcast recently decided to implement a "smarter" HDMI interface, which has now been upgraded to make it worse than it was when I first used an HDMI connector. To add insult to injury, OS X Firefox can no longer display Blogspot pages with the utility bar at the top of the screen!
Recently, the Business section of the San Francisco Chronicle ran a story about the rise of "coding boot camps" in the Bay Area. This was a corporate-based response to the observation that coding talent is not what it used to be. Still, there is it worth asking just what sort of training those boot camps are providing? Could it be that software has gone down the tubes due to a competitive environment in which fast delivery is more important than product reliability? (After all, Microsoft conditioned us to expect that all software needs periodic updating.) Is this a sign of just how badly we are at the mercy of technology that no longer works according to plan?