I found Jason D. O'Grady’s piece about Apple Television in The Apple Core department of ZDNet to be a useful read. If nothing else it offers a reality check on all that hagiography now being circulated about Steve Jobs. O’Grady begins with Jobs’ “vision statement,” as it was documented in Walter Isaacson’s biography:
I’d like to create an integrated television set that is completely easy to use. It would be seamlessly synched with all of your devices and with iCloud. It will have the simplest user interface you could imagine. I finally cracked it.
This clearly has high “cool” value; and I can even imagine Jobs promoting it to a rapt audience of the Apple faithful. However, whatever his gifts for interface design may have been, this may have been a case where he ignored (either willfully or accidentally) both the question of how we now interact with our television sets and whether we want our television set to look like a computer.
Personally, I am not sure I want my television set to be a computer, particularly after what things have been like since I got a Blu-ray player with Wi-Fi connectivity. The bottom line is that the Internet-based services are not quite at the level I want yet; but that is not what bothers me. The real problem is that this device keeps upgrading its software, and this can be disruptive. If I want to play a Blu-ray or DVD, I have to wait for the upgrade to complete. If I want to play a CD, I do not have my television turned on; so I don’t know whether the player is unresponsive because it is in the middle of an upgrade.
What these new Internet-based devices seem to have overlooked is a fundamental axiom:
Sometimes you just want your television set to be a television set.
Whatever my misgivings about Comcast may be, they seem, for the most part, to handle upgrading the set-top box software in ways that do not disrupt my television viewing. The question is not just whether the Apple Television interface is “the simplest user interface you could imagine.” The question is whether there are certain fundamental aspects of operation that make a computing device significantly different from a television set or a CD player, and software upgrades have a lot to do with the nature of that difference.
I like to think of this as a “functionality gap.” I am sure that this gap will eventually close, but I suspect that the act of closing it will involve a fair amount of pain for first adopters. Naturally, first adopters tend to be good at living with that pain, particularly if their feedback is honored. Steve Jobs did not always have the best opinion of others, and that may well have included much of the Apple user community. Nevertheless, he did appreciate the value of learning from the decisions you make, both the good and the bad. I have to wonder whether post-Jobs Apple will be able to negotiate the learning path that may ultimately lead to the way in which our computer ultimately supplants our television.
Meanwhile, I have a bright idea of my own. I like the way in which service providers like Comcast are trying to make it possible for me to see on my computer screen the same things I can see on my television through a set-top box. There are still a lot of bugs in the solutions in place; but I get the impression that there are a lot of decision-making heads out there that know how to keep their eyes on the prize. Perhaps one step in the right direction will be to view that television-to-computer connection from the other side. What would it take for my set-top box is support a virtual remote terminal to my computer, just as software currently exists to have one computer serve as a remote terminal to another? Such a service would get set-top box users comfortable with the idea of letting their television set also be their computer, which would provide a user community better equipped to provide feedback to those development teams trying to close the real gap in question.