The father of a good friend of mine once lamented that his son entered college as a Republican Windows user but graduated as a Democrat and, perhaps worse, a Mac lover. The two conversions are parallel because Apple fans are the progressives of the computer industry, who believe that computers will continue to become not just gradually better but dramatically better. These improvements, they have been taught, arrive as dramatic jumps in power and ease of use every five or 10 years -- the digital equivalent of punctuated equilibrium, to borrow a term from Stephen Jay Gould.
Their quixotic faith in this evolutionary process has persisted even though the next quantum leap is now five or 10 years overdue. For some, the release of Apple's new operating system, Mac OS X, was going to be the next jump. But Apple's public beta test of its biggest OS update since the Mac was first released in 1984 may throw cold water on notions that the computer industry is due for another big advance and that Apple will be the company that makes it happen.
It's common knowledge among fans of Christmas and ketchup that waiting for the good stuff is half the fun. But after too lengthy a period of waiting, we begin to doubt that the delivery will be made and begin to resent the thing we wanted, as if the ketchup had solidified in the bottle. Eventually we stop shaking and tapping and just eat the hamburger plain.
Mac fans used to enjoy waiting for the Next Big Thing because they were almost always rewarded. In the '80s and early '90s, what the Mac delivered really was big. In fact, with the exception of the Internet, what Mac came up with constitutes the modern conception of "what computers do." Revolutionary trends invented or first capitalized on by Apple during this period include the graphical user interface, networking, plug and play, 3.5-inch floppies, CD-ROM drives, digital imaging, digital video and laser printers.
And that's just a start. This proliferation of paradigm shifts seemed as if it would continue forever at a constant or even increasing rate. But of course the pace of change for the better in the computer world had to slow, as it has in every new industry before it. Just because personal computers sprang seemingly from nowhere into the forefront of pop culture does not mean they are exempt from the same cycle as every previous miraculous technology, from flint arrowheads to blow-dryers.
The dynamic at work is a simple one. Inventors mostly refine previous ideas by building the proverbial better mousetrap. But every once in a while, when the time and place are just right, they see the need for a mousetrap in the first place while everyone else just puts out some grain at night to keep the mice from climbing into their shoes. This second kind of invention doesn't simply improve on processes we are already familiar with, but creates entirely new ways of doing things or entirely new things to do. And because the invention is new, there are usually immediately obvious ways to make large improvements upon it that turbocharge its early development. Gradually the refinements become less and less exciting -- certain users may even perceive them as backward steps.
Which brings us back to Apple, the arguably deserving poster child of innovation in the realm of the PC. The advancements that Apple created or adopted in the '80s were so important to the computer industry that it became axiomatic that some company, probably Apple, would periodically continue these great leaps forward. Given the quantum leap from typing lengthy, arcane commands to selecting sensible actions from a pull-down menu, anticipation was high as to what would come next. Mac lovers settled in with popcorn, eager to witness the next quantum leap in the relationship between man and machine, while DOS/Windows users were just happy that they could use their cheap PC as a cash register and a payroll system.
But something funny happened on the way to the revolution: Things continued to gradually improve. Microsoft borrowed whatever major features of the Mac OS it was technically able to. Both companies then simply refined and refined. Meanwhile breathless pundits and well-meaning R&D departments began searching for bandwagons to jump on: speech recognition, artificial intelligence, handwriting recognition, virtual reality -- anything that hinted at a new era of computing. One by one each promise proved mostly empty. Processors kept getting faster but only gamers could figure out what to do with them. Anticipation fatigue set in. Soon the only thing left to do was take all the suddenly boring computers and make them more interesting by hooking them all up to one another via the Internet. But a Web browser, which does a lot for person-to-person and person-to-database interaction, does depressingly little for person-to-computer interaction, just as the installation of a CB radio fails to qualitatively alter the mechanics of the driving experience.
Even while the Internet was barging its way into the zeitgeist, many people still felt that Apple might be able to pull off the next paradigm shift. Apple labored for years on various tantalizing "next generation" OS technologies (code-named Copland and Gershwin) but failed to produce the promised new system. Then Steve Jobs returned. He implemented not only a fantastic idea for remaking the company by giving Macs a lower price point and colorful translucent cases but also a concrete plan for a completely rewritten Mac OS based on software from his old company, Next. Suddenly a lot of people's expectations got too high again.
OS X has been available to the public in beta form for weeks now, and its unveiling brought a collective cry of "Huh?!?" Whatever it is, it is definitely not a completely new and revolutionary way of dealing with computers. All the new stuff is either great but invisible plumbing or skin-deep candy coating on things the Mac OS has provided for years. OS X is still primarily about pointing and clicking and files and folders and windows and menus, which is why it has received such a lukewarm response. From Microsoft, people expect, at best, incrementally less annoying. From Apple, they demand insanely great.
But that demand is unfair to Apple and to the industry as a whole. The computer business is growing up and doesn't need anyone to drive it to soccer practice anymore. Its growth spurts have gone away along with its bad complexion. Apple did a lot of amazing things in the '80s, as did many other companies. It was an extraordinary time in which an enormous number of factors came together to create an explosion of development. It's unlikely that a sea change in the basic operation or functionality of computers will occur on that scale ever again. While it's true that we would have a difficult time recognizing a year 2030 model PC transported back in time to us now, it won't be because computers changed quickly. Instead, a gradual metamorphosis will transpire as one feature is added here and another becomes outmoded there.
The tech industry as a whole has been melancholic of late. The major trade shows, the closest the industry gets to religious festivals, come and go with little of the traditional secular fanfare announcing exciting new products. Software and hardware sales are down, and if tech stocks were any more bearish they'd be on television trying to prevent forest fires. Apple, in particular, is suffering from deflating stock prices and plummeting profits. But is this really Apple's fault? Perhaps the current anomie afflicting all things digital can be traced to a realization in the collective unconscious that we're all going to be pointing and clicking and shuffling files around for a long time to come. Maybe the revolution is over.
Shares