To make a sweeping, possibly unfair generalization about an entire swath of humanity, computer geeks come in at least two distinct subspecies. One is familiar from popular culture -- the unkempt, hairy, paunchy recluse lacking in social graces. An ugly stereotype, to be sure, but these people do exist. Less well-known is the second kind of geek, the kind of guy Julius Caesar feared -- the lean and hungry geek. These geeks come in compact packages, thin and wiry. They sport close-shaved goatees rather than long hair and rabbinical beards. They regard the world with blazingly intense eyes, taking in everything, evaluating it, wondering how to fix it. These are the visionaries, the geeks who don't like to waste their time doing unimportant, boring stuff. Their impatience is reflected physically: Their bodies shiver with a nervous, tightly contained energy, just waiting to explode into the "flow" of all-night coding sessions, or, in the case of Neal Stephenson, 900-page novels.
Lean and hungry geeks tend to be ambitious, and "Cryptonomicon," Stephenson's latest book, fits the bill -- it's an insanely ambitious techno-thriller/historical novel that critics are mentioning in the same breath as Thomas Pynchon's "Gravity's Rainbow." It's being labeled Stephenson's "crossover" book, mainly because it doesn't fit neatly into the same science-fiction slot as his last two novels, "Snowcrash" and "The Diamond Age." But Stephenson hasn't actually left his home territory. If anything, the truth is the opposite. "Cryptonomicon" is clear proof of Stephenson's ongoing intention to delve ever deeper into the heart of the digital era, to lay out in detail both excruciating and poetic the awesome influence the computer has exerted on the 20th century.
"Cryptonomicon," like "Gravity's Rainbow," is partly set during World War II. But it also takes place in the present day. The drama set in the earlier period centers around the invention of the digital computer as a code-breaking means of thwarting the Nazis. Meanwhile, in 1999, descendants of the WWII characters are gallivanting around the Far East, using computers in a venture-
I caught up with Neal Stephenson as he came through San Francisco on his book tour. He's not an easy interview (lean and hungry geeks rarely are) -- but I already knew that from previous interactions. In person, he is measured and restrained, judicious almost to a fault, offering a sharp contrast to the flamboyant exuberance that makes his novels such giddy, enjoyable rides. He doesn't get carried away. He doesn't like interpreting his own work. He's heard most of my questions too many times already. It is all too obvious that he is only meeting with a reporter because he is doing his duty promoting his novel, that he'd much rather be back home in Seattle pounding away at his keyboard, at work on his next delirious masterpiece.
His attitude is not uncommon in writers or, for that matter, computer geeks. And there's no question that Stephenson is both. His credentials as a writer don't need repeating. But his identity as a geek may run even deeper. He comes from a family of engineers and physicists, on both sides, and he was programming in BASIC when he was 15. He's been programming all his life -- he once even wrote an image processing program for the Macintosh -- but never as a paying job. He certainly doesn't try to hide it. When I meet him, he is wearing a T-shirt with the word "hackers" emblazoned across the chest.
I want to find a way to unleash the energy lurking inside of Neal Stephenson, the lean and hungry geek, but he's too cagey for me. He anticipates the direction of every question. He even prefaces his answers with framing meta-commentary: "This is where I'm going to be annoying," he says; "this is where I'm going to be evasive."
"This is where I'm not going to be helpful."
He's never actually annoying. He's quite civil, and he seems genuinely apologetic when our time runs out. But you sure don't want to ask him dumb questions. Like, is he happy with the comparison to Pynchon?
"I'm not unhappy with it," he says. He doesn't actually roll his eyes, but you can feel him thinking, "Well duh, what ambitious writer who just wrote a technologically obsessed book set partially in WWII wouldn't want to be compared to Pynchon?"
Well then, what about the problem of whether mainstream readers will be taken aback by the lectures on the mathematics underlying cryptographic theory, which Stephenson includes in his novel. Did he worry about that?
"Not really," he says. "When it comes down to it, the few pages of the book that have equations on them don't contain a whole lot of plot or character development. So if you skip over that stuff you miss very little."
Indeed, to refrain from including the hard stuff, says Stephenson, would be tantamount to giving into what he sees as a popular reluctance to face up to the implications of technological progress.
"To me it seems like there is a kind of a strange denial in a lot of our culture, about just how important science and technology have been this century," says Stephenson. "There's just an unwillingness to come to grips with it at all. I don't deprecate people who feel that way, but I do think that at the end of a century like this one it's not the end of the world if you toss an equation into a work of art."
"Cryptonomicon" is a book about many things -- World War II, the Philippines, venture capital and the high-tech economy, to pick just a few -- but the axis around which everything revolves is precisely that issue of how important science and technology have been -- as viewed from "the end of a century like this one." The novel's journey back in time follows directly, Stephenson says, from his ruminations about the future.
"The more I thought about the future of computing the more interesting it was to consider the history of it. This is true not only in computing but in a lot of areas. Maybe we could have known more about what was going to happen in the Balkans if we paid more attention to the history there. I started feeling the need to put things in a longer historical context."
Part of that historical context is the rise of what Stephenson calls "hacker culture." Until very recently, the culture generated by computer hackers was an underground phenomenon, usually misunderstood by the mainstream as something illicit and vaguely dangerous.
Stephenson's own success is one sign of the changing times. Before the emergence of the Internet as a mainstream phenomenon, before Wired Magazine suddenly promoted the prominence of the computer geek as a cultural icon, publishing "Cryptonomicon" as a major hardcover release would have made no sense. But the undeniable importance of techno-culture at the close of this century has spawned a widespread popular desire to understand how we got here. "Cryptonomicon," which Stephenson envisions as just one installment of a series of novels taking place in the past, present and future, makes eminent sense when seen as a response to this social hunger.
Both the novel and the hubbub of contemporary computing culture are fueled by the energy that seethes though one crucial contradiction -- the computer's significance as a tool for both hiding meaning from view and for enhancing access to information. The word "code," after all, can refer to something that has been encrypted and hidden, but it also refers to the basic building blocks of a program -- something that, with a little knowledge, is right out there in plain view.
If Stephenson's obsessions with the meaning of code weren't obvious enough from "Cryptonomicon," all one has to do is consider what Stephenson decided to do after finishing his humongous tome. He took a little "break" and dashed off a 40,000-word meditation on the cultural significance of computer operating systems, "In the Beginning was the Command Line." Not only does the essay illustrate just how logically "Cryptonomicon" follows from Stephenson's earlier work, but it also provides useful clues on how to view the evolution of Stephenson's entire body of work.
"Ever since the Mac came out," writes Stephenson, "our operating systems have been based on metaphors." But as Stephenson matured as a computer user, he found himself increasingly disenchanted with the metaphorical stuff that came between him and his computer. He abandoned the Mac-style point-and-click "graphical user interface" (GUI). Instead he opted for direct contact via text input at the "command line" prompt.
Stephenson's psychological transition was encouraged by his growing infatuation with the Linux operating system, the flag bearer of the so-called open-source software movement. "Open-source software" refers to software in which the underlying source code to a computer program is made freely accessible to all, rather than locked away from users as a proprietary corporate secret. For Stephenson, his change of operating system heart was a sign of upward evolution. He found it empowering and liberating to move from a metaphorical GUI desktop to a command-line interface, from closed code to open code.
Stephenson's last three novels follow a similar trajectory. In "Snowcrash" Stephenson won the enduring adulation of geeks everywhere by delivering two fabulously cool metaphors for what the computer could offer the world -- the Metaverse, an online reality in which hackers donned their favorite personas and acted out their fantasies, and the Librarian, a helpful digital entity, not unlike a real-life librarian, who is also a really, really neat way of imagining how we puny humans might some day be able to plumb the database of all recorded information. Then in "The Diamond Age," he went a step further, entrancing his readers with the "Young Lady's Illustrated Primer," a "smart" book that delivered lessons in life and computer theory disguised as interactive fairy tales, all with the intention of educating its young female owner in how to thrive in a treacherous world.
In "Cryptonomicon," Stephenson cuts to the chase. Instead of elaborate metaphors for how the computer works or might work, he brings us directly to ground zero: Alan Turing, the creator of the first digital computer, is even a character in "Cryptonomicon." As the novel flips back and forth across the last 50 years, we see both the birth of the computer and its current state-of-the-art implementation. Stephenson goes so far as to include actual code for an encryption algorithm in the text.
It's as if, after giving his readers a pair of advanced metaphor-based GUIs with which to contemplate the role of the computer in modern life, Stephenson has now decided to show us the "source" -- to deliver to us the truth about computing in all its raw, unblemished command-line beauty.
I lay out for him my painstakingly constructed analysis. What does he think? Is this a fair appraisal?
He nods his head briefly.
"That's a good analogy," he says. And then he pauses, waiting for the next question. (Lesson for would-be interviewers of geeks: Never ask them a yes-or-no question, because that's all you'll get in response.) But Stephenson does agree with the general thrust of my questions about the contradictory nature of code.
"That's the basic contradiction I'm trying to deal with here," says Stephenson. "There's always been this duality between secrecy and openness. The digital computer as we have it today was born in the attempt to deal with codes, to go into these impenetrable messages and bring back the information. In that time the codes that we were breaking were to us a sinister force. We had to break these codes or the bad guys were going to take everything over. Now, the computer is all about openness and spreading information to every corner of the world. But at the same time, we're finding that the more we do that, the more we are perceiving a need to encrypt our stuff, to keep it out of the hands of the bad guys."
It's a messy situation, especially for engineers and hackers used to thinking of the world in neat binary terms of ones and zeros, or as a set of problems that can all eventually be solved. The "open-source" advocates that Stephenson rhapsodizes about in his essay on operating systems are often the same people working hardest to ensure that individuals can keep their personal information private.
As our all too short interview comes to a close -- Stephenson doesn't want to keep a photographer waiting -- I try, again, to pull his novel and his essay into line with one another. I observe that the very same open-source hackers who are luxuriating in Stephenson's beloved command-line world are hard at work devising their own Macintosh-like GUIs. Isn't this going backwards, I ask him? To me, it seems that the gist of Stephenson's writing, his comments about the importance of science and technology and his enthusiasm for incorporating code and equations in his novel all add up to a strong authorial point of view stating that the world would be a better place if people were smarter about their relationship with computer technology. But those open-source hackers are busily striving to make it easier for people to be stupid.
Doesn't the world need more smart users, I ask Stephenson?
"I think we need an upgrade path," he answers. "I think we need a way to encourage people to become smart users."
Is "Cryptonomicon" part of that upgrade path? Showing people the "source" -- delivering to them the roots and history of computing culture -- is this Stephenson's contribution to social smartening-up?
"I can see where you are going," says Neal Stephenson. "It would make a nice wrap-up for your story. I don't know. I mean, the only way I could see that happening is if somehow this makes geek culture a little more accessible to people, so they don't feel like they are becoming some kind of monster as they learn how to use this kind of technology."
He is too modest -- a rare compliment to bestow upon a hacker. With "Cryptonomicon," Stephenson has embroidered the phrase "computer literacy" with a whole new layer of meaning. He has become the poet laureate of hacker culture. So why even bother with the dumb questions? Cut the book tour short and send this man back to Seattle. He's got some more writing to do.
Shares