[Read the story.]
Sam Williams' article raises some intriguing issues lying at the junction of computer science and history. What's most remarkable, however, is the idea that computer scientists/programmers should place themselves in charge of preserving this history. Contrast that situation with the visual arts or literature. Art history is not preserved and analyzed by artists, nor is literary history the province of writers. Historical scholarship is the domain of trained historians, whose research methods enable them to see beyond the superficial qualitative aspects of their subject and place it into an evaluative context. Grady Booch himself notes that literature and art each have a unique critical discipline, while software has none but should. It should be clear that by analogy computer science/software should have its own historical discipline, tended by historians as criticism is tended by critics.
When one makes the appropriate distinction between aesthetics and history, it is clear that the "elegance" of a particular piece of code, as Booch sees it, is of less importance than its influence on later authors and its role in shaping subsequent events. Unless sound historical research is applied to the collection of software examples, the end result will be no more than that -- a mere collection. It is interesting to note that the Computer History Museum that Williams mentions is staffed and advised primarily by computer scientists and that its chief curator is not a trained historian. I will be curious to see how this institution determines what things are significant and why they are significant, without an historian's perspective. What seems to be important to computer scientists of today may turn out to be of minor relevance in the course of history in this very young discipline.
-- Deborah Beranek Lafky
Williams' article was right on point. One of my earliest programming jobs out of college was to reverse-engineer the Univac object code of a still-in-use NASA telemetry program for which the source was lost and port it to an IBM mainframe. And this was in 1985. I'm 43 years old -- perhaps ancient by Salon.com reader demographics, but not really all that much of a greybeard -- and I've seen endless languages, various types of "structured programming methods" and compiler philosophies bloom, have their day in the sun, and wither away.
Some of them deserved to die; others should be preserved. I'm sure this is not an original idea, but maybe someone should start thinking along the lines of a mega-Rosetta Stone; every time a new computational tool (language, method, etc.) becomes generally accepted by the software community, it gets added to the "Stone" in such a way that it can be mapped to a prior tool. Not that I know how to do this, of course, but it's an interesting concept. Since we're still dealing, essentially, with von Neumann computing, I don't see any reason why it couldn't be done. Love Salon.com -- keep up the great work!
-- Eric Bobinsky
Shares