The Land of Forgotten Software: Always There, but Rarely Noticed

Publicado en Coder stories

30 ene 2019

6 min

The Land of Forgotten Software: Always There, but Rarely Noticed
autor

Some things are built to last. But maybe not software. The world of software moves quickly. Languages come in and out of fashion. Frameworks appear out of the blue. New versions are released that are completely backward incompatible. Everyone is focused on the shiny new that appears every six months or so. Nonetheless, a few bits of software have stood the test of time. They were created a long time ago, but are still used everywhere. Most programmers today know the software, but not the names of the programmers who created them. What follows are the stories behind 4 of these.

Pipe dreams

Douglas McIlroy had a dream. His vision was of industrialized programming, where programmers took pre-built, reusable components “off the shelf” and assembled programs with them, in much the same way a plumber might use a standard pipe fitting or an automobile-assembly-line worker might bolt on a fender.

At a conference sponsored by the NATO Science Committee and held in Garmisch, Germany, October 7-11, 1968, he opened the presentation of his paper titled Mass Produced Software Components with this:

“We undoubtedly produce software by backward techniques[…] Software production today appears in the scale of industrialization somewhere below the more backward construction industries. I think its proper place is considerably higher, and would like to investigate the prospects for mass-production techniques in software.”

He added:

“Of course mass production, in the sense of limitless replication of a prototype, is trivial for software. But certain ideas from industrial technique I claim are relevant[…] The idea of interchangeable parts corresponds roughly to our term ‘modularity,’ and is fitfully respected. The idea of machine tools has an analogue in assembly programs and compilers. Yet this fragile analogy is belied when we seek for analogues of other tangible symbols of mass production. There do not exist manufacturers of standard parts, much less catalogues of standard parts. One may not order parts to individual specifications of size, ruggedness, speed, capacity, precision, or character set.”

The amazing thing is: everything he said is still pretty much true fifty years later!

McIlroy is the author of the first, and most important, program from the land of forgotten software—Unix Pipes. In fact, the entire Unix operating system still bears the imprint of his vision, despite him stepping down from leading its development team at Bell Labs in 1986. The rules of Unix are unbelievably simple, yet developers have been amazingly consistent in their application, even though there is no real enforcement mechanism.

The “official” overarching paradigm of Unix is the idea that the power of a system comes more from the relationships among programs than from the programs themselves.

In the foreword to his paper Unix Time-Sharing System, published in The Bell System Technical Journal, he writes:

  • Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features.
  • Expect the output of every program to become the input for another, as yet unknown, program. Don’t clutter output with extraneous information. Avoid stringently columnar- or binary-input formats. Don’t insist on interactive input.

Unix Pipes are the very embodiment of this philosophy. Although perhaps few of you know who McIlroy is (he is still alive and kicking!), it is very likely that most readers do know who Dennis Ritchie was. Let’s give the last word to him:

“One of the most widely admired contributions of Unix to the culture of operating systems and command languages is the pipe.”

Y2K or 2036?

How many of you know what these two dates have in common? Would it help to say that the answer is Dr. David L. Mills?

Mills is the man who, in 1981, designed NTP, the Network Time Protocol, while working at COMSAT, the US-government-regulated consortium that laid the satellite foundations of the modern world. The base NTP protocol has been running, substantially unchanged, since then, making NTP the longest-running application protocol in existence. Thirty-eight years is a long time.

There were fears in 1999 that it would all go pear-shaped when we ticked over into the millennium, but NTP shrugged it off as though it was just another day. The new doomsday prediction is 2036, when the unsigned 64-bit NTP timestamp rolls over, but don’t assume NTP is not up to the task. After all, it was built to withstand nuclear attack (not really, but that was the rumor about the early packet-switching network ARPANET for many years).

“What could be simpler than a time protocol?” you might ask. Just send a query to some public time source (NIST, for example), and when you receive the current time from the server, you set your computer’s clock, right? Sure. Easy. As long as you know just one thing: how long it took for the answer to get back to you.

Maybe network time is not so trivial after all. But let’s ask this: what is the problem if it is one or two seconds off? Well, a second is a long time. Long enough for hundreds, or thousands, of things to happen. Computers are fast. If you need to compare a log on one computer with a log on another computer to match up a transaction, a second might as well be a year.

Mills was deeply involved in routing protocols. He was the first chair of the Internet Engineering Task Force, and he created Fuzzball, which was the DEC PDP-11 based router that was used to create the National Science Foundation Network (NSFNET), which was also sometimes known as the internet backbone. Mills designed NTP to handle the challenges of network timekeeping by polling a group of peers and using their responses to detect transmission errors and calculate message-transmission times. Peer-to-peer (P2P) networking in 1981! The algorithms to implement this P2P network without creating logical loops involve network-routing techniques, in particular the Bellman-Ford algorithm (for those who are interested), which gives the network the robustness to survive even the loss of the primary time-keeping source. To quote Request for Comments (RFC) 1059: “Timekeeping accuracy throughout most portions of the internet can be ordinarily maintained to within a few tens of milliseconds, even in cases of failure or disruption of clocks, time servers or nets.”

Synchronized computer clocks are used for everything, from personal-file management to copy protection (e.g., trial licenses) to criminal investigation. It is truly hard to imagine a world without NTP, the second program in the land of forgotten software.

Is there an Echo in here?

In 1983, Mike Muuss attended a DARPA (the agency of the U.S. Department of Defense responsible for the development of ARPANET and NSFNET) meeting in Norway. There, he had a conversation with Mills, in which he said Mills “_described some work that he had done on his Fuzzball LSI-11 systems to measure path latency using timed ICMP Echo packets._”

Muuss was working at Ballistic Research Labs and wanted to explore some unusual network behavior. Remembering his conversation with Mills, he sat down and wrote a bit of code to explore the network using timed ICMP Echo packets. In Muuss’s own words:

“The code compiled just fine, but it didn’t work—there was no kernel support for raw ICMP sockets! Incensed, I coded up the kernel support and had everything working well before sunrise … If I’d known then that it would be my most famous accomplishment in life, I might have worked on it another day or two and added some more options.”

“The folks at Berkeley eagerly took back my kernel modifications and the PING source code, and it’s been a standard part of Berkeley UNIX ever since.”

Ping (which was named after the pinging noise of submarine sonar) is one of the most-used programs in the world. It was written in one day (all 40kB of it) and probably has the highest ratio of hours of usage to hours of coding than any other program ever written.

PostScript

In 1982, Dr. John Warnock quit his job at Xerox’s Palo Alto Research Center (formerly Xerox PARC) because he could not convince management to commercialize the Interpress page-description language that he helped write. Interpress was based on the Forth programming language. He and two former colleagues from Xerox, Charles Geschke and Dan Putman, founded Adobe Systems, and it was only natural that the Interpress clone that he cowrote be called PostScript, because, like its progenitor, PostScript was Forth-influenced and used postfix operators.

Warnock set out to build a networked printer that ran on PostScript, but Steve Jobs caught wind, and was able to convince Warnock to instead license PostScript and allow Apple to manufacture the printer. Then a programmer named Paul Brainerd caught wind of that collaboration and created a “desktop publishing” program called Aldus PageMaker (an electronic typesetting and page-layout program) just in time for the release of the LaserWriter in 1985.

PageMaker and its desktop-publishing branding completely assured the success of the new LaserWriter and established PostScript as the de facto standard for electronic typesetting and page layout in one fell swoop.

Adobe made a lot of money from licensing PostScript, enough money to eventually buy Aldus for just under $500m only nine years later.

Adobe InDesign is the second-generation PageMaker that was under development at Aldus when it was acquired by Adobe.

PostScript lives on in many places and is perhaps the most ubiquitous of all of the forgotten software, because it is at the heart of the PDF-file format. It would be difficult to overestimate the significance and ubiquitousness of PDF.

Post-PostScript

These stories are about just a few programs from the land of forgotten software. Maybe the next time you send a PDF, or sort your files by creation date, you will think about the long history of computer programming that has allowed each subsequent generation of programmers to stand on the shoulders of giants.

This article is part of Behind the Code, the media for developers, by developers. Discover more articles and videos by visiting Behind the Code!

Want to contribute? Get published!

Follow us on Twitter to stay tuned!

Illustration by Victoria Roussel

Las temáticas de este artículo