In the 1990s, something new was happening: The World Wide Web had arrived and was transforming the Internet.
At first, websites were static, with limited styling and graphic design. To see an example of what a typical website looked like in the mid-1990s, take a look at C2.com: It hasn’t changed since it was first created in March 1995 (and by the way, it was also the original Wiki). Although it was technically possible to have a dynamic website using the Common Gateway Interface (CGI) and Perl or PHP, the user interface lacked the sophistication that most people felt was required for an application. Remember, this was also the period when personal-computer users had only just become familiar with graphic user interfaces such as Windows 95 or Macintosh and their features, which included the scrolling lists, menus, radio buttons, checkboxes, pop-ups, and all of the other rich user interface elements that we take for granted today.
However, in August 1995, Larry Ellison, the CEO of Oracle, unveiled Oracle Webserver on stage at the Boston Macworld expo (at which this author was in attendance): It was a seminal moment in the history of the web. Ellison presented a bold vision, a reversal of the client/server model of distributed applications that were popular at the time. In the client/server model, business rules and data transformation were handled by “thick” applications on personal computers, with the server mostly dedicated to database activities. Ellison suggested that the design of applications would swing back towards the centralized model of a “mainframe” with “dumb terminals,” with Oracle Webserver taking the place of the mainframe and web browsers taking the place of dumb terminals. It was a compelling and exciting vision and, as we can see in retrospect, a prophetic vision, too. Unlike Perl and PHP, Oracle’s PL/SQL was a language that enterprise IT departments were familiar and comfortable with. It was also exceedingly robust.
Oracle was a company almost as trusted and established in the corporate IT world as IBM, and so its endorsement of the web, and the delivery of tools to build websites, sent a signal to the corporate world that web applications were something to be taken seriously. This indirectly led to the popularity of our next language, Java, aided by Oracle and Sun Microsystems’ fervent support (along with Apple, IBM, and Netscape) of a concept they called the network computer, an inexpensive computer that would be the modern equivalent of a dumb terminal. The development of Java would prove to be an integral part of this vision.
In the early 1990s, Sun Microsystems, with its popular Unix and SPARC microprocessor-based workstations and servers, had declared war on the Microsoft Windows and Intel partnership. It hoped that the Unix-SPARC combination could be used in personal computers, a move that would increase the market (and profits) of its recently developed SPARC processor a hundredfold. It joined forces with ON Technology, the company founded by software entrepreneur Mitch Kapor, to create an alternative to Microsoft Windows.
That partnership eventually dissolved without bearing much fruit, although Sun did end up inheriting some software: A Smalltalk clone written in C that included a virtual machine, an integrated development environment (IDE), and a syntax. It was originally developed in C++, but was ultimately delivered using Objective-C, and it became the foundation for Java. Its virtual machine (VM), like all VMs, emulated an imaginary central processing unit (CPU) by converting its virtual opcodes to the real opcodes of the physical CPU it was being run on. One could write as many VMs as needed, ones that ran on IBM CPUs, ones that ran on Sun SPARC, ones that ran on Intel chips, and ones that ran on the ARM chips in your phone. Once your program had been converted into the generic opcodes of the VM, it was able to run on all of these platforms.
At the time, a team led by the Sun programmer Patrick Naughton was targeting the market for “set-top boxes” and decided to make a consumer device, specifically a device for interactive televisions, that they tried to sell to cable-TV companies. They decided to power this box with an operating system based on the Smalltalk clone, which they code-named Oak, making it highly optimized for performance (consumer electronics do not have a lot of CPU power or memory).
Even though the set-top-box project never worked out, the software that the team designed presented a compelling use case. Designed as it was for television-set-top boxes controlled by a central authority (such as a cable-TV company), it was good at two things: Operating in a very small memory space and sending programs as objects (called Java applets) across a slow network. This is what motivated the web-browser company Netscape to officially adopt it and include a Java VM in its browser. It was actually quite a good decision, as it allowed browsers to become delivery mechanisms for small applications that communicated with larger applications on a server. It is what made web-based applications possible, because the first versions of HTML couldn’t support a rich enough user interface to allow web applications to compete successfully with desktop applications.
In summary, Java’s philosophy was to be lightweight (which it was at first), and to provide a rich user interface in a small applet that could be “sent down the wire” on demand. It was supported by a very large, stable company that would be around forever (or so it seemed at the time). And, unlike Smalltalk, it could be used for free. As web-based applications became the norm, adoption of Java by IT departments of large organizations increased exponentially—and, perhaps predictably, so did the language’s size and complexity. Here’s what the irrepressible Larry Wall, creator of Perl, had to say about it in 2011:
“Java is sort of the COBOL of the 21st century, I think. It’s kind of heavyweight, verbose, and everyone loves to hate it—though not everyone will admit that. But managers kind of like it because it looks like you’re getting a lot done. If 100 lines of Java code accomplish a task, then it looks like you’ve written 100 lines, even though, in a different language, it might only take 5 lines.”
Just like Perl, Python started as a scripting language. Its author, Guido van Rossum, had the same goal as Wall: To create a language to fill the gap between shell-scripting languages and one like C, with which you could write an entire application.
Van Rossum added a few extra philosophical concerns. Famously, he wished to create it as a “batteries included” language. Today that does not seem so special, but in the late 1980s, when he first started thinking about Python, most languages were a formal specification written up as a paper and released without standard libraries. In creating Python, he wished to make sure that commonly performed operations (for a developer) could be executed with as little as one line of code. To that end, he designed Python to be extremely sparse, without the need for boilerplate, or setup, or any ritual incantations. And he planned extensive standard libraries that could be loaded as needed (to avoid bloat), to perform commonly used functions such as file or network access.
Finally, he had a strong feelings about aesthetics and readability, and consciously tried to remedy all of the sources of ugliness and ambiguity that he had experienced with Fortran, ALGOL, and the language that he had previously worked on, ABC.
Python had a slow start, and had a small (but devoted) community for many years. However, after 2004, it started to gain momentum, aided no doubt by Google’s acquisition of YouTube and its subsequent adoption of Python as an “official” language, even going so far as to hire van Rossum. It is now the most popular language in the world by some measures.
Another programmer who wanted to balance functional programming with imperative programming was Yukihiro “Matz” Matsumoto. Like van Rossum, he wanted an updated, Perl-like language, and he was aware of Python. However, he “didn’t think [Python] was a true object-oriented language” (this changed somewhat in later releases but, prior to version 2.2, Python was not truly consistently object oriented) and he “really wanted a genuine object-oriented, easy-to-use scripting language.”
Matsumoto’s syntax was influenced by Lisp, but he did away with macros. He added objects and messaging inspired by Smalltalk, and he rounded off the whole affair with some inspiration from Perl. So it would be fair to say that the philosophy behind this language was also about filling the gap between shell-scripting languages and a language like C with one you could write an entire application with—but, this time, with objects.
In April 1995, Brendan Eich found himself trying to make the best out of a bad situation. He had just started working at Netscape and had been given the mission of creating a scripting language for its browser.
He tried to infuse the project with a specific philosophy: He started with the intention of creating a lightweight Self implementation, which is another way of saying a lightweight Smalltalk implementation, since Self originally began as a stripped-down version of Smalltalk. The philosophy therefore would have been about the same aesthetic tradition as Smalltalk: Simplicity, composability, uniformity.
This chapter in the series covers the languages that most people use today. However, it’s important not to ignore the future. In the next, and final, chapter, we will look at a few languages that are increasing in popularity and may take on a more significant role in the years to come.
This article is part of Behind the Code, the media for developers, by developers. Discover more articles and videos by visiting Behind the Code!
Want to contribute? Get published!
Follow us on Twitter to stay tuned!
Illustration by Victoria Roussel