There are more than 2,500 programming languages in existence today (some would say almost 9,000!), of which about 250 are in more or less current use. Where do all these languages come from? Why are there so many? What’s the difference between them?
General purpose vs. domain specific
What most people think of when they hear the words “programming language” is code that can create apps that people can use. That kind of language is called a “general purpose language”. The other kind of programming language is called a “domain specific language” (DSL).
A DSL is one where the syntax of the language reflects the concerns of a specific application or family of applications. Some examples are PostScript (specific to the domain of page description and rendering), sed (designed for parsing and transforming streams of text), Gherkin (a language designed to define test cases to check the behavior of software), and SQL (a language for querying relational databases).
These languages are not intended to be used to create interactive computer programs for end users—for that, a programmer will almost always use a general purpose programming language.
Philosophies and paradigms
The people who create programming languages do so from a particular perspective. Usually it is to remedy some deficiency in the languages that already exist, to provide a different approach to writing code.
In natural language there are many ways of saying the same thing. On a cold January day, one could say, “The weather is bad outside”, “It is inclement today”, or “It’s snowing pretty hard”, and for most English speakers, how it’s understood would be the same. It is a question of style and preference and context. If you were to say, “The weather is bad outside” in late April or in September, people would automatically understand that it is windy and raining.
Similarly, because all general purpose languages must be converted to machine language before the CPU will execute them, which one to use is mostly a question of style and convenience, as long as the language is “Turing complete”. This means it can be used to implement a Turing machine, which is a conceptual computer introduced by the great English mathematician Alan Turing in his 1936 paper On Computable Numbers, with an Application to the Entscheidungsproblem. It is also the primitive model upon which modern microprocessors were originally based.
The most important quality of Turing-complete languages is that they can all implement the universal computing machine. This means that they all have a common denominator, or lingua franca. And that means that anything you can express or write in one of these languages, you can express in any of the others.
However, this does not mean that each language is equally easy to write in. Excel and PowerPoint are Turing complete, but no sane person would want to write a program using them. And this is why different programming languages exist. Some styles of programming are easier in one language compared with others. Some languages include libraries that do a lot of the work for you.
Now let’s take a look at the ideas behind the design of a small number of the major programming languages. We will start with the oldest languages first because later languages were strongly influenced by these earlier ones and their design philosophies—understanding these “proto” languages will allow you to better understand the modern languages that followed.
Plankalkül: The lost language
In the early 1940s, the first programming language to be implemented on an electromechanical computer was Plankalkül, created by the German civil engineer Konrad Zuse, for use on the Z4, one of the world’s first programmable computers. However, since this was during the Second World War, his inventions were kept top secret, and the rest of the world was introduced to the concept of programmable computers via the publication of the First Draft of a Report on the EDVAC (Electronic Discrete Variable Automatic Computer) in 1945, assembled and edited by John von Neumann at the request of the US Army. Before this, EDVAC computers (other than Zuse’s Z series) were not programmable. The programming was built into the electronic circuitry, in exactly the same way as it is in a pocket calculator, only nowhere near as powerful.
A-0: DRY (Don’t Repeat Yourself) Programming
For the next decade, most programmers worked directly in machine language. Since machine language is not the easiest thing to work with, to make life easier for themselves, EDVAC programmers kept notebooks on commonly used programs that had been debugged and tested. These were basically subroutines, and the world’s second programming language/compiler was written to automate this process.
J. Presper Eckert and John Mauchly, the designers of the EDVAC, sold their company to Remington Rand, where they completed work on their next-generation computer, the UNIVAC (Universal Automatic Computer). One of the people on their team was Grace Hopper, and in 1951 she created A-0 (Arithmetic Language version 0) to make repetitive programming jobs easier. Just like the old saw about prisoners assigning numbers to their jokes, Hopper assigned numbers or indices to commonly used subroutines and the data to be processed as arguments. The A-0 compiler (actually more of a linker) would use the index to get the subroutine’s full machine-language implementation and pass the argument to it. Hopper recounts: “nobody would touch it because, they carefully told me, computers could only do arithmetic; they could not do programs.” This mistrust was to be a pattern repeated over and over again with each successive generation of compilers.
Hopper then set her sights on something more ambitious: getting computers to process text. Remember, the computers of the day were seen as very large and very fast adding machines. People didn’t think of them as we do today: machines that can process text and images and even understand semantics. Being able to use human readable text to program was the first major paradigm shift in programming philosophy.
Hopper was truly a pioneer. She went on to create a language called FLOW-MATIC, which by the end of 1956 had given programmers 20 human readable reserved words (programming language instructions; for comparison, C has 32 reserved words and Smalltalk only 6) with which to create business programs such as payroll or bill processing.
FORTRAN: It’s all about the math
As Hopper was beginning work on FLOW-MATIC, John Backus at IBM was taking a philosophically different approach to creating human readable programming languages. Whereas Hopper was interested in getting computers to “speak English,” Backus was interested in having them speak the language of mathematics. The particular deficiency that Backus set out to address was the lack of floating point arithmetic. There was also the desire to make programming more accessible to scientists of all sorts, and by doing so, promote increased use of computers for scientific research. Speedcode (also known as Speedcoding) was released in 1953 as an interpreter, but in November 1954 Backus submitted a proposal to his bosses at IBM for a more extensive, compiled, mathematically based language that was to become the FORTRAN (Formula Translating System) language. FORTRAN was enormously influential, and the vast majority of current popular languages can trace their origins back to it.
Its design philosophy is strongly influenced by the IBM technology of punched (or punch) cards. In FORTRAN one made a series of “statements” that could fit into a fixed format on a punch card. In modern terms, each punch card was a line of code. The cards were then fed into the computer as a batch, and a printout was returned with results… or errors, if there was a bug. Because of the space limitations on a punch card, a premium was placed on being succinct and packing as much as possible into the smallest space possible. FORTRAN code does not make much sense unless one is familiar with mathematics.
COBOL: Hello Computer!
By comparison, the work that Hopper was doing, which would result in the introduction of COBOL (Common Business Oriented Language) in 1959, was being done on the UNIVAC computers, which used a keyboard to enter programs onto large spools of magnetic tape. This eliminated the space constraints associated with punch cards, and there was no need to keep the language concise. Instead, Hopper pursued her idea of making computers understand human language.
Known as an extremely verbose language, COBOL has hundreds of reserved words—between 350 and 550, depending upon which version you look at. As a point of comparison, that is not much more than a minimal working vocabulary in English or the full vocabulary of a pidgin language.
LISP: Do what I mean, not what I say
In 1958, the deficiency that John McCarthy (one of the founders of the discipline of artificial intelligence) wanted to address was similar to Hopper. Not only did he want computers to “speak English,” he also wanted them to be able to “handle declarative as well as imperative sentences” and “exhibit ‘common sense’ in carrying out its instructions”. In other words, he wanted computers to be able to understand human readable expressions about decision-making, logic, and procedure.
The core of its philosophy is exhibited in the macros, the composable aspect of the language that makes it possible to design and develop procedures from the top down. This ability to invent rules and syntax that can then be used as if they were built into the language made LISP the language of choice for early AI researchers and was used for many of the early expert systems, notably the American Express Authorizer’s Assistant.
LISP was seminal. It is one of the 2 main ancestors of almost all currently used general purpose programming languages (the other is FORTRAN, via ALGOL, which we will look at next). LISP introduced higher-order functions (a function that can either take another function as an argument, or can return a function), composability (macros in LISP; making building blocks of code and reusing them), the virtual machine (a conceptual processor that uses a conceptual assembly/machine language), garbage collection (a routine built into the programming language that automatically handles the very tedious job of memory management for the programmer), and parentheses—lots and lots of parentheses.
“You cannot teach beginners top-down programming, because they don’t know which end is up”—C.A.R. Hoare
ALGOL: It knows when you’ve been naughty…
ALGOL was a strange language from the start. Designed by committee in 1958, its original name was International Algebraic Language, but soon changed to ALGOrithmic Language. It originally ran on the Z22. Yes, the latest generation of Zuse’s computer, now with vacuum tubes instead of electromechanical switches!
Designed as it was by a committee, the philosophy behind it was for it to be all things to all people, which likely as not was the cause of its downfall. The only language of its time to be more complex was the ill-fated PL/1, which was intended to bring the best (?) of ALGOL, FORTRAN, and COBOL into one language to rule them all. Edsger W. Dijkstra—one of the greatest computer scientists to have ever lived—compared it to flying an airplane with 7,000 switches and controls. Like all the world-domination schemes that had come before (and after), it eventually collapsed under its own weight in the mid-1970s.
Backus was on the original committee, along with 7 other mathematicians and professors, only 2 of which had any previous language-design experience. McCarthy was to join a bit later on for the second version of ALGOL (ALGOL 60). Despite the committee-driven attention deficit disorder, ALGOL managed to introduce 4 concepts that were to influence almost every language that came after it:
- Scoped variables
- Structured programming
- Compile time type checking
- Paged, or virtual, memory
This made ALGOL much stricter than previous languages, which was resented by many older journeyman programmers who were used to writing in FORTRAN or assembly language. The difference of opinion reached fever pitch in 1968 with the publication of Dijkstra’s Go to Statement Considered Harmful in the Journal of the Association for Computing Machinery. The resentment it triggered will be apparent in the “don’t you tell me what to do” philosophy of BCPL and its successor C.
It is worth noting that every language below is able to trace its DNA back to ALGOL, even C.
Simula: Art imitates life
Simula had a very clear philosophy: add on to ALGOL the higher-level programming contracts required to make simulations and models easier to code. To do this, the concepts of objects, classes, inheritance and polymorphism, and methods were created. Objects were originally intended to be used differently from how they are put into practice today.
In Simula, and its successor, Smalltalk, objects were conceived as virtual embodiments of real-world actors and objects, complete with attributes or properties, behaviors, and states. Both languages’ goal was to model everyday physical objects in the real world (as opposed to the virtual world of scrollbars and buttons).
BCPL/C: The macho side of programming
The ubiquitous C programming language was based on BCPL, which was designed specifically for the purpose of writing compilers and was the first language in which “Hello, World!” was programmed. It was extremely low level, having only one data type: a 16-bit word, which had a one-to-one relationship with machine-language words. Its design philosophy was expressly stated by its creator, Martin Richards: “The philosophy of BCPL is not one of the tyrant who thinks he knows best and lays down the law on what is and what is not allowed; rather, BCPL acts more as a servant offering his services to the best of his ability without complaint, even when confronted with apparent nonsense. The programmer is always assumed to know what he is doing and is not hemmed in by petty restrictions.”
The reference to “the tyrant” was aimed directly at the ALGOL committee and the structured programming advocates who created ALGOL descendants that were considered too restrictive by the “real” programmers, who wore their deep understanding of the underlying operations of the CPU as a badge of honor, and scorned programmers who might “shoot themselves in the foot” with a language that allowed them to directly manipulate memory and program counters.
Dennis Ritchie (a member of the team that created UNIX, and the creator of the C programming language) evolved BCPL first to a stripped-down “B,” and then to “C,” when he was seeking a language with which to write utilities for the new operating system called UNIX that he was working on with Ken Thompson, Brian Kernighan, Douglas McIlroy, and Joe Ossanna at Bell Labs. Within a year, the team had rewritten the entire UNIX kernel in C.
C retains the original philosophy of BCPL of providing deep access to the “metal” of the computer without the pesky safety restrictions that might be needed to protect lesser minds. It adds the structured programming paradigms of ALGOL (BCPL was more like FORTRAN than ALGOL in many ways). In short, C’s philosophy is: “I know what I’m doing, get the out my way.”
C signaled the end of an era, the era of “big iron.” C was developed for single-user computing (UNIX was the singular of Multics), which was something completely new, almost unimaginable, in an era when people queued up for precious computing time, or sat at timesharing terminals in a room with 10 other programmers, all working on the same computer.
Everything would change after that.
This article is part of Behind the Code, the media for developers, by developers. Discover more articles and videos by visiting Behind the Code!
Want to contribute? Get published!
Follow us on Twitter to stay tuned!
Illustration by Victoria Roussel