Death of an era

 
 

This October, three notable people in the computing world passed away, with one death receiving more attention than the rest. Their respective achievements are nearly immeasurable, writes Conor O’Nolan

John McCarthy

On October 23rd, John McCarthy passed away. His contribution to computer science was incredible; he spent his academic life working on artificial intelligence, cognitive science and mathematical logic.

He showed a prowess for maths from an early age, teaching himself at college level before enrolling in a degree in California Institute of Technology and going on to do a PhD at the prestigious Princeton University. After finishing his PhD he spent time at other very respected universities such as MIT and Dartmouth, but he finally settled when he became a professor at Stanford.

While at MIT he did what could be described as his most influential work; he invented LISP. LISP is one of the oldest computer programming languages still in use today. In its early days it was often used for artificial intelligence programming, which caused fast progress to be made in AI research. One of the elements of LISP that has had massive consequences on computer programming was LISP’s ‘garbage collection’ system. Garbage collection is a system where computer memory is managed automatically. Before this became commonplace, memory had to be managed manually, which was a painstaking process that often resulted in catastrophic errors in programs. Throughout the last few decades LISP has spawned dozens of implementations and dialects. Many of these have had massive influences on the more popular programming languages used today, such as Python, Ruby and Javascript.

Steve Jobs

Steve Jobs died on October 5th after a lengthy battle with a pancreatic tumour. His passing was met with global tributes to his incredible life’s work.

Jobs cofounded Apple in 1973, and in 1977 the company began selling their first mass-produced computer, the Apple II, which was sold in millions over the course of its lifespan. Jobs had a keen eye for technologies that could potentially change the world of computing. In 1979 he licensed some software from Xerox that Xerox’s executives saw no value in, but it went on to become the way we interact with computers today; the graphical user interface (with mouse input).

Jobs was a demanding manager who expected an incredible amount from his staff and this led to tensions which caused his dismissal from the company. He went on to found NeXT Computer, another company that dealt in the very high-end computer market. This venture was not incredibly profitable, but resulted in the development of some brilliant technology, which came in useful when Apple bought the company and rehired Steve Jobs. At the time of the acquisition, Apple was performing very poorly financially and Jobs’ shrewd business skills helped save the company.

Jobs might not have contributed an awful lot to the computing world in a technical sense, but he made a serious mark in the world of computers and technology as a whole. His keen eye for detail led him to make aesthetic decisions which Apple are now renowned for, from the design of their computers to the rendering of fonts in their operating system. The iPod, iPad and iPhone have all changed the way in which we consume media.

Dennis Ritchie

 

Dennis Ritchie died on October 12th, his passing occurring the week after the death of Steve Jobs. As a result his death garnered little attention, but his contributions were arguably much more important. He will be remembered for his two main contributions to computing: the UNIX operating system and the C Programming Language.

In the 1970’s, while working at Bell Labs, Ritchie helped write the UNIX operating system. At this point, computer programs tended to be written in such a way that they were very dependent on the specific hardware they were written on. In 1973, the UNIX operating system was rewritten in Ritchie’s C programming language, which made it much more portable. This new operating system that was ‘easy’ to use made computing infinitely more accessible to many people.

Ritchie’s work on UNIX has indirectly formed the backbone of a massive amount of modern technology. The vast majority of internet servers run Linux or BSD as their operating system, which are derivatives of UNIX. Apple computers run OSX, which features a lot of UNIX-derived code, as do iPads, iPod Touches and iPhones. Google Android runs a modified version of Linux, which is itself a system similar to UNIX.

Ritchie’s C programming language was also incredibly influential. It is a compact language that can be used on almost any computer and for almost any task. Despite it being almost forty years old, it is still one of the most widely used languages in the world. Even its manual (which Ritchie co-authored) is seen as a template for computer programming books and a perfect example of clear scientific writing.

Advertisements